Realtime PowerTrack, Historical PowerTrack, the 30-Day Search API, and Volume Streams (such as the Decahose product) are now available on the Gnip 2.0 platform. These new, enhanced versions of our core API products offer new features and functionality that will give access to new enrichments, new filtering capabilities, as well as increased reliability from having multiple data centers.

One first step in the migration process is updating your data consumption applications for Gnip 2.0. These are the applications that make requests to Gnip APIs and process the API responses. Our team will help you set up a 30-day migration period with access to both 1.0 and 2.0 versions so you can migrate your applications to Gnip 2.0. All migrations need to be completed by December 1, 2016 at which point we will be sunsetting all 1.0 endpoints.

The purpose of this article is to dig into the details that you and your developer colleagues will need to consider as you start to migrate to Gnip 2.0. We’ll start off by discussing migration steps common to all products, then provide product-specific details, including code checklists.

As you’ll see below, migrating to Gnip 2.0 is a relatively straightforward process. We have already heard back from customers that have completed the process in a few days.

Updating Gnip API Clients 

Regardless of the Gnip 2.0 products you use, here are migration steps common to all of them:

  • Review product-specific Migration Guides at
  • Update endpoint URLs to 2.0 versions.
  • Review new Gnip Enrichment offerings and consider how they can help with your use-case. New offerings include Expanded URLs and Klout 2.0. Also, Gnip Enrichment metadata is now available in the Twitter ‘original’ format.
  • For filtered products (PowerTrack, Historical PowerTrack, and 30-Day Search), review the product-specific Migration Guides for changes in rule Operators:
    • New Operators.
    • Replaced and Deprecated Operators.
    • Language classification.
    • Changes in tokenization and matching behavior.
    • All Gnip 2.0 products support long rules, with up to 2,048 characters.
      • Note that the Realtime and Historical PowerTrack products will no longer return rule values/syntax in the JSON payload (same behavior as with version 1.0 ‘long’ rules). However, these products will return rule IDs and tags. If your Gnip 1.0 system depends on the ‘matching rules’ rule value metadata, you will need to instead use the new rule IDs to look up the rule syntax on the client side.
    • Both Gnip 1.0 & 2.0 use Basic Authentication, and use the same Gnip credentials. If you are unable to authentication, make sure you are explicitedly adding the “Authentication: Basic” HTTP header to your API requests.

For more information on migrating PowerTrack rules/filters, see this Support article on preparing 2.0 rule sets.

As Gnip 2.0 rolled out we have been updating the Gnip clients we use in-house. To help develop the information below, we updated example clients for realtime PowerTrack, 30-Day Search and Historical PowerTrack.

Product-Specific Details  

Beyond the general to-dos listed above, below are some more product-specific migration details to consider.

Realtime PowerTrack  

If you are porting PowerTrack 1.0 streams to 2.0, be sure to study its Migration Guide.

New URLs:

  • Streaming API
    • Version 1.0:{ACCOUNT_NAME}/publishers/twitter/streams/track/{STREAM_LABEL}.json
    • Version 2.0:{ACCOUNT_NAME}/publishers/twitter/{STREAM_LABEL}.json
  • Rules API
    • Version 1.0:{ACCOUNT_NAME}/publishers/twitter/streams/track/{STREAM_LABEL}.json
    • Version 2.0:{ACCOUNT_NAME}/publishers/twitter/{STREAM_LABEL}.json
  • Replay API
    • Version 1.0:{ACCOUNT_NAME}/publishers/twitter/replay/track/{STREAM_LABEL}.json
    • Version 2.0:{ACCOUNT_NAME}/publishers/twitter/{STREAM_LABEL}.json

API Updates:

Streaming API
  • Backfill is triggered with the new backfillMinutes request parameter. The version 1.0 client parameter has been decrepated, and connection request parameter validation will prevent its use. See our Backfill documentation for more details.

  • With version 2.0, the connection heartbeat interval is decreased from every 15 seconds to every 10 seconds. This provides the opportunity to shorten your data read timeout intervals.

Rules API
  • As discussed HERE, deleting rules is now done with the POST method with the &_method=delete request parameter.

  • Rules are assigned universally-unique IDs (UUIDs) when created and those IDs are returned under the ‘matching_rules’ JSON metadata for all Tweets consumed. If you are using rule tags as a unique ID in version 1.0, you can now revert to using tags to logically group rules. See HERE for more information.

  • When adding or deleting rules, the request payload can now be up to 5 MB in size (up from 1 MB in version 1.0).

  • There are new rule validations applied when adding rules. For example, rules with explicit AND or or logical phrases are not allowed. See HERE for more information.

  • Updates in Rules API request responses. See HERE for example response.

Other Details

  • See the Volume Streams Migration Guide for migration details for Decahose, Firehose, and User Mention streams. Note that Volume Streams share the same streaming API migration details discussed above, but do not involve any filtering or Rules API updates.

Twitter Hosebird Client was recently upated for 2.0. This Support article describes that 1.0 –> 2.0 migration process. Note that the Hosebird Client does not manage Rules API requests or implement any data handling/parsing of consumed data.

Realtime Code Checklist

□ Review the Realtime Migration Guide.

Streaming PowerTrack API

□ Update URL.
□ Update Backfill trigger and logic, if you have that feature.
□ Review and test your data timeout logic w.r.t. heartbeat frequency decreasing from 15 to 10 seconds.
□ Add new Rule ID metadata details to parsing and storage code.
□ If using new Enrichments, add metadata details to parsing and storage code.

PowerTrack Rules API

□ Update URL.
□ Update ‘Delete Rule’ method.
□ Add new Rule ID to parsing and storage code.
□ Using 1.0 tags as a unique rule ID? We recommend using the new auto-generated rule UUIDs instead, and revert to using tags to logically group rules.
□ Handle new Rules API request responses. For example, if any rule fails to get added, get rule-by-rule feedback on its validity. With these metadata you can readily adjust the rules submitted.

PowerTrack Replay API

□ Update URL.
□ Same parsing and storage details as with streaming PowerTrack API.

30-Day Search API 

If you are porting a 30-Day Search API 1.0 client to 2.0, be sure to study its Migration Guide.

If you have migrated from 30-Day Search 1.0 to Full-Archive Search, then you are already familiar with the differences between 1.0 and 2.0 versions of the 30-Day Search API. These details include data mutability, share unsupported Operators, and have common API request responses.

New 30-Day Search API URLs:

  • Version 1.0:
    • Data:{ACCOUNT_NAME}/search/{LABEL}.json
    • Counts:{ACCOUNT_NAME}/search/{LABEL}/counts.json
  • Version 2.0:
    • Data:{ACCOUNT_NAME}/{LABEL}.json
    • Counts:{ACCOUNT_NAME}/{LABEL}/counts.json

API Updates

  • Search API 2.0 no longer requires the publisher=twitter request parameter, and it should be removed.

If you are not already using Full-Archive Search, these videos are another resource for exploring differences between versions of 30-Day Search:

A Ruby client for 30-Day Search API was recently upated. Its README includes some notes on the 1.0 –> 2.0 migration process.

30-Day Search Code Checklist

□ Review the 30-Day Search API Migration Guide.
□ Update Data and Count URLs.
□ Remove publisher=twitter from request parameters.
□ Handle new Search API request responses.
□ If using new Enrichments, add metadata details to parsing and storage code.

Historical PowerTrack  

If you are porting a Historical PowerTrack API 1.0 client app to 2.0, be sure to study its Migration Guide.

New Historical PowerTrack URLs:

The root host domain has been updated from to

  • Version 1.0:
    • Get Jobs:{ACCOUNT_NAME}/jobs.json
    • Creating a Job:{ACCOUNT_NAME}/jobs.json
    • Accepting Job:{ACCOUNT_NAME}/publishers/twitter/historical/track/jobs/{JOB_UUID}.json
    • Getting Job status:{ACCOUNT_NAME}/publishers/twitter/historical/track/jobs/{JOB_UUID}.json
    • Getting Download Links:{ACCOUNT_NAME}/publishers/twitter/historical/track/jobs/{JOB_UUID}/results.json
  • Version 2.0:
    • Get Jobs:{ACCOUNT_NAME}/publishers/twitter/jobs.json
    • Creating a Job:{ACCOUNT_NAME}/publishers/twitter/jobs.json
    • Accepting Job:{ACCOUNT_NAME}/publishers/twitter/jobs/{JOB_UUID}.json
    • Getting Job status:{ACCOUNT_NAME}/publishers/twitter/jobs/{JOB_UUID}.json
    • Getting Download Links:{ACCOUNT_NAME}/publishers/twitter/jobs/{JOB_UUID}/results.json

See the HPT API Reference documentation and this article on downloading HPT data for more details.

A Ruby client for Historical PowerTrack was recently upated. It’s README includes some notes on the 1.0 –> 2.0 migration process.

API Updates

  • When creating a new Job, set the streamType to track_v2.

Historical PowerTrack Code Checklist

□ Review the Historical PowerTrack Migration Guide.
□ Update URLs.
□ Update Job description streamType setting to track_v2
□ If using new Enrichments, add metadata details to parsing and storage code. Note the availability of new Enrichment metadata HERE.