What is a Connector?

Updated on Thursday, 3 July, 2014 - 12:57

Connectors allow you to select and configure Data Destinations. Choose where you want to send the data generated by your live streams or your queries against the Historics archive.

 

NoteAll of the connectors are available to customers on the Enterprise Edition but the Pay-As-You-Go plan and the Professional Edition include only a subset of the connectors.

 

Connector/
output_type
Status Throughput limit per delivery Type Enterprise Edition Only
CouchDB
couchdb
live 50MB NoSQL  
DynamoDB
dynamodb
live Variable, based on your configuration * NoSQL  
ElasticSearch
elasticsearch
live 200MB NoSQL  
FTP
ftp
live 200MB File Yes
Google BigQuery
bigquery
live 100MB SQL Yes
HTTP
http
live 200MB File  
MongoDB
mongodb
live 50MB NoSQL  
MySQL
mysql
live 50MB SQL Yes
PostgreSQL
postgresql
live 50MB SQL Yes
Pull
pull
live 50MB File  
Redis
redis
live 50MB NoSQL  
S3
s3
live 200MB File  
SFTP
sftp
live 50MB File Yes
Splunk Storm REST
splunkstormrest
live 50MB NoSQL Yes
Splunk Storm
splunkstorm
live 20MB NoSQL Yes
Splunk Enterprise
splunk
live 50MB NoSQL Yes
Zoomdata
zoomdata
live 10MB NoSQL Yes

* Amazon generally limits users to 5MB but you can contact them to discuss your account and potentially increase your usage.

DynamoDB and S3 are part of Amazon Web Services. You have to set up delivery points for other connectors on your own servers, or you can lease servers from a number of hosting companies. If you do not want a long-term comittment, Amazon Web Services offer EC2 cloud servers that you can use for that purpose.

How do I set up a connector?

Each connector has its own set of parameters. For example, for FTP, you would specify a username, password, and directory whereas for HTTP you would need to specify parameters such as the URL of the endpoint your data will go to.

To get started with Push, prepare all the parameters you'll need to use with the connector you've chosen; make sure you have all of the passwords, keys, and so on, ready. Then, hit the /push/create endpoint. It requires a hash (if you're running a stream) or a Historics id (if you're running a query against Historics data).

To learn more, read our Push introduction and Push API step-by-step guide. Then, follow the instructions published on each connector's page (see the table at the top of this page for links).

What format do you use for the data ?

We deliver data to you in JSON format. JSON is a sreamlined, lightweight format that is easy to parse and easy for humans to interpret. Free resources such as jsonlint.com exist to format raw JSON to make it as human-readable as possible. If you're new to this format, take a look at our Understanding JSON page to get started.