This connector is available only to customers on the Enterprise Edition.
Splunk Storm is a cloud-based data analysis and visualization service. Splunk Storm REST is a new Push connector that can push interactions into your Splunk Storm account using the new Splunk Storm REST API.
Configuring Splunk Storm REST for Push delivery
To use the Splunk Storm REST API with Push delivery, follow the instructions below, skipping any steps you may already have completed. It does not matter which operating systems you use as long as you can connect to the internet:
Create a new Splunk Storm project.
You need a Splunk Storm account.
Set project time zone to UTC.
- You are now ready to set up the Splunk Storm connector.
Configuring Push for delivery to Splunk Storm REST
To enable delivery, you will need to define a stream or a Historics query. Both return important details required for a Push subscription. A succesful stream definition returns a hash, a Historics query returns an id. You will need either (but not both) to set the value of the hash or historic_id parameters in a call to /push/create. You need to make a call to /push/get or /historics/get to obtain that information or you can use the DataSift dashboard.
Once you have the stream hash or the Historics id, you can give that information to /push/create. In the example below we are making that call using curl, but you are free to use any programming language or tool.
curl -X POST 'https://api.datasift.com/v1.6/push/create' \ -d 'name=connectorsplunkstormrest' \ -d 'hash=2558e17de13072fa126370c37c5bd8f' \ -d 'output_type=splunkstormrest' \ -d 'output_params.api_hostname=api.splunkstorm.com' \ -d 'output_params.project_id=74b232308d9413e291c99231340e9c3e' \ -d 'output_params.format=json_new_line_timestamp' \ -d 'output_params.auth.access_token=KcD77o-eGcuc45jIAx5ZAIP8RsD0FQIrU4D3LP2dXC4nYs1aBlooWy9jGQFKBcPTzVMZgi7s2SM=' \ -H 'Authorization: datasift-user:your-datasift-api-key'
For more information, read the step-by-step guide to the API to learn how to use Push with DataSift's APIs.
When a call to /push/create is successful, you will receive a response that contains a Push subscription id. You will need that information to make successful calls to all other Push API endpoints (/push/delete, /push/stop, and others). You can retrieve the list of your subscription ids with a call to /push/get.
You should now check that the data is being delivered to your Splunk Storm project's input. Log in to your Splunk Storm account and search for data using the * wildcard character. It will match any document it can find. When the results are empty, you may have to wait for a while to let DataSift populate your project's database.
If there are no results, the most likely reason is that your DataSift stream has not yet produced any data, and you simply need to wait for a few seconds. When you are filtering for content that appears regularly, your stream will produce a high volume of data which will probably reach Splunk with minimal delay. However, a filter for content that appears rarely will result in a low-volume stream and might take several minutes or hours to find just one match. Therefore, it's a good idea to test a stream in the DataSift UI to get an idea of the throughput you should expect.
If there is a longer delay, this might be due to the fact that the stream has no data in it or there may be a problem with the Push platform or the Splunk platform. In the first case, preview your stream using the DataSift web console and in the second case, make a call to /push/log to find out if there are any clues in there.
Please make sure that you watch your usage and add funds to your account when it is running low. Also, stop any subscriptions that are no longer needed otherwise you will be charged for their usage. There is no need to delete them. You can can have as many stopped subscriptions as you like without paying for them. Remember that any subscriptions that were paused automatically due to insufficient funds, will resume when you add funds to your account.
- Familiarize yourself with the output parameters (for example, the project name) you'll need to know when you send data to a Splunk Storm project.
Twitter sends delete messages which identify Tweets that have been deleted. Under your licensing terms, you must process these delete messages and delete the corresponding Tweets from your storage.
The following parameters need values that can be found on your Splunk Storm project Inputs page.
You will need to complete the following steps to get there:
- visit the project list page
- click on the project you want to use
- click on the Settings icon
- click on the Inputs tab
- click on the API tab
default = api.splunkstorm.com
| The name of the Splunk Storm REST API host.
Example values: http://api.splunkstorm.com
| The ID of your Splunk Storm REST project.
Example values: AaBbCcDdEeFf0123456789AaBbCcDdEe
default = json_new_line_timestamp_meta
| The output format for your data:
Take a look at our Sample Output for File-Based Connectors page.
| Your Splunk Storm REST access token.
Example values: AaBbCcDdEeFf0123456789AaBbCcDdEeFf0123456789AaBbCcDdEeFf0123456789AaBbCcDdE=
Data format delivered:
JSON format. Each interaction is stored as one document.
One interaction per document.
DataSift cannot currently send more than 5MB of data every 10 seconds to Splunk Storm. This is due of the limitations of the Splunk Storm infrastructure and it may change in the future.
Please refer to the Splunk Storm support page.