Exporting Events to Syslog
Total Page:16
File Type:pdf, Size:1020Kb
Exporting Events to Syslog Author: Sila Kissuu, IBM On Demand Consulting Created: 5/19/2017 Introduction One of the product enhancements in IBM API Connect v5.0.7.0 is the capability to offload analytics data to external systems for further processing. This capability is useful if you want to consolidate data from multiple sources, if you require enhanced monitoring, or if you want to enrich your analytics data. Supported third-party systems include: • HTTP servers • Elasticsearch clusters • Kafka clusters • Syslog servers In this article, we illustrate the process of configuring IBM API Connect to offload event data to the Splunk server via syslog. Prerequisites Collect the connection details and other configuration information that is required to configure a data offload to the target system. The configuration requirements are as follows: • To configure data offload to an HTTP server, you must, at a minimum, provide the server URL. If you require encrypted communication with the server, you must also have a Transport Layer Security (TLS) profile defined. You can optionally add standardized or custom HTTP headers to define additional operating parameters. • To configure data offload to an Elasticsearch cluster, you must, at a minimum, provide one or more server URLs, define how indexes should be created by specifying a dynamically configured name, specify the number of primary shards for storing the indexed documents, and specify a number of replica shards for redundancy. If you require encrypted communication with the cluster, you must also have a TLS profile defined. You can optionally specify server authentication credentials. • To configure data offload to a Kafka cluster, you must, at a minimum, provide host and port connection details for one or more servers, and the name of the Kafka topic to which you want to publish the offloaded data. If you require encrypted communication with the cluster, you must also have a TLS profile defined. For logging and monitoring purposes within Kafka, you can optionally specify a string identifier by which API Connect can be uniquely identified. • To configure data offload to Syslog, you must, at a minimum, provide host and port connection details for the server. If you require encrypted communication with Syslog using the TCP protocol, you must also have a TLS profile defined. It is recommended that you create one or more TLS profiles that you can use for encrypted communication instead of using the default TLS profile named Cloud Manager and API Manager TLS Profile. Procedure: Splunk 1. Configure a data input that listens on a TCP or UDP port a. Specify the port number upon which Splunk will listen b. Click Next c. In the Select Source Type drop-down list select Operating System then syslog. d. Specify a preferred method for receiving the host field for events coming from the Management server e. Specify the index where Splunk will store incoming data. f. Click Review, then Submit. 2. If the configuration finishes successfully, your Splunk server is ready to collect events from the Management server 3. Proceed to enable syslog in the Cloud Manager Procedure: Enable syslog in Cloud Manager Console 1. Run the command debug tail file /var/log/cmc.out to monitor log entries as you go through these steps. 2. In the Cloud Manager, click Settings. 3. From the navigation pane, click Analytics 4. Complete the following fields that are available under the API Events, Monitoring Events, Log Events, or Audit Events section (configure any or all event types of interest): a. Select the Export events to a third-party system checkbox b. From the Select Analytics Platform drop-down list that is displayed, select Syslog. c. Click Configure to specify connection details and other configuration options for the Syslog server. d. Complete the fields in the Syslog Output window as follows: i. In the Host field, specify a fully qualified host name or IP address of the Syslog server. ii. In the Port field, accept the default port (which is set based on the protocol that you specify in the next field, for secure HTTP connections). Alternatively, specify another port number on which Syslog listens for incoming connections. The default port is 514 if connecting using the UDP protocol, and 601 for the TCP protocol. iii. From the Protocol drop-down list, specify your preferred protocol for secure HTTP connections from either of the following options: 1. UDP: The connectionless User Datagram Protocol (UDP), which is generally used for simpler messaging transmissions, offers no guarantee of delivery, and has no handshaking dialogues 2. TCP: The more complex Transmission Control Protocol (TCP), which is used for connection-oriented transmissions, with reliable, ordered data streaming between communicating network applications and error-correction facilities iv. To use TLS to set up a private connection to the Syslog server and secure the transmission of the data being offloaded, select the Use TLS check box. Then, select your preferred profile from the drop-down list. This list shows all TLS profiles that have been created in the Cloud Manager. The default TLS profile (Cloud Manager and API Manager TLS Profile), which is defined in the Cloud Manager, is selected by default. NOTE: The Use TLS check box and corresponding TLS profile drop-down list are available only if you selected TCP in the Protocol field. v. If you specified TCP as the protocol and selected the Use TLS check box, select the Validate Certificate checkbox if you want to verify the authenticity of the TLS certificate presented by the Syslog server before you establish a connection. vi. To verify that your connection and configuration details are valid, click Send Test Event to generate and transmit a test event to the Syslog server. Messages are displayed beneath the primary banner in the Cloud Manager UI to indicate that an analytics event is being sent, and to indicate that the transmission was successful. The log entry below shows submission of a test event: 2017-05-19 19:21:02.058 INFO [T-172] [com.ibm.apimgmt.api.rest.ApiServlet.logRequest] 192.168.1.199: POST /v1/cloud/analyticsoutputs/test 2017-05-19 19:21:02.915 INFO [T-172] [com.ibm.apimgmt.config.impl.ETagRepositoryServiceConfig.invalidate] Invalidate path: /cloud/analyticsoutputs/test A successful test returns a response similar to this: 2017-05-19 19:22:41.129 INFO [T-172] [com.ibm.apimgmt.resources.analytics.AnalyticsOutputsResource.analyticsoffloadTestPost] Response: { "code": 0, "stdOut": "Sending Logstash's logs to /opt/logstash/testlogs which is now configured via log4j2.properties\n[2017-05-19T19:22:31,627][INFO ][logstash.pipeline ] Starting pipeline {\"id\"=>\"main\", \"pipeline.workers\"=>4, \"pipeline.batch.size\"=>125, \"pipeline.batch.delay\"=>5, \"pipeline.max_inflight\"=>500}\n[2017-05- 19T19:22:31,651][INFO ][logstash.pipeline ] Pipeline main started\n[2017-05- 19T19:22:31,788][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9805}\n", "stdErr": "", "eventSent": true } 2017-05-19 19:22:41.131 INFO [T-172] [com.ibm.apimgmt.api.util.ApiResponseHandler.log] 192.168.1.199: POST /v1/cloud/analyticsoutputs/test 200 If the transmission failed, an error message is displayed on top of the current window to inform you of the failure. For example: Click OK to close the error message and then verify that your configuration settings are correct. The log file cmc.out will also provide additional failure information to assist with troubleshooting: 2017-05-19 19:30:52.512 INFO [T-172] [com.ibm.apimgmt.resources.analytics.AnalyticsOutputsResource.analyticsoffloadTestPost] Response: {"code" : 99, "stdOut": "Error running command, see logs for details", "stdErr": ""} 2017-05-19 19:30:52.514 WARNING [T-172] [com.ibm.apimgmt.resources.analytics.AnalyticsOutputsResource.analyticsoffloadTestPost] Logstash configure test command failed, see /var/log/logstashconfigure.out for more info. You can verify that the target system has received the event by looking for an event that includes 5apimanagement_testevent as a value in the event record. Below is an illustration of the event 5apimanagement_testevent as received by Splunk: To see examples of the test events that are generated for API, monitoring, log, and audit events, see Sample test events for analytics offload. Below are sample log entries created in /var/log/cmc.out indicating a successful test: 2017-05-19 19:07:08.040 INFO [T-172] [com.ibm.apimgmt.resources.analytics.AnalyticsOutputsResource.analyticsoffloadTestPost] Response: { "code": 0, "stdOut": "Sending Logstash's logs to /opt/logstash/testlogs which is now configured via log4j2.properties\n[2017-05-19T19:06:56,722][INFO ][logstash.pipeline ] Starting pipeline {\"id\"=>\"main\", \"pipeline.workers\"=>4, \"pipeline.batch.size\"=>125, \"pipeline.batch.delay\"=>5, \"pipeline.max_inflight\"=>500}\n[2017-05- 19T19:06:56,779][INFO ][logstash.pipeline ] Pipeline main started\n[2017-05- 19T19:06:56,882][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9805}\n", "stdErr": "", "eventSent": true } 2017-05-19 19:07:08.043 INFO [T-172] [com.ibm.apimgmt.api.util.ApiResponseHandler.log] 192.168.1.199: POST /v1/cloud/analyticsoutputs/test 200 vii. Click Update to store the settings configured for offloading data to the Syslog server. e. Click the Save icon to collectively save your defined settings for enabling or disabling access to analytics, and for offloading data. Procedure: Enabling syslog in the Management server To enable syslog in the management server, run the following command: Mgmt. syslog set remote ip <Splunk_server_IP> port <portNumber> For example: management/APIConnect> mgmt syslog set remote ip 192.168.1.199 port 6001 syslog messages will now be sent to 192.168.1.199:6001 Reloading syslog configuration syslogng start/running, process 28025 The command enables syslog and sends events to the Splunk server at 192.168.1.199 port 6001. Other syslog-specific commands include: • mgmt syslog del config Delete syslog configuration.