IBM Cloud Application Business Insights Version 1.1.3

Administering IBM Cloud Application Business Insights

IBM

SC27-8793-03

Note Before using this information and the product it supports, read the information in “Notices” on page 55

This edition applies to the IBM® Cloud Application Business Insights, Version 1.1.3 and to all subsequent releases and modifications until otherwise indicated in new editions. © Copyright International Business Machines Corporation 2018, 2019. US Government Users Restricted Rights – Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp. Contents

Chapter 1. Administering...... 1 User administration...... 1 Default Dashboard Designer roles...... 1 Tool Content Groups and Engine User Groups...... 2 Cloud Application Business Insights Users...... 3 Configuring Connectors and Sources...... 12 Connectors for IBM products...... 12 Connectors for other products...... 20 Generic Connectors...... 33 Managing connector sources...... 40 Configuring the scheduler...... 45 Configuring the scheduler email...... 45 Updating the scheduler user login details...... 45 Application backup and restore...... 46 Performing data backup...... 46 Restoring the backup data...... 48 Additional configuration settings...... 49 Enabling the display of external application URL...... 50 Modifying the default number of rows retrieved from the JDBC data source...... 50 Setting the engine timeout interval from command line...... 51 Configuring the keystore password...... 51 Configuring logging...... 52 Configuring SMTP server for email notifications...... 53 Configuring to use the utility scripts on IBM Cloud Private...... 54

Notices...... 55 Trademarks...... 56 Terms and conditions for product documentation...... 57 IBM Online Privacy Statement...... 57

iii iv Chapter 1. Administering

Administering Cloud Application Business Insights application includes user administration, application configurations, application backup and restore, and third-party database configurations.

User administration Use the role-based and group-based access control options to create users and assign user roles and groups to them. User roles enable users to perform different tasks on the Cloud Application Business Insights application. Security is based on the role and groups that are assigned to a user. A role is a group of permissions that control the actions that a user can perform on the Cloud Application Business Insights application. During installation, a default user is added to Dashboard Designer. This default user has access to Dashboard Designer and an Engine instance. However, the user cannot access any Engine User Groups. The default user first needs to create a System Administrator user, and then that System Administrator can grant access to the Engine user groups, to the default user. The default user can add multiple Tool Content Groups, Engine User Groups, Dashboard Designer users, Engine users, and Cloud Application Business Insights users. The default user can assign only one user role to each user and an Engine instance, but can assign multiple Tool Content Groups and Engine User Groups to each user.

Default Dashboard Designer roles Dashboard Designer contains predefined, default roles. You can’t modify or delete these roles nor can you create any custom roles. The default user who is added during installation must create more users and assign them roles and groups according to the requirement.

Table 1. Predefined roles and their privileges Role Privilege Dashboard Developer Create dashboards. Publisher Create and deploy widgets and dashboards. Menu Administrator Create and publish widgets and dashboards. Dashboards are deployed by using Menu Access. System Administrator Complete access to Dashboard Designer, and can perform all the tasks that are available within Dashboard Designer. Can also manage connectors, data definition queries, and Dashboard Designer users. Note: The default user who is created while installing Cloud Application Business Insights does not have access to Engine user groups. The default user first needs to create a System Administrator user, and then that System Administrator can grant access to the Engine user groups to the default user.

Only the default user and users with System Administrator role can complete the following user management tasks: • Create Tool Content Groups. • Create Engine User Groups.

© Copyright IBM Corp. 2018, 2019 1 • Create Dashboard Designer users, Engine users, and users who can access both Dashboard Designer and Engine. • Assign Tool Roles, Tool Content Groups, Engine User Groups, and Engine Instances to users. Note: If you assign a Dashboard Developer role to users, then such users cannot preview tool contents or dashboards that are created by them or their group members. To enable such users to preview tool contents or dashboards, you must assign an Engine instance and an Engine User Group to Dashboard Designer to such users. As a default user or as a System Administrator, you cannot edit your own user profile.

Tool Content Groups and Engine User Groups Dashboard Designer displays Tool Content Groups and Engine User Groups. Only a System Administrator or a default user can grant access to these user groups. Dashboard Designer displays the following types of user groups: Tool Content Groups Content indicates all the dashboard components and contents such as Layouts, Widgets, Data Definitions, Filters, Dashboards, and Menus that are created by users. Users belonging to the same Tool Content Group can view, use, modify, and delete contents that are created by each other. A default user or a System Administrator can create multiple Tool Content Groups, and assign one or more Tool Content Groups to a single user. Engine User Groups Group of users who can access only those dashboards and widgets that are published on a specified Engine instance. All the published dashboards and widgets are displayed on an Engine instance. Users belonging to the same Engine User Group can access all the dashboards and widgets that are published on the Engine instance that is assigned to that group. A default user or System Administrator can create multiple Engine User Groups, and assign one or more Engine Groups to a single user.

Managing Tool Content Groups and Engine User Groups You can add, search, modify, or delete Tool Content Groups and Engine User Groups.

Before you begin To manage Tool Content Groups and Engine User Groups, you must be logged in as a System Administrator or as a default user.

About this task You need to perform similar tasks and steps to manage Tool Content Groups and Engine User Groups. This topic provides information about all the common tasks that can be performed on the Tool Content Groups page and Engine User Groups page.

Procedure Complete the following steps to view, add, search, edit, or delete Tool Content Groups and Engine User Groups: 1. To manage Tool Content Groups, in the left navigation pane of Dashboard Designer, click Users and Groups > Tool Content Groups. In the Tool Content Groups page, complete the tasks that are provided in the following task table. 2. To manage Engine User Groups, in the left navigation pane of Dashboard Designer, click Users and Groups > Engine User Groups. In the Engine User Groups page, complete the tasks that are provided in the following task table. 3. Complete any of the following tasks:

2 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Table 2. Common Tool Content Groups and Engine User Groups tasks Task Steps

Add a group a. Click Add Group. A window opens. b. In the Name field, enter a name for the group. c. In the Description field, enter a description for the group. d. Complete any of the following steps: • To save the newly added group, click Save. • If you don’t want to add the group, click Cancel.

Edit a group description a. To edit a description of a group, click the Edit icon that is displayed on that row. b. Modify the existing description. c. Complete any of the following steps: • To save the modifications, click the Save. • Click Cancel to restore the original description.

Search a group To find a group, enter the name of the group in the Search field.

Delete groups a. To delete a group or multiple groups, complete any of the following steps:

• Click the Delete icon on the group row. • Select the group, and then click the Delete button. • To delete multiple groups, select multiple groups, and then click the Delete button. A confirmation message is displayed. b. Click Ok.

Cloud Application Business Insights Users You can add different types of users such as users who can access both Dashboard Designer and Engine, or users who can access only Engine or only Dashboard Designer. You can also grant these users access to one or more Tool Content Groups and Engine User Groups. Using Dashboard Designer, you can create the following users: Dashboard Designer users Dashboard Designer users can create dashboard content such as Layout, Widgets, Filters, Data Definitions, Dashboards, and Menus based on their user roles. Based on the user role Dashboard Designer users can be categorized as follows: Users with access to Dashboard Designer only These users have Dashboard Developer assigned to them. They cannot preview the content that is created by them or their group members as they are not assigned any Engine instance or Engine User Groups. If you want such users to preview any dashboard content, then you must assign an Engine instance and an Engine User Group to such users.

Chapter 1. Administering 3 Users with access to Dashboard Designer and Engine These users have Publisher, System Administrator or Menu Administrator role. They can preview dashboard content, publish widgets and dashboards, create menus, and deploy dashboards, as they have access to Engine. These users have an Engine instance, and one or more Engine User Groups assigned to them. Engine users Can access only Engine. These users are not assigned any default Dashboard Designer role or Tool Content Groups.

Adding users You can view, add, modify, search, or delete Dashboard Designer users and Engine users by using Dashboard Designer. You can also import users in bulk by using a CSV file.

Before you begin Before you add users, you must ensure that the following tasks are completed: • An Engine instance is added. • At least one Tool Content Group is created. • At least one Engine User Group is created. To add users, you must be logged in as a System Administrator or as a default user.

Procedure Adding Dashboard Designer users with access to Dashboard Designer only. • To add Dashboard Designer users only, complete the following steps: a) In the left navigation pane of Dashboard Designer, click Users and Groups > Users. A Users page opens in a new tab. b) Click Add User. An Add User window that displays the TOOL ACCESS pane opens. c) In the User name field, enter the name of the user. d) In the Tool Role pane, click Dashboard Developer. e) In the Tool Content Group(s) pane, click the Assign Groups list. A pop-up window opens where the Tool Content Groups are listed alphabetically sorted in ascending order. Complete the following steps in the pop-up window: – Click one or more Tool Content Groups that are listed under Content Groups (select one or

more) pane, and click to add the Tool Content Groups to Selected Content Groups pane. – If you want to delete any group from the Selected Content Groups pane, then click the Delete

icon that is displayed next to it. f) Click Save. A Dashboard Designer user is created. This user can access Dashboard Designer, create tool content and dashboards but cannot preview them. If you want the user to preview tool content and dashboards, then you must assign an Engine instance and an Engine User Group to the user. Complete the steps “5” on page 5 and “6” on page 5 that are provided in the Adding Engine users section. Adding Dashboard Designer users with access to Dashboard Designer and Engine • To add a user who can access both Dashboard Designer and Engine, complete the following steps: a) In the left navigation pane of Dashboard Designer, click Users and Groups > Users.

4 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights A Users page opens in a new tab. b) Click Add User. An Add User window that displays the TOOL ACCESS pane opens. c) In the User name field, enter the name of the user. d) In the Tool Role pane, click either Publisher, Menu Administrator, or System Administrator. Note: A System Administrator has access to all the Tool Content Groups. Therefore, if you select System Administrator, then you do not need to select a specific content group from Tool Content Group(s) pane. e) In the Tool Content Group(s) pane, click the Assign Groups list. A pop-up window opens where the Tool Content Groups are listed alphabetically sorted in ascending order. Complete the following steps in the pop-up window: – Click one or more Tool Content Groups that are listed under Content Groups (select one or

more) pane, and click to add the Tool Content Groups to Selected Content Groups pane. – If you want to delete any group from the Selected Content Groups pane, then click the Delete

icon that is displayed next to it. f) Assign an Engine instance and an Engine User Group to the user. Complete the steps “5” on page 5 and “6” on page 5 that are provided in the Adding Engine users section. Adding Engine users • To add Engine users only, complete the following steps: a) In the left navigation pane of Dashboard Designer, click Users and Groups > Users. A Users page opens in a new tab. b) Click Add User. An Add User window that displays the TOOL ACCESS pane opens. c) In the User name field, enter the name of the user. d) In the Tool Role pane, click None. e) To assign an Engine instance, complete the following steps: a. Expand the ENGINE ACCESS pane, and in the Engine Instance(s) pane, click Assign Instances. A pop-up window opens displaying an Engine instance. b. Click the Engine instance that is listed under Engine Instance (select one or more) pane, and

click to add the instance to Selected Engine Instances pane. To delete the selected instance from the Selected Engine Instances pane, click the Delete

icon that is displayed next to the instance. c. To enable the user to access Schedule Tasks menu on Engine, select the Scheduler check box. The user is able to create and manage scheduled tasks for all the dashboards that are displayed on Engine, irrespective of the assigned User Group. f) To assign Engine User Groups, in the User Group(s) pane, complete the following steps: a. Click the Assign Groups list. A pop-up window opens where the Engine Groups are listed alphabetically sorted in ascending order.

Chapter 1. Administering 5 b. Click one or more Engine Groups that are listed under User Groups (select one or more) pane,

and click to add the Engine Groups to Selected User Groups pane. If you want to delete any group from the Selected User Groups pane, then click the Delete

icon that is displayed next to it.

Results After a user is created, in the PostgreSQL database, the user details are stored in the following tables: • Dashboard Designer user details are in USERS table under T_DEFAULT_TOOL table. • Engine user details are in USERS table under T_DEFAULT_ENGINE table. • Cloud Application Business Insights users are stored in both the tables.

What to do next After you add or delete a user, you must add or delete the user name and password of that user in the basic-registry.xml file.

Importing users in bulk You can import users, Tool Content Groups, and Engine User Groups to the latest Cloud Application Business Insights version in bulk by using a CSV file.

Before you begin Ensure that you upgrade Cloud Application Business Insights installation to its latest version or install its latest version on your server. The latest Cloud Application Business Insights installation image contains the following CSV file: • Import_users_sample.csv: A sample file that contains dummy data. After you install Cloud Application Business Insights, the two CSV files are available at the following locations:

$install_dir/prdutil/data/importUsers

Where: install_dir is the directory where you installed the application.

$install_dir/post_installation/prdutil/data/importUser

Where: install_dir is the directory where you extracted the UI TAR file. You must edit any one of the CSV files to include the following information for each user that you plan to import: User name User name must not be more than 50 characters in length. Can contain alphanumeric characters with underscores. User name cannot contain spaces or special characters. Tool Role Enter either Dashboard Developer, System Administrator, Publisher, None, or Menu Administrator. If you enter any other role or change the syntax or capitalization of the available roles, then the import for the record fails. Tool Content Groups Can contain blank, single, or multiple values based on the following scenarios:

6 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights • If the user has access to multiple content groups, then you can enter multiple group names, each separated by a pipe symbol. • If the user's role is None or System Administrator, then you must leave this column blank. • If the user's role is Dashboard Developer, Publisher, or Menu Administrator, then this column must contain at least one value. If it does not contain a value, then the import fails. You can enter multiple content group names for a single user by separating each group with a pipe symbol. • If you enter new content groups, then those groups are also created. Engine Instances Can contain a single value or you can leave it blank. Note: If you leave it blank, then you cannot create any Engine users, Cloud Application Business Insights users, nor can any enable Dashboard Designer users to preview any tool content. Engine User Groups Enter either single or multiple values. For multiple values, separate each value by a pipe symbol. If you enter new user groups, then those groups are also created. Ensure that the CSV file is complete in context to the following parameters: • Contains valid data • Does not contain any additional columns or blank rows • Does not contain duplicate user names. For example, if Dashboard Designer already contains a user who is named 'User_abc', then you cannot add another user with the same name. • For a user, if you add an Engine instance, then you must add an Engine User Group. Else, the import for that user record fails. To import users, you must be logged in as a root user.

Procedure Complete the following steps to import users in bulk: 1. Copy the CSV file that contains user information to the Cloud Application Business Insights installation directory. 2. Run the following command to import users to an on-premises installation of Cloud Application Business Insights:

./import_users.sh

Where is the path of your .csv file. For example

• ./import_users.sh $install_dir/prdutil/data/importUsers/ import_users_sample.csv Where: $install_dir is the directory where you installed the application.

• ./import_users.sh $install_dir/postinstallation/prdutil/data/ importUsers/import_users_sample.csv Where: install_dir is the directory where you extracted the UI TAR file. Note: To resolve any import users-related issues, check the ImportUsersReport.log file that is placed at install_dir/prdutil

Results The imported users are displayed on the Users tab in the Dashboard Designer.

Chapter 1. Administering 7 What to do next After you add or delete a user, you must add or delete the user name and password of that user in the basic-registry.xml file for on premises installation package and in user-registry.yml file for IBM Cloud™ Private installation package.

Managing users You can view, modify, search, or delete users by using Dashboard Designer.

Before you begin To manage users, you must be logged in as a System Administrator or as a default user.

Procedure Complete the following steps to view, modify, search, or delete users: • To view or search users, complete the following steps: a) In the left navigation pane of Dashboard Designer, click Users and Groups > Users. The Users page opens. It displays all the users. b) To find a user, enter the name of the user in the Search field. • To delete a user or delete multiple users, complete the following steps: a) To delete a user or users, complete any one of the following steps in the Users page:

– To delete a user, click the Delete icon that is displayed next to the user or select the user and click the Delete button. – To delete multiple users, select multiple users, and then click the Delete button. b) In the confirmation message window that is displayed, click Ok. • To modify a user, complete the following steps: a) In the left navigation pane of Dashboard Designer, click Users and Groups > Users. The Users page opens. It displays all the users.

b) To edit a user, click the Edit icon that is displayed on the user row. The Edit a user window is displayed. c) In the Tool Role pane, click the role that you want to assign to the user. Note: – If you click System Administrator, then the already selected Content Groups that are displayed in the Tool Content Group(s) pane are deleted, as the System Administrator has access to all the Tool Content Groups. – If you click None, then the already selected Content Groups that are displayed in the Tool Content Group(s) pane are deleted. You cannot select any Content Group as the user is an Engine user only and cannot access any Tool content. – If you click any Tool Role other than System Administrator and None, then the existing Content Groups are retained. d) To add or delete Content Groups, complete the following steps in the Tool Content Group(s) pane: a. Click the Assign Groups list. A pop-up window opens where the Tool Content Groups are listed alphabetically sorted in ascending order. b. Click one or more Tool Content Groups that are listed under Content Groups (select one or

more) pane, and click to add the Tool Content Groups to Selected Content Groups pane.

8 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights – If you want to delete any group from the Selected Content Groups pane, then click the

Delete icon that is displayed next to it. e) To delete an Engine instance that is assigned to a user or to disable the access to Schedule Tasks, complete any of the following steps in the ENGINE ACCESS pane: a. In the Engine Instance(s) pane, click the Assign Instance list. A pop-up window opens displaying the Engine instance. b. To delete the selected instance from the Selected Engine Instances pane, click the Delete

icon that is displayed next to the instance. c. To disable the access to Schedule Tasks, clear the Scheduler check box. Note: – If you delete an Engine instance, then the Engine User Groups that are assigned to that user are also deleted. The user cannot preview any Tool content. – Users with System Administrator, Publisher or Menu Administrator role, and Engine users must have at least one instance assigned to them. f) To add or delete Engine User Groups, complete the following steps in the User Group(s) pane: a. Click the Assign Groups list. A pop-up window opens where the Engine Groups are listed alphabetically sorted in ascending order. b. Click one or more Engine Groups that are listed under User Groups (select one or more) pane,

and click to add the Engine Groups to Selected User Groups pane. – If you want to delete any group from the Selected User Groups pane, then click the Delete

icon that is displayed next to it. Note: Users with System Administrator, Publisher, or Menu Administrator role, and Engine users must have at least one Engine User Group assigned to them. g) To save the changes, click Save. Else, click Cancel.

What to do next After you add or delete a user, you must add or delete the user name and password of that user in the basic-registry.xml file.

Adding or modifying user information in registry After you add or delete a user in Dashboard Designer tool, you must also update the basic- registry.xml file to include or delete a user name and password of that user.

Before you begin Ensure that the user is already added or deleted in Dashboard Designer or Engine.

Procedure Complete the following steps to add user name and password to the basic-registry.xml file: 1. Encrypt the passwords. For more information, see Encrypting passwords section in Installing IBM Cloud Application Business Insights. 2. As a root user, open the basic-registry.xml file from the following location:

Chapter 1. Administering 9 cd $install_dir/wlp/usr/servers/icabiap vi basic-registry.xml

Where, install_dir is the directory where you installed the application. However, during installation, if you do not specify a different installation location, then by default the application is installed in /opt/ icabi folder. 3. Add, modify, or delete the user and password entry in basic-registry.xml file as follows: a) Add a user name and its encrypted password entry within the element as follows:

Note: If the ignoreCaseForAuthentication is set to true, the case-insensitive user name can be used to log in to the application. b) Optional: Modify an existing user name or password from within the basicRegistry element in basic-registry.xml. c) Optional: Delete an existing user name or password entry from the basicRegistry element in basic-registry.xml. Note: When you delete a Dashboard Designer tool or Engine user from the user interface, make sure to delete the user entry from the basic-registry.xml file also. 4. Save the basic-registry.xml file.

Adding or modifying user information in user registry After you add or delete a user in Dashboard Designer tool, you must also update the user- registry.yaml file to include or delete a user name and password of that user.

About this task Ensure that the user is already added or deleted in Dashboard Designer or Engine.

Procedure Complete the following steps to add user name and password in the user-registry.yaml file. 1. Open the user-registry.yaml file from the following location:

cd $install_dir/icabi_helm/templetes vi user-registry.yaml

10 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Where: install_dir is the directory where you extracted the UI TAR file. 2. Add, modify, or delete the user and password entry in user-registry.yaml file as follows: Important: • Default administrator's user name, which is smadmin and password, which is smamdin is mentioned in the values.yaml file that is placed at install_dir/icabi_helm folder. These values are accepted in the values.yaml file and passed to user-registry.yaml file. • Default Scheduler user's user name, which is schedulerAdmin and password, which is schedulerAdmin is updated in the values.yaml file that is placed at install_dir/ icabi_helm folder. These values are accepted in the values.yaml file and passed to user- registry.yaml file.

apiVersion: v1 kind: Secret metadata: name: user-registry namespace: {{ .Values.namespace }} type: Opaque stringData: basic-registry.xml: |- #Default Admin user #Default scheduler user # Add new ICABI users here and redeploy the secret to take effect.

a. Add more users to the user-registry.yml file by modifying the element in clear text. For example, . b. Optional: Modify an existing user name or password in user-registry.yaml. c. Optional: Delete an existing user name or password in user-registry.yaml. Note: • When you delete a Dashboard Designer tool or Engine user from the user interface, make sure to delete the user entry from the user-registry.yaml file. • If you modify the user-registry.yaml file after the application deployment, manually update the default administrator's credentials, default scheduler users' credentials and all the previously existing user's credentials and then redeploy the user-registry.yaml file. To deploy run the following command:

kubectl apply -f user-registry.yaml

Restarting the Cloud Application Business Insights server You can restart the Cloud Application Business Insights server, Cloud Application Business Insights OIDC server, and PostgreSQL database by running the stop_icabi.sh and start_icabi.sh commands.

Procedure 1. Stop Cloud Application Business Insights server, Cloud Application Business Insights OIDC server, Apache NiFi server, and PostgreSQL database with the following command:

cd $install_dir ./stop_icabi.sh

Where: install_dir is the directory where the application is installed.

Chapter 1. Administering 11 Note: Cloud Application Business Insights server, Cloud Application Business Insights OIDC server, Apache NiFi server, and PostgreSQL database are stopped. 2. Start Cloud Application Business Insights server, Cloud Application Business Insights OIDC, Apache NiFi server, and PostgreSQL database with the following command:

cd $install_dir ./start_icabi.sh

Note: Cloud Application Business Insights server, Cloud Application Business Insights OIDC server, Apache NiFi server, and PostgreSQL database are started.

Configuring Connectors and Sources Use the Connector component to connect to a data source and select and configure a specific data provider. Connector is the predefined interface template and is the base for Connector sources.

About this task Connector sources are the instances of connector types, which can be configured to connect directly to data providers. Based on the supported connector types, you can configure the connectors sources for various databases or REST-compliant web services. These connector sources can then be used to create data definitions. Cloud Application Business Insights Connector framework can be categorized as follows:

Connectors for IBM products Connectors to connect to data sources for various IBM Products. List of Connectors for IBM products:

Table 3. Connectors for IBM products Connector name Product name Description apmcdp IBM Cloud Application Connects to Cloud APM Performance Management instances. The display name of the connector is APM.

itmdp IBM Tivoli® Monitoring Connects to IBM Tivoli Monitoring instances. The display name of the connector is ITM.

icamsvc IBM Cloud App Management Connects to Cloud App Management instances. The display name of the connector is ICAM.

12 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Table 3. Connectors for IBM products (continued) Connector name Product name Description omnibus_restapi Tivoli Netcool®/OMNIbus Connects to Tivoli Netcool/ OMNIbus web services. The display name of the connector is Omnibus. Note: It is based on Apache NiFi- platform and uses REST API for data retrieval.

qradar_restapi QRadar® Connects to QRadar REST- compliant web services. The display name of the connector is QRadar. Note: It is based on Apache NiFi- platform and uses REST API for data retrieval.

Note: You can edit only the display name of the connector types.

Configuring IBM Cloud Application Performance Management connector sources You can create connector sources to IBM Cloud Application Performance Management by using the built- in connectors that are available in Cloud Application Business Insights.

Before you begin Ensure that the IBM Cloud Application Performance Management database environment is up and running.

Procedure 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. By default, APM is selected in the Connector Type list. 4. In all the remaining fields, enter the information that is provided in the following table:

Table 4. Cloud APM connector data source information Field User action Connector Source Name Enter a name of the Cloud APM data source that you want to add. Source name can contain alphanumeric characters and underscores.

URL The default URL format is https://Server_IP_or_HOSTNAME:port Where, • Server_IP_or_HOSTNAME is the IP address or host name of APM. • port is the default port for Cloud APM. The default port is 9050.

Chapter 1. Administering 13 Table 4. Cloud APM connector data source information (continued) Field User action

System Config URL The default system configuration URL format for Cloud APM is https://APM_Server_IP_or_HOSTNAME:port. Where: • The system configuration URL is secured and uses https. • APM_Server_IP_or_HOSTNAME is the IP address or host name of Cloud APM server. • port is the default port Cloud APM. The default port is 8091.

Username Enter the user name for Cloud APM CURI data provider. Note: The default user name for Cloud APM CURI data provider is smadmin.

Password Enter a password for Cloud APM CURI data provider. Contact the administrator for this information.

5. To make sure that the connection to the data source is successful, click Test Connection. If the connection to the data source fails, then a message that indicates the reason for connection failure is displayed. You must fix the errors and test the connection again. 6. When the connection to the source is successful, click Save.

Results The connector source is added and it can be used to create data definitions.

Configuring IBM Tivoli Monitoring connector sources You can create connector sources to IBM Tivoli Monitoring by using the built-in connectors that are available in Cloud Application Business Insights.

Before you begin Ensure that IBM Tivoli Monitoring database environment is up and running. For ITM data source, get the managed system name for the ITM connector source. For more information, see “Getting the Managed System Name for ITM connectors” on page 15.

Procedure 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select ITM and enter information that is provided in the following table:

Table 5. IBM Tivoli Monitoring connector data source information Field User action Connector Source Name Enter a name of the ITM data source that you want to add. Source name can contain alphanumeric characters and underscores.

14 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Table 5. IBM Tivoli Monitoring connector data source information (continued) Field User action

URL The default URL format is https://Server_IP_or_HOSTNAME:port Where: • Server_IP_or_HOSTNAME is the IP address or host name of Tivoli Enterprise Portal Server. • port is the default port for ITM. The default port is 15200.

Managed System Name Enter the correct Tivoli Enterprise Monitoring Server name. If you enter an incorrect name for a valid connector source, then an error message that indicates you to select a different source is displayed. For more information, see “Getting the Managed System Name for ITM connectors” on page 15.

Username Enter the user name for ITM CURI data provider. Note: The default user name for ITM data provider is sysadmin.

Password Enter a password for ITM CURI data provider or data provider. Contact the administrator for this information.

Important: To connect to the OMEGAMON® data source, use the ITM connector and create the source. Make sure that you are able to connect to the system where OMEGAMON is available and the agents are up and running. 5. To make sure that the connection to the data source is successful, click Test Connection. If the connection to the data source fails, then a message that indicates the reason for connection failure is displayed. You must fix the errors and test the connection again. 6. When the connection to the source is successful, click Save.

Results The connector source is added and it can be used to create data definitions.

Getting the Managed System Name for ITM connectors You require the Managed System Name for ITM connector configuration. Managed System Name is the name of a particular operating system, subsystem, or application in an enterprise where a monitoring agent is installed and running.

Before you begin • Make sure that the data providers are set up for ITM servers for which you want to create connectors on Cloud Application Business Insights. See Verifying the dashboard data provider is enabled. • If the portal server has heavy load, you must install a separate dedicated portal server to service the requests from the Cloud Application Business Insights. If you set up a separate portal server: – Ensure that the dedicated portal server has the application support for the agents whose data needs to be fetched. – Ensure that Tivoli Enterprise Portal clients are not connected to this dedicated portal server to complete administrative tasks such as creating custom workspace, creating situations, and creating managed system groups.

Chapter 1. Administering 15 Procedure Complete the following steps to get the managed system name for the ITM connector source: • Log in to the Tivoli Enterprise Portal Server by using the following URL: http://TEPS_IP_or_HOSTNAME:15200/ITMRESTProvider/test.html Where, TEPS_IP_or_HOSTNAME is Tivoli Enterprise Portal Server IP address or host name. • Enter your Tivoli Enterprise Portal Server user name and password and click OK. • Click GET to send a GET request for the provider information. If the data provider is enabled successfully, the following data provider information is displayed in the text field:

{ "filteredRows": 1, "identifier": "id", "items": [ { "MSSName": "ibm-cdm:\/\/\/CDMSS\/Hostname=HV6-COMMON-1+ManufacturerName =IBM+ProductName=IBM Tivoli Monitoring Services", "baseUrl": "http:\/\/:15200\/ibm\/tivoli\/rest", "datasetsUri": "\/providers\/itm.HUB_HV6-COMMON-1\/datasets", "datasourcesUri": "\/providers\/itm.HUB_HV6-COMMON-1\/datasources", "description": "IBM Tivoli Monitoring dashboard data provider", "id": "itm.HUB_HV6-COMMON-1", "label": "HUB_HV6-COMMON-1", "remote": false, "type": "IBMTivoliMonitoringServices", "uri": "\/providers\/itm.HUB_HV6-COMMON-1", "version": "06.30.02.00" } ], "numRows": 1, "totalRows": 1 }

Important: If the URI field is displayed blank, click Reset URI so that URI field value is set to /ibm/ tivoli/rest/providers, and then click GET. • Note the value for id from the data provider information. The value of id is the manage system name that you require to configure ITM connector source. For example, itm.HUB_HV6-COMMON-1 is the manage system name. • Optional: If the responses from ITM Data Provider Connector Source are slow or inconsistent, set or increase the heap size of the ITM data provider. For more information, see Setting the TEPS/e maximum heap size.

Configuring IBM Cloud App Management connector sources You can create connector sources to IBM Cloud App Management by using the built-in connectors that are available in Cloud Application Business Insights.

Before you begin • Ensure that the IBM Cloud App Management database environment on IBM Cloud Private is up and running. • Ensure that Cloud App Management Service Data Collector is installed. For more information, see Installing the Cloud App Management Service Data Collector topic in Administering IBM Cloud Application Business Insights.

Procedure 1. Open Dashboard Designer.

16 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select ICAM and enter information that is provided in the following table:

Table 6. IBM Cloud App Management connector data source information Field User action Connector Source Name Enter a name of the Cloud App Management data source that you want to add. Source name can contain alphanumeric characters and underscores.

URL The default URL format is https:// CLUSTER_IP_or_HOSTNAME:port Where: • CLUSTER_IP_or_HOSTNAME is the IP address or host name of the master node where Cloud App Management Service Data Collector is installed. Note: Ensure that Cloud App Management Server's host name is resolved from the system where you installed Cloud Application Business Insights. • port is the default port for Cloud App Management Service Data Collector. The default port is 30950.

Tenant ID Enter your tenant ID (subscription ID) to access Cloud App Management. Your subscription ID is a part of the Cloud App Management login URL. For example, https://CLUSTER_IP_or_HOSTNAME/cemui/ landing?subscriptionId=TENANT_ID

5. To make sure that the connection to the data source is successful, click Test Connection. If the connection to the data source fails, then a message that indicates the reason for connection failure is displayed. You must fix the errors and test the connection again. 6. When the connection to the source is successful, click Save.

Results The connector source is added and it can be used to create data definitions.

Configuring IBM QRadar connector sources Use QRadar connector to connect to QRadar REST-compliant services. QRadar connector is based on Apache NiFi-platform and uses REST API for data retrieval.

Before you begin • Ensure that the web service that you want to connect is up and running. • For the sample request, if you need to provide custom headers, then ensure that you have that information available with you. Custom headers are request headers that are needed to retrieve REST API information from a web service. For example, Authorization request headers or Accept-Charset request headers. Authorization request headers contain authorization information that is required by

Chapter 1. Administering 17 the REST services. Accept-Charset indicates the acceptable data characters that the response must contain.

Procedure Complete the following steps to add QRadar REST-compliant services by using QRadar connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select QRadar connector. 5. In the Connector Source Name field, enter a name for QRadar source. Source name can contain alphanumeric characters and underscores. 6. In the Endpoint URL field, enter URL details for QRadar web service in the following format: https://server_IP_or_HOSTNAME/api Where, server_IP_or_HOSTNAME is the IP address or host name of the server where the web service is hosted. 7. From the Authentication Type list, select none or Basic Authentication based on whether user name and password are configured during the installation of the web service. If you select Basic Authentication, then you must complete the following steps: a. In the User Name field, enter a user name that is used during the configuration of the web service. b. In the Password field, enter a password that is used during the configuration of the web service. 8. Click Next. 9. From the Method list, select GET method, and in the URI field, enter the uniform resource identified (URI) for QRadar source. For example, URI Where, URI is the resource URI. For more information about URI formats, see https://www.ibm.com/support/knowledgecenter/ SS42VS_7.2.8/com.ibm.qradar.doc/t_api_using_the_doc_user_interface.html. 10. Optional: Under Custom Headers, complete the following steps: • In the Name field, enter the request header name that is provided by the REST API web service provider. • In the Value field, enter the request header value that is provided by the REST API web service provider. 11. Click Save. 12. To make sure that the connection to QRadar data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message to indicate that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results The QRadar web service is added and can be used to create your custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

18 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Configuring IBM Tivoli Netcool/OMNIbus connector sources Use Omnibus connector to connect to Tivoli Netcool/OMNIbus REST-compliant services. Omnibus connector is based on Apache NiFi-platform and uses REST API for data retrieval.

Before you begin • Ensure that the web service that you want to connect is up and running. • For the sample request, if you need to provide custom headers, then ensure that you have that information available with you. Custom headers are request headers that are needed to retrieve REST API information from a web service. For example, Authorization request headers or Accept-Charset request headers. Authorization request headers contain authorization information that is required by the REST services. Accept-Charset indicates the acceptable data characters that the response must contain.

Procedure Complete the following steps to add a Tivoli Netcool/OMNIbus web service by using the Omnibus connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select Omnibus connector. 5. In the Connector Source Name field, enter a name for the Tivoli Netcool/OMNIbus source. Source name can contain alphanumeric characters and underscores. 6. In the Endpoint URL field, enter URL details for Tivoli Netcool/OMNIbus web service in any of the following formats: • http://server_IP_or_HOSTNAME:port • https://server_IP_or_HOSTNAME:port Where, • server_IP_or_HOSTNAME is the IP address or host name of the server where the web service is hosted. • port is the default port of the web service. 7. From the Authentication Type list, select none or Basic Authentication based on whether user name and password are configured during Tivoli Netcool/OMNIbus installation. If you select Basic Authentication, then you must complete the following steps: a. In the User Name field, enter a user name that is used during Tivoli Netcool/OMNIbus configuration. b. In the Password field, enter a password that is used during Tivoli Netcool/OMNIbus configuration. 8. Click Next. 9. From the Method list, select a method and complete any of the following steps based on the method that you select: • – For GET method, in the URI field, enter the uniform resource identified (URI) for the source. – For POST method, in the URI field, enter the URI for the source, and in the Request Body method, enter the post request. – Under Custom Headers, complete the following steps: - In the Name field, enter the request header name that is provided by the REST API web service provider.

Chapter 1. Administering 19 - In the Value field, enter the request header value that is provided by the REST API web service provider. URI Where, URI is the resource URI. For more information about URI formats, see https://www.ibm.com/support/knowledgecenter/en/ SSNFET_9.2.0/com.ibm.netcool_OMNIbus.doc_7.4.0/omnibus/wip/api/reference/ omn_api_http_httpinterface.html. 10. Click Save. 11. To make sure that the connection to the Tivoli Netcool/OMNIbus data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results The Tivoli Netcool/OMNIbus web service is added and can be used to create custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Connectors for other products Cloud Application Business Insights has connectors to connect to multiple non-IBM product data sources. These connectors are integrated with Apache NiFi platform. List of Connectors for non-IBM products:

Table 7. Connectors for other non-IBM products Connector name Product name Description cassandra Connects to Cassandra databases. The display name of the connector is Cassandra.

mongodb Mongo DB Connects to MongoDB databases. The display name of the connector is Mongo DB.

nagiosxi_restapi Nagios XI Connects to Nagios XI REST- compliant services. The display name of the connector is Nagios XI.

solarwinds_restapi SolarWinds Connects to SolarWinds REST- compliant services. The display name of the connector is SolarWinds.

20 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Table 7. Connectors for other non-IBM products (continued) Connector name Product name Description es_restapi Elasticsearch Connects to Elasticsearch application instances. The display name of the connector is Elasticsearch.

hbase Apache HBase Connects to HBase databases. The display name of the connector is HBase.

druid_restapi Connects to Druid REST- compliant services. The display name of the connector is Druid.

prometheus_restapi Prometheus Connects to Prometheus REST- compliant services. The display name of the connector is Prometheus.

hiveql Connects to Hive data warehouse. The display name of the connector is HiveQL.

Note: You can edit only the display name of the connector types.

Configuring Apache HBase database connector sources You can add Apache HBase database to Cloud Application Business Insights by using HBase connector. HBase connector is based on Apache NiFi-platform.

Before you begin • Ensure that the database that you want to connect is up and running, and is not Secure Sockets Layer (SSL) secured. • Copy the HBase database configuration files, core-site.xml and hbase-site.xml, to the system where Cloud Application Business Insights is installed. For example, /opt/hbaseconf/core-site.xml and /opt/hbaseconf/hbase-site.xml Where hbaseconf is a folder that you created within the installation directory to copy the XML files. • Ensure that the DNS entry for the HBase server is configured. If the DNS entry is missing, then you must add the host name and IP address of the HBase server to the /etc/hosts file on the system where Cloud Application Business Insights is installed.

About this task The HBase database tables are indexed by using row keys. The row keys are implemented as byte arrays and are sorted in byte-lexicographical order. So, in a table that contains row keys that are indexed from 1 -20, the data for row keys 10 - 19 is stored before the row key 2. Also, if you specify the Start Rowkey as 1 and the End Rowkey as 5, then the data for all the row keys that start from 1 and end before 5 is retrieved. Data for the row key 5 is not retrieved.

Chapter 1. Administering 21 Procedure Complete the following steps to add an HBase database by using HBase connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select HBase connector. 5. In the Connector Source Name field, enter a name for the source. Source name can contain alphanumeric characters and underscores. 6. In the Hadoop Configuration Files field, enter the path to the XML configuration files that you saved on Cloud Application Business Insights server. For example, /opt/hbaseconf/core-site.xml, /opt/hbaseconf/hbase-site.xml Note: The file paths must be separated by a comma. 7. Click Next. 8. In the Table Name field, enter a database table name from which you want to retrieve data. Note: If you provide the table name only, then the entire data of that table is retrieved. To retrieve selective data, you must specify Start Rowkey End Rowkey, and either Columns or Filter Expression. 9. Optional: In the Start Rowkey field, enter the row key value from which you want to retrieve data. 10. Optional: In the End Rowkey field, enter the row key value up to which you want to retrieve the data. Note: Data is retrieved for all the row key values that lie between the start and the end row keys, including the start row key data. The end row key data is not retrieved. 11. Optional: In the Columns field, enter column families or columns in any of the following formats, based on your requirement: "," To retrieve data for all the columns within the specified column families. Where, and are the column families in the HBase database table. Each column family contains multiple columns. ":",":" To retrieve data for specific columns within the column families. Where, and are the column families and and are the columns names within the column families. For example, if an HBase database table contains five column families, , and so on up to , and each column family contains three columns, such as , , and , then you can enter any of the following values: • To retrieve data for all the columns in the column families 1 and 4, enter ",". • To retrieve data for columns A and C in the column families 1 and 4, enter, ":",":", ":",":",. Note: Enter values either in the Columns field or in the Filter Expression field. 12. Optional: In the Filter Expressions field, enter any of the HBase filter expressions. For more information, see https://www.cloudera.com/documentation/enterprise/5-5-x/topics/ admin_hbase_filtering.html 13. Click Save.

22 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights 14. To make sure that the connection to the database is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message that indicates that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Also, ensure that you enter valid row key values. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results The HBase database is added and can be used to create custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Configuring Apache Hive connector sources You can connect an Apache Hive data warehouse that is based on by using HiveQL connector. The connector uses a Hive Query Language (HiveQL) to fetch data from a Hive data source. HiveQL connector is based on Apache NiFi-platform.

Before you begin • Ensure that the Apache Hive data source that you want to connect is up and running, and is not Secure Sockets Layer (SSL) secured. • Copy the Apache Hive configuration file, hive-site.xml, to the system where Cloud Application Business Insights is installed. For example, /opt/hiveconf/hive-site.xml Where hiveconf is a folder that you created within the installation directory to copy the XML file. • Ensure that the DNS entry for the Apache Hive server is configured. If the DNS entry is missing, then you must add the host name and IP address of the Apache Hive server to the /etc/hosts file on the system where Cloud Application Business Insights is installed.

Procedure Complete the following steps to add an Apache Hive data warehouse source by using HiveQL connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select HiveQL connector. 5. In the Connector Source Name field, enter a name for the source. Source name can contain alphanumeric characters and underscores. 6. In the Database Connection URL field, enter URL details for Apache Hive data source that you want to add. For example, jdbc:hive2://server_IP_or_HOSTNAME:port/db_name Where: • server_IP_or_HOSTNAME is the IP address or host name where Apache Hive data warehouse is installed. • port is the default port for Apache Hive. The default port is 10000. • db_name is the database name.

Chapter 1. Administering 23 7. In the Hive Configuration Resources field, enter the path to the XML configuration file that you saved on Cloud Application Business Insights server. For example, /opt/hiveconf/hive-site.xml 8. Optional: In the Database User field, enter a user name that is used during Apache Hive configuration. 9. Optional: In the Password field, enter a password that is used during Apache Hive configuration. 10. Click Next. 11. In the Query field, enter a sample query for Apache Hive. For more information about query formats, see https://docs.hortonworks.com/HDPDocuments/ HDP3/HDP-3.1.0/using-hiveql/content/hive_hive_query_language_basics.html. 12. Click Save. 13. To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message that indicates that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results The Apache Hive data source is added and can be used to create custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Configuring Apache Druid connector sources Use Druid connector to connect to Apache Druid REST-compliant services. Druid connector is based on Apache NiFi-platform and uses REST APIs for data retrieval.

Before you begin • Ensure that the web service that you want to connect is up and running. • To retrieve data from Druid, you need to enter queries in SQL format. Therefore, you must ensure that the druid.sql.enable property is set to true in the common.runtime.properties file in the Druid setup environment.

Procedure Complete the following steps to add Druid REST-compliant services by using the Druid connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select Druid connector. 5. In the Connector Source Name field, enter a name for the Druid source. Source name can contain alphanumeric characters and underscores. 6. In the Endpoint URL field, enter URL details for the Druid web service in any one of the following formats: • http://server_IP_or_HOSTNAME:port • https://server_IP_or_HOSTNAME:port Where,

24 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights • server_IP_or_HOSTNAME is the IP address or host name of the server where the web service is hosted. • port is the default port of the web service. 7. From the Authentication Type list, select none or Basic Authentication based on whether user name and password are configured during the installation of the web service. If you select Basic Authentication, then you must complete the following steps: a. In the User Name field, enter a user name that is used during the configuration of the Druid web service. b. In the Password field, enter a password that is used during the configuration of the Druid web service. 8. Click Next. 9. From the Method list, select POST method, and in the URI field, enter the uniform resource identified (URI) for Druid source. For example, /druid/v2/sql/ 10. In the Request Body field, enter an SQL query to fetch data from the Druid REST-compliant services. For more information about the query formats, see http://druid.io/docs/latest/tutorials/tutorial- query.html. 11. Click Save. 12. To make sure that the connection to Druid data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results The Druid web service is added and can be used to create your custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Configuring Cassandra database connector sources You can add Cassandra database to Cloud Application Business Insights by using Cassandra connector. Cassandra connector is based on Apache NiFi-platform.

Before you begin Ensure that the database that you want to connect is up and running, and is not Secure Sockets Layer (SSL) secured.

Procedure Complete the following steps to add Cassandra database by using Cassandra connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select Cassandra connector. 5. In the Connector Source Name field, enter a name for the source. Source name can contain alphanumeric characters and underscores.

Chapter 1. Administering 25 6. In the Cassandra Contact Points field, enter URL details for Cassandra database that you want to add. For example, server_IP_or_HOSTNAME:port Where: • server_IP_or_HOSTNAME is the IP address or host name where Cassandra is installed. • port is the default port for Cassandra. The default port is 9042. 7. Optional: In the Keyspace field, enter the key space value that is used during Cassandra configuration. A keyspace in Cassandra is a namespace that defines data replication on nodes. A cluster contains one keyspace per node. For more information about the Keyspace, see http:// cassandra.apache.org/doc/4.0/cql/ddl.html. 8. Optional: In the User Name field, enter a user name that is used during Cassandra configuration. 9. Optional: In the Password field, enter a password that is used during Cassandra configuration. 10. Click Next. 11. In the Query field, enter a sample SQL query for the database. For example, select * from sample_table For more information about query formats, see http://cassandra.apache.org/doc/latest/cql/ dml.html#select. 12. Click Save. 13. To make sure that the connection to the database is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results The Cassandra data source is added and can be used to create custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Configuring Elasticsearch connector sources You can add Elasticsearch application to Cloud Application Business Insights by using Elasticsearch connector. Elasticsearch connector is based on Apache NiFi-platform.

Before you begin Ensure that the Elasticsearch application instance that you want to connect is up and running, and is not Secure Sockets Layer (SSL) secured.

Procedure Complete the following steps to add an Elasticsearch application instance by using Elasticsearch connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed.

26 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights 4. From the Connector Type list, select Elasticsearch connector. 5. In the Connector Source Name field, enter a name for the source. Source name can contain alphanumeric characters and underscores. 6. In the Elasticsearch URL field, enter URL details for Elasticsearch instance that you want to add. For example, http://server_IP_or_HOSTNAME:port Where: • server_IP_or_HOSTNAME is the IP address or host name where Elasticsearch application is installed. • port is the default port for Elasticsearch. The default port is 9200. 7. Optional: In the User Name field, enter a user name that is used during Elasticsearch configuration. 8. Optional: In the Password field, enter a password that is used during Elasticsearch configuration. 9. Click Next. 10. In the Index field, enter an index value to fetch data from the Elasticsearch instance. 11. Optional: In the Type field, enter the type value to fetch data from the Elasticsearch instance. 12. In the Query field, enter a sample query for the Elasticsearch instance. For more information about query formats, see https://www.elastic.co/guide/en/elasticsearch/ reference/current/docs-get.html. 13. Optional: In the Fields field, enter the comma separated field values to fetch data from the Elasticsearch instance. 14. Click Save. 15. To make sure that the connection to the Elasticsearch instance is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results The Elasticsearch application instance is added and can be used to create custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Configuring MongoDB connector sources You can add MongoDB to Cloud Application Business Insights by using Mongo DB connector.

Before you begin Ensure that the database that you want to connect is up and running and is not Secure Sockets Layer (SSL) secured.

Procedure Complete the following steps to add MongoDB databases by using Mongo DB connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select Mongo DB connector.

Chapter 1. Administering 27 5. In the Connector Source Name field, enter a name for the source. Source name can contain alphanumeric characters and underscores. 6. In the Mongo URI field, enter URI details for MongoDB that you want to add. For example, mongodb://server_IP_or_HOSTNAME:port Where: • server_IP_or_HOSTNAME is the IP address or host name where MongoDB is installed. • port is the default port for MongoDB. The default port is 27017. 7. In the Mongo Database Name field, enter the name of the MongoDB database. 8. In the Mongo Collection Name field, enter the collection name of the MongoDB database. 9. Click Next. 10. In the Query field, enter a sample query in JSON format for the MongoDB database. For example, {"id":"value"} For more information about query formats, see https://docs.mongodb.com/manual/tutorial/query- embedded-documents/. 11. Click Save. 12. To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results MongoDB is added and can be used to create custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Configuring Nagios XI connector sources Use Nagios XI connector to connect to Nagios XI REST-compliant services. Nagios XI connector is based on Apache NiFi-platform and uses REST APIs for data retrieval.

Before you begin • Ensure that the web service that you want to connect is up and running. • For the sample request, if you need to provide custom headers, then ensure that you have that information available with you. Custom headers are request headers that are needed to retrieve REST API information from a web service. For example, Authorization request headers or Accept-Charset request headers. Authorization request headers contain authorization information that is required by the REST services. Accept-Charset indicates the acceptable data characters that the response must contain.

Procedure Complete the following steps to add Nagios XI REST-compliant services by using Nagios XI connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed.

28 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights 4. From the Connector Type list, select Nagios XI connector. 5. In the Connector Source Name field, enter a name for the Nagios XI source. Source name can contain alphanumeric characters and underscores. 6. In the Endpoint URL field, enter URL details for the Nagios XI web service in the following format: https://server_IP_or_HOSTNAME/nagiosxi/api/v1 Where, server_IP_or_HOSTNAME is the IP address or host name of the server where the Nagios XI web service is hosted. 7. From the Authentication Type list, select none or Basic Authentication based on whether user name and password are configured during the installation of the web service. If you select Basic Authentication, then you must complete the following steps: a. In the User Name field, enter a user name that is used during the configuration of the web service. b. In the Password field, enter a password that is used during the configuration of the web service. 8. Click Next. 9. From the Method list, select GET method, and in the URI field, enter the uniform resource identified (URI) for Nagios XI source. For example, URI?apikey=API_KEY&pretty=1 Where: • URI is the resource URI. • API_KEY is the security token to access Nagios XI REST APIs. For more information about URL formats, see https://support.nagios.com/kb/article.php?id=176. 10. Click Save. 11. To make sure that the connection to Nagios XI data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results The Nagios XI web service is added and can be used to create your custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Configuring Prometheus connector sources Prometheus is a application that records time series data and that resides in a cloud environment. The Prometheus connector connects to the REST API services of the Prometheus application. The Prometheus connector is based on Apache NiFi-platform and uses REST APIs for data retrieval.

Before you begin • Ensure that the Prometheus system that you want to connect is up and running.

• To access Prometheus REST API services from IBM Cloud Private, you need an authentication token to be generated dynamically by using as a custom filter. For more information, see Creating a custom filter for Prometheus token-based authentication topic in Using IBM Cloud Application Business Insights.

Chapter 1. Administering 29 • For the sample request, if you need to provide custom headers, then ensure that you have that information available with you. Custom headers are request headers that are needed to retrieve REST API information from a Prometheus system. For example, Authorization request headers or Accept- Charset request headers. Authorization request headers contain authorization information that is required by the Prometheus system. Accept-Charset indicates the acceptable data characters that the response must contain.

Procedure Complete the following steps to add a Prometheus REST API service by using the Prometheus connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select Prometheus connector. 5. In the Connector Source Name field, enter a name for the Prometheus source. Source name can contain alphanumeric characters and underscores. 6. In the Endpoint URL field, enter URL details for the Prometheus system in the following format: • http://server_IP_or_HOSTNAME:port • https://server_IP_or_HOSTNAME:port Where, • server_IP_or_HOSTNAME is the IP address or host name of the server where the Prometheus application is installed. • port is the default port for Prometheus. The default port is 9090. 7. From the Authentication Type list, select none or Basic Authentication based on whether user name and password are configured during the installation of Prometheus. Note: On IBM Cloud Private setup, you must select none. If you select Basic Authentication, then you must complete the following steps: a. In the User Name field, enter a user name that is used during the configuration of the Prometheus system. b. In the Password field, enter a password that is used during the configuration of the Prometheus system. 8. Click Next. 9. From the Method list, select GET method, and in the URI field, enter the uniform resource identified (URI) for Prometheus source. The URI must contain the parameter name and the start and end time in Universal Time Coordinated (UTC) format. For example, /api/v1/query_range? query=process_cpu_seconds_total&start=2019-07-04T08:00:19.699Z&end=2019-07- 04T18:30:19.699Z&step=5m You can specify instant queries, range queries, or alert queries only. For more information about URI formats, see https://prometheus.io/docs/prometheus/2.10/ querying/api/. 10. Optional: Under Custom Headers, complete the following steps:

30 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights • In the Name field, enter the request header name that is provided by the Prometheus REST API service provider. • In the Value field, enter the request header value that is provided by the Prometheus REST API service provider. Note: On IBM Cloud Private setup, in the Name field, enter Authorization and in the Value field, enter the authentication token that is generated by the following URL: https://server_IP_or_HOSTNAME:8443/idprovider/v1/auth/identitytoken Where: • server_IP_or_HOSTNAME is the IP address or host name of the server where the Cloud Application Business Insights is installed on IBM Cloud Private. • port is the default port of the IBM Cloud Private master node. The default port number is 8443. 11. Click Save. 12. To make sure that the connection to Prometheus data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: • Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request. • On IBM Cloud Private setup, if you do not enter any authentication token in the Custom Header field, then the test connection fails.

Results The Prometheus REST API service is added and can be used to create your custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Configuring SolarWinds connector sources Use SolarWinds connector to connect to SolarWinds REST-compliant services. SolarWinds connector is based on Apache NiFi-platform and uses REST API for data retrieval.

Before you begin • Ensure that the SolarWinds system that you want to connect is up and running. • For the sample request, if you need to provide custom headers, then ensure that you have that information available with you. Custom headers are request headers that are needed to retrieve REST API information from a SolarWinds system. For example, Authorization request headers or Accept- Charset request headers. Authorization request headers contain authorization information that is required by the SolarWinds system. Accept-Charset indicates the acceptable data characters that the response must contain.

Procedure Complete the following steps to add SolarWinds XI REST-compliant services by using SolarWinds connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select SolarWinds connector.

Chapter 1. Administering 31 5. In the Connector Source Name field, enter a name for the SolarWinds source. Source name can contain alphanumeric characters and underscores. 6. In the Endpoint URL field, enter URL details for the SolarWinds system in the following format: https://server_IP_or_HOSTNAME:port Where: • server_IP_or_HOSTNAME is the IP address or host name where SolarWinds application is installed. • port is the default port for SolarWinds. The default port is 17778. 7. From the Authentication Type list, select none or Basic Authentication based on whether user name and password are configured during the installation of the REST service. If you select Basic Authentication, then you must complete the following steps: a. In the User Name field, enter a user name that is used during the configuration of the REST service. b. In the Password field, enter a password that is used during the configuration of the REST service. 8. Click Next. 9. From the Method list, select a method and complete any of the following steps based on the method that you select: • – For GET method, in the URI field, enter the uniform resource identified (URI) for the source. – For POST method, in the URI field, enter the URI for the source, and in the Request Body, enter the post request. – Under Custom Headers, complete the following steps: - In the Name field, enter the request header name that is provided by the REST API service provider. - In the Value field, enter the request header value that is provided by the REST API service provider. For example, • URI for GET method: /SolarWinds/InformationService/v3/Json/Query?query=QUERY • URI POST method: /SolarWinds/InformationService/v3/Json/Query Request body for POST method:

{ "query":QUERY_WITH_PARAM", "parameters":{ "PARAM_NAME":"PARAM_VALUE" } }

Following are the variables that are used in the examples: • QUERY is the query in SolarWinds Query Language (SWQL) format. • QUERY_WITH_PARAM is the SWQL query with parameters. • PARAM_NAME is the parameter name that is used in the QUERY_WITH_PARAM query. • PARAM_VALUE is the parameter value for the parameter that is used in the QUERY_WITH_PARAM query. For more information about URL formats, see https://github.com/solarwinds/OrionSDK/wiki/REST. 10. Click Save.

32 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights 11. To make sure that the connection to SolarWinds data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Results The SolarWinds REST service is added and can be used to create your custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Generic Connectors Use generic connectors to connect to multiple supported data sources. List of generic Connectors:

Table 8. Generic Connectors Connector name Description jdbc Connects to any relation database that supports JDBC interface. The connector is integrated with Apache NiFi platform. The display name of the connector is JDBC. For more information about the supported third- party data sources, see Supported third-party database drivers for JDBC Connector topic in Installing IBM Cloud Application Business Insights.

restapi Connects to any generic REST-compliant web service. The connector is integrated with Apache NiFi platform and uses REST APIs for data retrieval. The display name of the connector is REST API.

realtime Connects to a Kafka server that publishes real-time data for applications. The connector subscribes to a specific topic that is published on the Kafka server. This connector is not integrated with Apache NiFi platform. The display name of the connector is Real Time.

Note: You can edit only the display name of the connector types.

Configuring third-party databases You can add third-party databases such as Db2, Oracle, PostgreSQL, MySQL, MSSQL, H2, and Derby by using JDBC connector. These databases support JDBC interface.

Before you begin • Ensure that the third-party database environments are up and running. • See the supported database and driver versions.

Chapter 1. Administering 33 For more information, see Supported third-party database drivers for JDBC Connector topic in Installing IBM Cloud Application Business Insights. • Download the related third-party database drivers. For more information, see “Downloading the database driver JAR files” on page 35.

Procedure Complete the following steps to add third-party databases by using JDBC connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select JDBC connector. 5. In the Connector Source Name field, enter a name of the third-party database. Source name can contain alphanumeric characters and underscores. 6. In the Database Connection URL field, enter URL for the third-party databases in the following format:

Table 9. Third-party database URLs Database name URL format MySQL jdbc:mysql://server_IP_or_HOSTNAME:database_name Oracle jdbc:oracle:thin:@server_IP_or_HOSTNAME:port:database _name Db2 jdbc:db2://server_IP_or_HOSTNAME:port/database_name MSSQL jdbc:sqlserver:// server_IP_or_HOSTNAME:port;database_name Derby jdbc:derby://server_IP_or_HOSTNAME:port/database_name H2 jdbc:h2:tcp:server_IP_or_HOSTNAME:port/database_name PostgreSQL jdbc:postgresql:server_IP_or_HOSTNAME:port/ database_name

Following are the variables that are used in the table: • server_IP_or_HOSTNAME is the IP address or host name of the third-party database. • port is the default port for the third-party database. • database_name is the name of the third-party database. 7. In the Database Driver Class Name field, enter driver class name for the third-party database in the following format:

Table 10. Third-party database driver class names Database name Driver class name MySQL com.mysql.jdbc.Driver Oracle oracle.jdbc.driver.OracleDriver Db2 com.ibm.db2.jcc.DB2Driver MSSQL com.microsoft.sqlserver.jdbc.SQLServerDriver Derby org.apache.derby.jdbc.ClientDriver

34 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Table 10. Third-party database driver class names (continued) Database name Driver class name H2 org.h2.Driver PostgreSQL org.postgresql.Driver 8. In the Database Driver Location field, enter the location of the database driver JAR file. For example, install_dir/connector/jdbcdrivers/driver.jar Where install_dir is Cloud Application Business Insights installation location. By default, it is /opt/ icabi. Note: Save the JAR files at the specified location. If you change this location, then the files cannot be backed up or restored by using the default backup and restore scripts that are provided along with Cloud Application Business Insights. See “Application backup and restore” on page 46. 9. In the Database User field, enter user name for the third-party database provider. 10. In the Password field, enter password for the third-party database provider. Note: Even if the third-party database administrators change passwords of the databases, the already active connections remain active and unchanged, unless the connections are deleted and added again. 11. Click Next. 12. In the Query field, enter a sample query for the third-party database. 13. Click Save. 14. To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: • Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request. • If you enter a sample query that retrieves larger number of records, then even if the data source is configured correctly, Test Connection displays an error message which indicates that the data source is not connected. To resolve this issue, you must enter a sample query which returns limited number of rows and is faster in execution.

Results The third-party database is added and it can be used to create data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Downloading the database driver JAR files If you want to use a database other than PostgreSQL database, you must download the related third- party database driver.

Procedure 1. Download the specific driver JAR file for the third-party database from the respective site. For example, • For the Oracle database, download ojdbc.jar from the following site: http://www.oracle.com/technetwork/database/features/jdbc/index-091264.html

Chapter 1. Administering 35 • For IBM Db2® specific drivers, download from the following site: http://www-01.ibm.com/support/docview.wss?uid=swg21363866 • For PostgreSQL database, download postgresql-.jar from the following site: https://jdbc.postgresql.org/download.html • For Derby database, download the db-derby-.tar.gz from the following site: https://db.apache.org/derby/derby_downloads.html • For Microsoft JDBC Driver for SQL Server, download smssql-jdbc-.jre8.jar from the following site: https://www.microsoft.com/en-us/download/details.aspx?id=56615 • For the H2 database, download h2-.jar from the following site: http://www.h2database.com/html/download.html • PostgreSQL is the default database. Therefore, you don’t need to download the driver JAR file for PostgreSQL database. It’s available at the following location: install_dir/prddb Where: install_dir is where Cloud Application Business Insights is installed.

2. Copy the downloaded JAR file in the install_dir/connector/jdbcdrivers/ folder. Where install_dir is Cloud Application Business Insights installation location. By default, it is /opt/ icabi. Note: You must save the JAR files at the mentioned location. If you change this location, then the files cannot be backed up or restored by using the default backup and restore scripts that are provided along with Cloud Application Business Insights.

Configuring REST API connector sources Use REST API connector to connect to all other generic REST-compliant services.

Before you begin • Ensure that the REST API service that you want to connect is up and running. • Ensure that the REST API service provides response in JSON array format. If the JSON response is in a format that is not supported by Cloud Application Business Insights, then data is not displayed on dashboards. Sample JSON array format

[ { "name": "John", "age": 30 }, { "name": "Greg", "age": 35 } ]

If the response is not in the required format, then you must add response transformations for such service by using Apache NiFi user interface. For more information, see “Adding response transformation for REST API connector sources” on page 38. • For the sample request, if you need to provide custom headers, then ensure that you have that information available with you. Custom headers are request headers that are needed to retrieve REST API information from a web service. For example, Authorization request headers or Accept-Charset request headers. Authorization request headers contain authorization information that is required by

36 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights the web services. Accept-Charset indicates the acceptable data characters that the response must contain.

Procedure Complete the following steps to add REST-compliant web services by using specific REST API connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select REST API connector. 5. In the Connector Source Name field, enter a name for the source. Source name can contain alphanumeric characters and underscores. 6. In the Endpoint URL field, enter URL details for the web service that you want to add in the following format: https://server_IP_or_HOSTNAME:port Where: • server_IP_or_HOSTNAME is the IP address or host name of the server where the web service is hosted. • port is the default port of the web service. If port number is not provided in the endpoint URL, then you can use 443 as the default port number. 7. From the Authentication Type list, select none or Basic Authentication based on whether user name and password are configured during the installation of the web service. If you select Basic Authentication, then you must complete the following steps: a. In the User Name field, enter a user name that is used during the configuration of the web service. b. In the Password field, enter a password that is used during the configuration of the web service. 8. Click Next. 9. From the Method list, select a method and complete any of the following steps based on the method that you select: • – For GET method, in the URI field, enter the uniform resource identified (URI) for the source. – For POST method, in the URI field, enter the URI for the source, and in the Request Body method, enter the post request. – Under Custom Headers, complete the following steps: - In the Name field, enter the request header name that is provided by the REST API web service provider. - In the Value field, enter the request header value that is provided by the REST API web service provider. For example, URI Where, URI is the resource URI. 10. Click Save. 11. To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

Chapter 1. Administering 37 Results The REST API service is added and can be used to create your custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Adding response transformation for REST API connector sources If the JSON response that is provided by a web service is in a format that is not supported by Cloud Application Business Insights, then data is not displayed on dashboards. You must add response transformations for such web service by using Apache NiFi user interface.

Procedure Complete the following steps to add response transformations for web services: 1. Log in to Apache NiFi user interface by using the following URL and user credentials: https://server_ip:8081/nifi Where, server_ip is the host name or IP address of the server where Cloud Application Business Insights is installed. User ID: adminDEFAULT Password: pswdDEFAULT A page that shows process groups, PRD Connector Data Service and PRD Connector Config Service is displayed. 2. Click PRD Connector Data Service, zoom in to the process flow, and click the DEFAULT process group. 3. Zoom in the DEFAULT process group, and click the REST API connector’s source that must be edited for response transformation. 4. Zoom in to highlight invokeHTTP processor, and add a response transformation between response relation of invokeHTTP processor and success port. For more information about adding response transformation, see https://nifi.apache.org/developer- guide.html You can also contact IBM support professional services to add such response transformations.

Example As an example, following are the steps to add response transformation for Tivoli Netcool/OMNIbus REST API by using JoltTransformJSON processor. The JoltTransformJSON processor converts the JSON response that is received from Tivoli Netcool/OMNIbus to a JSON format that is required by Cloud Application Business Insights. 1. Log in to Apache NiFi user interface. A page that shows PRD Connector Data Service and PRD Connector Config Service is displayed. 2. Click PRD Connector Data Service, zoom in to the process flow, and click the DEFAULT process group. 3. Zoom in the DEFAULT process group, and click the Tivoli Netcool/OMNIbus REST API source that must be edited for response transformation. 4. Zoom in to highlight invokeHTTP processor.

5. From the toolbar, drag the Processor icon in between invokeHTTP processor and success port. An Add Processor window opens. 6. In the Add Processor window, from the Type column, select JoltTransformJSON, and then click ADD.

38 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights The JoltTransformJSON processor is added to the process flow. 7. Right-click JoltTransformJSON > Configure. 8. In the Configure Processor window, click the PROPERTIES tab, and in the Jolt Specification window, enter the transformation specification that must be used for Tivoli Netcool/OMNIbus. For example,

[{ "operation": "shift", "spec": { "rowset": { "rows": { "*": "rows[]" } } } }, { "operation": "shift", "spec": { "rows": "" } }]

9. Click OK, and APPLY. 10. To stop all the process flows, in the Operate palette, click the stop icon. 11. Connect invokeHTTP processor, JoltTransformJSON , and success port. Select the required relationships from Create Connection window. 12. To start all the process flows, in the Operate palette, click the start icon. Related information NiFi - System requirements

Configuring the Real Time connector sources Use Real Time connector to connect to a Kafka server that publishes real-time data for applications. The connector uses a Real Time Data Collector that subscribers to a specific topic on the Kafka server and renders that data to Cloud Application Business Insights.

Before you begin • Ensure that Real Time Data Collector is installed and is up and running. Usually, it is installed along with Cloud Application Business Insights. However, if you installed Real Time Data Collector on a separate server, then ensure that is up and running, and connected to Cloud Application Business Insights. To establish a connection, install the Real Time Data Collector security certificate on the computer where you are accessing Cloud Application Business Insights. For more information, see the Restarting Real Time Data Collector topic in Installing IBM Cloud Application Business Insights. • Ensure that Kafka server is up and running and data is generated for the Kafka topics that you subscribed to and want to display on the dashboard. Authentication must not be enabled on the Kafka server. • Ensure that the Kafka topic renders data at a frequency of one data point per second. If the data is rendered at a higher frequency, then the real-time widgets or dashboards do not display any data. • Ensure that the JSON response that is received from the Kafka server is in JSON array format. If the JSON response is in a format that is not supported by Cloud Application Business Insights, then data is not displayed on dashboards. Sample JSON array format

[ { "name": "John", "age": 30 },

Chapter 1. Administering 39 { "name": "Greg", "age": 35 } ]

Procedure Complete the following steps to add real-time data sources by using a Real Time connector: 1. Open Dashboard Designer. 2. In the navigation pane, go to Connector & Sources > Connector Sources. 3. In the Connector Sources tab, click Add Source. An Add Connector Source window is displayed. 4. From the Connector Type list, select Real Time connector. 5. In the Connector Source Name field, enter a name for the source. Source name can contain alphanumeric characters and underscores. 6. In the Real Time Data Collector Server IP or Host Name field, enter the IP address or host name of the server where Real Time Data Collector is installed. 7. In the Port Number field, enter the port number of the Real Time Data Collector. The default port is 7443. 8. Click Save.

Results The real-time data source is added and can be used to create your custom data definitions. For more information, see Creating a custom data definition topic in Using IBM Cloud Application Business Insights.

Managing connector sources You can view, search, modify, and delete the connector sources.

Procedure • To view, search, or delete connector sources, complete the following steps: a) In the left navigation pane of Dashboard Designer, click Connector & Sources > Connector Sources. The All Connector Sources page opens in a new tab. The page displays all the already created data sources. b) Complete any of the following steps: – To find all the data sources that belong to a connector, from the View list, select a connector type. – To find a connector source, enter the name of the source in the Search field.

– To delete a connector source, click the Delete icon that is displayed on that source row. • To modify APM, ITM, or Cloud App Management connector source, complete the following steps:

a) Under the Actions column for a source, click icon. b) In the Edit Connector Source window, complete any of the following steps based on the connector source: – For APM and ITM data sources, in the Username and Password field, edit the user name and password for the source.

40 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights – For Cloud App Management data sources, in the Tenant ID field, edit the tenant ID for the source. c) To make sure that the connection to edited data source is active and successful, click Test Connection. If the connection to the data source fails, then a message that indicates the reason for connection failure is displayed. You must fix the errors and test the connection again. d) Click Save. • To modify the third-party JDBC connector sources that use JDBC connector, complete the following steps:

a) Under the Actions column for a third-party connector source, click icon. An Edit Connector Source window opens. b) Edit any of the following fields: – In the Database Driver Class Name field, enter the driver class name for the third-party data source. – In the Database Driver Location field, enter the location of the directory where the JAR file of the database is placed. For example, install_dir/connector/jdbcdrivers/ jar_name.jar folder. Where install_dir is Cloud Application Business Insights installation location. By default, it is /opt/icabi. – In the Database User field, enter user name for the third-party database provider. – In the Password field, enter password for the third-party database provider. Note: Even if the third-party database administrators change passwords of the databases, the already active connections remain active and unchanged, unless the connections are deleted and added again. c) Click Next. d) In the Query field, enter or edit the sample query for the third-party database. e) Click Save. f) To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request. • To modify REST API, Nagios XI, QRadar, SolarWinds, Druid, Prometheus, or Tivoli Netcool/OMNIbus services, complete the following steps, based on the connector type:

a) Under the Actions column for any REST API services or Tivoli Netcool/OMNIbus service, click icon. An Edit Connector Source window opens. b) From the Authentication Type list, select none or Basic Authentication based on whether user name and password are configured during the installation of the service. If you select Basic Authentication, then you must complete the following steps: a. In the User Name field, enter a user name to connect to the web service. b. In the Password field, enter a password to connect to the web service. c) Click Next, and complete the following steps to modify the sample request.

Chapter 1. Administering 41 d) From the Method list, select a method and complete any of the following steps based on the method that you select: – For GET method, in the URI field, enter the uniform resource identified (URI) for the source. Note: For Nagios XI, Prometheus, and QRadar connectors, you can select GET method only. – For POST method, in the URI field, enter the URI for the source, and in the Request Body method, enter the post request. Note: For Druid connector, you can select POST method only. – Under Custom Headers, complete the following steps: - In the Name field, enter the request header name that is provided by the REST API web service provider. - In the Value field, enter the request header value that is provided by the REST API web service provider. e) Click Save. f) To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request. • To modify a Cassandra data source, complete the following steps:

a) Under the Actions column for a Cassandra database, click icon. An Edit Connector Source window opens. b) Optional: In the Keyspace field, enter a Keyspace value. Note: If you enter an incorrect Keyspace value the first time, then Cloud Application Business Insights does not display an error message. However, QueryCassandra processor for that Cassandra source displays the following error message on its bulletin on Apache NiFi user interface: Failed to properly initialize Processor. If still scheduled to run, NiFi will attempt to initialize and run the Processor again after the 'Administrative Yield Duration' has elapsed. For more information about resolving the error, see PRDCONNCS018E error displayed on updating Keyspace value for Cassandra data source. c) Optional: In the User Name field, enter a user name to connect to the Cassandra database. d) Optional: In the Password field, enter a password to connect to the Cassandra database. Note: If you enter a user name, then you must ensure that you also enter a password. You can either leave both the fields blank or enter information in both the fields. e) Click Next. f) In the Query field, enter or edit the sample query for the Cassandra database. g) Click Save. h) To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request.

42 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights • To modify a MongoDB data source, complete the following steps:

a) Under the Actions column for a MongoDB database, click icon. An Edit Connector Source window opens. b) Optional: In the Mongo Database Name field, enter the name of the MongoDB database. c) In the Mongo Collection Name field, enter the collection name of the MongoDB database. d) Click Next. e) In the Query field, enter or edit the sample query for the MongoDB database. f) Click Save. g) To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request. • To modify an Elasticsearch data source, complete the following steps:

a) Under the Actions column for an Elasticsearch data source, click icon. An Edit Connector Source window opens. b) Optional: In the User Name field, enter a user name that is used during Elasticsearch instance configuration. c) Optional: In the Password field, enter a password that is used during Elasticsearch instance configuration. d) Click Next. e) In the Index field, enter an index value to fetch data from the Elasticsearch instance. f) Optional: In the Type field, enter the type value to fetch data from the Elasticsearch instance. g) In the Query field, enter a sample query for the Elasticsearch instance. h) Optional: In the Fields field, enter the field values to fetch data from the Elasticsearch instance. i) Click Save. j) To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request. • To modify an Apache Hive data source, complete the following steps:

a) Under the Actions column for an Apache Hive data source, click icon. An Edit Connector Source window opens. b) If you changed the location of the XML file, then in the Hive Configuration Resources field, enter the modified path to the XML configuration file. For more information, see “Configuring Apache Hive connector sources” on page 23. c) Optional: In the Database User field, enter a user name that is used during Apache Hive configuration. d) Optional: In the Password field, enter a password that is used during Apache Hive configuration.

Chapter 1. Administering 43 e) Click Next. f) In the Query field, enter a sample query for Apache Hive data source. g) Click Save. h) To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request. • To modify an HBase data source, complete the following steps:

a) Under the Actions column for an HBase data source, click icon. An Edit Connector Source window opens. b) If you changed the location of the XML configuration files, then in the Hadoop Configuration Files field, enter the modified path to the XML configuration files. For more information, see “Configuring Apache HBase database connector sources” on page 21. c) In the Table Name field, enter any other database table name from which you want to retrieve data. d) Optional: In the Start Rowkey field, enter any other row key value from which you want to retrieve data. e) Optional: In the End Rowkey field, enter any other row key value up to which you want to retrieve the data. Note: Data is retrieved for all the row key values that lie between the start and the end row keys, including the start row key data. The end row key data is not retrieved. f) Optional: In the Columns field, enter column family names or column names for which you want to retrieve data. Note: Enter values either in the Columns field or in the Filter Expression field. For more information about the End Rowkey and Columns fields, see “Configuring Apache HBase database connector sources” on page 21. g) Optional: In the Filter Expressions field, enter any of the HBase filter expressions. For more information, see https://www.cloudera.com/documentation/enterprise/5-5-x/topics/ admin_hbase_filtering.html h) Click Save. i) To make sure that the connection to the data source is successful, click Test Connection. If the source details and the sample request are valid and complete, then a message indicating that the connection is successful is displayed. For connection failure messages, you must fix the errors that are mentioned in the messages and test the connection again. Note: Even if the source details are valid, a connection failure message might be displayed due to an invalid response received for the sample request. • To modify a real-time data source, complete the following steps:

a) Under the Actions column for a real-time data source, click icon. An Edit Connector Source window opens. b) Optional: If you changed the Cloud Application Business Insights installation server or if you installed the Real Time Data Collector on another server, then complete the following steps:

44 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights – In the Real Time Data Collector Server IP or Host Name field, enter the IP address or host name of the server where Real Time Data Collector is installed. – In the Port Number field, enter the port number for the Real Time Data Collector. c) Click Save.

Configuring the scheduler Create the scheduled reports at a specified time and date as one time tasks or recurring tasks. You can also email the reports to yourself or any one else in your organization. You can configure and administer the scheduler as needed.

Configuring the scheduler email Use this information for configuring the email settings that can be used for sending the scheduled dashboards from Engine.

Procedure Update the bootstrap.properties file with the required information. • Open the bootstrap.properties file from the following location:

cd $install_dir/wlp/usr/servers/icabi vi bootstrap.properties

Where: install_dir is the directory where you installed the application. Update the following section:

#Add SMTP relay email.smtp.relay=

If required, update the value of prdEmailFrom parameter in following section:

#scheduler mail configuration [email protected]

• Save the file. • Restart the Cloud Application Business Insights server.

Updating the scheduler user login details Run the update_scheduler_user_login_template.sh script to update the user login details for creating the scheduled reports.

About this task The update_scheduler_user_login_template.sh script must be run in the following scenarios: • When the Cloud Application Business Insights application is configured on a custom port. For more information, see the Changing default port numbers section in Installing IBM Cloud Application Business Insights. • When you want to change the context root to a non-default one. • When you must reset your user name and password according to your organization's password policy settings.

Procedure 1. Run the update_scheduler_user_login_template.sh file as follows:

Chapter 1. Administering 45 •

cd $install_dir/prdutil ./update_scheduler_user_login_template.sh

Where: install_dir is the directory where you installed the application.

cd $install_dir/postinstallation/prdutil ./update_scheduler_user_login_template.sh

Where: install_dir is the directory where you extracted the UI TAR file. When prompted, provide the following details:

Enter 1 to update scheduler user credentials. Enter 2 to update user name of the scheduler user. Enter 3 to update password of the scheduler user. Enter any other key to get information about the scheduler user. Enter your choice:

Where: • User name to authenticate to the Engine. • Password for the user. It is saved in encrypted format in the database. Important: Update the user name and password is the basic-registry.xml file for on-premise installation and user-registry.yaml file for IBM Cloud Private installation. 2. If you have updated the scheduler user's user name, restart the Cloud Application Business Insightsserver.

Application backup and restore Provides information about essential features to back up and restore your dashboard data and connector data that is stored in the default PostgreSQL database and connector folder. The database and connector folders are bundled with Cloud Application Business Insights application.

Performing data backup To prevent data loss if there’s a service disruption or data corruption, you must backup the data at regular, predefined time intervals. You can trigger a manual backup by using the backup.sh script or you can schedule a backup by setting up a cron job.

Before you begin Ensure that the third-party database driver JAR files are placed at install_dir/connector/ jdbcdrivers Where install_dir is Cloud Application Business Insights installation location. By default, it is /opt/ icabi. Note: If the JAR files are not available at the mentioned location, then the files are not backed up or restored by using the default backup and restore scripts that are provided along with Cloud Application Business Insights.

About this task Data backup in Cloud Application Business Insights is done by using the backup.sh script that is available in install_dir folder.

46 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Where: install_dir is the directory where you installed the application. By default, it is /opt/icabi.

Procedure 1. Go to the directory where you installed Cloud Application Business Insights. 2. Run the backup script by using the following command:

./backup.sh /backup-dir

Where backup-dir is the folder where the ICABI_Backup folder must be created. A ICABI_Backup folder is created in the backup-dir directory that contains the The ICABI_Backup____.tar file. This TAR file contains the database, connector, and Cloud Application Business Insights application backup. Where: • is the Cloud Application Business Insights host name. • is the tenant name. For example, default. • is day of the year in year-month-day format. • is the time of the day in hour-minute-second format. Note: During the backup operation, the Dashboard Designer tool is unresponsive. You can’t modify and save the existing dashboards or create new ones until the operation is complete.

Results The backup.sh script archives the following files along with dashboard data and connector data in the database: • Some configuration files from the following folders: – install_dir/wlp/usr/servers/icabi – install_dir/wlp/usr/servers/icabiap – install_dir/icabi_config • Some properties files from install_dir/wlp/usr/servers/icabi folder: – config – config/.properties – config/logback.properties – config/oed_connector_core.properties – bootstrap.properties – jvm.options – lib – lib/RestFullConf.jar – lib/postgresql-42.2.5.jar – server.env • Some key files from install_dir/wlp/usr/servers/icabiap folder: – basic-registry.xmlserver.env – bootstrap.properties – config – resources/ – resources/AES_Encryptor.sh – resources/security/

Chapter 1. Administering 47 – resources/security/key.jks – resources/security/ltpa.keys – config/auth_app.properties – jvm.options The backup script also archives the following folders and files from the install_dir/connector folder: • From apacheds-2.0.0-M24 folder, the organization folder that is placed at the following location is archived: /instances/default/partitions/organization • jdbcdrivers folder • From nifi-1.8.0 folder, the following folders are archived: – bin – conf – lib – template – docs – conf - only the bootstrap.conf file is archived. • From prdrtdc folder, the prdrtdc.properties file is archived. • From prdutil folder, the config.ini and importUser files are archived. Note: If you added any other files, then you must ensure that you back up and restore those files separately. Related tasks “Downloading the database driver JAR files” on page 35 If you want to use a database other than PostgreSQL database, you must download the related third- party database driver.

Restoring the backup data After you install Cloud Application Business Insights, if there’s a service disruption or data corruption, then you can use this information to restore a full backup from a specified location.

Before you begin • Ensure that the backup-dir contains the compressed backup folder, ICABI_Backup. • Make sure your Cloud Application Business Insights application is up and running.

About this task The backup contents are restored in Cloud Application Business Insights by using the restore.sh script that is available in install_dir folder. Where: install_dir is the directory where you installed the application. By default, it is /opt/icabi. The restore.sh script restores the backup data.

Procedure 1. Go to the directory where you installed Cloud Application Business Insights. 2. Run the restore script by using the following command:

./restore.sh /ICABI_Backup____.tar

Where:

48 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights • is the Cloud Application Business Insights host name. • is the tenant name. For example, default. • is day of the year in year-month-day format. • is the time of the day in hour-minute-second format. Important: • The install_dir must be the same for backup and restore operations. For example, if the install_dir on the server from where backup is performed, it must be the same on the server where the data is restored.

What to do next If you want to restore your data on a different server, complete the following steps. 1. Open the bootstrap.properties files from the following location:

cd $install_dir/wlp/usr/servers/icabiap vi boostrap.properties

2. Modify the values of currentServer.hostname and oidc.client1.hostname with the hostname or IP address of your server. 3. Open the bootstrap.properties files from the following location:

cd $install_dir/wlp/usr/servers/icabi vi boostrap.properties

4. Modify the value of oidc.hostname with the hostname or IP address of your server. 5. Restart the Cloud Application Business Insights server. For more information, see Restarting the Cloud Application Bussiness Insights server topic, from Administering IBM Cloud Application Business Insights. Note: After the restore, you might find that some of the NiFi-based connectors do not work correctly. To resolve this issue, configure the connectors again from Dashboard Designer and test the connection success. For more information, see Connectors for other products section in Administering IBM Cloud Application Business Insights. Related tasks “Performing data backup” on page 46 To prevent data loss if there’s a service disruption or data corruption, you must backup the data at regular, predefined time intervals. You can trigger a manual backup by using the backup.sh script or you can schedule a backup by setting up a cron job. “Restarting the Cloud Application Business Insights server” on page 11 You can restart the Cloud Application Business Insights server, Cloud Application Business Insights OIDC server, and PostgreSQL database by running the stop_icabi.sh and start_icabi.sh commands.

Additional configuration settings Use this information to do some additional configuration settings in your Cloud Application Business Insights environment. Use these settings as applicable for your specific usage scenario. These tasks are one-time tasks that can be performed by your Application Administrators to improve the system performance.

Chapter 1. Administering 49 Enabling the display of external application URL Configure the HTTP security headers to enable IFrames so that you can embed a Cloud Application Business Insights dashboard in an external application and embed an external web application in Cloud Application Business Insights widgets.

About this task Configurable parameters are provided in the bootstrap.properties file to embed and render the Cloud Application Business Insights application in an external website or web application and to allow an external website or web application to embed into Cloud Application Business Insights.

Procedure 1. Configure the HTTP security header to enable IFrames as follows: • To allow an external website or web application to embed in Cloud Application Business Insights, open the bootstrap.properties file and modify the value of ALLOW_IFRAME_INSIDE_PRD parameter and set it as true. For example, ALLOW_IFRAME_INSIDE_PRD=true. The bootstrap.properties file is present at install_dir/wlp/usr/servers/icabi. • To allow Cloud Application Business Insights to embed in an external website or web application, open the bootstrap.properties files from the following and modify the value of ALLOW_PRD_INSIDE_IFRAME parameter and set it as true. For example, ALLOW_PRD_INSIDE_IFRAME=true. The bootstrap.properties file is present at the following locations:

– cd $install_dir/wlp/usr/servers/icabi

– cd $install_dir/wlp/usr/servers/icabiap

By default, the value of ALLOW_IFRAME_INSIDE_PRD parameter and ALLOW_PRD_INSIDE_IFRAME parameter is set as false. 2. Restart the Cloud Application Business Insights server. For more information, see Restarting the Cloud Application Business Insights topic from Administering IBM Cloud Application Business Insights.

Modifying the default number of rows retrieved from the JDBC data source You can modify the number of rows that are retrieved from the JDBC data source by using the JDBC Connector.

Procedure Complete the following steps: Note: install_dir is the directory where you installed the application. By default, it is /opt/icabi. 1. Stop the Connector service by using the following command:

cd $install_dir/connector/scripts ./stop_connector.sh

2. Edit the prdconfigdb.properties file to add the maxRowsLimit parameter as follows:

cd $install_dir/connector/nifi-1.8.0/conf prdconn.ds.jdbc.query.maxRowsLimit=15000

Note: By default, the maximum rows that are retrieved are 15000. Modify this value according to your requirement. Too high value might impact the dashboard performance. 3. Start the Connector service with the following command:

50 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights cd $install_dir/connector/scripts ./start_connector.sh

Setting the engine timeout interval from command line You can modify the timeout value to display large amount of data streaming from connector or database without interruption on the Cloud Application Business Insights Engine.

Procedure Complete the following steps to modify the timeout value from command line: 1. Open the config.ini file from the following location: • For on premises installation package:

cd $install_dir/prdutil vi config.ini

Where: install_dir is the directory where you installed the application. By default, it is /opt/icabi. • For IBM Cloud Private installation package:

cd $install_dir/postintallation/prdutil vi config.ini

Where: install_dir is the directory where you extracted the UI TAR file. 2. Modify the value of SetPreferenceValue parameter as true. For example, SetPreferenceValue=true 3. Modify the value of PreferenceValue. You can change PreferenceValue to increase or decrease the refresh interval. The default timeout is 30 seconds. 4. Use the following command to run the update_preference_value.sh script.

./update_preference_value.sh

Configuring the keystore password Configure a unique password to the cryptographic keystore, and encrypt it with the AES encryption algorithm.

About this task If you want to change the keystore password into a custom password, follow these instructions:

Procedure 1. Go to keystore file location in Cloud Application Business Insights server.

cd $install_dir/wlp/usr/servers/icabi/resources/security

2. Change the key password by running the following command:

keytool -keypasswd -keystore key.jks -alias default

When prompted, provide the following inputs: • Existing key password. By default, it is persistent123. • New password and confirm the same password. 3. Change the keystore password by running the following command:

keytool -storepasswd -keystore key.jks

When prompted, provide the following inputs:

Chapter 1. Administering 51 • Existing keystore password. By default, it is persistent123. • New password and confirm the same password. Make sure that the password for key and keystore are the same. 4. Encrypt the newly generated password using AESEncryption.sh file. For more information, see Encrypting passwords in Installing IBM Cloud Application Business Insights. 5. Add the encrypted password in the install_dir/wlp/usr/servers/icabi/server.xml file to following tag:

6. Restart the Cloud Application Business Insights server. For more information, see the Restarting the Cloud Application Business Insights server topic in Administering IBM Cloud Application Business Insights.

What to do next Make sure all your configured Connector Sources are working correctly.

Configuring logging You can modify log level, number of log files to be retained, and size of the log file that is related to Dashboard Designer tool and Engine by using the logback.properties file.

About this task The logback.properties file that is located in install_dir/wlp/usr/servers/icabi/config has the following parameters with default values: • log_level=INFO The type of log messages that are saved in the log files depend on the log level that is defined in the logback.properties file. By default, it is INFO. The following table shows the log levels and the messages that are captured for each option:

Table 11. Log level rules for different options Logging level DEBUG INFO WARN ERROR DEBUG YES YES YES YES INFO NO YES YES YES WARN NO NO YES YES ERROR NO NO NO YES • log_min_index=1 Index of most recent log file. • log_max_index=10 Index of oldest log file that must be retained. • log_max_size=10MB It is the size of each log file. Important: • The values of log_min_index, log_max_index, and log_max_size parameters must be numeric. • The value of log_max_index must be greater than log_min_index. • Number of log files generated are (log_max_index - log_min_index) + 1.

52 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Procedure Complete the following steps to modify the log level, number of log files to be retained, and size of the log file in the logback.properties file: 1. Open the logback.properties file as follows:

cd $install_dir/wlp/usr/servers/icabi/config vi logback.properties

Where: install_dir is the directory where you installed the application. 2. Enter the required values for log_level, log_min_index, log_max_index, and log_max_size parameters. 3. Save the file. 4. Restart the Cloud Application Business Insights application. Related tasks “Restarting the Cloud Application Business Insights server” on page 11 You can restart the Cloud Application Business Insights server, Cloud Application Business Insights OIDC server, and PostgreSQL database by running the stop_icabi.sh and start_icabi.sh commands.

Configuring SMTP server for email notifications The simple mail transfer protocol (SMTP) configuration specifies how Cloud Application Business Insights connects to the mail server that delivers email notifications. The server can reside external to Cloud Application Business Insights. Contact your email system administrator to set up an account.

Procedure Update the bootstrap.properties file in Cloud Application Business Insights server as follows:

cd $install_dir/wlp/usr/servers/icabi vi bootstrap.properties

Where: install_dir is the directory where you installed the application. Update the following section:

#Configure from Email email.from=Sender's email ID #Add SMTP relay email.smtp.relay= email.smtp.port=25 email.smtp.username= email.smtp.password=

#Configure from Email [email protected] email.smtp.relay=myserver.ibm.com email.smtp.port=25 email.smtp.username=abc email.smtp.password=teramera

For more information about sending a dashboard PDF by email, see Viewing dashboards section in Using IBM Cloud Application Business Insights.

Chapter 1. Administering 53 Configuring to use the utility scripts on IBM Cloud Private Complete the following configuration settings to use the Cloud Application Business Insights utility scripts on IBM Cloud Private.

Procedure Complete the following configuration setting in prdutil_config.sh file. 1. Open prdutil_config.sh file from the following location:

cd $install_dir/postinstallation/prdutil vi prdutil_config.sh

Where: install_dir is the directory where you extracted the UI TAR file. 2. Modify the value of ICABI_DB_SVC_HOST parameter with database service IP address. Note: Do not use the Domain Service Name. 3. Modify the JAVA_HOME parameter with the path where you installed JAVA on your machine. Important: You must provide a valid JAVA_HOME path for utility scripts to work as expected on IBM Cloud Private. 4. Open the import_user_config.properties file from the following location:

cd $install_dir/postinstallation/prdutil/lib vi import_user_config.properties

Where: install_dir is the directory where you extracted the UI TAR file. 5. Modify the value of ICABI_DB_SVC_HOST parameter with database service IP address and enter the database password for PRD_DATABASE_PASSWORD parameter. Note: Do not use the Domain Service Name.

What to do next You can use following scripts present at install_dir/postinstall/prdutil directory. • import_users.sh This script import the users from .csv file. For more information, see Importing users in bulk topic from Administering IBM Cloud Application Business Insights. • update_preference_value.sh The script controls the data refresh frequency for Cloud Application Business Insights Engine. For more information, see Setting the engine time out interval from command line topic from Administering IBM Cloud Application Business Insights. • icabi_user_info.sh This script generates a .slmtag file that provides information about the authorized users that use the Cloud Application Business Insights application and their count with license details. The .slmtag file is placed at install_dir/postinstall/prdutil/data/icabiuserinfo. • update_scheduler_user_login_template Use this script to update the user name and password for scheduler user. For more information, see Updating the scheduler user login details topic from Administering IBM Cloud Application Business Insights.

54 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Notices

This information was developed for products and services offered in the US. This material might be available from IBM in other languages. However, you may be required to own a copy of the product or product version in that language in order to access it. IBM may not offer the products, services, or features discussed in this document in other countries. Consult your local IBM representative for information on the products and services currently available in your area. Any reference to an IBM product, program, or service is not intended to state or imply that only that IBM product, program, or service may be used. Any functionally equivalent product, program, or service that does not infringe any IBM intellectual property right may be used instead. However, it is the user's responsibility to evaluate and verify the operation of any non-IBM product, program, or service. IBM may have patents or pending patent applications covering subject matter described in this document. The furnishing of this document does not grant you any license to these patents. You can send license inquiries, in writing, to: IBM Director of Licensing IBM Corporation North Castle Drive, MD-NC119 Armonk, NY 10504-1785 US For license inquiries regarding double-byte character set (DBCS) information, contact the IBM Intellectual Property Department in your country or send inquiries, in writing, to: Intellectual Property Licensing Legal and Intellectual Property Law IBM Japan Ltd. 19-21, Nihonbashi-Hakozakicho, Chuo-ku Tokyo 103-8510, Japan INTERNATIONAL BUSINESS MACHINES CORPORATION PROVIDES THIS PUBLICATION "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Some jurisdictions do not allow disclaimer of express or implied warranties in certain transactions, therefore, this statement may not apply to you. This information could include technical inaccuracies or typographical errors. Changes are periodically made to the information herein; these changes will be incorporated in new editions of the publication. IBM may make improvements and/or changes in the product(s) and/or the program(s) described in this publication at any time without notice. Any references in this information to non-IBM websites are provided for convenience only and do not in any manner serve as an endorsement of those websites. The materials at those websites are not part of the materials for this IBM product and use of those websites is at your own risk. IBM may use or distribute any of the information you provide in any way it believes appropriate without incurring any obligation to you. Licensees of this program who wish to have information about it for the purpose of enabling: (i) the exchange of information between independently created programs and other programs (including this one) and (ii) the mutual use of the information which has been exchanged, should contact: IBM Director of Licensing IBM Corporation North Castle Drive, MD-NC119 Armonk, NY 10504-1785 US

© Copyright IBM Corp. 2018, 2019 55 Such information may be available, subject to appropriate terms and conditions, including in some cases, payment of a fee. The licensed program described in this document and all licensed material available for it are provided by IBM under terms of the IBM Customer Agreement, IBM International Program License Agreement or any equivalent agreement between us. The performance data discussed herein is presented as derived under specific operating conditions. Actual results may vary. Information concerning non-IBM products was obtained from the suppliers of those products, their published announcements or other publicly available sources. IBM has not tested those products and cannot confirm the accuracy of performance, compatibility or any other claims related to non- IBMproducts. Questions on the capabilities of non-IBM products should be addressed to the suppliers of those products. Statements regarding IBM's future direction or intent are subject to change or withdrawal without notice, and represent goals and objectives only. This information is for planning purposes only. The information herein is subject to change before the products described become available. This information contains examples of data and reports used in daily business operations. To illustrate them as completely as possible, the examples include the names of individuals, companies, brands, and products. All of these names are fictitious and any similarity to actual people or business enterprises is entirely coincidental. COPYRIGHT LICENSE: This information contains sample application programs in source language, which illustrate programming techniques on various operating platforms. You may copy, modify, and distribute these sample programs in any form without payment to IBM, for the purposes of developing, using, marketing or distributing application programs conforming to the application programming interface for the operating platform for which the sample programs are written. These examples have not been thoroughly tested under all conditions. IBM, therefore, cannot guarantee or imply reliability, serviceability, or function of these programs. The sample programs are provided "AS IS", without warranty of any kind. IBM shall not be liable for any damages arising out of your use of the sample programs. Each copy or any portion of these sample programs or any derivative work must include a copyright notice as follows: © (your company name) (year). Portions of this code are derived from IBM Corp. Sample Programs. © Copyright IBM Corp. 2005, 2016.

Trademarks IBM, the IBM logo, and ibm.com are trademarks or registered trademarks of International Business Machines Corp., registered in many jurisdictions worldwide. Other product and service names might be trademarks of IBM or other companies. A current list of IBM trademarks is available on the web at "Copyright and trademark information" at www.ibm.com/legal/copytrade.shtml. Linux is a trademark of Linus Torvalds in the United States, other countries, or both. Microsoft, Windows, Windows NT, and the Windows logo are trademarks of Microsoft Corporation in the United States, other countries, or both. UNIX is a registered trademark of The Open Group in the United States and other countries.

56 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights Java™ and all Java-based trademarks and logos are trademarks or registered trademarks of Oracle and/or its affiliates.

Terms and conditions for product documentation Permissions for the use of these publications are granted subject to the following terms and conditions.

Applicability These terms and conditions are in addition to any terms of use for the IBM website.

Personal use You may reproduce these publications for your personal, noncommercial use provided that all proprietary notices are preserved. You may not distribute, display or make derivative work of these publications, or any portion thereof, without the express consent of IBM.

Commercial use You may reproduce, distribute and display these publications solely within your enterprise provided that all proprietary notices are preserved. You may not make derivative works of these publications, or reproduce, distribute or display these publications or any portion thereof outside your enterprise, without the express consent of IBM.

Rights Except as expressly granted in this permission, no other permissions, licenses or rights are granted, either express or implied, to the publications or any information, data, software or other intellectual property contained therein. IBM reserves the right to withdraw the permissions granted herein whenever, in its discretion, the use of the publications is detrimental to its interest or, as determined by IBM, the above instructions are not being properly followed. You may not download, export or re-export this information except in full compliance with all applicable laws and regulations, including all United States export laws and regulations. IBM MAKES NO GUARANTEE ABOUT THE CONTENT OF THESE PUBLICATIONS. THE PUBLICATIONS ARE PROVIDED "AS-IS" AND WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO IMPLIED WARRANTIES OF MERCHANTABILITY, NON- INFRINGEMENT, AND FITNESS FOR A PARTICULAR PURPOSE.

IBM Online Privacy Statement IBM Software products, including software as a service solutions, ("Software Offerings") may use cookies or other technologies to collect product usage information, to help improve the end user experience, to tailor interactions with the end user or for other purposes. In many cases no personally identifiable information is collected by the Software Offerings. Some of our Software Offerings can help enable you to collect personally identifiable information. If this Software Offering uses cookies to collect personally identifiable information, specific information about this offering's use of cookies is set forth in the following paragraphs. Depending upon the configurations deployed, this Software Offering may use session cookies that collect each user's user name for purposes of session management, authentication, and single sign-on configuration. These cookies can be disabled, but disabling them will also likely eliminate the functionality they enable. If the configurations deployed for this Software Offering provide you as customer the ability to collect personally identifiable information from end users via cookies and other technologies, you should seek

Notices 57 your own legal advice about any laws applicable to such data collection, including any requirements for notice and consent. For more information about the use of various technologies, including cookies, for these purposes, See IBM's Privacy Policy at http://www.ibm.com/privacy and IBM's Online Privacy Statement at http:// www.ibm.com/privacy/details the section entitled "Cookies, Web Beacons and Other Technologies" and the "IBM Software Products and Software-as-a-Service Privacy Statement" at http://www.ibm.com/ software/info/product-privacy.

58 IBM Cloud Application Business Insights: Administering Cloud Application Business Insights

IBM®

SC27-8793-03