PUBLIC SAP Data Services Document Version: 4.2 Support Package 12 (14.2.12.0) – 2020-02-06

Supplement for Adapters DP Bridge, Hive, HTTP, JDBC, JMS, MongoDB, OData, .com, Shapefile, and SuccessFactors company. All rights reserved. affiliate

THE BEST RUN 2020 SAP SE or an SAP © Content

1 Naming Conventions...... 5

2 Data Services adapters...... 9 2.1 Adapter required knowledge and expertise...... 11

3 Adapter installation and configuration...... 12 3.1 Adding an adapter instance...... 13 3.2 Adapter Configuration options ...... 14 JDBC adapter-specific configuration settings...... 19 DP Bridge runtime configuration options...... 23 HTTP adapter specific configuration settings...... 24 JMS adapter-specific configuration settings...... 25 3.3 Starting and stopping the adapter instance...... 27 3.4 Monitoring the adapter instances and operations...... 28 3.5 Viewing adapter instance statistics...... 29

4 Creating an adapter datastore...... 31

5 Adapter datastore configuration options...... 33 5.1 DP Bridge for SDI Outlook adapter datastore options...... 33 5.2 Hive adapter datastore configuration options...... 35 5.3 JDBC adapter datastore configuration options...... 37 5.4 MongoDB adapter datastore configuration options...... 38 5.5 OData adapter datastore configuration options...... 45 5.6 Salesforce.com adapter datastore configuration options...... 49 5.7 Shapefile adapter datastore configuration options...... 51 5.8 SuccessFactors adapter datastore configuration options...... 54

6 Browse and import metadata...... 55

7 Adapter metadata mapping...... 56 7.1 Metadata mapping for DP Bridge for SDI Outlook data ...... 57 7.2 Metadata mapping for Hive...... 58 Apache Hive data type conversion...... 59 7.3 Metadata mapping for JDBC...... 60 7.4 Metadata mapping for MongoDB...... 62 7.5 Metadata mapping for OData...... 63 7.6 Metadata mapping for Salesforce.com...... 64 7.7 Metadata mapping for SuccessFactors...... 65

Supplement for Adapters 2 PUBLIC Content 8 The DP Bridge adapter...... 67 8.1 SDI Outlook mail attachment table...... 67 8.2 SDI Outlook mail message table...... 68

9 Hive adapter datastores...... 70 9.1 Hive adapter source options...... 71 9.2 Hive adapter target options...... 72 9.3 Hive adapter datastore support for SQL function and transform...... 73 9.4 Pushing the JOIN operation to Hive...... 73 9.5 About partitions...... 74 9.6 Previewing Hive table data...... 74 9.7 Using Hive template tables...... 75 9.8 SSL connection support for Hive adapter...... 76

10 HTTP adapter...... 77 10.1 HTTP adapter datastore...... 78 10.2 HTTP adapter architecture...... 79 10.3 HTTP adapter operations...... 80 10.4 Configuring an HTTP operation instance...... 81 10.5 HTTP adapter operation instance configuration options...... 82 Target URL for HTTP requests...... 87 10.6 Operation instance invocation...... 87 Importing functions...... 88 10.7 Testing the HTTP operations...... 88 Files and settings for testing Request/Reply operation...... 89 Files and settings for testing the Request/Acknowledge operation...... 92 10.8 Configure SSL for the HTTP adapter...... 94 10.9 Error handling and tracing...... 95

11 JMS adapter ...... 97 11.1 Scope of the JMS adapter...... 98 11.2 JMS adapter functional overview...... 98 11.3 Design considerations...... 100 11.4 Configuring a JMS adapter—overview...... 101 JMS adapter operations...... 102 Adding an operation to an adapter instance...... 105 JMS adapter operation options...... 106 11.5 JMS adapter datastore...... 111 Import message functions and outbound messages to the datastore...... 111 11.6 Testing the JMS adapter and operations...... 113 Configure the JMS provider...... 115 Using MQ instead of JNDI configuration...... 116

Supplement for Adapters Content PUBLIC 3 JMS adapter sample files...... 117 Test Get: Request/Reply...... 118 Test Get: Request/Acknowledge...... 120 Test GetTopic: Request/Acknowledge...... 122 Test PutGet: Request/Reply...... 124 Test Put: Request/Acknowledge...... 127 Test PutTopic: Request/Acknowledge...... 129 11.7 WebLogic as JMS provider...... 132 11.8 Error handling and tracing...... 133

12 MongoDB adapter...... 134 12.1 MongoDB metadata...... 134

13 OData adapter ...... 136 13.1 OData as a source...... 136 OData Depth level...... 137 13.2 OData as a target...... 138 13.3 OData pushdown behavior...... 141

14 Salesforce.com adapter...... 143 14.1 The Salesforce.com DI_PICKLIST_VALUES table...... 144 14.2 The CDC datastore table...... 144 14.3 Use Salesforce.com for changed data ...... 145 Reading changed data from Salesforce.com...... 146 Using check points...... 150 Using the CDC table source default start date...... 150 Limitations...... 151 14.4 Salesforce.com error messages...... 152 14.5 Administering the Salesforce.com adapter...... 154

15 SuccessFactors adapter...... 155 15.1 SuccessFactors push-down operations...... 155 15.2 SuccessFactors ID field...... 156 15.3 SuccessFactors as a source...... 156 15.4 SuccessFactors as a target...... 157 15.5 CompoundEmployee API...... 159 Importing data from an .xsd file...... 160 Specify filters for CompoundEmployee as a source...... 160 Retrieve information from CompoundEmployee...... 163

16 SSL connection support...... 165 16.1 Importing certificate chain when errors occur...... 166

Supplement for Adapters 4 PUBLIC Content 1 Naming Conventions

We refer to certain systems with shortened names plus we use specific environment variables when we refer to locations for SAP and SAP Data Services files.

Shortened names

● The terms “Data Services system” and “SAP Data Services” mean the same thing. ● The term “BI platform” refers to “SAP BusinessObjects Business Intelligence platform.” ● The term “IPS” refers to “SAP BusinessObjects Information platform services.”

 Note

Data Services requires BI platform components. However, IPS, a scaled back version of BI, also provides these components.

● CMC refers to the Central Management Console provided by the BI or IPS platform. ● CMS refers to the Central Management Server provided by the BI or IPS platform.

Variables

Variables Description

INSTALL_DIR The installation directory for the SAP software.

Default location:

● For Windows: C:\Program Files (x86)\SAP BusinessObjects ● For UNIX: $HOME/ businessobjects

 Note

INSTALL_DIR is not an environment variable. The in­ stallation location of SAP software may be different than what we list for INSTALL_DIR based on the location that your administrator set during installation.

Supplement for Adapters Naming Conventions PUBLIC 5 Variables Description

The root directory of the BI or IPS platform.

Default location:

● For Windows: \SAP BusinessObjects Enterprise XI 4.0

 Example

C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0

● For UNIX: /enterprise_xi40

 Note

These paths are the same for both BI and IPS.

The root directory of the Data Services system.

Default location:

● All platforms \Data Services

 Example

C:\Program Files (x86)\SAP BusinessObjects\Data Services

Supplement for Adapters 6 PUBLIC Naming Conventions Variables Description

The common configuration directory for the Data Services system.

Default location:

● If your system is on Windows (Vista and newer): \SAP BusinessObjects \Data Services

 Note

The default value of environ­ ment variable for Windows Vista and newer is C: \ProgramData.

 Example

C:\ProgramData\SAP BusinessObjects \Data Services

● If your system is on Windows (Older versions such as XP) \Application Data \SAP BusinessObjects\Data Services

 Note

The default value of environ­ ment variable for Windows older versions is C: \Documents and Settings\All Users.

 Example

C:\Documents and Settings\All Users\Application Data\SAP BusinessObjects\Data Services

● UNIX systems (for compatibility)

The installer automatically creates this system environment variable during installation.

 Note

Starting with Data Services 4.2 SP6, users can desig­ nate a different default location for during installation. If you cannot find the in the listed default location, ask

Supplement for Adapters Naming Conventions PUBLIC 7 Variables Description

your System Administrator to find out where your de­ fault location is for .

The user-specific configuration directory for the Data Services system.

Default location:

● If you are on Windows (Vista and newer): \AppData\Local\SAP BusinessObjects\Data Services

 Note

The default value of environment variable for Windows Vista and newer versions is C:\Users\{username}.

● If you are on Windows (Older versions such as XP): \Local Settings \Application Data\SAP BusinessObjects \Data Services

 Note

The default value of environment variable for Windows older versions is C: \Documents and Settings\{username}.

The installer automatically creates this system environment variable during installation.

 Note

This variable is used only for Data Services client appli­ cations on Windows. is not used on UNIX platforms.

Supplement for Adapters 8 PUBLIC Naming Conventions 2 Data Services adapters

Adapters enable you to browse and import application metadata, and move batch and real-time data between SAP Data Services and the application.

A typical enterprise infrastructure is a complex mix of standard and custom applications, databases, enterprise resource planning (ERP) applications, and so on. Data Services combines and extends critical extraction transformation loading (ETL) and enterprise application integration (EAI) technology components required for true enterprise data integration.

Adapters allow you to integrate disparate applications with the Data Services platform. Adapters help facilitate incompatible applications and systems to work together and share data.

The following table describes the adapters supported by Data Services.

Data Services Adapters Adapter name Description

DP Bridge Use data provisioning (DP) Bridge as a connection to Smart Data Integration (SDI) Outlook adapter.

Set up DP Bridge as an adapter instance in the Data Services Management Console Administrator application.

Currently, the DP Bridge supports importing Out­ look PST data, which includes mail message and attachment data.

Hive Use the SAP Hive adapter to connect to a Hive server to work with tables from Hadoop. Hive is a data-warehousing infrastructure built on Hadoop.

 Note

You can also use the Cloudera ODBC driver to connect remotely to the Hive Server. For more information, see the Supplement for Hadoop.

HTTP Use the HTTP adapter to rapidly integrate diverse systems and applications using HTTP protocol. The HTTP adapter supports the following functionality:

● SSL for security ● Compress data encoding ● Initiate Request-Reply and Request-Acknowledge serv­ ices

Supplement for Adapters Data Services adapters PUBLIC 9 Adapter name Description

JDBC Use the JDBC adapter to connect any JDBC source to Data Services.

After you create an adapter instance and a datastore, use JDBC tables as a source in a Data Services data flow to fetch, insert, update, and delete data.

JMS Use the Java Messaging Service (JMS) adapter to initiate the Request-Reply and the Request-Acknowledgment mes­ sages.

JMS adapter supports Information Resource (IR) requests, which are JMS-compatible application requests. You can also set the JMS adapter to subscribe to IR published mes­ sages.

Use the JMS adapter with a batch job or real-time data flow (RTDF) under the following circumstances:

● When the batch job or RTDF passes an outbound mes­ sage for Request-Acknowledge operations. ● When the batch job or RTDF passes a Message Function for Request-Reply operations.

MongoDB Use the MongoDB adapter for the following read and write capabilities:

● Read data from MongoDB to other Data Services tar­ gets. ● Write data to MongoDB from other Data Services sour­ ces.

OData Use the (OData) to browse and import database tables to use as sources and targets in data flows.

OData is a standardized protocol for creating and consuming data .

 Example

Load and extract data from new OData-based objects in the SuccessFactors API.

After you customize data objects or extensions for OData, you can load that data only through OData objects.

 Note

Currently, you can’t expose older objects, like SuccessFactors, CompoundEmployee, and BizX tables though OData.

Supplement for Adapters 10 PUBLIC Data Services adapters Adapter name Description

Salesforce.com Use the Salesforce.com adapter to access Salesforce.com data from within the native Data Services ETL environment.

After you create a Salesforce.com adapter, use the adapter to do the following tasks:

● Automate the process for Salesforce.com configuration. ● Browse Salesforce.com schema metadata in the same manner as all sources and targets from within SAP Data Services Designer.

Shapefile Use the Shapefile adapter to load the geospatial vector data from Shapefiles into the HANA database for further analysis.

 Note

For more information about using Shapefile, see the SAP HANA Spacial Reference Guide in the SAP Help Por­ tal, SAP HANA Platform.

SuccessFactors Use the SuccessFactors adapter to view, import, and use SuccessFactors data in Data Services data flows.

Adapter required knowledge and expertise [page 11] Before you use SAP Data Services adapters, ensure that you understand how to use the features and systems that support and enable adapters.

2.1 Adapter required knowledge and expertise

Before you use SAP Data Services adapters, ensure that you understand how to use the features and systems that support and enable adapters.

To work with adapters, you must have the following knowledge and expertise:

● Design and run Data Services data flows. ● Manage Data Services processes with the SAP Data Services Management Console Administrator application. ● Know the role an adapter plays in business systems integration. ● Work in the environment that the adapter targets. ● Solve administration and integration issues when you integrate Data Services with external systems. ● Understand, create, and troubleshoot SQL query statements, XML markup language, and XML configuration schemas.

Parent topic: Data Services adapters [page 9]

Supplement for Adapters Data Services adapters PUBLIC 11 3 Adapter installation and configuration

To use an adapter, you create and configure an adapter instance and an adapter datastore.

To create an adapter, first create and configure the adapter in the SAP Data Services Management Console Administrator application. Then create an adapter datastore in SAP Data Services Designer, and associate the adapter instance that you created in the Management Console with the datastore.

The Data Services installer automatically includes Adapter files required to create an adapter. The following table contains a list of adapters that are automatically installed with Data Services, the applicable Data Services version, and other requirements.

Adapter Data Services version you need and other requirements

DP Bridge Adapter for SDI Outlook 4.2.7 or later

Hive 4.1.1 or later

HTTP 4.0.0 or later

HTTP adapter servlet

JDBC 4.2.2 or later

JMS 11.7.0 or later

You also need the following applications:

● A JMS provider, such as Weblogic Application Server ● SAP Data Services Adapter SDK version 2.0.0.0 or later

MongoDB 4.2.4 or later

OData 4.2.2 or later

SuccessFactors 4.2.1 or later

Salesforce.com 12.0.0 or later

Shapefile 4.2.3 or later

Additionally, adapters are associated with the following files:

● Adapter JAR files ● Adapter configuration templates ● Software System extensions, such as Salesforce.com

The following table describes the adapter-related objects that you configure in the Management Console.

Adapter-related object descriptions Configure Description

Adapter instance Creates the specific adapter type in the system.

Adapter operations Identifies the integration options for the adapter instance. Applicable for HTTP and JMS adapter types.

Supplement for Adapters 12 PUBLIC Adapter installation and configuration Configure Description

Administrator connection Establishes a connection between the Administrator and your adapter-enabled repository.

For additional information about configuring adapters, see the Management Console Guide.

Adding an adapter instance [page 13] Before you can create an adapter datastore in SAP Data Services Designer, you must add and configure the adapter in the SAP Data Services Management Console.

Adapter Configuration options [page 14] When you create an adapter instance in the Administrator, complete options in the Adapter Configuration tab.

Starting and stopping the adapter instance [page 27] If you make configuration changes to an adapter, restart the adapter instance for the changes to take effect.

Monitoring the adapter instances and operations [page 28] To monitor the adapter instances and operations, open the adapter in the SAP Data Services Management Console Administrator application.

Viewing adapter instance statistics [page 29] View adapter instance statistics in SAP Data Services Management Console Administrator application.

Related Information

Adapters

3.1 Adding an adapter instance

Before you can create an adapter datastore in SAP Data Services Designer, you must add and configure the adapter in the SAP Data Services Management Console.

Log in to the SAP Data Services Management Console and open the Administrator application.

1. Expand the Adapter Instances node and select the applicable Job Server.

If the Adapter Instances node isn’t available in Administrator, open the SAP Data Services Server Manager and open the applicable Job Server properties. Make sure the Support adapter and message broker communication option in the Job Server Properties dialog box is enabled.

 Note

Find the Server Manager through the Windows Start option or in $LINK_DIR/SAP Data Services by default.

2. Open the Adapter Configuration tab and click Add.

The Installed Adapters tab opens listing the adapters that are installed on the Job Server.

Supplement for Adapters Adapter installation and configuration PUBLIC 13 3. Click the applicable adapter type.

The Adapter Configuration tab opens showing the Adapter instance startup configuration options. 4. Complete the Adapter instance startup configuration options.

For option descriptions, see Adapter Configuration options [page 14]. 5. Click Apply.

The Administrator adds the adapter instance to the Data Services system, and lists the adapter on the Adapter Instance Status tab.

The HTTP and JMS adapters require that you add at least one operation instance for each adapter instance. Operations identify the integration options for the adapter instance. For more information about operation instances, see the individual adapter sections.

Task overview: Adapter installation and configuration [page 12]

Related Information

Adapter Configuration options [page 14] Starting and stopping the adapter instance [page 27] Monitoring the adapter instances and operations [page 28] Viewing adapter instance statistics [page 29] DP Bridge runtime configuration options [page 23] HTTP adapter specific configuration settings [page 24] JMS adapter-specific configuration settings [page 25]

3.2 Adapter Configuration options

When you create an adapter instance in the Administrator, complete options in the Adapter Configuration tab.

The following table contains descriptions for the common adapter options in the Adapter Configuration tab.

Adapter Configuration option descriptions

Parameter Description

Adapter Instance Name Required. Specifies a unique name to identify this instance of the adapter. Don’t include spaces.

 Note

Optional. Use the name that you enter for Adapter Instance Name as the adapter datastore name.

Supplement for Adapters 14 PUBLIC Adapter installation and configuration Parameter Description

Access Server Host Specifies the host ID of the computer that runs the access server that connects to this adapter instance.

For real-time jobs, configure a service that the Access Server uses to run the job. When a job uses adapter-based data, the access server must be able to connect to the adapter in­ stance.

If you don’t know this information, leave this option blank.

Access Server Port Specifies port information. Applicable when the adapter ac­ cesses real-time services.

If you don’t know this information, leave this blank.

Use SSL Protocol Specifies to use Secure Sockets Layer (SSL) protocol for communication between the adapter and the Job Server/ engine.

● True: Uses SSL protocol. ● False: Doesn’t use SSL protocol.

 Note

SSL protocol isn’t applicable for the DP Bridge SDI Out­ look adapter.

Character Set Specifies a character set used to convert text characters to and from bytes for data.

Metadata Character Set Specifies a character set used to convert text characters to and from bytes for metadata.

Adapter Retry Count Specifies the number of times Data Services tries to recon­ nect to the adapter instance when the adapter instance fails or crashes.

● Zero (0): Doesn’t attempt to reconnect. ● Negative number: Attempts to reconnect an indefinite number of times. ● Positive number: Attempts to reconnect the specified number of times.

Adapter Retry Interval Specifies the number of milliseconds to wait between retry attempts. Not applicable when you set Adapter Retry Count to zero (0).

Supplement for Adapters Adapter installation and configuration PUBLIC 15 Parameter Description

Classpath Specifies the -classpath Java parameter value when the adapter starts.

Adapters are preconfigured with most of the necessary JAR files. In some cases, you must configure the JAR files re­ quired by the adapter CLASSPATH. The following lists con­ tain the JAR files required for specific adapters.

DP Bridge SDI Outlook adapter:

\ext\lib \com.sap.hana.dp.adapterframework.jar ; ● ext\lib \com.sap.hana.dp.agent.jar; ● \ext\lib \com.sap.hana.dp.cdcadaptercommons.ja r; ● \ext\lib \org.eclipse.osgi_3.9.1.v20140110-161 0.jar; ● ext\lib \org.antlr.runtime_3.2.0.v20110131113 0.jar; ● ext\lib\commons-codec- commons-codec-1.9.jar; ● ext\lib \com.sap.hana.dp.outlookadapter.jar; ● \ext\lib\java-libpst.jar;

HTTP adapter:

/lib/acta_adapter_sdk.jar ● /lib/acta_broker_client.jar ● /lib/acta_tool.jar ● /ext/lib/xerces.jar ● /lib/acta_http_adapter.jar ● /lib/jcert.jar ● /lib/jnet.jar ● /lib/jsse.jar

JDBC adapter:

Add the path to the ojdbc6.jar file to the JDBC adapter CLASSPATH.

Supplement for Adapters 16 PUBLIC Adapter installation and configuration Parameter Description

JMS adapter:

Add the JAR files provided with the JMS provider that you’re using to the CLASSPATH. For example, for WebLogic, the JAR file is weblogic.jar.

Also add the j2ee.jar file to the CLASSPATH.

 Note

The j2ee.jar file is required. Get the j2ee.jar file from Java EE 1.6 and copy it to the adapter Job Server machine.

The JMS adapter JAR files include the following:

/lib/acta_adapter_sdk.jar ● /lib/acta_broker_client.jar ● /lib/acta_tool.jar ● /ext/lib/xerces.jar ● /lib/acta_jms_adapter.jar ● /ext/lib/jms//ext/lib/jms/j2ee.jar

Autostart Specifies whether the adapter interface starts when the Ad­ ministrator starts.

● True: Adapter interface starts when the Administrator starts. ● False: Adapter interface doesn’t start when the Admin­ istrator starts.

Supplement for Adapters Adapter installation and configuration PUBLIC 17 Parameter Description

Trace Mode Specifies whether the adapter interface writes information and error messages to the log output to help debug prob­ lems.

● True: Adapter interface writes information and error messages to the log output.

 Note

The adapter writes information and error messages using the following name and location: \adapters\logs \_trace.txt.

● False: Adapter interface writes only error information messages to the log output.

 Note

The adapter writes error information messages us­ ing the following name and location: \adapters\logs \_error.txt.

Additional Java Launcher Options Specifies additional Java application launcher code that the adapter instance enables when it launches the Java process that hosts the adapter.

For all adapter types, the field is automatically populated with: -Xms64m -Xmx256m. These values specify the initial memory allocation pool (Xms) and the maximum memory allocation pool (Xmx). You can edit, delete, or add to these values.

 Note

If you’re connecting to the adapter from behind a proxy server, add the following to the end of the Additional Java launcher options:

-Dhttps.proxyHost= -Dhttps.proxyPort=

 Note

For Unicode character support in MongoDB, add the fol­ lowing: -Dfile.encoding=UTF-8

Adapter Type Name Displays the name of the adapter type for this instance. The option is read only.

Supplement for Adapters 18 PUBLIC Adapter installation and configuration Parameter Description

Adapter Version Displays the version of the adapter for this instance. The op­ tion is read only.

Adapter Class Displays the adapter class name based on the adapter type. The option is read only.

JDBC adapter-specific configuration settings [page 19] When you add a JDBC adapter in the Administrator, complete options in the Adapter instance startup configuration page.

DP Bridge runtime configuration options [page 23] In addition to the common options, complete runtime parameters for the DP Bridge Outlook adapter when you configure the adapter in SAP Data Services Management Console.

HTTP adapter specific configuration settings [page 24] Additional configuration settings to make in the Adapter instance startup configuration page for HTTP adapter.

JMS adapter-specific configuration settings [page 25] Additional configuration settings to make in the Adapter instance startup configuration page for the JMS adapter.

Parent topic: Adapter installation and configuration [page 12]

Related Information

Adding an adapter instance [page 13] Starting and stopping the adapter instance [page 27] Monitoring the adapter instances and operations [page 28] Viewing adapter instance statistics [page 29]

3.2.1 JDBC adapter-specific configuration settings

When you add a JDBC adapter in the Administrator, complete options in the Adapter instance startup configuration page.

Complete the options in the following table for a JDBC adapter. The options establish the JDBC connection and push down capabilities at the adapter instance level, instead of the datastore level.

Supplement for Adapters Adapter installation and configuration PUBLIC 19 Parameter Description

JDBC driver class name Specifies the class that implements the JDBC driver.

 Example

com.microsoft.sqlserver.jdbc.SQLServe rDriver

JDBC driver url Specifies the URL for the JDBC driver.

User Specifies the name of the user connecting to the JDBC driver.

Password Specifies the password related to the specified user to con­ nect to the JDBC driver.

JDBC Pushdown Capability Enables SAP Data Services to push down a simple or nested expression or function to the JDBC driver.

● Yes: Pushes down a simple or nested expression or function to the JDBC driver. ● No: Doesn't push down a simple or nested expression or function to the JDBC driver.

 Example

If you know that the driver doesn’t support a certain expression or function, select No.

Supplement for Adapters 20 PUBLIC Adapter installation and configuration Parameter Description

JDBC Math Function Support Specifies whether to push down math functions to the JDBC driver.

● Yes: Pushes down math functions to the JDBC driver. Yes is the default setting. ● No: Doesn't push down math function so the JDBC driver.

This option applies to the following math functions:

● cei: Returns the absolute value of the input number. ● ceil: Returns the smallest integer value that is greater than or equal to the input number. ● floor: Returns the largest integer value that is greater than or equal to the input number. ● round: Returns the input number, rounded to the specified number of decimal places to the right of the decimal point. ● trunc: Returns the input number, truncated to the specified number of decimal places to the right of the decimal point. ● sqrt: Returns the square root of the input number. ● log: Returns the base-10 logarithm of the given nu­ meric expression. ● ln: Returns the natural logarithm of the given numeric expression. ● power: Returns the value of the given expression to the specified power. ● mod: Returns the remainder when one number is div­ ided by another.

Supplement for Adapters Adapter installation and configuration PUBLIC 21 Parameter Description

JDBC String Function Support Specifies whether to push down string functions to the JDBC driver.

● Yes: Pushes down string functions to the JDBC driver. Yes is the default setting. ● No: Doesn't push down string functions to the JDBC driver.

This option applies to the following string functions:

● lower: Converts the input string to lowercase. ● upper: Converts the input string to uppercase. ● rtrim_blanks: Returns the input string with blanks on the right removed. ● ltrim_blanks: Returns the input string with blanks on the left removed. ● length: Returns the length of the input string. ● substr: Returns the portion of the string specified by the offset and length. ● soundex: Returns the Soundex encoding of the input string.

JDBC Aggregate Function Support Specifies whether to push down aggregate functions to the JDBC driver.

● Yes: Pushes down aggregate functions to the JDBC driver. Yes is the default setting. ● No: Doesn't push down aggregate functions to the JDBC driver.

This option applies to the following aggregate functions:

● avg: Calculates the average of a given set of values. ● count: Counts the number of values in a table column. ● count_distinct: Counts the number of distinct non-null values in a table column. ● max: Returns the maximum value from a list. ● min: Returns the minimum value from a list. ● sum: Calculates the sum of a given set of values.

Supplement for Adapters 22 PUBLIC Adapter installation and configuration Parameter Description

JDBC Date Function Support Specifies whether to push down date functions to the JDBC driver.

● Yes: Pushes down date functions to the JDBC driver. Yes is the default setting. ● No: Doesn't push down date functions to the JDBC driver.

This option applies to the following date functions:

● week_in_year: Returns the week number relative to the year for the input date. ● month: Returns the month number for the input date. ● quarter: Returns the quarter number for the input date. ● year: Returns the year number for the input date. ● day_in_month: Returns the day number relative to the month for the input date. ● day_in_year: Returns the day number relative to the year for the input date.

JDBC Miscellaneous The following miscellaneous functions are also available: ● Ifthenelse: Computes the expression A: If A evalu­ ates to TRUE, return B. Otherwise, return C. ● nvl: Replaces input with replacement value when input is NULL.

Parent topic: Adapter Configuration options [page 14]

Related Information

DP Bridge runtime configuration options [page 23] HTTP adapter specific configuration settings [page 24] JMS adapter-specific configuration settings [page 25]

3.2.2 DP Bridge runtime configuration options

In addition to the common options, complete runtime parameters for the DP Bridge Outlook adapter when you configure the adapter in SAP Data Services Management Console.

The following table contains options specific for the DP Bridge SDI Outlook adapter in the Adapter instance startup configuration dialog box.

Supplement for Adapters Adapter installation and configuration PUBLIC 23 DP Bridge Outlook adapter runtime parameters

Parameter Description

Adapter Factory Class Specifies the factory class for the DP Bridge SDI Outlook adapter.

Enter the following text: com.sap.hana.dp.outlookadapter.DSBridgeO utlookAdapterFactory

Adapter Jar File Leave this option blank. You add the applicable value in the list of JAR files in the Classpath option.

 Note

For a list of JAR files, see the Classpath description in the table at Adapter Configuration options [page 14].

Adapter Name Specifies the name for the adapter type.

Enter the following value: DSBridgeOutlookAdapter

Parent topic: Adapter Configuration options [page 14]

Related Information

JDBC adapter-specific configuration settings [page 19] HTTP adapter specific configuration settings [page 24] JMS adapter-specific configuration settings [page 25] Adding an adapter instance [page 13]

3.2.3 HTTP adapter specific configuration settings

Additional configuration settings to make in the Adapter instance startup configuration page for HTTP adapter.

Parameter Description

Keystore Password Required if requests are made using the HTTPS protocol. If a password is given, it is used to check the integrity of the keystore data. Otherwise, the integrity of the keystore is not checked.

Parent topic: Adapter Configuration options [page 14]

Supplement for Adapters 24 PUBLIC Adapter installation and configuration Related Information

JDBC adapter-specific configuration settings [page 19] DP Bridge runtime configuration options [page 23] JMS adapter-specific configuration settings [page 25]

3.2.4 JMS adapter-specific configuration settings

Additional configuration settings to make in the Adapter instance startup configuration page for the JMS adapter.

Select the configuration type to use for the JMS adapter, and make the applicable settings. Select either Java Naming and Directory Interface (JNDI) or the IBM Message Queue (MQ) application in the Configuration Type dropdown list in the JMS Adapter section of the adapter configuration. The following table describes the two types of configurations.

JMS configuration types Type Description

JNDI Java Naming and Directory Interface is a Java API for a di­ rectory service. Set connection options for the applicable Web container you use, such as WebLogic.

MQ IBM MQ provides a communications layer for visibility and control of the flow of messages and data internal and exter­ nal servers.

JNDI configuration type parameters

Parameter Description

Server URL Represents the URL of the JMS Provider. For example: t3://:.

JNDI Context Factory JMS-Provider specific. Choose the context factory from a dropdown list that includes common context factories.

Context Factory allows remote Java clients to connect to a single server such as WebLogic.

If you require a context factory that is not listed, you can add it to the list by editing the JMSAdapter.xml file at /adapters/config/ templates/JMSAdapter.xml and updating the element.

For Weblogic as a JMS Provider, the JNDI Factory name is: weblogic.jndi.WLInitialContextFactory.

Supplement for Adapters Adapter installation and configuration PUBLIC 25 Parameter Description

Queue Connection Factory Queue connection factory name. For example: JMSConnections.AdapterConnectionFactory.

A Queue Connection Factory creates connections for Point- to-Point messaging domain.

Topic Connection Factory Topic connection factory name. For example: JMSConnections.AdapterTopicConnectionFac tory.

A Topic Connection Factory creates connections for Publish/ Subscribe messaging domain.

User Name Specifies the user name associated with the JNDI Security Principal.

A user name is required to authenticate a client. Not applica­ ble for server-side code.

Password Specifies the password associated with the JNDI Security Credentials.

A password is required to authenticate a client. Not applica­ ble for server-side code.

MQ Series from IBM configuration type parameters

Parameter Description

MQ Queue Manager Name Optional. Provides messaging services to applications that put messages on queues and receive messages on queues.

Leave the setting of when you use the default MQ Queue Manager on the system that runs MQ. The default queue manager processes any command for which a queue manager name is not specified.

MQ Channel Name Optional. Provides a communication path from one queue manager to another.

Specify if not using the default MQ Channel on the system running the adapter.

MQ Computer Name Optional. Specify if not using the MQ Queue Manager on the same system running the adapter.

Supplement for Adapters 26 PUBLIC Adapter installation and configuration Parameter Description

MQ Transport Type Specifies how the JMS client connects to the Queue Man­ ager.

● CLIENT: JMS client is on a different computer from the Queue Manager. Uses a client connection (TCP/IP). ● BINDINGS: JMS client is on the same computer as the Queue Manager. Connects directly using bindings.

MQ Port Optional. Specify if not using the default MQ port (1414).

MQ User ID Optional. Specify if required to log in to the MQ Queue Man­ ager.

MQ Password Optional. Specify if required to log in to the MQ Queue Man­ ager.

SSL Cipher Suite Specifies the Cipher Suites to use if you connect using SSL protocol.

Parent topic: Adapter Configuration options [page 14]

Related Information

JDBC adapter-specific configuration settings [page 19] DP Bridge runtime configuration options [page 23] HTTP adapter specific configuration settings [page 24] JMS adapter [page 97]

3.3 Starting and stopping the adapter instance

If you make configuration changes to an adapter, restart the adapter instance for the changes to take effect.

Log in to the SAP Data Services Management Console and open the Administrator application.

1. Expand the Adapter Instances node and select the applicable Job Server. 2. Open the Adapter Instance Status tab.

The Administrator displays all adapter instances and the status for each. The status for the applicable adapter indicates that a restart is required to implement the changes. 3. Select the checkbox next to the adapter that you want to start or stop. 4. Click Start to start the adapter instance. Click Shutdown to stop the adapter instance.

The adapter icon indicator next to the adapter name indicates the overall status.

Supplement for Adapters Adapter installation and configuration PUBLIC 27 5. Optional. Click the Trace or Error links under the Log files column.

Management Console displays the Adapter trace log and Adapter error log tabs that contain the log content.

Task overview: Adapter installation and configuration [page 12]

Related Information

Adding an adapter instance [page 13] Adapter Configuration options [page 14] Monitoring the adapter instances and operations [page 28] Viewing adapter instance statistics [page 29]

3.4 Monitoring the adapter instances and operations

To monitor the adapter instances and operations, open the adapter in the SAP Data Services Management Console Administrator application.

Log in to the SAP Data Services Management Console and open the Administrator application.

1. Expand the Adapter Instances node and select the applicable Job Server. 2. Open the Adapter Instance Status tab.

The Administrator displays all adapter instances and the status for each. 3. Find the overall status of a particular adapter instance or operation by examining the icon indicators.

Adapter icon indicator descriptions Icon indicator Description

A green circle with a check mark indicates that the adapter instance or operation has started and is currently running.

A yellow circle with an exclamation point indicates that the adapter instance or operation isn’t currently running.

A red circle with an “X” indicates that the adapter instance or operation has experienced an error.

4. View the information under each column for the adapter for additional information.

The following table describes all possible status values.

Supplement for Adapters 28 PUBLIC Adapter installation and configuration Column Description

Requests Processed The number of requests for this operation instance that Data Services processed. Processing for these requests is complete.

Requests Pending The number of requests for this operation instance that are pending. Processing for these requests isn’t complete.

Requests Failed The number of requests for this operation instance that have failed. Processing for these requests has stopped.

Status Text that indicates the status for operations, including er­ rors.

Possible values include:

○ Initialized ○ Starting ○ Started ○ Shutting Down ○ Shutdown ○ Error text—Displays the last error message that oc­ curred as the adapter instance shutdown. Or, indi­ cates that the configuration has changed. To enable the adapter instance to use the changes, restart the adapter instance.

5. Click the Trace or Error links under the Log files column.

The applicable log file appears.

Task overview: Adapter installation and configuration [page 12]

Related Information

Adding an adapter instance [page 13] Adapter Configuration options [page 14] Starting and stopping the adapter instance [page 27] Viewing adapter instance statistics [page 29]

3.5 Viewing adapter instance statistics

View adapter instance statistics in SAP Data Services Management Console Administrator application.

Log in to the SAP Data Services Management Console and open the Administrator application.

1. Expand the Adapter Instances node and select the applicable Job Server. 2. Click the name of an adapter instance.

Supplement for Adapters Adapter installation and configuration PUBLIC 29 The Adapter Status tab opens. The information that appears on this page depends on the latest activity for the adapter instance.

 Example

The Adapter Status tab for a Hive adapter contains the following information: ○ Adapter Instance Name ○ Adapter Instance Process ID ○ Status ○ Last Error ○ Config Status

Task overview: Adapter installation and configuration [page 12]

Related Information

Adding an adapter instance [page 13] Adapter Configuration options [page 14] Starting and stopping the adapter instance [page 27] Monitoring the adapter instances and operations [page 28]

Supplement for Adapters 30 PUBLIC Adapter installation and configuration 4 Creating an adapter datastore

Create at least one adapter datastore in the SAP Data Services Designer for each adapter with which you extract or load data.

Before you complete the following steps, make sure that you complete the following tasks:

● Install and configure the adapter files for the specific adapter type, and configure an adapter instance in the SAP Data Services Management Console. ● Ensure that you have the proper access privileges to the application that the adapter serves.

Log in to Data Services.

1. Open the Datastores tab in the object library. 2. Right-click in an empty area of the tab and select New from the dropdown menu.

The datastore editor dialog box opens. 3. Enter a unique name in the Datastore Name text box.

The name can be the same as the adapter instance name. 4. Select Adapter from the Datastore Type dropdown list. 5. Select a job server from the Job Server dropdown list. 6. Choose the name of the adapter instance from the Adapter Instance Name dropdown list. Only the adapter instances that you configured for the selected job server computer appear in the list.

The Advanced options appear. 7. Complete the advanced options based on your adapter type.

Find specific adapter option descriptions in Adapter datastore configuration options [page 33]. The following table contains descriptions for some common configuration options.

Parameter Description

Username and Password Specifies the user name and password associated with the adapter database to which you’re connecting.

Web service end point or URL Specifies the URL where your service can be accessed by a client application.

Default Base64 binary field length in kilobyte (KB) Specifies the size for the Data Services varchar field. The default is 16 KB.

 Note

Binary data is encoded in ASCII using Base64 format. Data Services stores this ASCII data in a varchar field.

8. Click OK to save values and finish creating the datastore.

Supplement for Adapters Creating an adapter datastore PUBLIC 31 Data Services saves the datastore configuration in your metadata repository and displays the new datastore in the Datastores tab of the object library.

 Note

If you don’t provide the correct connection information, such as user name and password, or if you entered an invalid option value, an error message appears when you click OK.

 Tip

While in the datastore editor, click Show ATL at the bottom of the editor to open a text window that displays the Data Services generated scripting language. The generated language shows how Data Services codes the settings for this datastore.

Related Information

Change a datastore definition Create and manage multiple datastore configurations

Supplement for Adapters 32 PUBLIC Creating an adapter datastore 5 Adapter datastore configuration options

Each type of adapter has unique datastore options to complete as well as options common to all adapter types.

For descriptions of common options, see Creating an adapter datastore [page 31]. For complete information about datastores, see the Designer Guide.

DP Bridge for SDI Outlook adapter datastore options [page 33] When you create the datastore for the Data Provisioning (DP) Bridge for the SDI Outlook adapter, complete the specific options in both the common and Advanced sections of the datastore editor.

Hive adapter datastore configuration options [page 35] To configure a Hive adapter datastore, include connection information to your data in Hadoop.

JDBC adapter datastore configuration options [page 37] When you configure a JDBC adapter datastore, ensure that you complete the JDBC adapter-related advanced option.

MongoDB adapter datastore configuration options [page 38] Complete MongoDB-specific options in the Advanced section of the adapter datastore editor.

OData adapter datastore configuration options [page 45] When you create an OData adapter datastore, supply information such as ODava version, authentication type, and scope in Advanced options.

Salesforce.com adapter datastore configuration options [page 49] Complete the Salesforce-specific Advanced options when you create a Salesforce.com adapter datastore.

Shapefile adapter datastore configuration options [page 51] Complete Shapefile-specific options for source and—or target puposes in the Advanced options of the datastore editor.

SuccessFactors adapter datastore configuration options [page 54] There is one SuccessFactor-specific option in the Advanced options of the datastore editor.

Related Information

Datastores

5.1 DP Bridge for SDI Outlook adapter datastore options

When you create the datastore for the Data Provisioning (DP) Bridge for the SDI Outlook adapter, complete the specific options in both the common and Advanced sections of the datastore editor.

The following table contains the datastore options specific for the DP Bridge for SDI Outlook adapter datastore.

Supplement for Adapters Adapter datastore configuration options PUBLIC 33 SDI Outlook adapter common datastore options

Option Value

Datastore Type Choose Adapter from the dropdown list.

Adapter Instance Name Choose the name of the DP Bridge adapter instance that you created in Management Console for the SDI Outlook adapter.

SDI Outlook adapter advanced options

SDI Outlook adapter option Value

PST file location Specifies the full path and file name of the PST file to access. The file must be on your local computer.

Ensure that you have permission to access the file.

Support large object Specifies whether Data Services imports BLOB and CLOB data types from the PST file.

● Yes: Imports BLOB and CLOB data types from the PST file. Yes is the default setting. ● No: Skips BLOB and CLOB data types when you execute the data flow.

 Restriction

Many fields from e-mail messages and attachments are large object data types. However, importing this data type can slow down job performance.

Default Base64 LOB field length in kilobytes (KB) Specifies a set number of Kilobytes to import for each large object type field.

The default is 16 KB.

Keep in mind that this setting affects job performance. When you set the field size, consider the field content and how much of the field you want to import to make the data useful.

 Example

If the mail attachment table contains large object fields that have 20 KB of data, the default setting of 16 KB causes Data Services to import only a portion of the 20- KB field. Setting the field size to less than the imported field size could exclude valuable information on import.

If you set the option Support large object to No, the software ignores the setting in this option.

Parent topic: Adapter datastore configuration options [page 33]

Supplement for Adapters 34 PUBLIC Adapter datastore configuration options Related Information

Hive adapter datastore configuration options [page 35] JDBC adapter datastore configuration options [page 37] MongoDB adapter datastore configuration options [page 38] OData adapter datastore configuration options [page 45] Salesforce.com adapter datastore configuration options [page 49] Shapefile adapter datastore configuration options [page 51] SuccessFactors adapter datastore configuration options [page 54] The DP Bridge adapter [page 67]

5.2 Hive adapter datastore configuration options

To configure a Hive adapter datastore, include connection information to your data in Hadoop.

The following table contains descriptions for the datastore configuration options that apply to the Hive adapter datastore.

Hive adapter datastore option descriptions

Option Description

Datastore Type Select Adapter.

Adapter Instance Name Select the specific instance that you created in the Management Console.

Advanced options

User name Specifies the user name associated with the data to which you are connecting.

If you select Kerberos for the Authentication, include the Ker­ beros realm with the user name. For example: dsuser@BIG­ DATA.COM.

If you select Kerberos keytab for the Authentication, do not complete the User name option.

Password Specifies the password associated with the data to which you are connecting.

Local working directory Specifies the path to your local working directory.

Supplement for Adapters Adapter datastore configuration options PUBLIC 35 Option Description

HDFS working directory Specifies the path to your Hadoop Distributed File System (HDFS) directory. If you leave this blank, Data Services uses /user/sapds_hivetmp as the default.

 Note

If you use Beeline CLI, enter the directory that your ad­ ministrator created, and assign permission 755 to each directory in the path.

String size Specifies the size of the Hive STRING datatype. The default is 100.

SSL enabled Specifies whether to use SSL (Secure Socket Layer), or the newer Transport Layer Security (TLS), for secure communi­ cation over the network.

Select Yes to use an SSL connection to connect to the Hive server.

 Note

If you use Kerberos or Kerberos keytab for authentica­ tion, set this option to No.

SSL Trust Store Specifies the path and file name of the trust store that veri­ fies credentials and stores certificates.

Trust Store Password Specifies the password associated with the trust store.

Authentication Indicates the type of authentication you are using for the Hive connection:

● None ● Kerberos ● Kerberos keytab

 Note

Complete the remaining Kerberos options based on your selection for Authentication.

Supplement for Adapters 36 PUBLIC Adapter datastore configuration options Option Description

Additional Properties Specifies additional connection properties.

For multiple property value pairs, use a semicolon as a de­ limiter between pairs. End the string of property values with a semicolon.

 Example

name1=value1; name1=value1; name2=value2;

To enable SASL-QOP support, set the Authentication option to Kerberos. Then enter one of the following values, which should match the value on the Hive server:

● Authentication only: ;sasl.qop=auth; ● Authentication with integrity protec­ tion: ;sasl.qop=auth-int; ● Authentication with integrity and confidentiality protec­ tion:;sasl.qop=auth-conf;

Parent topic: Adapter datastore configuration options [page 33]

Related Information

DP Bridge for SDI Outlook adapter datastore options [page 33] JDBC adapter datastore configuration options [page 37] MongoDB adapter datastore configuration options [page 38] OData adapter datastore configuration options [page 45] Salesforce.com adapter datastore configuration options [page 49] Shapefile adapter datastore configuration options [page 51] SuccessFactors adapter datastore configuration options [page 54] Configuring Kerberos authentication for Hive connection Hive adapter datastores [page 70]

5.3 JDBC adapter datastore configuration options

When you configure a JDBC adapter datastore, ensure that you complete the JDBC adapter-related advanced option.

When you configure an adapter datastore for JDBC, there is one option to complete in addition to the common and advanced options in the datastore editor.

Supplement for Adapters Adapter datastore configuration options PUBLIC 37 JDBC adapter option description Option Description

Convert unknown data type to VAR­ Specifies whether SAP Data Services imports unsupported data types as VAR­ CHAR CHAR.

● Yes: Imports all unsupported data types as VARCHAR. ● No: Ignores the metadata column for unsupported data types during import.

Parent topic: Adapter datastore configuration options [page 33]

Related Information

DP Bridge for SDI Outlook adapter datastore options [page 33] Hive adapter datastore configuration options [page 35] MongoDB adapter datastore configuration options [page 38] OData adapter datastore configuration options [page 45] Salesforce.com adapter datastore configuration options [page 49] Shapefile adapter datastore configuration options [page 51] SuccessFactors adapter datastore configuration options [page 54] Metadata mapping for JDBC [page 60]

5.4 MongoDB adapter datastore configuration options

Complete MongoDB-specific options in the Advanced section of the adapter datastore editor.

The following table contains configuration options that apply to the MongoDB adapter.

MongoDB adapter Advance option descriptions MongoDB options Description

Server host Specifies the host name or IP address of the database server to which the datastore connects.

Server port Specifies the port number of the database server to which the datastore connects.

 Note

If you use MongoDB sharded clusters, specify the port for the mongos instance.

Supplement for Adapters 38 PUBLIC Adapter datastore configuration options MongoDB options Description

Authentication Type Specifies the authentication type for the MongoDB connec­ tion.

● MongoDB-CR: Authenticates using a challenge—re­ sponse mechanism that uses a username and pass­ word. ● LDAP: Authenticates using Lightweight Directory Ac­ cess Protocol (LDAP) service.

 Note

Use only secure encrypted or trusted connections between the client and the server and between saslauthd and the LDAP server. The LDAP server uses the SASL PLAIN mechanism to send and receive data in plain text. Use a trusted chan­ nel, such as VPN, an encrypted connection with SSL, or a wired network.

● Kerberos: Authenticates with the MongoDB server using Username and KeyTab. Kerberos uses tickets to authen­ ticate, which means that passwords aren’t stored locally or sent over the internet. When using Kerberos, first export the KeyTab file and then copy it to the machine that runs the MongoDB adapter instance.

 Note

Kerberos is supported in MongoDB Enterprise ver­ sion 2.4 and later.

For more information about Kerberos, visit http:// web.mit.edu/kerberos/ . ● SCRAM-SHA-1: Authenticates using user credentials against the user name, password, and the database on which the user was created.

 Note

SCRAM-SHA-1 is the preferred mechanism for MongoDB versions 3.0 and later. It isn’t supported in earlier versions.

● No Authentication: Doesn’t use authentication. No Authentication is the default setting.

Supplement for Adapters Adapter datastore configuration options PUBLIC 39 MongoDB options Description

Username Specifies the user name associated with the selected au­ thentication type.

Required for LDAP, MongoDB-CR, Kerberos, and SCRAM- SHA-1 authentication.

Password Specifies the password associated with the selected authen­ tication type.

Required for LDAP, MongoDB-CR, Kerberos, and SCRAM- SHA-1 authentication.

Kerberos Realm Specifies the name of the applicable Kerberos realm.

 Note

The Realm name is case-sensitive.

A realm contains the services, host machines, and so on, that users can access.

Required for Kerberos authentication.

Kerberos KDC Specifies the hostname of the Key Distribution Center (KDC).

Secret keys for user machines and services are stored in the KDC database.

Required for Kerberos authentication.

Kerberos KeyTab Specifies the path to the .keytab file.

The .keytab file stores long-term keys for one or more principals.

 Note

The .keytab file must be accessible on the machine that runs the MongoDB adapter instance.

Required for Kerberos authentication.

Varchar size Specifies the length of the varchar type for string columns.

The default is 1024.

Varchar size determines the size during table importing. If the actual value is longer than the specified length, Data Services truncates the string during importing.

Supplement for Adapters 40 PUBLIC Adapter datastore configuration options MongoDB options Description

Rows to scan Specifies the number of scanned records to generate the metadata schema during import and the number of rows to display when previewing document data.

 Example

Enter -1 to scan and display all rows. Enter 50 to scan and display 50 rows.

The default setting is 100.

Sample directory Specifies the location of the folder that stores files that you name to match the collection names on MongoDB.

 Example

You have a folder named c:\mongo_sample\. The folder has three files:

● a. ● b.json ● c.json

You set the Sample directory option to c: \mongo_sample\ and import collections named a, b, and c.

The datastore is now able to find the corresponding files and use those files to generate schemas for the collec­ tions.

 Note

The Job Server must be able to access the folders and files you’re using because the Job Server may not be on the same machine as the Data Services Designer appli­ cation.

If Data Services doesn’t find a file name that matches the collection name in the location that you specify, it generates a schema from data in the MongoDB server.

Supplement for Adapters Adapter datastore configuration options PUBLIC 41 MongoDB options Description

Use cache Indicates whether Data Services stores the generated schema in a cache file.

● Yes: Stores the generated schema in /ext/mongo/mcache/ on the Job Server machine that contains the adapter instance.

 Note

To use cache, Data Services must be able to access the directory.

● No: Doesn’t use the cached schema but generates a new schema.

To understand how the Use Cache option impacts Data Serv­ ices processing, view the following scenarios.

 Example

Scenario 1 uses repository 1 and Job Server 1.

With the Use cache option set to Yes, Data Services gen­ erates a cache file when you import a collection for the first time. When you reimport the collection, and Use cache is still set to Yes, Data Services reads the meta­ data from the cache file.

 Note

Reading from a cache file is faster, but the cache file doesn’t pick up any schema changes that you make after Data Services initially created the cache file.

If Use cache is set to No when you reimport, Data Serv­ ices scans the collection and generates a new schema instead of reading the schema from the cache file.

 Example

Scenario 2 uses repository 2 and Job Server 2.

You now switch to repository 2 and Job Server 2. You set the Use cache option to Yes and you import the same collection that you used with repository 1.

Data Services tries to access the cache from Job Server 2 but, because the cache is in repository 1, it can't find the cached file on the machine. Instead, Data Services scans from the collection and creates a schema.

Supplement for Adapters 42 PUBLIC Adapter datastore configuration options MongoDB options Description

To reuse the cache generated by Job Server 1, ensure that Job Server 1 and Job Server 2 are under the same Data Services installation. Alternately, manually copy the cache and paste to the folder from which Job Server 2 is trying to access.

Use SSL Indicates whether to use Secure Sockets Layer (SSL) with or without a privacy-enhanced mail (PEM) file to connect to MongoDB .

 Note

SSL improves security for data exchange, but it can re­ duce application performance. The SSL configuration parameters and the hardware you use affect the reduc­ tion range.

SSL PEM File Specifies the path to the SSL PEM file to use when Data Services connects to a MongoDB instance that requires cli­ ent certificates.

If you don't provide a path to a PEM file, Data Services con­ nects using SSL without a certificate.

 Note

Data Services doesn’t support a passphrase-protected certificate in this case. If the .pem file is passcode en­ crypted, decrypt the file with the passphrase before you use it.

Replica Set Specifies whether to connect to a replica set.

● Yes: Connects using a replica set.

 Note

Also set the Secondary servers option.

● No: Doesn't connect using a replica set.

 Note

If you set Sharded Cluster to Yes, Data Services ignores this setting.

Supplement for Adapters Adapter datastore configuration options PUBLIC 43 MongoDB options Description

Secondary servers Specifies the name of the secondary database server that Data Services uses for the replica set.

Required when you select Yes for the Replica Set option.

When you enter the value for Secondary servers for the rep­ lica set, use a comma to separate multiple secondary data­ base server names and ports. For example: :,:,:.

Sharded Cluster Specifies whether to connect to a routing service (mongos) as a front end to a MongoDB sharded cluster.

● Yes: Connects to a routing service as a front end to a sharded cluster.

 Note

If you set this option to Yes, Data Services ignores the Replica Set option.

If you select Yes, also enter the port for the mongos in­ stance into the Server port option. ● No: Doesn't connect to a routing service as a front end to a sharded cluster.

Parent topic: Adapter datastore configuration options [page 33]

Related Information

DP Bridge for SDI Outlook adapter datastore options [page 33] Hive adapter datastore configuration options [page 35] JDBC adapter datastore configuration options [page 37] OData adapter datastore configuration options [page 45] Salesforce.com adapter datastore configuration options [page 49] Shapefile adapter datastore configuration options [page 51] SuccessFactors adapter datastore configuration options [page 54] MongoDB metadata [page 134]

Supplement for Adapters 44 PUBLIC Adapter datastore configuration options 5.5 OData adapter datastore configuration options

When you create an OData adapter datastore, supply information such as ODava version, authentication type, and scope in Advanced options.

 Note

Before you create the OData adapter datastore:

If you plan to use OAuth 2.0 authentication, register your application in the Portal to obtain the following information:

● Client ID ● Scope (permissions) ● Client secret ● Endpoint token

 Note

If you are using an HTTPS URL for an OData service, you must have the right certificates imported into the Java Keystore. To import the certificate, do the following:

1. Check the SSL certification (for HTTPS URL only). 2. Download the CA (certificate) from the browser. 3. Copy the certificate to %LINK_DIR%\ssl\trusted_certs. 4. Run %LINK_DIR%\bin\SetupJavaKeystore.bat. 5. Restart the JobServices service.

The following table contains Advanced configuration option descriptions for the OData Adapter.

OData Advanced option descriptions

OData option Description

Default varchar length Specifies the default size for the Data Services varchar field.

Depth level Specifies whether the OData data contains navigation prop­ erties.

● 1: OData data doesn't contain navigation properties. ● 2: Odata data contains navigation properties.

Read about navigation properties in your OData documenta­ tion.

Supplement for Adapters Adapter datastore configuration options PUBLIC 45 OData option Description

OData version Specifies the OData version.

● V1 ● V2: V2 is the default setting. ● V4 ● AUTO: Data Services detects the version from the URL.

 Note

Data Services doesn’t support job migration between OData V2 and V4 because each version uses different metadata. Data Services doesn’t support OData V3.

The OData adapter uses the odata4j library that supports V1 and V2. It also uses Apache Olingo library that supports V2 and V4. For more information about oData libraries, see http://www.odata.org/libraries/ .

Require CSRF Header Specifies whether to use a Cross-Site Request Forgery (CSRF) token to provide additional security when Data Serv­ ices writes data to an OData API.

● True: Uses a CSRF token to provide additional security.

 Note

Using a CSRF token is supported only for OData V2.

● False: Doesn't use a CSRF token to provide additional security.

Supplement for Adapters 46 PUBLIC Adapter datastore configuration options OData option Description

Authentication type Specifies the authentication method to use when connecting to OData.

● Basic: Uses Username and Password for authentication. Basic is the default setting. ● OAuth 2.0: Uses Microsoft Graph API v4 for authentica­ tion.

When you select OAuth 2.0, you need a token from the Azure Active Directory (AD) v2.0 endpoint. The service uses the to­ ken to call Microsoft Graph API v4 under its own identity. The following list outlines the basic steps to configure a service and obtain a token:

1. Register your application in the Azure Portal. 2. Configure permissions for Microsoft Graph for your ap­ plication. 3. Get administrator consent. 4. Get an access token. 5. Use the access token to call Microsoft Graph.

 Restriction

Perform steps 1 through 3 before configuring the data­ store.

To find specific instructions for the steps, go to the Microsoft Graph API Web page.

Grant Type Specifies the grant type. Select client-credentials for Micro­ soft Graph API.

Client Id Specifies the unique application (client) ID assigned by Azure AD when you clicked Register in the Register an ap­ plication page in the Microsoft Azure portal.

Applicable only when you select OAuth 2.0 for Authentication type and you use Microsoft Graph API.

Token Endpoint Specifies the Azure AD v2.0 /token token endpoint.

Data Services uses the endpoint to communicate with the Microsoft platform.

Applicable only when you select OAuth 2.0 for Authentication type and you use Microsoft Graph API.

Supplement for Adapters Adapter datastore configuration options PUBLIC 47 OData option Description

Client Secret Specifies the password that the application uses to authenti­ cate with the Microsoft identity platform.

Obtain the client secret when you register your application on the Microsoft Azure Portal.

Applicable only when you select OAuth 2.0 for Authentication type and you use Microsoft Graph API.

Scope Specifies the scope (permissions) applicable for the request.

Set the permissions when you register your application on the Microsoft Azure Portal. The value passed for the scope parameter in this request consists of the following elements:

● The application ID URI assigned when you registered the application. ● The default suffix .default.

 Example

For Microsoft Graph, the value is https:// graph.microsoft.com/.default.

This value requests tokens from the Azure AD v2.0 endpoint for the application resources for which you have permission.

Applicable only when you select OAuth 2.0 for Authentication type and you use Microsoft Graph API.

Parent topic: Adapter datastore configuration options [page 33]

Related Information

DP Bridge for SDI Outlook adapter datastore options [page 33] Hive adapter datastore configuration options [page 35] JDBC adapter datastore configuration options [page 37] MongoDB adapter datastore configuration options [page 38] Salesforce.com adapter datastore configuration options [page 49] Shapefile adapter datastore configuration options [page 51] SuccessFactors adapter datastore configuration options [page 54] Metadata mapping for OData [page 63] OData adapter [page 136]

Supplement for Adapters 48 PUBLIC Adapter datastore configuration options 5.6 Salesforce.com adapter datastore configuration options

Complete the Salesforce-specific Advanced options when you create a Salesforce.com adapter datastore.

The following table contains descriptions for the Advanced options in the datastore editor to complete when you create or edit a Salesforce.com adapter datastore.

Salesforce.com option descriptions

Salesforce.com options Description

Batch size Specifies the batch size to use in queries.

When loading data, Saleforce.com can send only a maximum of 200 rows.

Enable CDC Specifies whether to enable changed data capture (CDC) for this datastore.

● Yes: Uses CDC when loading data to the target. ● No: Doesn't use CDC when loading data to the target. No is the default setting.

Disable CDC deleted record Specifies whether to retrieve deleted records for a CDC load.

● Yes: Retrieves records deleted for a CDC load. ● No: Doesn't retrieve records deleted for a CDC load.

Disable CDC upserted records Specifies whether to retrieve upserted records for a CDC load.

● Yes: Retrieves upserted records for a CDC load. ● No: Doesn't retrieve upserted records for a CDC load.

Disable HTTP chunking Specifies whether Data Services uses HTTP chunking.

● Yes: Uses HTTP chunking.

 Note

If you get the following error when trying to browse metadata, set this parameter to Yes:There was a communication error when talking to Salesforce.com: Transport error: 411 Error: Length Required

● No: Doesn't use HTTP chunking.

HTTP chunking reduces the number of new connections by including larger chunks of data per transaction.

Supplement for Adapters Adapter datastore configuration options PUBLIC 49 Salesforce.com options Description

Convert Date value to UTC Specifies whether Data Services converts date data to UTC format when reading and loading data.

● Yes: Converts date data to UTC format when reading and loading data. ● No: Doesn't convert date data to UTC format when reading and loading data.

Metadata resilience? Specifies whether the adapter sends an error message to Data Services when it reads from changed data capture (CDC) sources.

● No: Issues error messages when reading from CDC sources. ● Yes: Doesn't issue error messages when reading from CDC sources.

When you select Yes, the adapter issues or doesn't issue er­ ror messages under the following situations:

When reading from normal or CDC sources:

● When the CDC source no longer exists, the adapter doesn’t send an error message to Data Services. ● When a source field no longer exists, the adapter reads a NULL value for that field to Data Services. ● When a source field in a Query WHERE clause no longer exists, all conditions that use that field automatically evaluate to “FALSE”, which can reduce the conditions.

 Example

WHERE clause: 'WHERE ColumnA = A and (ColumnB = B or ColumnC = C)'

When ColumnC no longer exists, the adapter proc­ esses the clause using “FALSE” in place of Col­ umnC:

'WHERE ColumnA = A and (ColumnB = B or ColumnC = C)'

'WHERE ColumnA = A and (ColumnB = B or FALSE)'

'WHERE ColumnA = A and ColumnB'

When loading data to Salesforce.com:

● When a table no longer exists, the adapter doesn't load that table to Salesforce.com. ● When a column no longer exists, the adapter doesn't load that column to Salesforce.com.

Supplement for Adapters 50 PUBLIC Adapter datastore configuration options Parent topic: Adapter datastore configuration options [page 33]

Related Information

DP Bridge for SDI Outlook adapter datastore options [page 33] Hive adapter datastore configuration options [page 35] JDBC adapter datastore configuration options [page 37] MongoDB adapter datastore configuration options [page 38] OData adapter datastore configuration options [page 45] Shapefile adapter datastore configuration options [page 51] SuccessFactors adapter datastore configuration options [page 54] Salesforce.com adapter [page 143]

5.7 Shapefile adapter datastore configuration options

Complete Shapefile-specific options for source and—or target puposes in the Advanced options of the datastore editor.

Before you create a Shapefile adapter datastore, keep the following information in mind:

● Each Shapefile consists of one set of .dbf, .shp, and .shx files. Separate any duplicate files into multiple subdirectories so that each file folder contains only one .dbf, .shp, and .shx file. ● Before you use Shapefile data as a source in SAP Data Services, perform the following steps: 1. Open the Adapter source page. 2. Find the DBF File Charset field. 3. Add the Java code page number associated with the Shapefile language to the field. Find the code online.

 Note

To load the Shapefile data into SAP HANA, create the appropriate in HANA. For information on spatial reference, see the SAP HANA Spacial Reference Guide.

The following table contains descriptions for Shapefile-specific options in the Advanced section of the datastore editor.

Shapefile Advanced option descriptions

Option Description

Directory path Specifies the directory that contains subdirectories of shapefile formats.

Supplement for Adapters Adapter datastore configuration options PUBLIC 51 Option Description

Import unsupported data types as varchar Specifies whether SAP Data Services imports unsupported data types as VarChar.

● Yes: Imports unsupported data types as VarChar.Yes is the default value. ● No: Doesn't import unsupported data types as VarChar.

VARCHAR size for unknown data type Specifies the length of the VarChar data type when you se­ lect Yes for Import unsuported data types as varchar.

The default value is 255.

Include shapefile name as column Indicates whether Data Services includes the Shapefile name as a column for each row.

● Yes: Includes the Shapefile name as a column for each row. ● No: Doesn't include the Shapefile name as a column for each row. No is the default value.

Column name for shapefile name Specifies to give the name for the column that includes the Shapefile name.

The default value is DI_SHAPEFILE_NAME.

VARCHAR size of column name for shapefile Specifies the length of the VarChar of a column.

Applicable when you specify a value for Column name for shapefile name option.

Include rowid column Specifies whether Data Services includes a rowid column for each row.

● Yes: Includes a unique row ID in a rowid column for each row in the table. ● No: Doesn't include a unique row ID in a rowid column for each row in the table. No is the default value.

Column name for rowid Specifies the name for the rowid column.

The default value is DI_ROWID.

Applicable when you select Yes for Include rowid column.

The following table contains Shapefile-specific option descriptions in the datastore editor related to reading Shapefile data.

Supplement for Adapters 52 PUBLIC Adapter datastore configuration options Shapefile reader option descriptions

Option Description

Batch size Specifies the number of rows the shapefile adapter sends in a batch.

The default value is 10.

This value affects the amount of memory Data Services uses. The higher the number, the higher the amount of memory Data Services uses.

DBF File Charset Specifies the Java code page number associated with the language of the source shapefile.

 Note

This value is the same value that you enter into the DBF File Charset field in the Adapter Source page.

Find the Java code page number online.

Include full path in shapefile name Specifies the full path for the source shapefile.

For more information about using Shapefile, see the SAP HANA Spacial Reference Guide in the SAP Help Portal, SAP HANA Platform.

Parent topic: Adapter datastore configuration options [page 33]

Related Information

DP Bridge for SDI Outlook adapter datastore options [page 33] Hive adapter datastore configuration options [page 35] JDBC adapter datastore configuration options [page 37] MongoDB adapter datastore configuration options [page 38] OData adapter datastore configuration options [page 45] Salesforce.com adapter datastore configuration options [page 49] SuccessFactors adapter datastore configuration options [page 54]

Supplement for Adapters Adapter datastore configuration options PUBLIC 53 5.8 SuccessFactors adapter datastore configuration options

There is one SuccessFactor-specific option in the Advanced options of the datastore editor.

SuccessFactors Advanced option description

SuccessFactors option Description

Company ID Specifies a unique company ID that identifies the Success­ Factors client instance.

Parent topic: Adapter datastore configuration options [page 33]

Related Information

DP Bridge for SDI Outlook adapter datastore options [page 33] Hive adapter datastore configuration options [page 35] JDBC adapter datastore configuration options [page 37] MongoDB adapter datastore configuration options [page 38] OData adapter datastore configuration options [page 45] Salesforce.com adapter datastore configuration options [page 49] Shapefile adapter datastore configuration options [page 51] SuccessFactors adapter [page 155]

Supplement for Adapters 54 PUBLIC Adapter datastore configuration options 6 Browse and import metadata

After you create an adapter datastore for the applicable adapter, view and import metadata to use as a source or a target in your data flows.

View an adapter metadata by right-clicking the datastore name and selecting Open. The Metadata explorer opens in the Designer window. The data available to view and import is based on the adapter type. Some metadata appear as a list of table names with the table icon. Some objects appear in a file tree with expandable nodes. Double-click an object, such as a table name, to open the object metadata. For example, open a table to view the table schema.

The import options available are based on the adapter type.

 Example

For Salesforce.com, the following import options are available when you right-click an object:

Option What is imported

Table node The selected table.

Reference by node All tables directly under the selected node.

Reference node All tables directly under the selected node.

For complete instructions for browsing and importing metadata using a datastore, see the “Database datastores” section of the Designer Guide.

Related Information

Creating an adapter datastore [page 31] Datastore metadata Imported metadata from database datastores

Supplement for Adapters Browse and import metadata PUBLIC 55 7 Adapter metadata mapping

Each application that you access using an adapter has its own data types that may or may not match SAP Data Services data types.

SAP Data Services maps imported data types to its own data types for processing. Then it converts the data types back to the application data types when it loads generated data.

For data type mapping between a specific adapter type and Data Services, see the applicable mapping topic. For more information about Data Services data types and data type conversion, see the Designer Guide.

 Note

For information about Shapefile data types, see the SAP HANA Spacial Reference Guide in the SAP Help Portal, SAP HANA Platform. Also, to learn how to work with spatial data in SAP HANA, see the Data Services Supplement for Big Data.

Metadata mapping for DP Bridge for SDI Outlook data [page 57] SAP Data Services matches SDI Outlook PST metadata data types to supported data types when it reads data from SDI Outlook sources.

Metadata mapping for Hive [page 58] SAP Data Services matches Hive metadata data types to supported data types when it reads data from Hive adapter sources.

Metadata mapping for JDBC [page 60] SAP Data Services matches JDBC metadata data types to supported data types when it reads data from JDBC adapter sources.

Metadata mapping for MongoDB [page 62] SAP Data Services matches MongoDB metadata data types to supported data types when it reads data from MongoDB adapter sources.

Metadata mapping for OData [page 63] SAP Data Services matches OData metadata data types to supported data types when it reads data from OData adapter sources.

Metadata mapping for Salesforce.com [page 64] SAP Data Services matches Salesforce.com metadata data types to supported data types when it reads data from Salesforce.com adapter sources.

Metadata mapping for SuccessFactors [page 65] SAP Data Services matches SuccessFactors metadata data types to supported data types when it reads data from SuccessFactors adapter sources.

Related Information

Data Services data types Data type conversion

Supplement for Adapters 56 PUBLIC Adapter metadata mapping Using spatial data with SAP HANA

7.1 Metadata mapping for DP Bridge for SDI Outlook data

SAP Data Services matches SDI Outlook PST metadata data types to supported data types when it reads data from SDI Outlook sources.

The following table shows the data type mapping between Outlook PST data types and Data Services data types.

 Note

With PST files, there’s BLOB and CLOB data types. Importing these data types can slow down job performance. Additionally, Data Services has specific limitations for BLOB data types. In general, you can’t use blob data-type columns in comparisons, calculations, or data type conversions. See “Limitations for long and blob” in the Reference Guide for additional restrictions and information.

Data type mapping between Outlook and Data Services

Outlook PST data type Data Services data type

Tinyint Int

Integer Int

Smallint Int

Alphanum Varchar

NVarchar Varchar

Varchar Varchar

Bigint Double

Time Time

Seconddate Datetime

Timestamp Datetime

Date Datetime

Double Double

Real Real

Decimal Decimal

NClob Varchar

Supplement for Adapters Adapter metadata mapping PUBLIC 57 Outlook PST data type Data Services data type

Blob Varchar

Clob Varchar

Varbinary Varchar

Parent topic: Adapter metadata mapping [page 56]

Related Information

Metadata mapping for Hive [page 58] Metadata mapping for JDBC [page 60] Metadata mapping for MongoDB [page 62] Metadata mapping for OData [page 63] Metadata mapping for Salesforce.com [page 64] Metadata mapping for SuccessFactors [page 65] Limitations for long and blob data types

7.2 Metadata mapping for Hive

SAP Data Services matches Hive metadata data types to supported data types when it reads data from Hive adapter sources.

The following table shows the conversion between Hive data types and Data Services data types when Data Services imports metadata from a Hive source or target.

Data type mapping between Hive and Data Services Hive data type Data Services data type

tinyint int

smallint int

int int

bigint decimal(20,0)

float real

double double

string varchar

boolean varchar(5)

Supplement for Adapters 58 PUBLIC Adapter metadata mapping Hive data type Data Services data type

complex not supported

Parent topic: Adapter metadata mapping [page 56]

Related Information

Metadata mapping for DP Bridge for SDI Outlook data [page 57] Metadata mapping for JDBC [page 60] Metadata mapping for MongoDB [page 62] Metadata mapping for OData [page 63] Metadata mapping for Salesforce.com [page 64] Metadata mapping for SuccessFactors [page 65]

7.2.1 Apache Hive data type conversion

SAP Data Services converts some Apache Hive data types when importing data and when loading data into external tables or files.

The following table shows the conversion between Apache Hive data types and Data Services data types. Data Services converts Apache Hive data types to Data Services data types when you import metadata from an Apache Hive source or target into the repository. Data Services also converts data types back to Apache Hive data types when it loads data into an external table or file.

Hive data type Data Services data type Additional information

TINYINT INT

SMALLINT INT

INT/INTEGER INT

BIGINT DECIMAL(19,0) As default, the precision is 19.

FLOAT DOUBLE

DOUBLE DOUBLE

DECIMAL DECIMAL

VARCHAR VARCHAR

CHAR VARCHAR

Supplement for Adapters Adapter metadata mapping PUBLIC 59 Hive data type Data Services data type Additional information

STRING VARCHAR(255)

BOOLEAN INT

TIMESTAMP DATETIME

Date Date

INTERVAL Not Supported Available with Hive 1.2.0 and later

complex Not Supported Complex types are array, map, and so on.

If Data Services encounters a column that has an unsupported data type, it does not import the column. However, you can configure Data Services to import unsupported data types. In the applicable datastore, check the Import unsupported data types as VARCHAR of size checkbox located in the left corner of the datastore editor dialog box.

7.3 Metadata mapping for JDBC

SAP Data Services matches JDBC metadata data types to supported data types when it reads data from JDBC adapter sources.

The following table shows the conversion between JDBC data types and Data Services data types when Data Services imports metadata from a JDBC source.

 Note

Data Services has specific limitations for BLOB data types. In general, you can’t use blob data-type columns in comparisons, calculations, or data type conversions. See “Limitations for long and blob” in the Reference Guide for additional restrictions and information.

SQL JDBC/Java data type AWDataType

VARCHAR java.lang.String AWT_VARCHAR

CHAR java.lang.String AWT_VARCHAR

NVARCHAR java.lang.String AWT_VARCHAR

LONGVARCHAR java.lang.String AWT_VARCHAR

BIT boolean AWT_BARCHAR

NUMERIC java.math.BigDecimal AWT_DECIMAL

TINYINT byte AWT_INT

SMALLINT short AWT_INT

Supplement for Adapters 60 PUBLIC Adapter metadata mapping SQL JDBC/Java data type AWDataType

INTEGER int AWT_INT

BIGINT long AWT_DECIMAL

REAL float AWT_REAL

FLOAT float AWT_DOUBLE

DOUBLE double AWT_DOUBLE

VARBINARY byte[] Not supported

BINARY byte[] Not supported

DATE java..Date AWT_DATETIME

TIME java.sql.Time AWT_TIME

TIMESTAMP java.sql.Timestamp AWT_TIME

CLOB java.sql.Clob Not supported

BLOB java.sql.Blob Not supported

ARRAY java.sql.Array Not supported

REF java.sql.Ref Not supported

STRUCT java.sql.Struct Not supported

Parent topic: Adapter metadata mapping [page 56]

Related Information

Metadata mapping for DP Bridge for SDI Outlook data [page 57] Metadata mapping for Hive [page 58] Metadata mapping for MongoDB [page 62] Metadata mapping for OData [page 63] Metadata mapping for Salesforce.com [page 64] Metadata mapping for SuccessFactors [page 65] Limitations for long and blob data types Limitations for long Unsupported data types

Supplement for Adapters Adapter metadata mapping PUBLIC 61 7.4 Metadata mapping for MongoDB

SAP Data Services matches MongoDB metadata data types to supported data types when it reads data from MongoDB adapter sources.

 Note

Data type mapping doesn’t always follow what is described in this table. For the same key with different object types, Data Services uses xs:string as a general data type.

MongoDB Schema Data Services Notes

String xs:string varchar

Double xs:double double

Integer xs:integer int

Boolean xs:boolean varchar

Date xs:datetime datetime

Timestamp xs:datetime datetime

ObjectId xs:string varchar If the “_id” type is ObjectId, then the value displayed in Data Services appears as: ObjectId("5330fb1052935853002e54fa")

BINARY xs:string varchar

NumberDecimal xs:decimal decimal Applicable for MongoDB version 3.4 and higher.

xs:string varchar If the data doesn’t contain “E” (scientific notation), it’s identified as decimal(28,2) by default.

If the data contains “E”, it’s parsed into varchar to main­ tain all the expression of data.

Other xs:string varchar

Parent topic: Adapter metadata mapping [page 56]

Related Information

Metadata mapping for DP Bridge for SDI Outlook data [page 57] Metadata mapping for Hive [page 58] Metadata mapping for JDBC [page 60] Metadata mapping for OData [page 63] Metadata mapping for Salesforce.com [page 64] Metadata mapping for SuccessFactors [page 65]

Supplement for Adapters 62 PUBLIC Adapter metadata mapping 7.5 Metadata mapping for OData

SAP Data Services matches OData metadata data types to supported data types when it reads data from OData adapter sources.

The following table shows how Data Services maps OData data types.

oData Data Services data types Notes

Int16, Int32, Int64 int

Double double

String varchar

Boolean varchar(5) Boolean true/false value.

Datetime datetime OData V2

DateTimeOffset datetime

Binary varchar In Base64 format. Size is defined in a datastore parameter named Default varchar length.

Byte int

Decimal decimal(20,0)

Single double

Float double OData V2

Guid varchar

SByte int

TimeOfDay time OData V4

Duration time OData V4

Date datetime OData V4

Parent topic: Adapter metadata mapping [page 56]

Related Information

Metadata mapping for DP Bridge for SDI Outlook data [page 57] Metadata mapping for Hive [page 58] Metadata mapping for JDBC [page 60] Metadata mapping for MongoDB [page 62] Metadata mapping for Salesforce.com [page 64] Metadata mapping for SuccessFactors [page 65]

Supplement for Adapters Adapter metadata mapping PUBLIC 63 7.6 Metadata mapping for Salesforce.com

SAP Data Services matches Salesforce.com metadata data types to supported data types when it reads data from Salesforce.com adapter sources.

The following table shows the conversion between Salesforce.com data types and Data Services data types.

Salesforce.com data type Data Services data types Description

xsd:base64Binary varchar Base 64-encoded binary data

xsd:boolean varchar ('true' or 'false') Boolean (True/False) values

xsd:date date Date values

xsd:datetime datetime Date/time values (timestamps)

xsd:double decimal Double values

xsd:int int Integer values

xsd:string varchar Character strings

The date/time values that the Salesforce.com adapter retrieves from Salesforce.com are in ISO 8601 format, reflect GMT time, and include a time zone field. To adjust for time zone differences, the Salesforce.com adapter automatically performs a translation based on the associated local and server clocks. When the Salesforce.com adapter communicates datetime information to SAP Data Services, Data Services receives those values in local time and it doesn't consider the time zone field.

 Note

If your local and server clocks aren't synchronized, translation speed is unaffected. However, if your local clock isn’t set to the correct time, Data Services could send incorrect times to Salesforce.com. In this case, changes that you expected to be returned may not be returned until a later synchronization.

 Example

Data Services is set for Pacific Standard Time (PST). The adapter receives a time of '2005-08-10T23:00:00Z' where 'Z' means GMT time. However, the value sent to Data Services is '2005.08.10 15:00:00'.

 Example

You want to retrieve information that changed since yesterday at 6:00 PM local time. You write a condition stating: SFDC_TIMESTAMP >='2005.08.10 18:00:00'. Data Services sends this condition "as is" to the adapter. However, Salesforce.com does not understand the timestamp because it lacks a time zone indicator. Therefore, the Salesforce.com adapter automatically converts the time specified in Data Services to a format that it understands: '2005-08-11T01:00:00Z'.

Parent topic: Adapter metadata mapping [page 56]

Supplement for Adapters 64 PUBLIC Adapter metadata mapping Related Information

Metadata mapping for DP Bridge for SDI Outlook data [page 57] Metadata mapping for Hive [page 58] Metadata mapping for JDBC [page 60] Metadata mapping for MongoDB [page 62] Metadata mapping for OData [page 63] Metadata mapping for SuccessFactors [page 65]

7.7 Metadata mapping for SuccessFactors

SAP Data Services matches SuccessFactors metadata data types to supported data types when it reads data from SuccessFactors adapter sources.

The following table shows the conversion of data types between SuccessFactors and Data Services.

SuccessFactors data types Data Services data types Description

Integer int Integer value.

Long decimal(20,0)

Float double Double values.

Double double Double values.

String varchar Character strings. SuccessFactors pro­ vides the size. Data is in UTF-8.

Boolean varchar(5) Boolean true/false value.

Date date Date values in format

Datetime datetime The date/time values that the adapter retrieves from SuccessFactors are in ISO 8601 format: . ISO 8601 also re­ flects GMT time and includes a time zone field. The adapter adjusts for any time zone differences by performing a translation based on the associated lo­ cal and server clocks. When the adapter communicates datetime information to Data Services, it receives those values in local time and the time zone field is not considered.

Binary varchar In Base64 format. Size is defined in a datastore option named Default Base64 binary field length.

Parent topic: Adapter metadata mapping [page 56]

Supplement for Adapters Adapter metadata mapping PUBLIC 65 Related Information

Metadata mapping for DP Bridge for SDI Outlook data [page 57] Metadata mapping for Hive [page 58] Metadata mapping for JDBC [page 60] Metadata mapping for MongoDB [page 62] Metadata mapping for OData [page 63] Metadata mapping for Salesforce.com [page 64]

Supplement for Adapters 66 PUBLIC Adapter metadata mapping 8 The DP Bridge adapter

Use the data provisioning (DP) Bridge to access smart data integration (SDI) Outlook functionality.

The DP Bridge SDI Outlook adapter imports tables that contain Outlook mail message data and mail attachment data from the designated PST file. You can use the tables as sources in a data flow.

SDI Outlook mail attachment table [page 67] An SDI Outlook mail attachment table has set column names with assigned data types.

SDI Outlook mail message table [page 68] An SDI Outlook mail message table has set column names with assigned data types.

8.1 SDI Outlook mail attachment table

An SDI Outlook mail attachment table has set column names with assigned data types.

The following table contains the column names and data types for an SDI Outlook mail attachment table.

Column name Data type

MSG_ID Varchar (1024)

Primary key

LONG_FILENAME NVarchar (4096)

FILENAME NVarchar (4096)

DISPLAY_NAME NVarchar (1024)

PATH_NAME NVarchar (1024)

CREATION_TIME Timestamp

MODIFICATION_TIME Timestamp

SIZE Integer

COMMENT NCLOB

CONTENT BLOB

Parent topic: The DP Bridge adapter [page 67]

Supplement for Adapters The DP Bridge adapter PUBLIC 67 Related Information

SDI Outlook mail message table [page 68]

8.2 SDI Outlook mail message table

An SDI Outlook mail message table has set column names with assigned data types.

The following table contains the column names and data types for an SDI Outlook mail message table.

Column name Data type

MSG_ID Varchar (256)

Primary key

SUBJECT NVarchar (4096)

SENDER_NAME NVarchar (4096)

CREATION_TIME Timestamp

LAST_MDF_TIME Timestamp

COMMENT NCLOB

DESC_NODE_ID Varchar (1024)

SENDER_MAIL_ADDR Varchar (256)

RECIPIENTS CLOB

DISPLAYTO CLOB

DISPLAYCC CLOB

DISPLAYBCC CLOB

IMPORTANCE Varchar (100)

PRIORITY Varchar (100)

ISFLAGGED Tinyint

MESSAGEBODY NCLOB

Parent topic: The DP Bridge adapter [page 67]

Supplement for Adapters 68 PUBLIC The DP Bridge adapter Related Information

SDI Outlook mail attachment table [page 67]

Supplement for Adapters The DP Bridge adapter PUBLIC 69 9 Hive adapter datastores

Use a Hive adapter datastore to connect to a Hive server and work with your tables stored in Hadoop.

Import Hive tables and use them as sources or targets in data flows.

 Note

Data Services supports Apache Hive and HiveServer2 version 0.11 and higher. For the most recent compatibility information, see the Product Availability Matrix (PAM) on the SAP Support Portal.

The Hive adapter datastore requires a supported Hive ODBC driver, such as Cloudera, to connect remotely to the Hive Server.

For more information about Hadoop and Hive, see the Supplement for Hadoop.

Which type of Hive datastore to use

SAP Data Services supports two types of Hive datastores: Hive adapter datastore and Hive database datastore.

Use a Hive database datastore when SAP Data Services is installed on a machine either within the Hadoop cluster or not. Use the Hive database datastore for a DSN or a DSN-less connection. Also include SSL (or the newer TLS) for secure communication over the network. For details, see the Supplement for Hadoop.

Use the Hive adapter datastore when Data Services is installed within the Hadoop cluster. Use the Hive adapter datastore for server-named (DSN-less) connections. Also include SSL (or the newer TLS) for secure communication over the network. For more information about adapters, see the Supplement for Adapters.

Hive adapter source options [page 71] Set specific source options when you use a Hive adapter datastore table as a source in a data flow.

Hive adapter target options [page 72] Set specific Hive target options in the target editor when you use a Hive adapter datastore table as a target in a data flow.

Hive adapter datastore support for SQL function and transform [page 73] The Hive adapter datastore can process data using SQL functions and the SELECT statement in a SQL transform.

Pushing the JOIN operation to Hive [page 73] Stage non-Hive data in a dataflow with the Data Transfer transform before joining it with a Hive source.

About partitions [page 74] SAP Data Services imports Hive partition columns the same way as regular columns, but displays the partition columns at the end of the table column list.

Previewing Hive table data [page 74] After you import Hive table metadata using a Hive datastore, preview data in Hive tables.

Using Hive template tables [page 75]

Supplement for Adapters 70 PUBLIC Hive adapter datastores You can use Hive template tables as targets in your data flow.

SSL connection support for Hive adapter [page 76] SAP Data Services supports SSL connections for the Hive adapter.

9.1 Hive adapter source options

Set specific source options when you use a Hive adapter datastore table as a source in a data flow.

Open the Adapter Source tab of the source table editor and complete the options. The following table contains Hive-specific options.

Hive adapter source option descriptions Option Description

Clean up working directory Specifies whether Data Services cleans up the working di­ rectory after the job completes.

● True: Deletes working directory after successful job completion. ● False: Doesn't delete the working directory after suc­ cessful job completion.

Execution engine type Specifies the type of engine to use for executing the job.

● Default: Uses the default Hive engine. ● Spark: Uses the Spark engine to read data from Spark. ● Map Reduce : Uses the Map Reduce engine to read data from Hive.

Parallel process threads Specifies the number of threads for parallel processing.

More than one thread can improve performance by maximiz­ ing CPU usage on the Job Server computer. For example, if you have four CPUs, enter 4 for the number of parallel proc­ ess threads.

Related Information

Parallel process threads for flat files

Supplement for Adapters Hive adapter datastores PUBLIC 71 9.2 Hive adapter target options

Set specific Hive target options in the target editor when you use a Hive adapter datastore table as a target in a data flow.

Open the Adapter Target tab of the target table editor and complete the Hive-specific options as described in the following table.

Hive adapter target option descriptions Option Description

Append Specifies whether Data Services appends new data to the ta­ ble or partition.

● True: Adds new data to the existing data in the table or partition. ● False: Deletes all existing data and adds new data to the table or partition.

Clean up working directory Specifies whether Data Services cleans up the working di­ rectory after the job completes.

● True: Deletes working directory after successful job completion. ● False: Doesn't delete the working directory after suc­ cessful job completion.

Dynamic partition Specifies whether Hive evaluates the table partitions when it scans data before loading.

● True: Uses table partitions when scanning data before loading. ● False: Uses static partitions for loading data.

SAP Data Services supports only all-dynamic or only all- static partitions.

Drop and re-create table before loading Specifies whether to drop the existing table and create a new table with the same name before loading.

● True: Drops existing table and creates a new table be­ fore loading data. ● False: Doesn't drop the existing table, but uses the exist­ ing table for loading data.

The Drop and re-create table before loading option is applica­ ble only when you use template tables in the design or test environment.

Supplement for Adapters 72 PUBLIC Hive adapter datastores Option Description

Number of loaders Specifies the number of loaders (threads) to run in parallel for loading data to the target table.

Specify a non-negative integer. The default is 1.

There are two types of loaders based on the number you en­ ter:

● Single loader loading: Loading with one loader. ● Parallel loading: Loading when the number of loaders is greater than one.

9.3 Hive adapter datastore support for SQL function and transform

The Hive adapter datastore can process data using SQL functions and the SELECT statement in a SQL transform.

When you use a Hive table in a data flow, use the SQL transform and add SQL functions to manipulate the data in the table.

● Use the SQL Transform to select specific data from the Hive table to process.

 Note

The SQL transform supports only a single SELECT statement. Also, SAP Data Services does not support SELECT for table columns with a constant expression.

● Use a sql() function to manipulate data in the following ways: ○ Create, drop, or INSERT Hive tables ○ Return a single string value from a Hive table ○ Select a Hive table that contains aggregate functions (max, min, count, avg, and sum) ○ Perform inner and outer joins

9.4 Pushing the JOIN operation to Hive

Stage non-Hive data in a dataflow with the Data Transfer transform before joining it with a Hive source.

When you include a join operation in a data flow between Hive and non-Hive data, stage the Hive data before the operation for better performance. Staging data is more efficient because SAP Data Services doesn't have to read all the data from the Hive data source into memory before performing the join.

Before you stage the data, enable the Enable automatic data transfer option in the Hive datastore editor.

Supplement for Adapters Hive adapter datastores PUBLIC 73 When you construct the data flow, add the Data_Transfer transform. Open the transform editor and make the following settings:

● Transfer Type = Table ● Database type = Hive

 Caution

For non-Hive relational databases: In the Data_Transfer transform, if the option Data Transfer Type is set to Automatic, disable the option Enable automatic data transfer. This rule applies to all relational databases except for Hive.

Related Information

Data_Transfer transform for push-down operations

9.5 About partitions

SAP Data Services imports Hive partition columns the same way as regular columns, but displays the partition columns at the end of the table column list.

The column attribute Partition Column identifies whether the column is partitioned.

When loading to a Hive target, select whether or not to use the Dynamic partition option on the Adapter Target tab of the target table editor.

Hive evaluates the partitioned data dynamically when it scans the data. If Dynamic partition is not selected, Data Services uses Hive static loading, in which it loads all rows to the same partition. The partitioned data comes from the first row that the loader receives.

Related Information

Hive adapter target options [page 72]

9.6 Previewing Hive table data

After you import Hive table metadata using a Hive datastore, preview data in Hive tables.

To preview Hive table data, first import table metadata using the Hive datastore. Then, right-click a Hive table name in the SAP Data Services Designer object library and click View Data.

Supplement for Adapters 74 PUBLIC Hive adapter datastores Alternatively, click the magnifying glass icon on Hive source and target objects in a data flow or open the View Data tab of the Hive table view.

 Note

The ability to preview Hive table data is available only with Apache Hive version 1.1 and later.

For more information about how to use the View Data tab, see the Designer Guide.

Related Information

Using View Data

9.7 Using Hive template tables

You can use Hive template tables as targets in your data flow.

Ensure that the Hive adapter datastore is correctly configured in both SAP Data Services Management Console and SAP Data Services. To add a Hive template table as a target, start to create a data flow in Data Services Designer and perform the following steps:

1. Add a Hive template table object to the data flow.

Use one of two methods: ○ Select a template table icon from the toolbar at right and click anywhere in the data flow in the workspace. ○ Expand the Template node under the applicable Hive adapter datastore in the object library and drag and drop a template table onto your workspace. Note that the template table has to already exist before it is in the object library.

The Create Template dialog box opens. 2. Enter a template table name in Template name. 3. Select the applicable Hive datastore name from the In datastore dropdown list. 4. Enter the Hive dataset name in Owner name. 5. Select the format of the table from the Format dropdown list. 6. Click OK to close the Create Template dialog box. 7. Connect the Hive template table to the data flow. 8. Click the template table target icon in the data flow to open the target editor. 9. Open the Target tab and set applicable options. The software completes the input and output schema areas based on the schema in the stated Hive dataset. 10. Save your changes and execute the applicable job.

Data Services opens the applicable Hive project and dataset, and creates the table. The table name is the name that you entered for Template name in the Create Template window. Data Services populates the table with the data generated from the data flow.

Supplement for Adapters Hive adapter datastores PUBLIC 75 9.8 SSL connection support for Hive adapter

SAP Data Services supports SSL connections for the Hive adapter.

The SSL connection support is through a database connection. For a database connection, configure SSL options on the Data Services server side. Provide Data Services with the necessary certificates.

Data Services automatically includes certificates in its Java keystore so that it recognizes an adapter datastore instance as a trusted Web site. However, if there’s an error regarding a certificate, manually add a certificate back into the Java keystore. For instructions, see the Supplement for Adapters.

Related Information

SSL connection support [page 165]

Supplement for Adapters 76 PUBLIC Hive adapter datastores 10 HTTP adapter

Use the HTTP adapter for secure transfer of data for request/reply and request/acknowledge services.

How it works

The HTTP is a request/response protocol over a connection with a server. A client sends a request to the server in the form of a request method, a universe resource identifier (URI), and protocol version, followed by a Multipurpose Internet Mail Extension (MIME)-like message that contains the following information:

● Modifiers ● Client information ● Possible body content

The server responds with a status line, including the message protocol version and a success or error code, followed by a MIME-like message that contains the following information:

● Server information ● Entity meta information ● Possible entity-body content

HTTP communication usually takes place over TCP/IP connections. The default port is TCP 80 [19], but you can use other ports. You can also implement HTTP on top of any other protocol on the Internet, or on other networks. HTTP only presumes a reliable transport; you can use any protocol that provides such guarantees.

HTTP Adapter advantages

The following table describes the advantages of using the SAP Data Services HTTP adapter.

Scope Description

Rapid integration of diverse systems and applications Uses HTTP protocol with the Data Services platform to meet unique business process requirements, saving time and ef­ fort.

SSL (Secure Socket Layer) Implements security over the HTTP protocol. Use the HTTPS protocol to protect data from any unscrupulous elements.

Compresses data encoding Saves network traffic by supporting compress type data en­ coding while sending and receiving information.

Request/Reply and Request/Acknowledge services Initiates services in Data Services through the adapter.

Supplement for Adapters HTTP adapter PUBLIC 77  Tip

When you create and use the HTTP adapter, follow the steps and refer to the general adapter installation and configuration information that appears earlier in this guide. Additionally, keep the Designer Guide, Administrator Guide, and Installation Guide handy for reference.

HTTP adapter datastore [page 78] Define a datastore for each instance of an HTTP adapter.

HTTP adapter architecture [page 79] The HTTP adapter uses a Web Server and HTTP protocol to read and upload data using specific SAP Data Services architecture.

HTTP adapter operations [page 80] To use the HTTP adapter, create an adapter instance in SAP Data Services Management Console, and configure the instance for a specific operation.

Configuring an HTTP operation instance [page 81] After adding an HTTP adapter instance, configure an HTTP operation instance in the SAP Data Services Management Console Administrator.

HTTP adapter operation instance configuration options [page 82] After you configure the HTTP Adapter, configure the applicable operation instances.

Operation instance invocation [page 87] SAP Data Services needs either a message function or an outbound message to call the operation instances.

Testing the HTTP operations [page 88] Use sample test files to test the operation of your HTTP adapter before you use it in your production environment.

Configure SSL for the HTTP adapter [page 94] With Secure Sockets Layer (SSL) configured, the HTTP Adapter uses secure transport over the TCP/IP network.

Error handling and tracing [page 95] When you execute jobs that include the HTTP adapter datastore, SAP Data Services logs error and trace messages to the LINK_DIR/adapters/log directory.

Related Information

Adapter installation and configuration [page 12]

10.1 HTTP adapter datastore

Define a datastore for each instance of an HTTP adapter.

Use an HTTP adapter when the data flow passes a message to an operation instance. Use either an outbound message (Request/Acknowledge operation) or a message function (Request/Reply operation).

Supplement for Adapters 78 PUBLIC HTTP adapter A real-time data flow can then pass a message to one of the adapter operation instances defined in the datastore.

After you define an adapter datastore, define one function or outbound message for each operation instance to which you pass a message.

Parent topic: HTTP adapter [page 77]

Related Information

HTTP adapter architecture [page 79] HTTP adapter operations [page 80] Configuring an HTTP operation instance [page 81] HTTP adapter operation instance configuration options [page 82] Operation instance invocation [page 87] Testing the HTTP operations [page 88] Configure SSL for the HTTP adapter [page 94] Error handling and tracing [page 95] Creating an adapter datastore [page 31]

10.2 HTTP adapter architecture

The HTTP adapter uses a Web Server and HTTP protocol to read and upload data using specific SAP Data Services architecture.

The following diagram depicts a local and a remote Data Services installation that use the HTTP Adapter to exchange information. This diagram also shows the interaction between SAP Data Services and any other third-party software supporting the HTTP protocol.

Supplement for Adapters HTTP adapter PUBLIC 79

The following steps describe the flow of control depicted in the diagram. The reference numbers in the diagram relate to the specific step number:

1. External application requests a service on Data Services. 2. A real-time data flow calls the adapter operation instance. 3. The operation instance receives the XML data from the real-time data flow and makes a request on the remote Data Services server. The operation instance forms the request URL by reading its configuration file. The URL contains servlet name and the service name, which you configure as part of the operation instance configuration. If you request to the information resource, configure a resource-specific URL as part of the operation instance configuration. 4. The information resource, such as Siebel, makes a request on the remote Data Services server by using HTTP or HTTPS protocol. The information resource forms the URL, which contains the servlet name and service name. 5. The servlet runs on the HTTP server, such as Tomcat, that is a part of Data Services. This HTTP server can be SSL enabled depending on your requirements. The servlet processes the request to get the service name and XML data. The service then invokes that service running locally in Data Services and sends the reply to the client.

Parent topic: HTTP adapter [page 77]

Related Information

HTTP adapter datastore [page 78] HTTP adapter operations [page 80] Configuring an HTTP operation instance [page 81] HTTP adapter operation instance configuration options [page 82] Operation instance invocation [page 87] Testing the HTTP operations [page 88] Configure SSL for the HTTP adapter [page 94] Error handling and tracing [page 95]

10.3 HTTP adapter operations

To use the HTTP adapter, create an adapter instance in SAP Data Services Management Console, and configure the instance for a specific operation.

When you start your adapter instance and its operations in the Management Console Administrator, the message “Started” appears in the Status column. To confirm that all operations are started, click Operations in the Dependent Objects column.

If you have a real-time service set up on your system, you can call it through the HTTP interface: http:// localhost:8080/admin/jsp/InvokeService.jsp

Supplement for Adapters 80 PUBLIC HTTP adapter Using this interface, you can call the selected service by sending the input XML to the HTTP Adapter servlet running on the remote machine where the service is configured.

 Note

For information about how to set up a test service, see the Installation Guide.

The following table describes the operation instances that you configure for the HTTP adapter.

Operation instances

Operation instance Information

Request/Reply from Data Services Sends the request to the remote Data Services machine and waits for the reply.

Request/Acknowledge from Data Services Sends the message to the remote Data Services machine and gives an acknowledgment.

Test the operation instances after you configure them.

Parent topic: HTTP adapter [page 77]

Related Information

HTTP adapter datastore [page 78] HTTP adapter architecture [page 79] Configuring an HTTP operation instance [page 81] HTTP adapter operation instance configuration options [page 82] Operation instance invocation [page 87] Testing the HTTP operations [page 88] Configure SSL for the HTTP adapter [page 94] Error handling and tracing [page 95] Testing the path from client to service

10.4 Configuring an HTTP operation instance

After adding an HTTP adapter instance, configure an HTTP operation instance in the SAP Data Services Management Console Administrator.

Log into the Management Console and open the Administrator application.

1. Expand the Adapter Instance node and select the applicable Job Server. 2. Open the Adapter Configuration tab.

Supplement for Adapters HTTP adapter PUBLIC 81 3. Click Operations under the Dependent Objects column for the applicable adapter instance.

The Adapter Operation Configuration tab opens. 4. Click Add for a new operation instance. To edit an existing operation, click the applicable instance.

The Supported Adapter Operations tab opens. 5. Select an operation type from the Operation Type dropdown list and click Apply.

The Operation Details tab opens. The options that appear are based on the operation type you choose. 6. Complete the operation instance configuration form.

Configure the following operation instances for HTTP adapter: ○ Request/Reply ○ Request/Acknowledge 7. Click Apply.

Task overview: HTTP adapter [page 77]

Related Information

HTTP adapter datastore [page 78] HTTP adapter architecture [page 79] HTTP adapter operations [page 80] HTTP adapter operation instance configuration options [page 82] Operation instance invocation [page 87] Testing the HTTP operations [page 88] Configure SSL for the HTTP adapter [page 94] Error handling and tracing [page 95] Adding an adapter instance [page 13]

10.5 HTTP adapter operation instance configuration options

After you configure the HTTP Adapter, configure the applicable operation instances.

The Request/Reply operation instance sends an outbound message as an HTTP request from the SAP Data Services data flow to the target URL, waits for a reply, and returns the reply to the data flow.

The Request/Acknowledge operation instance sends outbound messages from SAP Data Services data flows as HTTP requests to the target URL.

Configure the HTTP operation instances in the Management Console Administrator application in the Operation Details tab. The following table contains descriptions for the operation instance options in alphabetical order. The options apply to both types of operation instances except where noted.

Supplement for Adapters 82 PUBLIC HTTP adapter  Note

After you complete the operation instance configuration, restart the HTTP adapter so that the Management Console applies the operation instance to the adapter.

Operation instance option descriptions Option Description

Content-Encoding Specifies the encoding mechanism to use for sending the re­ quest.

● Application/nocompress: Doesn’t compress the re­ quest. Application/nocompress is the default. ● Application/x-gzip: Uses x-gzip to compress the re­ quest. ● Application/x-compress: Uses x-compress to compress the request.

Content-Language Specifies the language in which the request document is written.

Use the two-character ISO country code. For example, enter en to indicate that the content is in English.

Content-Type Specifies the content type header of the request.

Content type specifies the nature of the data by giving type and subtype identifiers.

Continue if untrusted Specifies whether Data Services continues the operation when it doesn’t trust the HTTP server.

● True: Continues the operation when it doesn’t trust the HTTP server. True is the default setting. ● False: Terminates the operation when it doesn’t trust the HTTP server.

Convert Input XML to Specifies whether Data Services converts the input XML to parameter—value pairs.

● None: Doesn’t convert the input XML to parameter— value pairs. None is the default setting. ● param—value: Converts the input XML to parameter— value pairs.

Delete header/comments from input XML Specifies whether Data Services deletes the header and—or comments from the input XML.

● Yes: Deletes the header and—or comments from the in­ put XML. ● No: Doesn’t delete the header and—or comments from the input XML. No is the default setting.

Supplement for Adapters HTTP adapter PUBLIC 83 Option Description

Description Specifies the operation instance description.

Data Services displays the description that you enter here in the Metadata Browsing dialog box.

Display name Specifies the operation instance display name.

Data Services displays the name that you enter for this op­ tion in the Metadata Browsing dialog box.

DOCTYPE header Specifies the DOCTYPE of the request header when re­ quired.

Enter the value using the full string of: .

Leave blank when not required.

 Note

Make sure to include the angle brackets around the value.

Enable Specifies whether Data Services starts the operation in­ stance when you start the adapter.

● True: Starts the operation instance when you start the adapter. ● False: Doesn’t start the operation instance when you start the adapter.

HTTP basic authentication password Specifies the related password to the HTTP basic authentication username option.

Applicable only when your service requires basic access au­ thentication.

Data Services encodes the information using Base64 binary encoding.

HTTP basic authentication username Specifies a user name for when you use HTTP basic access authentication.

Applicable only when your service requires basic access au­ thentication.

Input attribute Specifies the input attribute name for the input XML.

Operation instance Specifies a unique operation instance name.

Data Services uses the operation instance name when it im­ ports the Request/Reply or the Request/Acknowledge met­ adata objects.

Supplement for Adapters 84 PUBLIC HTTP adapter Option Description

Operation retry count If the operation fails, specifies the number of times Data Services retries the operation.

Enter an integer:

● 5: If the operation fails, Data Services tries to rerun the operation five times. 5 is the default setting. ● Enter 0 for no retries. ● Enter a negative number for an indefinite number of re­ tries.

Data Services retries the Request/Reply operation for the set number of times. If the operation still fails after Data Services retries for the set number of times, the operation fails with no more retry attempts.

Operation retry interval Specifies the time in milliseconds that Data Services waits between retries.

Enter a positive integer. The default value is 15000.

Request message format Specifies the file name of the Request message format file.

The file has either a DTD or XSD extension. The Request message file contains a definition of the request XML mes­ sage for this operation.

Request message root element Specifies the name of the XML root element contained in the Request message.

Request method Specifies the HTTP request method that Data Services uses for submitting the request.

● GET: Requests data from a specified resource. ● POST: Sends data to a server to create or update a re­ source. Data Services stores the sent data in the re­ quest body of the HTTP request. POST is the default setting.

Response message format Specifies the file name of the Response message format file.

The file has either a DTD or XSD extension. The Response message file contains a definition of the response XML mes­ sage for this operation.

Applicable for Request/Reply operation instance only.

Response message root element Specifies the name of the XML root element contained in the Response message.

Applicable for Request/Reply operation instance only.

Supplement for Adapters HTTP adapter PUBLIC 85 Option Description

Target URL Specifies the URL for the location to which Data Services sends the HTTP request.

Enter using the following server URL format: http:// :/DataServices/servlet/http? ServiceName=

Where:

: The IP address or host name of the Access Server. ● : The port number of the Tomcat Server. If you don’t use the bundled Tomcat Web server, the port number for your third-party Web server. ● : The name of the service.

Thread count Specifies the number of copies of the Request/Reply or Re­ quest/Acknowledge operation to run in parallel. Enter an in­ teger:

● 1: If the sequence of messages is important, such as in synchronous processing, don’t enter more than 1. 1 is the default setting. ● Greater than 1: Enter a value greater than 1 for parallel (asynchronous) processing of messages coming from a real-time service.

 Note

Ensure that you support multiple copies of real- time services with multiple copies of Request/ Reply.

Target URL for HTTP requests [page 87] When you configure an HTTP operation, you enter the target URL along with the service name.

Parent topic: HTTP adapter [page 77]

Related Information

HTTP adapter datastore [page 78] HTTP adapter architecture [page 79] HTTP adapter operations [page 80] Configuring an HTTP operation instance [page 81] Operation instance invocation [page 87] Testing the HTTP operations [page 88]

Supplement for Adapters 86 PUBLIC HTTP adapter Configure SSL for the HTTP adapter [page 94] Error handling and tracing [page 95]

10.5.1 Target URL for HTTP requests

When you configure an HTTP operation, you enter the target URL along with the service name.

The SAP Data Services server target URL format is:

http://:/DataServices/servlet/http?ServiceName=

Where:

is the IP address/host name of the Access Server ● is the port number of the Access server ● is the name of the service

These values are the same as in the URL of the Administrator.

Parent topic: HTTP adapter operation instance configuration options [page 82]

10.6 Operation instance invocation

SAP Data Services needs either a message function or an outbound message to call the operation instances.

Use the HTTP adapter datastore to import either a message function or an outbound message based on the type of operation involved. After you import the message functions and—or outbound messages, Data Services lists them under the HTTP adapter datastore in the Datastore tab of the object library.

The following table describes the action in real-time data flows when Data Services invokes the operation instance.

Operation type Invocation type Process

Request/Reply Message function A real-time data flow passes messages to an operation in­ stance (request) and waits for a return XML message (reply) from the information resource.

Request/Acknowledge Outbound messages A real-time data flow passes a message to an operation in­ stance (request) and waits for a confirmation (acknowledge) from the information resource.

For outbound messages, the real-time data flow doesn’t get a return XML message.

Importing functions [page 88] Use the HTTP adapter datastore to import message functions and outbound messages from the source.

Supplement for Adapters HTTP adapter PUBLIC 87 Parent topic: HTTP adapter [page 77]

Related Information

HTTP adapter datastore [page 78] HTTP adapter architecture [page 79] HTTP adapter operations [page 80] Configuring an HTTP operation instance [page 81] HTTP adapter operation instance configuration options [page 82] Testing the HTTP operations [page 88] Configure SSL for the HTTP adapter [page 94] Error handling and tracing [page 95]

10.6.1 Importing functions

Use the HTTP adapter datastore to import message functions and outbound messages from the source.

1. Log into SAP Data Services Designer and open the Datastore tab in the object library. 2. Right-click the applicable HTTP adapter datastore and select Open from the dropdown menu.

The adapter metadata browser opens at right. 3. Right-click the applicable operation instance in the metadata browser and select Import from the dropdown menu.

Data Services adds the selected operation instance, with the related message functions or outbound messages, under the HTTP adapter datastore node in the Datastore tab. Use the message functions and outbound message functions for creating a real-time data flow in Data Services.

Task overview: Operation instance invocation [page 87]

10.7 Testing the HTTP operations

Use sample test files to test the operation of your HTTP adapter before you use it in your production environment.

Perform the following tasks before you test the HTTP operation:

● Install and configure the HTTP adapter in the SAP Data Services Management Console. ● Import the Acta_HTTPAdapter_Sample.atl file in SAP Data Services Designer.

1. Log into the Management Console, open the Administrator application, and open the HTTP adapter instance.

Supplement for Adapters 88 PUBLIC HTTP adapter 2. Configure the HTTP Request/Reply or the Request/Acknowledge operation instance using the test settings provided.

○ Request/Reply operation: Files and settings for testing Request/Reply operation [page 89] ○ Request/Acknowledge operation: Files and settings for testing the Request/Acknowledge operation [page 92] 3. Log into Data Services Designer and execute the job HTTP_ReqRep_BatchJob or HTTP_ReqAck_BatchJob based on the operation instance you are testing.

After the batch job completes, Data Services creates an output file named either OutputRep.xml or OutputAck.xml in LINK_DIR\adapters\HTTP\samples\xml. 4. Examine the applicable XML file to ensure the operation instance functions correctly. 5. If there are errors in your job, view the Trace and Error logs through the Management Console Administrator.

Task overview: HTTP adapter [page 77]

Related Information

HTTP adapter datastore [page 78] HTTP adapter architecture [page 79] HTTP adapter operations [page 80] Configuring an HTTP operation instance [page 81] HTTP adapter operation instance configuration options [page 82] Operation instance invocation [page 87] Configure SSL for the HTTP adapter [page 94] Error handling and tracing [page 95]

10.7.1 Files and settings for testing Request/Reply operation

Test the Request/Reply operation instance using sample files and provided settings.

The Acta_HTTPAdapter_Sample.atl file contains files to test the Request/Reply and the Request/ Acknowledge operations. The following figure shows the Import Plan dialog box for the ATL file, with a list of the files included in the import.

Supplement for Adapters HTTP adapter PUBLIC 89 Use the settings in the following table to configure the Request/Reply operation instance in SAP Data Services Management Console Administrator.

Request/Reply test settings Field Configuration information

Operation instance HTTP_ReqReply_Function

Thread count 1

Operation retry count 5

Operation retry interval 15000

Display name HTTP_ReqReply_Function

Description Performs the Request/Reply operation

Enable Keep the default true.

Supplement for Adapters 90 PUBLIC HTTP adapter Field Configuration information

Target URL For HTTP operation, use:

http://:/ DataServices/servlet/http? ServiceName=Test

For HTTPS operation, use:

https:// :/ DataServices/servlet/http? ServiceName=Test

 Note

By default, the HTTPS port of the Tomcat server is 8443. You can change the default port in the Tomcat configura- tion file (acta-server.xml on Windows, and acta-server1.xml on UNIX)

Request method Keep the default Post.

Content-Type Keep the default text/xml.

Content-Language en

Content-Encoding Keep the default application/nocompress.

Continue if untrusted Keep the default true.

DOCTYPE header Leave blank.

HTTP Basic Authentication Username Set if applicable.

HTTP Basic Authentication Password Set if applicable.

Request message format LINK_DIR/adapters/Http/samples/dtd/ HTTPTestIn.dtd

Request message root element test

Response message format LINK_DIR/adapters Http/samples/dtd/ HTTPTestOut.dtd

Response message root element test

Delete header/comments from input XML Keep the default no.

Input attribute Leave blank.

Convert Input XML to Keep the default none.

Supplement for Adapters HTTP adapter PUBLIC 91 10.7.2 Files and settings for testing the Request/ Acknowledge operation

Test the Request/Acknowledge operation instance using sample files and provided settings.

The Acta_HTTPAdapter_Sample.atl file contains files to test the Request/Reply and the Request/ Acknowledge operations. The following figure shows the Import Plan dialog box for the ATL file, with a list of the files included in the import.

Use the settings in the following table to configure the Request/Acknowledge operation instance in SAP Data Services Management Console Administrator.

Request/Acknowledge test settings Field Configuration information

Operation instance HTTP_ReqAck_Outbound

Thread count 1

Operation retry count 5

Operation retry interval 15000

Display name HTTP_ReqAck_Outbound

Description Performs the Request/Acknowledge operation

Enable Keep the default true.

Supplement for Adapters 92 PUBLIC HTTP adapter Field Configuration information

Target URL For HTTP operation, use:

http://:/ DataServices/servlet/http? ServiceName=Test

For HTTPS operation, use:

https:// :/ DataServices/servlet/http? ServiceName=Test

 Note

By default, the HTTPS port of the Tomcat server is 8443. Change the default port in the Tomcat configuration file (acta-server.xml on Windows, and acta- server1.xml on UNIX)

Request method Keep the default Post.

Content-Type Keep the default text/xml.

Content-Language en

Content-Encoding Keep the default application/nocompress.

DOCTYPE header Leave blank.

HTTP Basic Authentication Username Set if applicable.

HTTP Basic Authentication Password Set if applicable.

Request message format LINK_DIR/adapters/Http/samples/dtd/ HTTPTestOut.dtd

Request message root element test

Input attribute Leave blank.

Convert Input XML to Keep the default none.

Supplement for Adapters HTTP adapter PUBLIC 93 10.8 Configure SSL for the HTTP adapter

With Secure Sockets Layer (SSL) configured, the HTTP Adapter uses secure transport over the TCP/IP network.

Server side

Configure SSL in your Web application server.

Use either the bundled Tomcat Web application server that is included with the SAP BusinessObjects BI Platform or the Information Platform Service installation, or use a third-party Web application server.

For a complete list of supported Web application servers, consult the Product Availability Matrix (PAM).

 Note

If you use a third-party Web application server, your administrator must have installed and configured it before installing SAP BusinessObjects BI Platform or Information Platform Service. See the Installation Guide for more information.

Client side

The HTTP adapter client handles the details of certificate authentication internally. It implements the X509TrustManager interface and uses SSLSocketFactory classes from the HttpsURLConnection class.

Whenever a job makes an HTTPS request to the SSL-enabled Web server, the client requests the server certificate. The certificate can be issued by a standard authority, such as VeriSign.

The HTTP client compares the certificate to the certificate store in LINK_DIR/ext/jre/lib/security and processes as follows:

● If the client determines the certificate is trustworthy, it retrieves all data from the Web server. ● If the client determines the certificate isn’t trustworthy, it throws an SSL exception to the caller.

The HTTP client requires the password for querying the local keystore for verification. You enter this password in the Keystore Password option when you create the HTTP adapter instance in the SAP Data Services Management Console.

The operation instance reads the setting for the Continue if untrusted option, which you set when you configure the operation instance. Based on the setting, the operation instance behaves as follows:

● If Continue if untrusted = False, then a message shows the SSL exception in Data Services trace files and the client doesn’t retrieve any data from the server. ● If Continue if untrusted = True, then a message shows the SSL exception in Data Services error and trace files and the client retrieves data from the server.

Supplement for Adapters 94 PUBLIC HTTP adapter Data Services downloads the certificate file named untrust.cer and adds it to the working directory or to the LINK_DIR/bin directory. Import this certificate file into the JDK certificate keystore by using the keytool command-line utility. Use the following code:

keytool -import -alias -file untrust.cer -keystore cacerts -storepass changeit

Parent topic: HTTP adapter [page 77]

Related Information

HTTP adapter datastore [page 78] HTTP adapter architecture [page 79] HTTP adapter operations [page 80] Configuring an HTTP operation instance [page 81] HTTP adapter operation instance configuration options [page 82] Operation instance invocation [page 87] Testing the HTTP operations [page 88] Error handling and tracing [page 95] SSL connection support [page 165]

10.9 Error handling and tracing

When you execute jobs that include the HTTP adapter datastore, SAP Data Services logs error and trace messages to the LINK_DIR/adapters/log directory.

The names of the error and trace log files match the names that you set in the adapter instance configuration in the SAP Data Services Management Console Administrator application. Data Services further amends the file names with _error.txt for error logs and _trace.txt for trace logs.

Parent topic: HTTP adapter [page 77]

Related Information

HTTP adapter datastore [page 78] HTTP adapter architecture [page 79] HTTP adapter operations [page 80] Configuring an HTTP operation instance [page 81] HTTP adapter operation instance configuration options [page 82]

Supplement for Adapters HTTP adapter PUBLIC 95 Operation instance invocation [page 87] Testing the HTTP operations [page 88] Configure SSL for the HTTP adapter [page 94]

Supplement for Adapters 96 PUBLIC HTTP adapter 11 JMS adapter

Java Message Service (JMS) enables applications to create, send, receive, and read messages.

JMS provides a common method for Java language programs to access enterprise-messaging or message- oriented middleware (MOM) products. In addition to the traditional MOM vendors, several database vendors and Internet-related companies also provide enterprise-messaging products. Java language clients and Java language middle-tier services must be able to use these messaging systems. JMS provides a common way for Java language programs to access these systems.

JMS provides communication that is:

● Asynchronous: A client does not need to be present to receive messages that a provider delivers. ● Reliable: JMS delivers messages only once.

Scope of the JMS adapter [page 98] The SAP Data Services JMS adapter supports Request/Reply and Request/Acknowledgment operations and use of an external Information Resource (IR).

JMS adapter functional overview [page 98] The JMS adapter works with SAP Data Services data flows and other components, including external JMS applications, Web servers, and Java EE applications.

Design considerations [page 100] To avoid possible errors in configuration, view information about configuring a JMS adapter.

Configuring a JMS adapter—overview [page 101] All SAP Data Services adapters communicate with the software through a designated Adapter Manager Job Server.

JMS adapter datastore [page 111] Use metadata from the SAP Data Services JMS adapter datastore with a batch job or real-time data flow to pass messages to an operation instance.

Testing the JMS adapter and operations [page 113] Use sample test files to test the operation of your JMS adapter before you use it in your production environment.

WebLogic as JMS provider [page 132] Use Weblogic as your JMS provider when you use the JMS adapter.

Error handling and tracing [page 133] View error and trace logs after processing messages with the JMS adapter to view processing information and messages for error resolution.

Supplement for Adapters JMS adapter PUBLIC 97 11.1 Scope of the JMS adapter

The SAP Data Services JMS adapter supports Request/Reply and Request/Acknowledgment operations and use of an external Information Resource (IR).

The basic JMS adapter process in Data Services encompasses the following operations:

● Data Services initiates the Request/Reply operation: Data Services sends the message request on a preconfigured request queue and gets the reply on a pre-configured reply queue. ● Data Services initiates the Request/Acknowledgment operation: Data Services sends the message on a pre-configured target queue or by publishing a message to a JMS topic. When Data Services uses a JMS topic to send the message, JMS sends back only the acknowledgment to Data Services. ● External Information Resource (IR) initiates Request/Acknowledgment and Request/Reply operations: An external IR sends the requests to Data Services, which sends back a reply or acknowledgment. Alternatively, the IR publishes a message to a JMS topic to which the JMS adapter subscribes.

 Note

An IR is a JMS-compatible application.

Parent topic: JMS adapter [page 97]

Related Information

JMS adapter functional overview [page 98] Design considerations [page 100] Configuring a JMS adapter—overview [page 101] JMS adapter datastore [page 111] Testing the JMS adapter and operations [page 113] WebLogic as JMS provider [page 132] Error handling and tracing [page 133]

11.2 JMS adapter functional overview

The JMS adapter works with SAP Data Services data flows and other components, including external JMS applications, Web servers, and Java EE applications.

The following diagram shows a functional overview of the JMS adapter, how it works with other components, and their potential interrelationships. It shows the architecture and functionality of the JMS adapter and how the adapter interacts with the external JMS application through the JMS Provider. JMS sends or receives data using either one or both of the following message domains:

Supplement for Adapters 98 PUBLIC JMS adapter ● Point-to-Point (P2P): Uses message queues, senders, and receivers for addressing, receiving, and retaining messages until they are consumed or expire. ● Publish/Subscribe: Uses message topics, which are like bulletin boards. Only subscribers can access the message from the topic.

The following steps describe the flow of control depicted in the diagram. The reference numbers in the diagram relate to the specific step number, and diagram elements are in bold:

1. External Applications invoke a JMS Operation on Data Services. 2. Based on the JMS operation invoked, the applicable real-time data flow (RTDF) calls the operation instance using the XML data from the external application as input. 3. The JMS Adapter Operation Instance sends the message to the configured queue or topic in the JMS provider. Based on the type of operation, the JMS provider sends the Reply/Acknowledgment message back to the software. 4. An External JMS Application sends messages to the JMS Provider on a request queue or publishes the message to a topic (Sending & Receiving). The JMS Adapter Operation instance receives these messages after polling them from the JMS Provider and, for P2P, sends the reply back to the external JMS application on a configured reply queue. No reply is sent if the message was from a topic.

Parent topic: JMS adapter [page 97]

Supplement for Adapters JMS adapter PUBLIC 99 Related Information

Scope of the JMS adapter [page 98] Design considerations [page 100] Configuring a JMS adapter—overview [page 101] JMS adapter datastore [page 111] Testing the JMS adapter and operations [page 113] WebLogic as JMS provider [page 132] Error handling and tracing [page 133]

11.3 Design considerations

To avoid possible errors in configuration, view information about configuring a JMS adapter.

Before adding and configuring a JMS adapter, and creating the JMS adapter datastore, consider the following information:

● If you plan to use JMS queues for the Point-to-Point message domain, and—or topics for the Publish/ Subscribe message domain, preconfigure the queues and topics in the Messaging System first. ● Remember that SAP Data Services uses only XML messages in replies or acknowledgments. ● When you configure the GetTopic operation, set the Thread Count option to 1 to avoid multiple copies of the same message going to a service.

 Note

Each thread is a subscriber to a topic, therefore, any thread over 1 receives and sends the same message.

● Install adapters on the computer that contains the specific Job Server that you designate for adapter processes. Make sure that you enable adapters for this Job Server in the SAP Data Services Server Manager.

Parent topic: JMS adapter [page 97]

Related Information

Scope of the JMS adapter [page 98] JMS adapter functional overview [page 98] Configuring a JMS adapter—overview [page 101] JMS adapter datastore [page 111] Testing the JMS adapter and operations [page 113] WebLogic as JMS provider [page 132] Error handling and tracing [page 133]

Supplement for Adapters 100 PUBLIC JMS adapter 11.4 Configuring a JMS adapter—overview

All SAP Data Services adapters communicate with the software through a designated Adapter Manager Job Server.

The following process involves the SAP Data Services Server Manager utility, the SAP Data Services Management Console Administrator application, and Data Services.

To configure the JMS adapter, perform the following steps:

1. Open the SAP Data Services Server Manager utility and ensure that the Job Server on which the adapter is installed has the Support adapter and message broker communication option enabled. 2. Log into the Management Console and open the Administrator application. 3. Perform the following tasks in the Administrator for the applicable Job Server: a. Add at least one instance of the adapter to the system. For JMS-specific options descriptions, see JMS adapter-specific configuration settings [page 25]. b. Configure at least one operation for the adapter instance. c. Start or restart the adapter instance to enable the configuration options. 4. Open the SAP Data Services Designer and create a JMS adapter datastore. 5. Import metadata using the JMS adapter datastore. 6. Use imported metadata to create both batch and real-time jobs as applicable.

JMS adapter operations [page 102] Create an adapter instance to identify the JMS application to use in the integration operation.

Adding an operation to an adapter instance [page 105] Add an operation instance to a JMS adapter instance in the SAP Data Services Management Console.

JMS adapter operation options [page 106] To set up an operation instance, complete the options in the Operation Details tab of the SAP Data Services Management Console Administrator.

Task overview: JMS adapter [page 97]

Related Information

Scope of the JMS adapter [page 98] JMS adapter functional overview [page 98] Design considerations [page 100] JMS adapter datastore [page 111] Testing the JMS adapter and operations [page 113] WebLogic as JMS provider [page 132] Error handling and tracing [page 133] Adapter installation and configuration [page 12] Starting and stopping the adapter instance [page 27] Creating an adapter datastore [page 31]

Supplement for Adapters JMS adapter PUBLIC 101 11.4.1 JMS adapter operations

Create an adapter instance to identify the JMS application to use in the integration operation.

Before the JMS adapter can integrate the JMS provider with the SAP Data Services system, create and configure one operation for each adapter instance. The operation identifies the JMS application to use.

The following table contains descriptions for the integration operations included with the JMS adapter.

 Note

The operations include Peer-to-Peer (using queues) and Publish/Subscribe (using topics) message domains.

Supplement for Adapters 102 PUBLIC JMS adapter Integration operation descriptions Operation Type Application Process

Get Operation Request/Reply From: External Information An external IR sends a re­ Resource (IR) quest XML message to a JMS Request/Acknowledgment queue. To: Data Services ● The JMS adapter polls the queue at a time in­ terval that you specify in the configuration. ● The JMS adapter re­ ceives the XML message from the queue and sends it to the pre-con­ figured job service. ● After the job service processes the XML mes­ sage, the job service sends a reply message to the JMS adapter. ● The JMS adapter puts the reply message in a preconfigured response queue. ● If the response queue was not preconfigured, the message becomes a Request/Acknowledg­ ment operation and the job services does not send a reply. ● If there is an error in in­ voking another service from the job service, the job service sends the original message to the undelivered queue for reference by the IR.

Supplement for Adapters JMS adapter PUBLIC 103 Operation Type Application Process

GetTopic Operation Request/Acknowledgment From: External Information Process: Resource (IR) ● An external IR publishes To: Data Services an XML message to a JMS topic. ● The JMS adapter polls the topic in the time in­ terval that you specified in the configuration. ● When the JMS adapter receives the message from the JMS topic, it sends the message to the service that handles the message.

PutGet Operation Request/Reply From: Data Services Data Services initiates a message request and sends To: JMS adapter it to a preconfigured request queue. At the same time, Data Services listens for a re­ ply on a preconfigured reply queue.

An external JMS-compatible application:

● Listens on the request queue for a request. ● Gets and processes the request. ● Returns an XML re­ sponse to the reply queue.

The JMS adapter picks up the response from the reply queue and sends it to the job service.

Supplement for Adapters 104 PUBLIC JMS adapter Operation Type Application Process

Put Operation Request/Acknowledgment From: Data Services Data Services initiates a re­ quest and sends it to a pre­ To: JMS adapter configured target queue.

● If the message is sent successfully, the JMS adapter sends an ac­ knowledgment to the job service. ● If the JMS adapter is un­ able to send the mes­ sage, it raises an excep­ tion.

PutTopic Operation Request/Acknowledgment From: Data Services Data Services initiates a re­ quest and publishes it as an To: JMS adapter XML message to a preconfig- ure target topic.

● If the message is sent successfully, the JMS adapter sends an ac­ knowledgment to the job service. ● If Data Services cannot send the message, the JMS adapter raises an exception.

Parent topic: Configuring a JMS adapter—overview [page 101]

Related Information

Adding an operation to an adapter instance [page 105] JMS adapter operation options [page 106]

11.4.2 Adding an operation to an adapter instance

Add an operation instance to a JMS adapter instance in the SAP Data Services Management Console.

Before you add an operation instance, perform the following tasks:

● Log into the Management Console and open the Administrator application.

Supplement for Adapters JMS adapter PUBLIC 105 ● Add and configure a JMS adapter instance.

Perform the following steps to add an operation instance to a JMS adapter instance.

1. Select the applicable Job Server under the Adapter Instances node and open the Adapter Configuration tab. 2. Click Operations under the Dependent Objects column for the applicable JMS adapter instance.

The Adapter Operation Configuration tab opens. 3. Click Add to configure a new operation.

The Supported Adapter Operations tab opens. 4. Select an operation type from the Operation Type dropdown list and click Apply.

The Operation Details tab opens. The options that appear are based on the operation type you chose. 5. Complete the options and click Apply to save the operation instance configuration. 6. Stop and restart the applicable JMS adapter instance to enable the operation instance settings.

Task overview: Configuring a JMS adapter—overview [page 101]

Related Information

JMS adapter operations [page 102] JMS adapter operation options [page 106]

11.4.3 JMS adapter operation options

To set up an operation instance, complete the options in the Operation Details tab of the SAP Data Services Management Console Administrator.

The following table contains option descriptions for configuring a JMS adapter operation instance. The last column lists the applicable operations. The options are listed in alphabetic order.

Operation option descriptions Field Description Applicable operation

Continue After Error Specifies whether to continue process­ PutGet (request/reply) ing after the operation instance en­ Get (request/reply and request/ counters an error. acknowledgment) ● True: Operation instance remains GetTopic (request/acknowledgment in start stage even after the error. only) ● False: Operation instance stops af­ ter the error occurs during the process.

Supplement for Adapters 106 PUBLIC JMS adapter Field Description Applicable operation

Default Response Queue Optional. Specifies the response queue Get (request/reply) name. The operation instance sends the reply back to the external JMS ap­ plication (Internal Resource) on this queue.

Enter the name based on the configura- tion type you chose for the adapter in­ stance: JNDI or MQ.

Description Specifies the operation instance de­ Put (request/acknowledgment) scription. PutTopic (request/acknowledgment) Enter a meaningful description. PutGet (request/reply) Data Services displays the description that you enter here in the Metadata Browsing dialog box.

Destination Queue Specifies the name of the destination Put (request/acknowledgment) queue (target) for the message.

Enter the name based on the configura- tion type you chose for the adapter in­ stance: JNDI or MQ.

Destination Topic Specifies the name of the topic to which PutTopic (request/acknowledgment) the operation instance publishes mes­ sages.

Enter the name based on the configura- tion type you chose for the adapter in­ stance: JNDI or MQ.

Display name Specifies the operation instance display Put (request/acknowledgment) name. PutTopic (request/acknowledgment) Data Services displays the name that PutGet (request/reply) you enter for this option in the Metadata Browsing dialog box.

Durable Subscriber Specifies the subscription name of the GetTopc (request/acknowledgement JMS durable subscriber (message con­ only) sumer). Leave blank if not applicable.

Enable Specifies whether the operation in­ All stance starts when the adapter starts.

● True: Starts the operation instance when the adapter starts ● False: Doesn’t start the operation instance the adapter starts.

Supplement for Adapters JMS adapter PUBLIC 107 Field Description Applicable operation

Message Format Specifies the DTD or XSD file name that PutTopic (request/acknowledgment) defines the XML message in this opera­ tion.

Operation instance Specifies a unique operation instance All name.

SAP Data Services Designer imports the operation metadata using this oper­ ation instance name.

Operation retry count If the operation fails, specifies the num­ All ber of times the adapter retries the op­ eration.

Enter an integer:

● 5: If the operation fails, Data Serv­ ices tries to rerun the operation five times. 5 is the default setting. ● Enter 0 for no retries. ● Enter a negative number for an in­ definite number of retries.

The adapter retries the operation for the set number of times. If the opera­ tion still fails after the adapter retries for the set number of times, the opera­ tion fails with no more retry attempts.

Operation retry interval Specifies the time in milliseconds that All the adapter waits between retries.

Enter a positive integer. The default value is 15000 milliseconds.

Polling interval Specifies the time interval in millisec­ Get (request/reply and request/ onds for polling the source topic by this acknowledgment) operation instance. GetTopic (request/acknowledgment  Example only)

If Polling interval = 1000, then the operation polls the source topic af­ ter every 1 second.

Reply format Specifies the DTD or XSD file name that PutGet (request/reply) defines the Reply XML message in the operation

Supplement for Adapters 108 PUBLIC JMS adapter Field Description Applicable operation

Reply Queue Specifies the name of the destination PutGet (request/reply) queue where the reply message is sent.

Enter the name based on the configura- tion type you chose for the adapter in­ stance: JNDI or MQ.

Reply XML Root Element Specifies the name of the XML root ele­ PutGet (request/reply) ment in the reply DTD or XSD.

Request DTD Root Element Specifies the name of the root element Get (request/reply and request/ for the input DTD for this operation. acknowledgment)

Request Format Specifies the DTD or XSD file name that Put (request/acknowledgment) defines the XML request message in the operation. PutGet (request/reply)

Request Queue Specifies the name of the destination PutGet (request/reply) queue where the message is sent.

Enter the name based on the configura- tion type you chose for the adapter in­ stance: JNDI or MQ.

Request XML Root Element Specifies the name of the XML root ele­ Put (request/acknowledgment) ment in the reply DTD or XSD. PutTopic (request/acknowledgment)

PutGet (request/reply)

Service Specifies the name of the real-time Get (request/reply and request/ service that the operation invokes when acknowledgment) it receives a new message from the GetTopic (request/acknowledgment source queue. only)

Source Queue Specifies the name of the queue where Get (request/reply and request/ the IR sends the message and the acknowledgment) adapter receives it.

Enter the name based on the configura- tion type you chose for the adapter in­ stance: JNDI or MQ.

Source Topic Specifies the name of the topic to which GetTopic (request/acknowledgment the operation subscribes. only)

Enter the name based on the configura- tion type you chose for the adapter in­ stance: JNDI or MQ.

Supplement for Adapters JMS adapter PUBLIC 109 Field Description Applicable operation

Thread count Specifies the number of copies of the Put (request/acknowledgment) operation to run in parallel. PutTopic (request/acknowledgment) The default setting is 1. PutGet (request/reply) ● 1: If the sequence of messages is important, such as in synchronous processing, don’t enter more than 1. 1 is the default setting. ● Greater than 1: Enter a value greater than 1 for parallel (asyn­ chronous) processing of messages coming from a real-time service.

 Note

When you set this option to greater than 1, ensure that multiple copies of real-time services are supported by mul­ tiple copies of the operation.

Timeout Specifies the maximum time in millisec­ PutGet (request/reply) onds that the operation waits for the re­ ply message Get (request/reply and request/ acknowledgment)

GetTopic (request/acknowledgment only)

Undelivered Queue Optional. Specifies the undelivered Get (request/reply and request/ queue for receiving any error messages. acknowledgment)

Enter the name based on the configura- tion type you chose for the adapter in­ stance: JNDI or MQ.

Parent topic: Configuring a JMS adapter—overview [page 101]

Related Information

JMS adapter operations [page 102] Adding an operation to an adapter instance [page 105]

Supplement for Adapters 110 PUBLIC JMS adapter 11.5 JMS adapter datastore

Use metadata from the SAP Data Services JMS adapter datastore with a batch job or real-time data flow to pass messages to an operation instance.

After you install and configure the JMS adapter, and configure the applicable operations, create the JMS adapter datastore in Data Services Designer. Use the datastore to import metadata, which consists of outbound message and message functions. When you create the datastore, consider the following information:

● Define one datastore for each adapter instance. ● Include a message function or an outbound message for each operation instance to which you pass a message.

Configure either a batch or real-time data flow to pass a message to one of the adapter operation instances defined for that datastore. The datastore can pass one of the following message types:

● An Outbound message for Request/Acknowledge operations ● A Message Function for Request/Reply operations

Import message functions and outbound messages to the datastore [page 111] Pass messages from a batch job or a real-time data flow to an operation instance.

Parent topic: JMS adapter [page 97]

Related Information

Scope of the JMS adapter [page 98] JMS adapter functional overview [page 98] Design considerations [page 100] Configuring a JMS adapter—overview [page 101] Testing the JMS adapter and operations [page 113] WebLogic as JMS provider [page 132] Error handling and tracing [page 133] Creating an adapter datastore [page 31]

11.5.1 Import message functions and outbound messages to the datastore

Pass messages from a batch job or a real-time data flow to an operation instance.

Import either a function or an outbound message, depending on the type of operation involved, for each operation instance.

The following table contains methods to use for Real-time data flows.

Supplement for Adapters JMS adapter PUBLIC 111 Method Description

Message functions Pass messages to an operation instance if the RTDF waits for a return XML message from the IR.

Outbound messages If the real-time data flow waits for a confirmation only from the IR, not a return XML message, outbound messages pass messages to an operation instance.

The following table contains operation types and the corresponding invocation types in the SAP Data Services Adapter for JMS.

Operation type Invocation type

Request/Reply Operation Message Function

Request/Acknowledge Operation Outbound Message

Importing message functions and outbound messages [page 112] For each operation instance, import a function or an outbound message using the applicable adapter datastore in SAP Data Services Designer.

Parent topic: JMS adapter datastore [page 111]

11.5.1.1 Importing message functions and outbound messages

For each operation instance, import a function or an outbound message using the applicable adapter datastore in SAP Data Services Designer.

Open Designer and perform the following steps to import message functions and outbound messages:

1. Double-click the applicable datastore associated with your JMS Adapter Instance.

Designer opens the Adapter metadata browser window at right. 2. Right-click the operation instance to import and select Import. Data Services adds the selected operation instance to the applicable folder under the datastore node.

Use the imported message functions and outbound message functions for creating Batch Jobs or real-time data flows in Data Services.

Task overview: Import message functions and outbound messages to the datastore [page 111]

Supplement for Adapters 112 PUBLIC JMS adapter 11.6 Testing the JMS adapter and operations

Use sample test files to test the operation of your JMS adapter before you use it in your production environment.

The following list contains an overview of the tasks to perform before you can test a JMS adapter in SAP Data Services:

1. Import the JMSAdapter.atl file into the Designer. Find the .atl file in \adapters\jms \samples. The imported project name is Acta_JMSAdapter_Sample. 2. Open the source and target editors of each batch job and update the location of the input and output XML file paths to the location of the SAP Data Services installation. For example, C:\\Program Files (x86)\SP BusinessObjects\Data Services\adapters\jms\samples\xml.

To make this task easier, create an environment variable for this location. 3. Add the Queue and Topic for the TestService_Job and TestServiceTopic_Job files in the Real-Time Services node of the SAP Data Services Management Console Administrator. Create the following:

○ Queue.TestService referencing the TestService_Job file ○ Topic.TestService referencing the TestServiceTopic_Job file.

See the Management Console Guide for information about Real-Time Services. 4. Configure a JMS adapter and add an operation instance based on the operation you are testing.

See the individual test topics for option settings. 5. In SAP Data Services Designer, edit the JMSAdapter datastore that you imported with the ATL file. If necessary, rename the datastore to match the name of the adapter you just created. 6. Open your JMS provider utility and create the following queues and topics:

○ Queue.MyQueue ○ Queue.ActaQueuePutGet ○ Queue.ActaQueuePutGet1 ○ Queue.ActaQueueGet ○ Queue.ActaReplyQueueGet ○ Queue.ActaUndeliveredQueue ○ Topic.MyTopic

 Note

The JMSAdapterTest.properties file and the scripts to execute the samples are located in the LINK_DIR/adapters/jms/samples directory.

7. If you configured the JMS adapter with a factory name other than what came with the sample files, perform the following substeps: a. Open the JMSAdapterTest.properties file located in LINK_DIR. b. Change the value for TopicConnectionFactoryName from the default setting of Tcf to the value for your factory name. c. Change the value for QueueConnectionFactoryName from the default setting of Qcf to the value for your factory name.

Supplement for Adapters JMS adapter PUBLIC 113 8. In the JMSAdapterTest.properties file, ensure that the MessageSource line contains LINK_DIR/ adapters/jms/samples/xml/JMSSource.xml that reflects the actual location on your system. 9. Open the setTestEnv.bat (Windows) or setTestEnv.sh (Unix) and set the JMS Provider jar file names for the sample test programs.

Configure the JMS provider [page 115] Set the Connection Factory, JMS Server, JMS queues, and JMS topics based on the JMS provider you use.

Using MQ instead of JNDI configuration [page 116] To use MQ instead of JNDI, change the configuration type of your JMS adapter and reset queues and topic names.

JMS adapter sample files [page 117] SAP Data Services includes sample files for you to test the JMS adapter and all applicable operation instances.

Test Get: Request/Reply [page 118] Before you run your JMS adapter with the Get (Request/Reply) operation, run a test operation to ensure you get the desired results.

Test Get: Request/Acknowledge [page 120] Before you run your JMS adapter with the Get (Request/Acknowledge) operation, run a test operation to ensure you get the desired results.

Test GetTopic: Request/Acknowledge [page 122] Before you run your JMS adapter with the GetTopic (Request/Acknowledge) operation, run a test operation to ensure you get the desired results.

Test PutGet: Request/Reply [page 124] Before you run your JMS adapter with the PutGet Request/Reply operation, run a test operation to ensure you get the desired results.

Test Put: Request/Acknowledge [page 127] Before you run your JMS adapter with the Put Request/Acknowledge operation, run a test operation to ensure you get the desired results.

Test PutTopic: Request/Acknowledge [page 129] Before you run your JMS adapter with the PutTopic Request/Acknowledge operation, run a test operation to ensure you get the desired results.

Task overview: JMS adapter [page 97]

Related Information

Scope of the JMS adapter [page 98] JMS adapter functional overview [page 98] Design considerations [page 100] Configuring a JMS adapter—overview [page 101] JMS adapter datastore [page 111] WebLogic as JMS provider [page 132]

Supplement for Adapters 114 PUBLIC JMS adapter Error handling and tracing [page 133]

11.6.1 Configure the JMS provider

Set the Connection Factory, JMS Server, JMS queues, and JMS topics based on the JMS provider you use.

A JMS provider is a messaging system that includes the JMS interface and provides administrative and control features. For example, Weblogic is a JMS provider.

After you select what JMS provider to use, create a JMS Server, select a Connection Factory, and configure JMS queues and topics to run the JMS adapter in SAP Data Services.

When you test the JMS adapter, use a sample application and use specific names for the queue and topic locations. For example, use the following names for the JNDI configuration type:

● Queue.MyQueue ● Queue.ActaQueuePutGet ● Queue.ActaQueuePutGet1 ● Queue.ActaQueueGet ● Queue.ActaReplyQueueGet ● Queue.ActaUndeliveredQueue ● Topic.MyTopic

 Note

The individual operation instance testing topics specify when to use these names.

Parent topic: Testing the JMS adapter and operations [page 113]

Related Information

Using MQ instead of JNDI configuration [page 116] JMS adapter sample files [page 117] Test Get: Request/Reply [page 118] Test Get: Request/Acknowledge [page 120] Test GetTopic: Request/Acknowledge [page 122] Test PutGet: Request/Reply [page 124] Test Put: Request/Acknowledge [page 127] Test PutTopic: Request/Acknowledge [page 129]

Supplement for Adapters JMS adapter PUBLIC 115 11.6.2 Using MQ instead of JNDI configuration

To use MQ instead of JNDI, change the configuration type of your JMS adapter and reset queues and topic names.

SAP Data Services uses the settings in the properties file named JMSAdapterTest.properties for the JMS configuration type settings, and the names for topics and queues. By default, the Properties file is set to use the JNDI configuration. Change the configuration in either the Properties file or in the SAP Data Services Management Console. For instructions to change the settings in the Management Console, see Adding an adapter instance [page 13].

The following steps are for changing the configuration type to MQ in the Properties file:

1. Open the JMSAdapterTest.properties file using a text editor.

The file is in LINK_DIR\adapters\jms\samples. 2. Find ConfigType and set it to MQ. 3. Set any of the following properties as required by your system: ○ MqQueueManager ○ MqChannel ○ MqComputerName ○ MqPort ○ MqUserID ○ MqPassword 4. For the queue and topic names, use MQ names instead of the JNDI names for the following properties: ○ TopicGetName ○ TopicPutName ○ QueueSourceGetName ○ QueueResponseGetName ○ QueuePutName ○ QueueRequestPutGetName ○ QueueReplyPutGetName 5. Save and close the file.

Task overview: Testing the JMS adapter and operations [page 113]

Related Information

Configure the JMS provider [page 115] JMS adapter sample files [page 117] Test Get: Request/Reply [page 118] Test Get: Request/Acknowledge [page 120] Test GetTopic: Request/Acknowledge [page 122] Test PutGet: Request/Reply [page 124]

Supplement for Adapters 116 PUBLIC JMS adapter Test Put: Request/Acknowledge [page 127] Test PutTopic: Request/Acknowledge [page 129]

11.6.3 JMS adapter sample files

SAP Data Services includes sample files for you to test the JMS adapter and all applicable operation instances.

The sample files are in the file located in LINK_DIR\adapters\JMS\samples. The following figure shows the Import Plan dialog box that contains the files in the ATL file.

Parent topic: Testing the JMS adapter and operations [page 113]

Related Information

Configure the JMS provider [page 115] Using MQ instead of JNDI configuration [page 116] Test Get: Request/Reply [page 118] Test Get: Request/Acknowledge [page 120] Test GetTopic: Request/Acknowledge [page 122] Test PutGet: Request/Reply [page 124]

Supplement for Adapters JMS adapter PUBLIC 117 Test Put: Request/Acknowledge [page 127] Test PutTopic: Request/Acknowledge [page 129]

11.6.4 Test Get: Request/Reply

Before you run your JMS adapter with the Get (Request/Reply) operation, run a test operation to ensure you get the desired results.

The Get Request/Reply is a JMS adapter operation that moves messages from an external Information Resource (IR) to SAP Data Services.

To test the Get Request/Reply operation on your system, create the operation using the settings in the following table. Follow the instructions in Adding an operation to an adapter instance [page 105]. The settings are based on the JNDI configuration type.

Get Request/Reply configuration options Option Value

Operation instance JMSGetOperation

Polling interval 1000

Thread count 1

Enable true

Source queue Queue.ActaQueueGet

Service Queue.TestService

Timeout 2000

Continue after error true

Default response queue Queue.ActaReplyQueueGet

Undelivered queue (optional) Queue.ActaUndeliveredQueue

Enable Transaction no

After entering the information in the operation instance configuration page, follow these steps to enable the options:

1. Click Apply. 2. Open the Adapter Instance Status tab. 3. Select the applicable JMS adapter instance and click Shutdown. 4. Select the applicable JMS adapter instance and click Start. 5. When the JMS Adapter Status changes from “Starting” to “Started”, the operation instance also starts running.

Parent topic: Testing the JMS adapter and operations [page 113]

Supplement for Adapters 118 PUBLIC JMS adapter Related Information

Configure the JMS provider [page 115] Using MQ instead of JNDI configuration [page 116] JMS adapter sample files [page 117] Test Get: Request/Acknowledge [page 120] Test GetTopic: Request/Acknowledge [page 122] Test PutGet: Request/Reply [page 124] Test Put: Request/Acknowledge [page 127] Test PutTopic: Request/Acknowledge [page 129]

11.6.4.1 Testing Get Request/Reply on Windows

Test the Get Request/Reply operation instance using the sample files and the provided settings.

Use the following two sample files, which represent external Information Resources (IR):

● sampleTest_Send.bat ● sampleTest_get.bat

The files are located in LINK_DIR\adapters\jms\samples.

Open a command prompt and perform the following steps:

1. Change directories to the location of the sample files and run the sampleTest_Send.bat file.

The sampleTest_Send.bat sample application sends the message at the source queue of the Get operation instance configured in the software. 2. Run the sampleTest_Get.bat file, an external IR.

The sampleTest_Get.bat sample application receives the reply from SAP Data Services on a default response queue.

The sample application sampleTest_Send.bat sends the message as a request on a source queue configured for JMSGetOperation instance. The JMSGetOperation instance invokes the real-time batch job and also sends the reply back at the default response queue.

The sample application sampleTest_Get.bat receives the reply on this default response queue. If any error occurs while invoking another service from this Job service, then the original message is sent to the undelivered queue, for reference by the IR.

11.6.4.2 Testing Get Request/Reply on Unix

Run the sample application with the Get Request/Reply operation on Unix.

Use the following files:

● sampleTest_Send.sh

Supplement for Adapters JMS adapter PUBLIC 119 ● sampleTest_Get.sh

The files are located in $LINK_DIR/adapters/jms/samples.

Open a command prompt and perform the following steps:

1. Change directories to $LINK_DIR/adapters/jms/samples and run the sampleTest_Send.sh.

The sample application sampleTest_Send.sh sends the message at the request queue of the operation instance configured in the software. 2. Run sampleTestGet.sh.

The sample application sampleTest_Get.sh sends the message as a request on a source queue configured for JMSGetOperation instance. JMSGetOperation instance invokes the real-time batch job and also sends the reply back at the default response queue.

The sample application sampleTest_Get.sh receives the reply on this default response queue. If any error occurs while invoking another service from this Job service, then the error message is sent to the undelivered queue for reference by the IR.

11.6.5 Test Get: Request/Acknowledge

Before you run your JMS adapter with the Get (Request/Acknowledge) operation, run a test operation to ensure you get the desired results.

The Get Request/Acknowledge is a JMS adapter operation for the external Information Resource (IR) to SAP Data Services application.

To test the Get Request/Acknowledge operation on your system, create the operation using the settings in the following table. The options are in the operation instance configuration page in the SAP Data Services Management Console Administrator application. For instructions to configure an operation instance, see Adding an adapter instance [page 13].

Get Request/Acknowledge configuration options Option Value

Operation instance JMSGetOperation

Polling interval 1000

Thread count 1

Enable true

Source queue Queue.ActaQueueGet

Service Queue.TestService

Timeout 2000

Continue after error true

Supplement for Adapters 120 PUBLIC JMS adapter Option Value

Default response queue Leave blank.

 Note

When you specify a value, this operation changes from Request/Acknowledgement to Request/Reply.

Undelivered queue Leave blank.

 Note

When you specify a value, this operation changes from Request/Acknowledgement to Request/Reply.

Enable Transaction no

After entering the information in the operation instance configuration page, follow these steps to enable the options:

1. Click Apply. 2. Open the Adapter Instance Status tab. 3. Select the applicable JMS adapter instance and click Shutdown. 4. Select the applicable JMS adapter instance and click Start. 5. When the JMS Adapter Status changes from “Starting” to “Started”, the operation instance also starts running.

Parent topic: Testing the JMS adapter and operations [page 113]

Related Information

Configure the JMS provider [page 115] Using MQ instead of JNDI configuration [page 116] JMS adapter sample files [page 117] Test Get: Request/Reply [page 118] Test GetTopic: Request/Acknowledge [page 122] Test PutGet: Request/Reply [page 124] Test Put: Request/Acknowledge [page 127] Test PutTopic: Request/Acknowledge [page 129]

Supplement for Adapters JMS adapter PUBLIC 121 11.6.5.1 Testing Get Request/Acknowledge on Windows

Run the sample application with the Get Request/Acknowledge operation on Windows.

Use the sample file sampleTest_send.bat located in LINK_DIR\adapters\jms\samples. It is a sample application (external Information Resource).

Open a command prompt and perform the following steps:

1. Run the sampleTest_Send.bat file.

SampleTest_Send.bat sends a message as a request on a source queue that is configured for JMSGetOperation instance. The JMSGetOperation instance invokes the real-time batch job, which creates an XML output file as an acknowledgment. 2. Find the acknowledgment file named JMSSourceOutput_Get.xml in LINK_DIR\adapters\jms \samples\xml.

The sampleTest_Send.bat file does not send a response to the default response queue because it is not configured for a response operation.

11.6.5.2 Testing Get Reply/Acknowledge on Unix

Run the sample application with the Get Request/Acknowledge operation on Unix.

Use the sample file sampleTest_send.sh located in %LINK_DIR/adapters/jms/samples. It is a sample application (external Information Resource).

Open a command prompt and perform the following steps:

1. Change directories to the sample files and run the file sampleTest_Send.sh.

The sampleTest_Send.sh sends a message as a request on a source queue that is configured for a JMSGetOperation instance. The JMSGetOperation instance invokes the real-time batch job that creaes an output XML file as an acknowledgment. 2. Find the acknowledgment file JMSSourceOutput_Get.xml in %LINK_DIR/adapters/jms/samples/ xml.

The sampleTest_Send.sh does not send a response to the default response queue because it is not configured for this type of operation.

11.6.6 Test GetTopic: Request/Acknowledge

Before you run your JMS adapter with the GetTopic (Request/Acknowledge) operation, run a test operation to ensure you get the desired results.

The GetTopic Request/Acknowledge is a JMS adapter operation for the external Information Resource (IR) to SAP Data Services application.

To test the GetTopic Request/Acknowledge operation on your system, create the operation using the settings in the following table. The options are in the operation instance configuration page in the SAP Data Services

Supplement for Adapters 122 PUBLIC JMS adapter Management Console Administrator application. For instructions to configure an operation instance, see Adding an adapter instance [page 13].

GetTopic Request/Acknowledge configuration options Option Value

Operation instance JMSGetTopicOperation

Polling interval 1000

Operation retry count 5

Operation retry interval 15000

Enable true

Source topic Topic.MyTopic

Durable Subscriber Leave blank

Service Topic.TestService

Timeout 2000

Continue after error true

Enable Transaction no

After entering the information in the operation instance configuration page, follow these steps to enable the options:

1. Click Apply. 2. Open the Adapter Instance Status tab. 3. Select the applicable JMS adapter instance and click Shutdown. 4. Select the applicable JMS adapter instance and click Start. 5. When the JMS Adapter Status changes from “Starting” to “Started”, the operation instance also starts running.

Parent topic: Testing the JMS adapter and operations [page 113]

Related Information

Configure the JMS provider [page 115] Using MQ instead of JNDI configuration [page 116] JMS adapter sample files [page 117] Test Get: Request/Reply [page 118] Test Get: Request/Acknowledge [page 120] Test PutGet: Request/Reply [page 124] Test Put: Request/Acknowledge [page 127] Test PutTopic: Request/Acknowledge [page 129]

Supplement for Adapters JMS adapter PUBLIC 123 11.6.6.1 Testing GetTopic Request/Acknowledge on Windows

Run the sample application with the GetTopic Request/Acknowledge operation on Windows.

Use the sample file sampleTest_GetTopic.bat located in LINK_DIR\adapters\jms\samples.

Perform the following steps in a command prompt:

1. Run the sampleTest_GetTopic.bat.

The sampleTest_GetTopic.bat application publishes a message to the source topic of the GetTopic operation instance. JMSGetTopicOperation, which subscribes to the topic, receives the message and sends it to the real-time service. The service then puts the message into the file JMSFileTarget_GetTopic.xml. 2. Find JMSFileTarget_GetTopic.xml in LINK_DIR\adapters\jms\samples\xml.

11.6.6.2 Testing GetTopic Request/Acknowledge on Unix

Run the sample application with the GetTopic Request/Acknowledge operation on Unix.

Use the sample file sampleTest_GetTopic.bat located in %LINK_DIR/adapters/jms/samples.

Perform the following steps in a command prompt:

1. Run the sampleTest_GetTopic.sh file.

The sampleTest_GetTopic.sh sample application publishes a message to the source topic of the GetTopic operation instance. JMSGetTopicOperation, which subscribes to the topic, receives the message and sends it to the real-time service. The service then puts the message into the file JMSFileTarget_GetTopic.xml 2. Find the JMSFileTarget_GetTopic.xml file in %LINK_DIR/adapters/jms/samples/xml.

11.6.7 Test PutGet: Request/Reply

Before you run your JMS adapter with the PutGet Request/Reply operation, run a test operation to ensure you get the desired results.

The PutGet Request/Reply operation is a JMS adapter operation for the SAP Data Services to the JMS adapter application.

To test the PutGet Request/Reply operation on your system, create the operation using the settings in the following table. The options are in the operation instance configuration page in the SAP Data Services Management Console Administrator application. For instructions to configure an operation instance, see Adding an adapter instance [page 13].

Supplement for Adapters 124 PUBLIC JMS adapter PutGet Request/Reply test values Option Value

Operation instance JMSPutGetOperation

Thread count 1

Operation retry count 5

Operation retry interval 15000

Display name JMSPutGetOperation

Description This operation instance represents the PutGet Request/ Reply operation. It sends the request message to the request queue and receives the reply message from the reply queue.

Enable true

Request queue Queue.ActaQueuePutGet

Reply queue Queue.ActaQueuePutGet1

Timeout 200000

Continue after error true

Request format LINK_DIR/adapters/JMS/samples/dtd/ JMSPUTGET_SOURCE.dtd

Request XML root element source

Reply format LINK_DIR/adapters/JMS/samples/dtd/ JMSPUTGET_RESPONSE1.dtd

Reply XML root element source

Enable transaction no

After entering the information in the operation instance configuration page, follow these steps to enable the options:

1. Click Apply. 2. Open the Adapter Instance Status tab. 3. Select the applicable JMS adapter instance and click Shutdown. 4. Select the applicable JMS adapter instance and click Start. 5. When the JMS Adapter Status changes from “Starting” to “Started”, the operation instance also starts running.

Parent topic: Testing the JMS adapter and operations [page 113]

Related Information

Configure the JMS provider [page 115] Using MQ instead of JNDI configuration [page 116] JMS adapter sample files [page 117] Test Get: Request/Reply [page 118]

Supplement for Adapters JMS adapter PUBLIC 125 Test Get: Request/Acknowledge [page 120] Test GetTopic: Request/Acknowledge [page 122] Test Put: Request/Acknowledge [page 127] Test PutTopic: Request/Acknowledge [page 129]

11.6.7.1 Testing PutGet Request/Reply on Windows

Run the sample application with the PutGet Request/Reply operation on Windows.

Use the following sample files:

● sampleTest_get.bat ● JMSPutGetOperation_BatchJob

The files are located in LINK_DIR\adapters\jms\samples.

1. Open a command prompt and Run the sampleTest_PutGet.bat.

The application displays the message:

Ready to receive message from queue Queue.ActaQueuePutGet

2. Execute the batch job JMSPutGetOperation_BatchJob in SAP Data Services Designer.

The batch job sends the message to the request queue. The sample application sampleTest_PutGet.bat listens for a message at the request queue of the JMSPutGetOperation instance. When it receives the message: 1. It prints a message to the command prompt window such as:

Message received: 18 200000000 2356376438743

2. The sample test program then sends a reply message to the reply queue configured for the JMSPutGetOperation instance. 3. The sample test program echoes a message to the command prompt window such as:

Message sent: ReplyFromJMSIR1 ReplyFromJMSIR2 ReplyFromJMSIR3

4. After the adapter operation receives the reply from the reply queue, it sends the acknowledgment message to the batch job which then generates the output file . 5. The batch job generates an XML output file named JMSSourceOutput_PutGet.xml located in LINK_DIR\adapters\jms\samples\xml. The contents of the file is similar to the message sent from the sample test with the addition of a timestamp and error information.

Supplement for Adapters 126 PUBLIC JMS adapter 11.6.7.2 Testing PutGet Request/Reply on Unix

Run the sample application with the PutGet Request/Reply operation on Unix.

Use the following sample files:

● sampleTest_PutGet.sh ● JMSPutGetOperation_BatchJob

Both files are located in %LINK_DIR/adapters/jms/samples.

1. Open a command prompt and run the sample application file sampleTest_PutGet.sh. 2. Execute the batch job JMSPutGetOperation_BatchJob in SAP Data Services Designer.

The batch job does the following: 1. Sends a message at the request queue. 2. The sample application sampleTest_PutGet.sh listens for the message at the request queue of the JMSPutGetOperation instance. 3. The sample application then sends the message to the reply queue configured for the JMSPutGetOperation instance. 4. After the batch job receives the reply from the reply queue, it generates an output XML file acknowledging the receipt. Find the XML file named JMSSourceOutput_PutGet.xml in %LINK_DIR/ adapters/jms/samples.

11.6.8 Test Put: Request/Acknowledge

Before you run your JMS adapter with the Put Request/Acknowledge operation, run a test operation to ensure you get the desired results.

The Put Request/Acknowledge operation is a JMS adapter operation for the SAP Data Services to the JMS adapter application.

To test the Put Request/Acknowledge operation on your system, create the operation using the settings in the following table. The options are in the operation instance configuration page in the SAP Data Services Management Console Administrator application. For instructions to configure an operation instance, see Adding an adapter instance [page 13].

Put Request/Acknowledge test values Option Value

Operation instance JMSPutOperation

Thread count 1

Display name JMSPutOperation

Description This operation instance represents the Put Request/Acknowledge operation. It queues the message to the configured destination queue.

Enable true

Destination queue Queue.MyQueue

Supplement for Adapters JMS adapter PUBLIC 127 Option Value

Request format LINK_DIR/adapters/JMS/samples/dtd/JMSPUT_SOURCE.dtd

Request XML root element source

After entering the information in the operation instance configuration page, follow these steps to enable the options:

1. Click Apply. 2. Open the Adapter Instance Status tab. 3. Select the applicable JMS adapter instance and click Shutdown. 4. Select the applicable JMS adapter instance and click Start. 5. When the JMS Adapter Status changes from “Starting” to “Started”, the operation instance also starts running.

Parent topic: Testing the JMS adapter and operations [page 113]

Related Information

Configure the JMS provider [page 115] Using MQ instead of JNDI configuration [page 116] JMS adapter sample files [page 117] Test Get: Request/Reply [page 118] Test Get: Request/Acknowledge [page 120] Test GetTopic: Request/Acknowledge [page 122] Test PutGet: Request/Reply [page 124] Test PutTopic: Request/Acknowledge [page 129]

11.6.8.1 Testing Put Reply/Acknowledge on Windows

Run the sample application with the Put Reply/Acknowledge operation on Windows.

Use the following sample files:

● sampleTest_Put.bat ● JMSPutOperation_BatchJob

The files are located in LINK_DIR\adapters\jms\samples.

1. Open a command prompt and run the sample application sampleTest_Put.bat.

The application displays the following message:

Ready to receive message from queue Queue.MyQueue.

2. Execute the batch Job JMSPutOperation_BatchJob in SAP Data Services Designer.

Supplement for Adapters 128 PUBLIC JMS adapter The sample application sampleTest_Put.bat (external IR) listens for a message to arrive at the request queue of the JMSPutOperation instance. When it receives the message, it prints a message to the command prompt window such as:

Received message: 18 200000000 2356376438743

After the JMS adapter acknowledges sending the message to the sample application, the batch job generates the output file JMSSourcOutput_Put.xml. Find the XML file in LINK_DIR\adapters\jms \samples.

The contents of the XML file should be similar to the message received by the sample test with the addition of a timestamp.

 Note

This file is created as a result of the design of the job, not as a result of the adapter operation sending a reply message to the job.

11.6.8.2 Testing Put Reply/Acknowledge on Unix

Run the sample application with the Put Reply/Acknowledge operation on Unix.

Use the following sample files:

● sampleTest_Put.bat ● JMSPutOperation_BatchJob

The files are located in $LINK_DIR/adapters/jms/samples.

1. Open a command prompt and run the sampleTest_Put.bat file.

The sample application (external IR) listens at the destination queue that you configured for the Put operation instance. 2. Execute the batch job JMSPutOperation_BatchJob in SAP Data Services Designer.

The sample application sampleTest_Put.bat receives the message from the destination queue and outputs an acknowledgment in an XML file named JMSSourceOutput_Put.xml to the $LINK_DIR/ adapters/jms/samples/xml directory.

11.6.9 Test PutTopic: Request/Acknowledge

Before you run your JMS adapter with the PutTopic Request/Acknowledge operation, run a test operation to ensure you get the desired results.

The PutTopic Request/Acknowledge operation is a JMS adapter operation for the SAP Data Services to the JMS adapter application.

To test the PutTopic Request/Acknowledge operation on your system, create the operation using the settings in the following table. The options are in the operation instance configuration page in the SAP Data Services

Supplement for Adapters JMS adapter PUBLIC 129 Management Console Administrator application. For instructions to configure an operation instance, see Adding an adapter instance [page 13].

PutTopic Request/Acknowledge test values Option Value

Operation instance JMSPutTopicOperation

Thread count 1

Operation retry count 5

Operation retry interval 15000

Display name JMSPutTopicOperation Display Name

Description JMSPutTopicOperation Display Name

Enable true

Destination queue Topic

Message format C:\ProgramFiles\SAP BusinessObjects\Data Services

Request XML root element source

Persistent message true

After entering the information in the operation instance configuration page, follow these steps to enable the options:

1. Click Apply. 2. Open the Adapter Instance Status tab. 3. Select the applicable JMS adapter instance and click Shutdown. 4. Select the applicable JMS adapter instance and click Start. 5. When the JMS Adapter Status changes from “Starting” to “Started”, the operation instance also starts running.

Parent topic: Testing the JMS adapter and operations [page 113]

Related Information

Configure the JMS provider [page 115] Using MQ instead of JNDI configuration [page 116] JMS adapter sample files [page 117] Test Get: Request/Reply [page 118] Test Get: Request/Acknowledge [page 120] Test GetTopic: Request/Acknowledge [page 122] Test PutGet: Request/Reply [page 124] Test Put: Request/Acknowledge [page 127]

Supplement for Adapters 130 PUBLIC JMS adapter 11.6.9.1 Testing PutTopic Request/Acknowledge on Windows

Run the sample application with the PutTopic Request/Acknowledge operation on Windows.

Use the sampleTest_Put.sh file and the JMSPutOperation_BatchJob file located in LINK_DIR \adapters\jms\samples.

1. Open a command prompt window and change directory to LINK_DIR/adapters/jms/samples. 2. Run the sample file sampleTest_PutTopic.bat.

The sample application sampleTest_PutTopic.bat displays the message:

Ready to receive message from topic Topic.MyTopic

 Note

If you do not see this message, then start the JMS publish/subscribe broker. The message should appear after you start the broker.

3. Execute the batch Job JMSPutTopicOperation_BatchJob from the SAP Data Services Designer.

The sample application sampleTest_PutTopic.bat listens for a message to be published by the JMSPutTopicOperation instance. When it receives the message, it prints a message to the command prompt window such as:

Received message: 18 200000000 2356376438743

After the adapter operation acknowledges sending the message to the sample application, the batch job outputs the acknowledgment XML file named JMSSourceOutput_PutTopic.xml to the directory LINK_DIR\adapters\jms\samples.

The contents of the XML file should be similar to the message received by the sample test with the addition of a timestamp.

 Note

Note that this file is created as a result of the design of the job, not as a result of the adapter operation sending a reply message to the job.

11.6.9.2 Testing PutTopic Request/Acknowledge on Unix

Run the sample application with the PutTopic Request/Acknowledge operation on Unix.

Use the sampleTest_Put.sh file and the JMSPutOperation_BatchJob file located in LINK_DIR \adapters\jms\samples.

1. Run the sampleTest_Put.sh file from the command prompt.

The sampleTest_Put.sh file listens at the destination queue configured for the Put operation instance. 2. Execute the batch job JMSPutOperation_BatchJob in SAP Data Services Designer.

Supplement for Adapters JMS adapter PUBLIC 131 The batch job initiates the message request. The external JMS-compatible application sends back an acknowledgment. The batch job writes the acknowledgment to an XML file.

11.7 WebLogic as JMS provider

Use Weblogic as your JMS provider when you use the JMS adapter.

To use Weblogic as your JMS provider, create a JMS Server, Connection Factory, and configure the JMS queues.

A JMS server puts the JMS infrastructure onto a WebLogic server. Then you create your queues so they target the WebLogic server.

See your WebLogic documentation for instructions to configure a JMS Server and a Connection Factory.

When you configure your queues for testing purposes, use the following values:

● Queue.MyQueue ● Queue.ActaQueuePutGet ● Queue.ActaQueuePutGet1 ● Queue.ActaQueueGet ● Queue.ActaReplyQueueGet ● Queue.ActaUndeliveredQueue

Parent topic: JMS adapter [page 97]

Related Information

Scope of the JMS adapter [page 98] JMS adapter functional overview [page 98] Design considerations [page 100] Configuring a JMS adapter—overview [page 101] JMS adapter datastore [page 111] Testing the JMS adapter and operations [page 113] Error handling and tracing [page 133]

Supplement for Adapters 132 PUBLIC JMS adapter 11.8 Error handling and tracing

View error and trace logs after processing messages with the JMS adapter to view processing information and messages for error resolution.

SAP Data Services logs Error and Trace log files in the LINK_DIR/adapters/log directory. Data Services logs error messages in the error log file before throwing any exceptions. Data Services names the error log file using the same name as the associated adapter.

Data Services also names the Trace file using the same name as the associated adapter. Enable the Trace log in the SAP Data Services Management Console when you configure the adapter. Trace messages contain the execution flow of the adapter and information that you can use to find the causes of errors.

If you have to contact SAP Support, they may ask you for the trace log to help them solve your issue.

Parent topic: JMS adapter [page 97]

Related Information

Scope of the JMS adapter [page 98] JMS adapter functional overview [page 98] Design considerations [page 100] Configuring a JMS adapter—overview [page 101] JMS adapter datastore [page 111] Testing the JMS adapter and operations [page 113] WebLogic as JMS provider [page 132]

Supplement for Adapters JMS adapter PUBLIC 133 12 MongoDB adapter

Use a MongoDB adapter to import your MongoDB metadata to process with SAP Data Services.

MongoDB is an open-source document database. MongoDB uses BSON, which is a JSON-like document with dynamic schemas instead of traditional table-based relational database structures.

MongoDB is schema-free, however, Data Services needs metadata for task design and execution. Therefore, Data Services generates schema data based on a certain number of records. Data Services allows you to provide a JSON file to generate a schema for each collection.

After you create a MongoDB adapter instance and datastore, perform the following tasks:

● Browse, import, and read MongoDB entities, which are similar to MongoDB collections. ● Write data back to MongoDB after mapping entities, which are the same as other targets in Data Services.

For complete information about using MongoDB in Data Services, see the Data Services Supplement for Big Data.

MongoDB metadata [page 134] Use data from MongoDB as a source or target in a data flow, and also create templates.

Related Information

Adapter installation and configuration [page 12]

12.1 MongoDB metadata

Use data from MongoDB as a source or target in a data flow, and also create templates.

The embedded documents and arrays in MongoDB are represented as nested data. SAP Data Services converts MongoDB BSON files to XML and then to XSD. Data Services saves the XSD file to the following location: LINK_DIR\ext\mongo\mcache.

Restrictions and limitations

Data Services has the following restrictions and limitations for working with MongoDB:

● In the MongoDB collection, the tag name cannot contain special characters that are invalid for the XSD file. For example, the following special characters are invalid for XSD files: >, <, &,/, \, #, and so on. If special characters exist, Data Services removes them.

Supplement for Adapters 134 PUBLIC MongoDB adapter ● MongDB data is always changing, so the XSD may not reflect the entire data structure of all the documents in the MongoDB. ● Data Services does not support projection queries on adapters. ● Data Services ignores any new fields that you add after the metadata schema creation that were not present in the common documents. ● Data Services does not support push down operators when you use MongoDB as a target.

Parent topic: MongoDB adapter [page 134]

Related Information

Formatting XML documents Source and target objects

Supplement for Adapters MongoDB adapter PUBLIC 135 13 OData adapter

Use an OData adapter to import data to use in SAP Data Services data flows.

Open Data (OData) Protocol provides a method to transfer data using standard queries and interoperable RESTful APIs with Web Services. OData is flexible in that it enables interoperability between disparate data sources, applications, services, and clients.

Import OData entity sets as tables.

OData as a source [page 136] Use an imported OData table as a source in an SAP Data Services data flow.

OData as a target [page 138] Define an OData target object in a data flow in the target editor.

OData pushdown behavior [page 141] SAP Data Services uses the OData specification to push down operations to the source or target database.

Related Information

Adapter installation and configuration [page 12] OData adapter datastore configuration options [page 45]

13.1 OData as a source

Use an imported OData table as a source in an SAP Data Services data flow.

When you use an OData table as a source in a data flow, you open the source editor to complete options specific to the table. The following table describes the OData adapter source options in the source editor.

OData Source object configuration Option Description

Batch size Specifies the number of rows Data Services processes in the batch job.

Column delimiter Specifies the non printable character present in the source table for the column delimiter.

Represented as “/nnn”

Supplement for Adapters 136 PUBLIC OData adapter Option Description

Row delimiter Specifies the non printable character present in the source table for the row delimiter.

Represented as “/nnn”

Top Specifies the maximum number of rows that Data Services returns, starting from the beginning, and skipping any rows indicated in Skip.

Skip Specifies the number of rows that Data Services ignores at the beginning of a collection.

 Example

Enter 5 to skip the first five rows, and returns the rows starting with row 6. Data services uses the following for­ mula: If Skip=5 then OData service returns the rows starting at position 5+1 = row 6.

Number of concurrent threads Controls the number of concurrent threads that Data Serv­ ices uses to load data. For more information about parallel processing, see the Performance Optimization Guide.

OData Depth level [page 137] Depth level determines whether the OData source contains navigation properties, which affect how SAP Data Services imports OData metadata and uses OData entities.

Parent topic: OData adapter [page 136]

Related Information

OData as a target [page 138] OData pushdown behavior [page 141]

13.1.1 OData Depth level

Depth level determines whether the OData source contains navigation properties, which affect how SAP Data Services imports OData metadata and uses OData entities.

You set the OData depth level when you configure the OData adapter datastore.

The following table describes how Data Services imports OData metadata based on the setting you make in the OData Depth level option.

Supplement for Adapters OData adapter PUBLIC 137 Depth level value and OData entity use Value Description

1 A Depth value of 1 indicates that the OData data does not contain navigation properties.

With no navigation properties, Data Services imports OData data as flat database tables and does not expand navigation entities. However, Data Services retains entity primary key attributes.

2 A Depth value of 2 indicates that the OData data contains navigation properties.

With navigation properties, Data Services imports all proper­ ties, including navigation and main entities, as flat DB tables with format by add “navigation entity name”/ “property name”.

Data Services expands the OData navigation associate en­ tity, which is the main entity through the navigation proper­ ties.

Data Services does not keep the entity primary key. In addi­ tion, Data Services uses the expand operator to retrieve the properties data of the main and associated navigation entities. The expand operator is similar to the Join opera­ tion for tables.

Parent topic: OData as a source [page 136]

Related Information

OData adapter datastore configuration options [page 45]

13.2 OData as a target

Define an OData target object in a data flow in the target editor.

When you use an OData table as a target in a data flow, you open the target editor to complete options specific to the table. The following table describes the OData adapter target options in the target editor.

 Note

If you use a different target type, the target options may be different than the target options detailed in the table.

Supplement for Adapters 138 PUBLIC OData adapter OData target object configuration Option Description

Column delimiter Specifies the non printable character present in the source table for the target column delimiter. Can't be edited.

Represented as “/nnn”

Row delimiter Specifies the non printable character present in the source table for the target row delimiter. Can't be edited.

Represented as “/nnn”

Supplement for Adapters OData adapter PUBLIC 139 Option Description

Loader action Specifies how Data Services loads generated data to a target with existing data.

● Create: Creates a new entity in the given target entity set.

 Note

If you load to Microsoft Graph API object, Create is the only option to select.

● Update: Modifies an existing entity in the target using update semantics. ● Merge: Modifies an existing target entity using merge semantics. ● Upsert: Modifies an existing target entity and adds new entities if they do not already exist.

 Restriction

Because each OData adapter uses a different third- party API per OData version, there is not a method to send upsert requests to the OData service. Therefore, for the Upsert option, Data Services uses the following workflow: ○ OData version 4: OData adapter sends an up­ date request. If the update request fails, it cre­ ates and sends a request. ○ OData version 1 and 2: OData adapter sends a create request. If the create request fails, it sends a merge request. If the create request and the merge request fail to process, Data Services generates an error message.

● Upsert function: Modifies an existing entity in the target and adds new entities when the entity does not exist.

 Note

For use with OData version 2 and SuccessFactors only. For SucessFactors, unlike the Upsert option, the Upsert function option sends the function by HTTP request to SuccessFactors.

● Delete: Deletes an existing entity in the target. ● Create link: Creates a new related entity link between two entities in the target. ● Update link: Updates related entity links between two entities in the target by navigation property.

Supplement for Adapters 140 PUBLIC OData adapter Option Description

● Delete link: Deletes related entity links between two en­ tities in the target by navigation property.

Audit Specifies whether to log data for auditing.

● True: Logs the status for each row and creates audit files. Stores audit files in \log \LoaderAudit. The format of the file is ____.dat. ● False: Returns an error to the user interface if OData server throws an error. Does not check for the row sta­ tuses.

 Note

Selecting False may improve performance. There­ fore, if you do not need auditing data, select False.

Parent topic: OData adapter [page 136]

Related Information

OData as a source [page 136] OData pushdown behavior [page 141]

13.3 OData pushdown behavior

SAP Data Services uses the OData specification to push down operations to the source or target database.

Data Services adheres to the following rules when pushing down operations to the source or target database:

● Includes only selectable and filterable columns in the projection. ● Uses the following Where clause form: operation . ● Includes only sortable columns in an ORDER BY clause.

 Note

You cannot use $orderby as part of an OData query.

After Data Services accepts the OData query, it checks the column metadata for allowable operations, such as LIKE and IN. The following table contains the column operations allowed by Data Services.

Supplement for Adapters OData adapter PUBLIC 141 Allowable column operations Operation Description

AllowFilter Allows for binary operation such as equal (=) and greater than (>).

AllowSelect Indicates that the column can be used in projection.

AllowSort Indicates that the column can be used in ORDER BY.

AllowInsert Indicates that the column can be used in target.

AllowUpdate Indicates that the column can be used in target.

AllowUpsert Indicates that the column can be used in target.

Parent topic: OData adapter [page 136]

Related Information

OData as a source [page 136] OData as a target [page 138] Push down operations to the database server

Supplement for Adapters 142 PUBLIC OData adapter 14 Salesforce.com adapter

Use a Salesforce.com adapter to access Salesforce.com data from the native SAP Data Services extract transform and load (ETL) environment.

Create and configure a Salesforce.com adapter instance in the SAP Data Services Management Console. Then create an adapter datastore in Data Services Designer to import Salesforce.com metadata. After import, Salesforce.com metadata appears under the Salesforce.com adapter node in the Datastore tab in the object library. Each table node contains two or three folders:

● The Referenced by and References subfolders contain reference by and referenced relationships with the table in the parent node as well as relationships with other tables.

 Example

If a Contact belongs to an Account, it has an AccountId column that points to its parent account. Account is “referenced by” Contact and Contact “references” Account.

● The Columns folder lists the table columns and their descriptions.

For information about datastores in general and application design processes, see the Designer Guide. For administration information, see the Administrator Guide.

The Salesforce.com DI_PICKLIST_VALUES table [page 144] The DI_PICKLIST_VALUES table is an SAP Data Services proprietary table that contains all Salesforce.com enumerated values, called picklists.

The CDC datastore table [page 144] The Salesforce.com Changed Data Capture (CDC) table contains a columns folder on input.

Use Salesforce.com for changed data [page 145] One simple usage of the Salesforce.com tables is to read changed data.

Salesforce.com error messages [page 152] When you encounter error messages during Salesforce.com adapter configuration and processing, use the information in the following table to help you understand why the error occurred and what solutions to use.

Administering the Salesforce.com adapter [page 154] Use the features in the SAP Data Services Management Console Administrator application to start services, monitor processes, and solve problems.

Related Information

Adding an adapter instance [page 13] Creating an adapter datastore [page 31] Salesforce.com adapter datastore configuration options [page 49]

Supplement for Adapters Salesforce.com adapter PUBLIC 143 14.1 The Salesforce.com DI_PICKLIST_VALUES table

The DI_PICKLIST_VALUES table is an SAP Data Services proprietary table that contains all Salesforce.com enumerated values, called picklists.

Import the DI_PICKLIST_Values table using a Salesforce.com adapter datastore.

To use the DI_PICKLIST_VALUES table as a source in data flow. Connect the picklist source to a Query transform and drill down to add a WHERE clause and filter for the picklist values to use in the data flow.

The picklist values table contains the following columns:

● OBJECT_NAME ● FIELD_NAME ● VALUE ● IS_DEFAULT_VALUE ● IS_ACTIVE ● LABEL

 Note

If you have translated pickup values in Salesforce.com, the LABEL column returns values for the language that you specify in your personal information settings. If your pickup values are not translated, the VALUE and LABEL columns return the same values.

Parent topic: Salesforce.com adapter [page 143]

Related Information

The CDC datastore table [page 144] Use Salesforce.com for changed data [page 145] Salesforce.com error messages [page 152] Administering the Salesforce.com adapter [page 154]

14.2 The CDC datastore table

The Salesforce.com Changed Data Capture (CDC) table contains a columns folder on input.

When you import a CDC table, the CDC table nodes differ from the other Salesforce.com tables. The columns folder contains the same columns as the Salesforce.com table plus three additional generated columns. Use the generated columns for CDC data retrieval. The following are the generated columns in the CDC table:

● DI_SEQUENCE_NUMBER: The sequence number (int). ● DI_OPERATION_TYPE: The operation type (varchar).

Supplement for Adapters 144 PUBLIC Salesforce.com adapter ● SFDC_TIMESTAMP: The Salesforce.com timestamp (date/time).

Parent topic: Salesforce.com adapter [page 143]

Related Information

The Salesforce.com DI_PICKLIST_VALUES table [page 144] Use Salesforce.com for changed data [page 145] Salesforce.com error messages [page 152] Administering the Salesforce.com adapter [page 154]

14.3 Use Salesforce.com for changed data

One simple usage of the Salesforce.com tables is to read changed data.

To be able to track changed data, Data Services enables you to specify dates, use check-pointing, or use information in the Salesforce.com tables.

The following topics describe how to configure a changed data capture (CDC) job with Salesforce.com, and update tables with changed data only.

The topics cover the following tasks:

● Read changed data from Salesforce.com ● How to use check points ● How to use the default start date in the CDC table source ● Limitations

Reading changed data from Salesforce.com [page 146] Use a Salesforce.com adapter and datastore to read changed data for regular database updates.

Using check points [page 150] A changed data capture (CDC) job uses the subscription name as check points when you read CDC from Salesforce.com.

Using the CDC table source default start date [page 150] The changed data capture (CDC) table source default start data can be a specified value, a check-point value, or a date related to the Salesforece.com retention period.

Limitations [page 151] There are some features in SAP Data Services that you cannot use when you collect changed data capture (CDC) data.

Parent topic: Salesforce.com adapter [page 143]

Supplement for Adapters Salesforce.com adapter PUBLIC 145 Related Information

The Salesforce.com DI_PICKLIST_VALUES table [page 144] The CDC datastore table [page 144] Salesforce.com error messages [page 152] Administering the Salesforce.com adapter [page 154]

14.3.1 Reading changed data from Salesforce.com

Use a Salesforce.com adapter and datastore to read changed data for regular database updates.

Perform the following tasks before you configure SAP Data Services to read changed data from Salesforce.com:

● Create a Salesforce.com adapter in the SAP Data Services Management Console following the instructions in Adding an adapter instance [page 13]. ● Create a Salesforce.com adapter datastore in Data Services Designer following the instructions in Creating an adapter datastore [page 31] and Salesforce.com adapter datastore configuration options [page 49].

1. Along with the Salesforce.com table, import changed data capture (CDC) table metadata into your local repository. 2. Start to build a data flow in Designer using the CDC table as the source object. 3. Add a Query transform and connect it to the CDC source. 4. Click the source object in the data flow to open the source editor. 5. Open the CDC Options tab and set options based on the descriptions in the following table.

Supplement for Adapters 146 PUBLIC Salesforce.com adapter CDC Options tab option descriptions Option Name Description

CDC subscription name Required. Specifies A name that Data Services uses to keep track of your location in a Salesforce.com CDC table that is growing continuously.

Enter a new name to create a new subscription. A sub­ scription name must be unique within a datastore, owner, and table name. For example, you can use the same sub­ scription name without conflict with different tables that have the same name in the same datastore if they have different owner names.

Salesforce.com CDC uses this subscription name to mark the last row read so that the next job starts reading the CDC table from that position.

You can use multiple subscription names to identify differ- ent users who read from the same imported Salesforce.com CDC table. Salesforce.com CDC uses the subscription name to save the position of each user.

Enable check-point When you select Enable check-point, Data Services re­ members the timestamp of last load and automatically applies that timestamp as the start time for the next load. By using Enable check-point, you do not need to define a WHERE clause in the Query transform.

Get before-image for each update row Do not select Get before-image for each update row (for use only if your source can log before-images and you want to read them during change-data capture jobs) as Salesforce.com provides no before-images.

Adapter source options include:

6. Open the Source tab and complete the options as described in the following table.

Source option descriptions Option Name Description

Column delimiter Specify a one-character delimiter for data columns by en­ tering the forward-slash (/) followed by a three-digit ASCII code to indicate an invisible character.

 Note

If the source already has a column delimiter, you can­ not edit this field.

Supplement for Adapters Salesforce.com adapter PUBLIC 147 Option Name Description

Row delimiter Specify a one-character delimiter for data rows by enter­ ing the forward-slash (/) followed by a three-digit ASCII code to indicate an invisible character.

 Note

If the source already has a row delimiter, you cannot edit this field.

Escape character Enter a one-charcter escape character.

 Note

If the source already has an escape character, you cannot edit this field.

CDC table source default start date This option works with the CDC Enable check-point option. Salesforce.com requires Data Services supply a start date and end date as part of a changed data request.

Fetch deleted records Set this value to Yes to have Data Services fetch the de­ leted records from the table as well as the CDC data. The default value is No.

7. Add a Map_CDC_Operation transform after the Query transform. 8. Click the Map_CDC_Operation transform name to open the transform editor. 9. Configure the CDC columns based on the information in the following table.

CDC column configuration Information Description

Sequencing Automatically populated with DI_SEQUENCE_NUMBER using sequential numbers starting at 0 each time the CDC operation starts.

Data Services sorts returned rows by the value in the DI_SEQUENCE_NUMBER column.

Row operation Automatically populated with DI_OPERATION_TYPE, which indicates the type of operation performed on the object.

Operations include: ○ I: INSERT ○ U: UPDATE ○ D: DELETE

Data Services doesn't return B, BEFORE-IMAGE records.

Supplement for Adapters 148 PUBLIC Salesforce.com adapter Information Description

Timestamp Data Services includes the SFDC_TIMESTAMP value with the time at which the operation was performed. For exam­ ple, the time the object was inserted, deleted, or last up­ dated.

 Note

Data Services may or may not set values for other columns depending on the operation type. For example, for a DELETE operation, Data Services sets only the ID. For UPDATE and INSERT, Data Services sets the columns to represent the state of the object after the operation.

10. Connect the Map_CDC_Operation transform to the target table.

The target table is where the INSERT, UPDATE, and DELETE commands are executed.

The following table shows the CDC operation mapping of data from Salesforce.com to the software:

Salesforce.com data since last CDC operation Records returned to Data Services

INSERT INSERT

UPDATE UPDATE

DELETE DELETE

INSERT & UPDATE INSERT & UPDATE

INSERT & DELETE DELETE

UPDATE & DELETE DELETE

INSERT & UPDATE & DELETE DELETE

If an object is inserted and updated after the reference time, the data flow returns two CDC records to Data Services, one for each operation. However, both records contain the same information, reflecting the state of the object after the UPDATE. So, in this type of situation, there is no way of knowing the object state after the INSERT operation.

For more information about CDC check points and using before images, see the Designer Guide.

Task overview: Use Salesforce.com for changed data [page 145]

Related Information

Using check points [page 150] Using the CDC table source default start date [page 150] Limitations [page 151] Using mainframe check-points Using before-images from mainframe sources

Supplement for Adapters Salesforce.com adapter PUBLIC 149 14.3.2 Using check points

A changed data capture (CDC) job uses the subscription name as check points when you read CDC from Salesforce.com.

Instead of reading all rows in a table to check for changed data, you can configure SAP Data Services to check for changed data beginning with the most recent set of appended rows. Because Salesforce.com saves changed data for only a limited time, it doesn't monitor the retrieving application or the data retrieved. Therefore, Data Services supplies the subscription name to read the most recent set of appended rows, and it also uses the Salesforce.com timestamp of the last record.

To enable this feature, enable check pointing in the CDC table source object in your data flow. If you disable check points, the CDC job always reads all the rows in the CDC data source, which increases processing time.

To use check-points, in the Source Table Editor, enter the CDC Subscription name and select the Enable check- point option. If you enable check-points and run a CDC job in recovery mode, the recovered job begins to review the CDC data source at the last check point.

 Note

To avoid data corruption problems, do not reuse data flows that use CDC datastores because each time a source table extracts data it uses the same subscription name. This means that identical jobs, depending upon when they run, can get different results and leave check-points in different locations in the file.

Parent topic: Use Salesforce.com for changed data [page 145]

Related Information

Reading changed data from Salesforce.com [page 146] Using the CDC table source default start date [page 150] Limitations [page 151]

14.3.3 Using the CDC table source default start date

The changed data capture (CDC) table source default start data can be a specified value, a check-point value, or a date related to the Salesforece.com retention period.

The CDC table source default start date is dependent on several factors. The following information presents situations for when you specify the start data and when you do not specify a start date.

When you specify a start date value, Data Services uses or does not use the value based on the specified start date in relation to the Salesforce.com retention period.

● Data Services uses the specified start date under the following circumstances: ○ When the specified date is within the Salesforce.com retention period and there is no check-point information.

Supplement for Adapters 150 PUBLIC Salesforce.com adapter ○ When the specified date is within the Salesforce.com retention period and after the check-point information.

● Data Services uses the check-point value as the start date under the following circumstances: ○ When the specified date is within the Salesforce.com retention period and before the check-point information.

● Data Services ignores the specified start date under the following circumstance: ○ When the specified date is outside of the Salesforce.com retention period.

When you do not specify a value for the start date, Data Services bases the date on other dates as follows:

● If a check-point value is not available during the initial execution, Data Services uses the beginning of the Salesforce.com retention period as the start date. ● If a check-point occurs within the Salesforce.com retention period, Data Services uses the check-point as the start date. ● If the check-point occurs before the Salesforce.com retention period, Data Services uses the beginning of the retention period as the start date. ● If a table is created within the Salesforce.com retention period and a check-point is not available, Data Services returns an execution error.

 Note

The solution to this situation is to open the source editor and enter a date for the CDC table source default start date. The date must fall after the date the table was created.

Parent topic: Use Salesforce.com for changed data [page 145]

Related Information

Reading changed data from Salesforce.com [page 146] Using check points [page 150] Limitations [page 151]

14.3.4 Limitations

There are some features in SAP Data Services that you cannot use when you collect changed data capture (CDC) data.

We mention that you can use the Map_CDC_Operation transform with a CDC table as a source. However, you cannot use the following transforms in a CDC data flow:

● Table Comparison ● SQL Transform

In addition, you cannot use the LOOKUP and LOOKUP_EXT functions when you import the Salesforce.com source table with a CDC datastore. That is because Data Services imports the CDC table with three additional columns, which you cannot use to search or compare:

Supplement for Adapters Salesforce.com adapter PUBLIC 151 ● DI_SEQUENCE_NUMBER DI_OPERATION_TYPE SFDC_TIMESTAMP

Parent topic: Use Salesforce.com for changed data [page 145]

Related Information

Reading changed data from Salesforce.com [page 146] Using check points [page 150] Using the CDC table source default start date [page 150]

14.4 Salesforce.com error messages

When you encounter error messages during Salesforce.com adapter configuration and processing, use the information in the following table to help you understand why the error occurred and what solutions to use.

During the course of designing and deploying your jobs, you may encounter error messages. The following table contains error messages and their descriptions, including suggested actions.

Salesforce.com error messages Error Message Description

Login operation has failed. SForce.com message is Invalid user name/password or user account is blocked for another {0} reason, which is explained by the Salesforce.com message.

ACTION: Confirm password or contact Salesforce.com for more in­ formation.

Unknown object type. SForce.com message is {0} The table used in the query is no longer available or visible to the user.

ACTION: Browse Salesforce.com metadata and look for the table.

Invalid field. SForce.com message is {0} One or more fields used in the query are no longer available.

ACTION: Browse Salesforce.com metadata to determine if there is a difference between the imported table and the actual metadata. If necessary, rebuild your data flow.

Unsupported SQL statement: {0} Your data flow is not supported by Salesforce.com.

ACTION: Rebuild according to the restrictions described in this document.

Supplement for Adapters 152 PUBLIC Salesforce.com adapter Error Message Description

Malformed query: {0}. SForce.com message is {1} The submitted query is unsupported by Salesforce.com. Most likely you have encountered a bug translating between data flows and Salesforce.com queries.

ACTION: Contact product support.

Invalid session parameter: name = {0}, value = {1} The URL or batchSize session parameter is invalid. Either the URL is malformed or batchSize is not a positive integer.

ACTION: Check the integrity of the URL and confirm that the batch­ Size is a positive integer.

Invalid CDC query: {0} The data flow built over a CDC table is invalid.

ACTION: Check for (and fix) any missing WHERE clause condition for SFDC_TIMESTAMP.

There was a service connection error when talking to The adapter could not connect to Salesforce.com. SForce.com: {0} ACTION: Confirm that the web service end point is correct and ac­ cessible through your network.

There was a communication error when talking to A protocol error occurred. SForce.com: {0} ACTION: Contact product support.

There was an unexpected error. SForce.com mes­ An unknown, unexpected error occurred. sage is {0} ACTION: Contact product support.

Parent topic: Salesforce.com adapter [page 143]

Related Information

The Salesforce.com DI_PICKLIST_VALUES table [page 144] The CDC datastore table [page 144] Use Salesforce.com for changed data [page 145] Administering the Salesforce.com adapter [page 154]

Supplement for Adapters Salesforce.com adapter PUBLIC 153 14.5 Administering the Salesforce.com adapter

Use the features in the SAP Data Services Management Console Administrator application to start services, monitor processes, and solve problems.

After you design Salesforce.com adapter instances, and create data flow suing data imported using the adapter datastore, run the applications to finalize SAP Data Services-adapter integration. The following list contains the basic startup tasks:

● Ensure that each instance has a status of “started”. ○ For real-time services, start services and applications that use this service. ○ For batch services, start the service and schedule the job. ● Monitor the progress for each job. ○ Monitor pending requests ○ Monitor processed requests ○ Monitor failed requests ○ Monitor status

 Note

The Administrator does not automatically refresh views. To refresh views, go to the View menu and select Refresh.

● Monitor progress for each real-time service. ● Monitor the messaging progress for the adapter server.

If problems occur, try the following solutions:

● When you receive error messages and suggested troubleshooting actions, read Salesforce.com error messages [page 152]. ● To understand the source of a problem, use error and trace logs. ● To trace and debug the adapter instance, enable debug tracing in the Administrator.

Parent topic: Salesforce.com adapter [page 143]

Related Information

The Salesforce.com DI_PICKLIST_VALUES table [page 144] The CDC datastore table [page 144] Use Salesforce.com for changed data [page 145] Salesforce.com error messages [page 152]

Supplement for Adapters 154 PUBLIC Salesforce.com adapter 15 SuccessFactors adapter

Use tables that you import with the SuccessFactors adapter as sources and targets in data flows.

Create a SuccessFactors adapter instance in SAP Data Services Management Console Administrator application. Then create a SuccessFactors adapter datastore in SAP Data Services Designer. Use the SuccessFactors adapter datastore to import metadata from your SuccessFactors account into your local repository.

SuccessFactors push-down operations [page 155] Include push-down operations to enable SAP Data Services to push down certain operations to the source or target database.

SuccessFactors ID field [page 156] Each SuccessFactors table has an identification field, which is an internal key.

SuccessFactors as a source [page 156] Use imported SuccessFactors data as a source in a data flow.

SuccessFactors as a target [page 157] Use imported SuccessFactors data as a target in a data flow.

CompoundEmployee API [page 159] SAP SuccessFactors CompoundEmployee API is a SOAP web service that extracts employee data out of the Employee Central.

Related Information

Adding an adapter instance [page 13] Creating an adapter datastore [page 31] SuccessFactors adapter datastore configuration options [page 54]

15.1 SuccessFactors push-down operations

Include push-down operations to enable SAP Data Services to push down certain operations to the source or target database.

Use the SuccessFactors query language specification to create push-down operations. Keep in mind the following rules:

● Include only columns in the projection. ● Use the following format for the WHERE clause: operation . ● Include only columns in the ORDER BY clause.

For more information about push-down operations, see the Designer Guide.

Supplement for Adapters SuccessFactors adapter PUBLIC 155 Parent topic: SuccessFactors adapter [page 155]

Related Information

SuccessFactors ID field [page 156] SuccessFactors as a source [page 156] SuccessFactors as a target [page 157] CompoundEmployee API [page 159]

15.2 SuccessFactors ID field

Each SuccessFactors table has an identification field, which is an internal key.

SuccessFactors creates a unique ID for each inserted row. Because Success Factors automatically creates and assigns an ID, ensure that the input data does not already include an ID field. If your input data already has an ID field, SuccessFactors returns an error..

When SuccessFactors updates or deletes a row, it requires that the input data includes an ID field. SuccessFactors uses the ID field to identify a row. If an updated or deleted row does not have an ID field, SuccessFactors returns an error.

Parent topic: SuccessFactors adapter [page 155]

Related Information

SuccessFactors push-down operations [page 155] SuccessFactors as a source [page 156] SuccessFactors as a target [page 157] CompoundEmployee API [page 159]

15.3 SuccessFactors as a source

Use imported SuccessFactors data as a source in a data flow.

Open a source editor by clicking the SuccessFactor table that you set as a source in a data flow. The following table contains source option descriptions.

Supplement for Adapters 156 PUBLIC SuccessFactors adapter SuccessFactors source table option descriptions Option Description

Batch size Specifies the number of rows to be processed as a batch.

Column delimiter Specifies the delimiter that separates data between columns in the source table.

Row delimiter Specifies the delimiter that separates data between rows in the source table.

Parent topic: SuccessFactors adapter [page 155]

Related Information

SuccessFactors push-down operations [page 155] SuccessFactors ID field [page 156] SuccessFactors as a target [page 157] CompoundEmployee API [page 159]

15.4 SuccessFactors as a target

Use imported SuccessFactors data as a target in a data flow.

The following table contains options in the target editor for a SuccessFactors target in a data flow.

SuccessFactors target table option descriptions Option Description

Batch size Specifies the number of rows to be processed as a batch.

Column delimiter Specifies the delimiter from the source data that separates data between columns.

Row delimiter Specifies the delimiter from the source data that separates data between rows.

Supplement for Adapters SuccessFactors adapter PUBLIC 157 Option Description

Use auto correct Checks the target table for existing rows before adding new rows to the table.

 Note

Enabling this option can slow job performance.

When you set this parameter to true, Data Services performs the following actions:

● Row status = Insert: Inserts a new row when it doesn't already exist. Updates an existing row with new data. ● Row status = Update: When input row has no ID field, updates an existing row in the target. If the row doesn't exist, inserts the row. When the table has an ID field, Up­ dates the row with changed data. ● Row status = Delete: Deletes the row.

Use audit Specifies that Data Services creates a log for auditing.

● True: Logs auditing-related data for the following sce­ narios: ○ If there are no user input keys and a row cannot be deleted or updated, automatically logs the ID field. ○ Always logs user input keys. ○ If there is no input key for an inserted rows, logs an error.

 Note

Specify input keys in the Query transform that is connected to the SuccessFactors target.

● False: Doesn't create audit log.

Data Services stores the log in the \log\SFSF directory. The format of the file is _

_.dat.

Parent topic: SuccessFactors adapter [page 155]

Related Information

SuccessFactors push-down operations [page 155] SuccessFactors ID field [page 156] SuccessFactors as a source [page 156]

Supplement for Adapters 158 PUBLIC SuccessFactors adapter CompoundEmployee API [page 159]

15.5 CompoundEmployee API

SAP SuccessFactors CompoundEmployee API is a SOAP web service that extracts employee data out of the Employee Central.

Import the CompoundEmployee entity using the SuccessFactors adapter datastore.

Import by right-clicking the applicable CompoundEmployee entity object in the metadata browser dialog box and selecting Import. Or you can import by name. After import, SAP Data Services places the CompoundEmployee entity under the Documents node of the SuccessFactors datastore. The CompoundEmployee entity has a nested structure.

For more information about importing metadata using an adapter datastore, and about nested data, see the Designer Guide.

Importing data from an .xsd file [page 160] After importing CompoundEmployee into a datastore, SAP Data Services automatically creates and writes the CompoundEmployee metadata into LINK_DIR/ext/SFSF/ EC_API_CompoundEmployee.xsd.bak.

Specify filters for CompoundEmployee as a source [page 160] When you use the CompoundEmployee entity as a source in a data flow, use filters to select the information to receive.

Retrieve information from CompoundEmployee [page 163] To retrieve the specific CompoundEmployee data from SuccessFactors, include the XML_Map transform after the CompoundEmployee reader, in the SAP Data Services data flow.

Parent topic: SuccessFactors adapter [page 155]

Related Information

SuccessFactors push-down operations [page 155] SuccessFactors ID field [page 156] SuccessFactors as a source [page 156] SuccessFactors as a target [page 157] Nested Data

Supplement for Adapters SuccessFactors adapter PUBLIC 159 15.5.1 Importing data from an .xsd file

After importing CompoundEmployee into a datastore, SAP Data Services automatically creates and writes the CompoundEmployee metadata into LINK_DIR/ext/SFSF/EC_API_CompoundEmployee.xsd.bak.

If there is only a .bak file in the SFSF directory at the time of import, Data Services imports the schema from SuccessFactors.

To read CompoundEmployee metadata from an .xsd file instead of importing it from the SuccessFactors API, perform the following steps:

1. Download the XSD file EC_API_CompoundEmployee.xsd from SuccessFactors. For information about how to download the .xsd file, see SAP Note 1900616 . 2. Save the EC_API_CompoundEmployee.xsd to LINK_DIR/ext/SFSF. 3. In the Designer, re-import the CompoundEmployee API into the datastore.

Task overview: CompoundEmployee API [page 159]

Related Information

Specify filters for CompoundEmployee as a source [page 160] Retrieve information from CompoundEmployee [page 163]

15.5.2 Specify filters for CompoundEmployee as a source

When you use the CompoundEmployee entity as a source in a data flow, use filters to select the information to receive.

Open the source editor in your data flow for the CompoundEmployee source object. Then use filters to input text, global variables, or substitution parameters as filter values.

Supplement for Adapters 160 PUBLIC SuccessFactors adapter Filters for CompoundEmployee sources Field Valid Operations Examples

LAST_MODIFIED_ON =, >, >=, <, <= Example 1:

> to_date('2013-02-28','YY YY-MM- DD')

Range example:

> $v1 and < $v2

Expand to LAST_MODIFIED_ON > to_date(‘2013-02-25’ ,'YYYY-MM-DD') AND LAST_MODIFIED_ON

< to_date(‘2013-02-28’,'YY YY-MM-DD')

COMPANY_TERRITORY_CODE =, IN = ‘IND’

IN (‘IND’,’USA’)

PERSON_ID =, IN = '99'

PERSON_ID_EXTERNAL =, IN IN operator example:

IN ('nkoo1', 'cgrant1')

Global variable example:

= '$v1'

 Note

You need to use single quotes for character data.

Substitution parameter example:

= [$$param1]

 Note

You need to use single quote as part of the parameter.

COMPANY =, IN = 'SFIDC01'

= $$sp

Where $$sp = 'SFIDC01'

Supplement for Adapters SuccessFactors adapter PUBLIC 161 Field Valid Operations Examples

EMPLOYEE_CLASS =, IN = '2'

DEPARTMENT =, IN Examples:

= 'US010001'

= 'DE010001'

DIVISION =, IN = 'DE01'

= 'divi1'

BUSINESS_UNIT =, IN = '1107bufd'

LOCATION =, IN = 'SE010010'

JOB_CODE =, IN = 'US_U2'

PAY_GROUP =, IN = 'A1'

EFFECTIVE_END_DATE =, >= >= to_date('2013-02-10','YY YY-MM-DD')

 Example

Data Services pushes down filters to SuccessFactors. The following is an example of a filter and the resulting query that Data Services sends to SuccessFactors.

Filter

EFFECTIVE_END_DATE >= to_date('2013-02-10','YYYY-MM-DD')

PAY_GROUP IN (‘01’, ‘02’) JOB_CODE = ‘$JC’ LOCATION IN (‘DE010001’, ‘$loc’) BUSINESS_UNIT IN ('DE01','FR01') DIVISION IN (‘FR01’, ‘DE01’) DEPARTMENT IN (‘FR010000’, ‘DE010001’) EMPLOYEE_CLASS = ‘1’ COMPANY IN (‘FR01’, DE01’) PERSON_ID_EXTERNAL IN (‘99999’, ‘9999_SACHIN’,’ RAHUL’, ‘11223347’) PERSON_ID IN (‘16116’, ‘15636’, ‘14276’) COMPANY_TERRITORY_CODE IN (‘FRA’, ‘DEU’) LAST_MODIFIED_ON >= $low_lmo AND <= $high_lmo

Where

$jc = ‘DE_00’

$loc = ‘FR010000’ $ec = ‘1’

Supplement for Adapters 162 PUBLIC SuccessFactors adapter $low_lmo = to_date('2013-03-01','yyyy-mm-dd') $high_lmo = to_date('2013-05-30','yyyy-mm-dd')

Resulting query

SELECT person, personal_information, address_information, phone_information, email_information, employment_information, job_information, compensation_information, paycompensation_recurring, paycompensation_non_recurring, payment_information, accompanying_dependent, alternative_cost_distribution, job_relation, direct_deposit, national_id_card, person_relation FROM CompoundEmployee WHERE LAST_MODIFIED_ON >= to_date('2013-03-01','YYYY-MM-DD') AND LAST_MODIFIED_ON <= to_date('2013-05-30','YYYY-MM-DD') AND COMPANY_TERRITORY_CODE IN ('FRA', 'DEU') AND PERSON_ID IN ('16116', '15636','14276') AND PERSON_ID_EXTERNAL IN ('99999', '9999_SACHIN','RAHUL', '11223347') AND COMPANY IN ('FR01', 'DE01') AND EMPLOYEE_CLASS = '1' AND DEPARTMENT IN ('FR010000', 'DE010001') AND DIVISION IN ('FR01', 'DE01') AND BUSINESS_UNIT IN ('DE01','FR01') AND LOCATION IN ('DE010001','FR010000') AND JOB_CODE = 'DE_00' AND PAY_GROUP IN ('01', '02') AND EFFECTIVE_END_DATE >= to_date('2013-02-10','YYYY-MM-DD')

Parent topic: CompoundEmployee API [page 159]

Related Information

Importing data from an .xsd file [page 160] Retrieve information from CompoundEmployee [page 163]

15.5.3 Retrieve information from CompoundEmployee

To retrieve the specific CompoundEmployee data from SuccessFactors, include the XML_Map transform after the CompoundEmployee reader, in the SAP Data Services data flow.

 Tip

To view the SQL that Data Services sends to SuccessFactors, configure the web service information in the SAP Data Services Management Console Administrator application. Enable the trace element job_trace_sqlreaders in the Web Services Configuration dialog box. For more information about this option, see the Integrator Guide.

Obtain information from the following schema levels:

person

personal_information address_information phone_information email_information employment_information job_information compensation_information paycompensation_recurring paycompensation_non_recurring

Supplement for Adapters SuccessFactors adapter PUBLIC 163 payment_information accompanying_dependent alternative_cost_distribution job_relation direct_deposit national_id_card

person_relation

Parent topic: CompoundEmployee API [page 159]

Related Information

Importing data from an .xsd file [page 160] Specify filters for CompoundEmployee as a source [page 160] Configuring web service information using the Administrator

Supplement for Adapters 164 PUBLIC SuccessFactors adapter 16 SSL connection support

SAP Data Services supports SSL connections in all its adapters. The setup and support can differ for each adapter.

The following table contains descriptions of the type of SSL connection for each adapter.

SSL type support for adapters Support Adapter Description

File-based Shapefile Connection between the Job Server and adapter uses the SSL connection VCF internally.

Excel No extra configuration is needed to use SSL connection with this adapter.

URL-based HTTP Uses an external SSL connection. Im­ port certificates into the Java Keystore JMS to get the SSL connection to work.

OData For more information, see:

Salesforce ● Configuring SSL with the HTTP adapter [page 94] SuccessFactors

Testadapter

WebService

Database connection-based Hive Configure SSL options on the Data Services server side. Also provide the MongoDB necessary certificates.

For more information about the SSL configuration options, see:

● Hive adapter datastore configura- tion options [page 35] ● MongoDB adapter datastore con­ figuration options [page 38]

 Note

The JDBC adapter requires a local JDBC driver. The JDBC driver should handle SSL security when connecting to a database server.

Importing certificate chain when errors occur [page 166] For URL-based services, when there is an error with the certificate, import the certification chain into SAP Data Services runtime services.

Supplement for Adapters SSL connection support PUBLIC 165 16.1 Importing certificate chain when errors occur

For URL-based services, when there is an error with the certificate, import the certification chain into SAP Data Services runtime services.

Data Services automatically includes certificates in its Java keystore so that Data Services recognizes an adapter datastore instance as a trusted website. If there is an error regarding a certificate, you can manually add a certificate back into the Java keystore. However, when there is an error for URL-based services, import the certification chain into Data Services runtime as follows:

1. In your web browser, click the lock icon in the address bar and select the wizard. 2. In the wizard, save the available certificate in base 64 encoded format to LINK_DIR\ssl \trusted_certs. 3. Stop Data Services Job Services. 4. Open the command prompt and enter cd LINK_DIR\bin and run the program SetupJavaKeystore.bat.

The command generates the jssecacerts and sslks.key files. 5. Restart Data Services Job Services and the adapter.

Task overview: SSL connection support [page 165]

Supplement for Adapters 166 PUBLIC SSL connection support Important Disclaimers and Legal Information

Hyperlinks

Some links are classified by an icon and/or a mouseover text. These links provide additional information. About the icons:

● Links with the icon : You are entering a Web site that is not hosted by SAP. By using such links, you agree (unless expressly stated otherwise in your agreements with SAP) to this:

● The content of the linked-to site is not SAP documentation. You may not infer any product claims against SAP based on this information. ● SAP does not agree or disagree with the content on the linked-to site, nor does SAP warrant the availability and correctness. SAP shall not be liable for any damages caused by the use of such content unless damages have been caused by SAP's gross negligence or willful misconduct.

● Links with the icon : You are leaving the documentation for that particular SAP product or service and are entering a SAP-hosted Web site. By using such links, you agree that (unless expressly stated otherwise in your agreements with SAP) you may not infer any product claims against SAP based on this information.

Beta and Other Experimental Features

Experimental features are not part of the officially delivered scope that SAP guarantees for future releases. This means that experimental features may be changed by SAP at any time for any reason without notice. Experimental features are not for productive use. You may not demonstrate, test, examine, evaluate or otherwise use the experimental features in a live operating environment or with data that has not been sufficiently backed up. The purpose of experimental features is to get feedback early on, allowing customers and partners to influence the future product accordingly. By providing your feedback (e.g. in the SAP Community), you accept that intellectual property rights of the contributions or derivative works shall remain the exclusive property of SAP.

Example Code

Any software coding and/or code snippets are examples. They are not for productive use. The example code is only intended to better explain and visualize the syntax and phrasing rules. SAP does not warrant the correctness and completeness of the example code. SAP shall not be liable for errors or damages caused by the use of example code unless damages have been caused by SAP's gross negligence or willful misconduct.

Gender-Related Language

We try not to use gender-specific word forms and formulations. As appropriate for context and readability, SAP may use masculine word forms to refer to all genders.

Videos Hosted on External Platforms

Some videos may point to third-party video hosting platforms. SAP cannot guarantee the future availability of videos stored on these platforms. Furthermore, any advertisements or other content hosted on these platforms (for example, suggested videos or by navigating to other videos hosted on the same site), are not within the control or responsibility of SAP.

Supplement for Adapters Important Disclaimers and Legal Information PUBLIC 167 www.sap.com/contactsap

© 2020 SAP SE or an SAP affiliate company. All rights reserved.

No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP SE or an SAP affiliate company. The information contained herein may be changed without prior notice.

Some software products marketed by SAP SE and its distributors contain proprietary software components of other software vendors. National product specifications may vary.

These materials are provided by SAP SE or an SAP affiliate company for informational purposes only, without representation or warranty of any kind, and SAP or its affiliated companies shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP or SAP affiliate company products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.

SAP and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP SE (or an SAP affiliate company) in Germany and other countries. All other product and service names mentioned are the trademarks of their respective companies.

Please see https://www.sap.com/about/legal/trademark.html for additional trademark information and notices.

THE BEST RUN