Mastering Custom Integrations in SAP Datasphere: Leveraging Open Connectors for REST APIs 

Introduction 

SAP Datasphere offers a wide range of pre-defined source system connectors. However, there may be instances where a standard connector is unavailable, such as with certain REST APIs. To address this, SAP provides the option to leverage Open Connectors. This blog outlines the process, including: 

  • Setting up the connection to SAP Open Connectors 
  • Creating a custom connector 
  • Integrating the data into SAP Datasphere 

Setup BTP and Datasphere Connection 

The initial step in setting up SAP Open Connectors in SAP Datasphere is to assign the Integration Suite entity to your BTP subaccount.

 

Once the assignment is completed, you can launch the Integration Suite from the subaccount.

 

It is crucial to ensure that the user has been granted the „Integration_Provisioner“ role to enable the necessary access rights.

 

Within the Integration Suite, select „Open Connectors“ as the required capability.

 

We can now open the Open Connectors 

Now the Open Connectors – Integration Suite is opened. 

Next, to integrate the Open Connectors account within SAP Datasphere, navigate to the „Connections“ tab in the desired space and select „Integrate your SAP Open Connectors Account.“

You will be prompted to enter specific parameters.

The organization Open Connectors secrets you can get when click on the settings button in the Open Connectors. 

Once the parameters are correctly configured, the account connection will be established successfully.

 This process is also described in the SAP documentation

Build Custom Connector 

SAP Open Connectors provides an extensive catalog of predefined connectors for various APIs and services (Documentation). In this example, we use a public API that provides Harry Potter-related data (Documentation). To create the custom connector, go to the „Connectors“ section and click „Build New Connector.“  

You can either create a connector from scratch or import an existing configuration.

 

First, provide the connector’s name, description, and classification. In the „Setup“ step, enter the Base URL of the API and configure other relevant details.

 

Since the chosen API does not require authentication, remove the default authentication headers using a PreRequestHook.

 

In the „Resources“ section, define the API endpoints to be consumed by mapping the endpoint URL to a custom endpoint name (in this case, they are identical). Additionally, configure the HTTP method (GET, POST, etc.) and any optional parameters.

 

It is essential to define the expected response model.

 

The response model can also be defined based on a sample payload. 

After saving the configuration, validate the connector in the „Validation“ section.

After saving the configuration, validate the connector in the „Validation“ section. The validation process will highlight any errors or missing definitions. Once the connector passes validation, authenticate an instance of the connector to begin testing.

 

Now you can “Authenticate” an instance of your connector to test it. 

In the „API Docs“ section, select the instance and test the endpoints to review the response codes and payloads.

 

The result of the test shows us the response code and body: 

The „Activity“ menu provides basic logging and monitoring, which is invaluable for troubleshooting issues during testing.

 

In the next section we describe now how we add this connector as a Source System and consume it. 

Consumption of Connector in SAP Datasphere 

After connecting the Open Connectors account to your Datasphere space, create a new connection by selecting „Open Connector.“

 

Choose the instance of your custom connector, noting that only Data Flows are currently supported for data integration.

 

With the connection established, create a new Data Flow in the Data Builder. The custom connector and its defined endpoints will now appear as selectable sources.

 

By configuring the data flow and running it successfully, the REST API data is written into a local table in SAP Datasphere, making it available for use in reporting and analysis.

 

This process demonstrates how SAP Datasphere, combined with Open Connectors, can integrate external REST API data into your enterprise data landscape, providing enriched data insights and seamless connectivity for custom use cases. 

Conclusion 

In summary, integrating REST APIs into SAP Datasphere using Open Connectors provides a robust framework for expanding data connectivity beyond standard sources, enabling seamless integration of external data into your SAP landscape. This process involves configuring the BTP environment, building custom connectors, and defining API endpoints for efficient data flow into SAP Datasphere tables. While the integration offers extensive flexibility, it is crucial to be aware of its limitations, such as support constraints and the requirement for data flow-based consumption (Documentation). By leveraging Open Connectors‘ logging and validation tools, developers can ensure smooth implementation and quickly address potential errors. This capability enhances data-driven decision-making by combining internal datasets with external API sources, enriching reports and analytics through SAP Analytics Cloud for more comprehensive insights. 

Links 

Open Connectors: https://help.openconnectors.ext.hana.ondemand.com/home 

Open Connectors Catalog: https://help.openconnectors.ext.hana.ondemand.com/home/catalog 

Building Custom Connectors: https://help.openconnectors.ext.hana.ondemand.com/home/build-elements 

Open Connectors Datasphere: https://help.sap.com/docs/SAP_DATASPHERE/be5967d099974c69b77f4549425ca4c0/9bfe7db51216449d985a0b59f5e181c4.html 


Christian has been working as a Business Intelligence Consultant at ZPARTNER since 2020. He is specialized in advanced SAP BW/4HANA, HANA native modeling and SAP Datasphere solutions. Christian has a strong technical background in ABAP programming, AMDP transformations, Python-based data processing. He has worked in projects in various industries and developed solutions for complex data extraction, integration and modeling.

×