Quantcast
Channel: SAP HANA Central
Viewing all 711 articles
Browse latest View live

General Availability of SAP HANA XS Advanced Cockpit

$
0
0
The new release of SAP HANA 2.0 SPS03 is around the corner and we are excited to explain our new features. In this blog, we are specifically going to look into SAP HANA XS Advanced Cockpit. This will replace the old SAP HANA XS Advanced Administration Tools, which is announced as “Deprecated” in this release. We will refer to SAP HANA XS Advanced Cockpit as XSA Cockpit and SAP HANA XS Advanced Administration Tools as XSA Admin Tools in this blog. One of the major reasons for us to move to XSA Cockpit is to provide users a consistent, unifying user experience in cloud and on-premise. The XSA Cockpit includes all functionalities of the XSA Admin Tools and it also provides a lot of additional capabilities. The UX and design of the XSA Cockpit is very similar to the SAP Cloud Platform Cockpit to provide users a seamless experience in the cloud and on premise.

Before we dive deeper into the new features, let me briefly explain how to install and access the XSA Cockpit. For those of you installing SAP HANA 2.0 SPS03, XSA Cockpit will be automatically deployed. For customers already using earlier releases, you need to install using the XS command. Log in to the SAP Support Portal and click on Download Software. Now you can search for the products you want to download and enter “SAP HANA Platform Edition 2.0” into the search term. Choose the second one, which is the maintenance product, as shown in the image below. You can then click on XSA Cockpit to download the zip file and once the download is complete, install the XSA cockpit by executing the command xs install <zip-file-name>. Then run the xs apps command to determine the URL of XSA Cockpit and launch it in your browser.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

In SAP HANA 2.0 SPS01 we provided a preview of XSA Cockpit with basic functionalities such as organization and space management, and in SAP HANA 2.0 SPS02 we included authorization management and now in the SAP HANA 2.0 SPS03 we provide all the functionality of XSA Admin Tools within the XSA Cockpit and a lot more. The below image shows us a screenshot of the XSA Cockpit. The user that is logged in here has the admin role, and hence this user is allowed to make all the CRUD operations on organizations. Based on the role of the logged in user, a selective enablement of UI elements will be displayed.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

Similar to the SAP Cloud Platform Cockpit, on clicking on an organization we can see the spaces assigned to this organization as shown in the following image. We can also notice that the tabs on the left navigation panel are different because we are now at the scope of organization level. We can navigate back to the Home level scope using breadcrumb.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

And by choosing any of these spaces, we can see the list of applications deployed in a particular space. The details of application instances, memory and disk limits are displayed here. We can view some additional information by clicking on the Monitoring tab from the navigation panel.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

The additional capabilities within the XSA Cockpit compared to that of the XSA Admin Tools include the possibility to browse though the Service Marketplace as shown in the image below. Using the Service Marketplace, we can choose a service and create service instances, which can then be bound to applications. The Service Marketplace also includes user provided services.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

Now, let’s navigate back to the Home level to view the Security capabilities. Under the Security tab, we can create and maintain role collections, trust configurations and trust certificates. The trust certificates along with their status can also be seen in the below image.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

The XSA Cockpit provides the Tenant Databases tab for creating a new tenant database, enabling them for XSA, and mapping them to an organization and/or space. This can be done on the same page of the XSA Cockpit as shown below whereas in the XSA Admin Tools this was provided under two separate tiles.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

The Host Management tab shows an overview of the different hosts on which XSA is running. As we are currently running a single host system we see only one host in the image below. Nevertheless in multi-host installations, we will be able to see different hosts along with the pinned apps and spaces within a host.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

Under the User Management tab we can create and manage users. And all further functionalities such as viewing audit logs, managing application lifecycle are included in the more tab. Additionally in every page of the XSA Cockpit, there will be panel-help button that explains details about the current page and the option for users to directly create support tickets, if necessary. These are shown in the following two images.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

The XSA Cockpit is also integrated into the SAP HANA Cockpit and SAP Web IDE. With the SAP HANA 2 SPS03 release, on choosing the Administer XS advanced option from the SAP HANA Cockpit, the users will be informed of the deprecation of XSA Admin Tools and will be provided the option to either choose XSA Cockpit or XSA Admin Tools. But going forward in the future releases, the possibility to use XSA Admin Tools will not be supported. And, of course we included the XSA Cockpit into SAP HANA, Express Edition too.

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

SAP HANA XS, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning, SAP HANA Cockpit

Create Resource Group and Resources in SAP HANA Cockpit 2.0 SP 06

$
0
0
As we are aware that SAP HANA Cockpit is the new web based HANA administration tool to monitor and manage HANA 2.0 databases, recently we installed the latest release SAP HANA Cockpit 2.0 SP 06 patch 04 for a client so as to monitor the BW/BPC HANA landscape.

To monitor and manage the HANA databases for the complete landscape, we installed HANA Cockpit 2.0 SP 06 patch 04 on a separate/dedicated server with 32 GB of RAM. Once the Cockpit instance is installed, we created resource groups and resources via SAP HANA Cockpit Manager web URL. A resource is the remote HANA database which we want to monitor via HANA Cockpit and a resource group is a group of commonly managed HANA databases for group level monitoring. For e.g. we planned to manage all BW/BPC non production HANA databases via a single resource group. So, we added all non production BPC HANA databases as ‘resources’ in the HANA cockpit and then we assigned these resources to the resource group.

I want to share how we configured the remote BW/BPC development HANA databases in HANA Cockpit Manager. A few prerequisites we should have in place before we configure resources:
1. HANA cockpit manager is accessible via URL https://fullhostname:510024

2. We are able to login the cockpit manager via the default cockpit user cockpit_admin or with our individual cockpit user created, for e.g., bsharma.

3. Logged in user (cockpit_admin or bsharma) should have the authorizations/roles such as Cockpit Resource Administrator and Cockpit User role.

Sequence of activities performed:

1. Create Resource Group
2. Register Resource
3. Assign the resource to a resource group
4. Validate resource monitoring via HANA Cockpit

Create a Resource Group:


Login SAP HANA Cockpit manager https://fullhostname:510024/ with Cockpit user.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

We will see the cockpit manager welcome screen. Only the features available according to the assigned authorizations, for e.g., to create a resource group and register a resource. Click on ‘Create Resource Group’ to create a resource group for group-level monitoring.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Enter the name for resource group according to the planned strategy to manage the customer landscape. For e.g. we created a group BW_BPC_NONPRD to monitor or manage all BW-BPC Non production HANA databases.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

There are no resources registered yet, so continue by clicking ‘Step 2’.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Click ‘add cockpit users’ button to browse available users.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Select the new cockpit user, for e.g. bsharma. This user will have access to the resource group BW_BPC_NONPRD so as to monitor and manage all the resources within the group.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Click the button – Create Group – to create the resource group.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Confirmation screen should display the details for new created resource group BW_BPC_NONPRD.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Register a Resource:


As we discussed before, a resource is the remote HANA database which we want to monitor and manage via SAP HANA Cockpit 2.0. Again login to cockpit manager and go to home page. Click on ‘Register a Resource’.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Select ‘System generated’ resource name. It will create a resource name for e.g. SID@SID in case of multiple container HANA database systems. Provide the details for the HANA database to be monitored and managed:

◈ Host: full hostname with domain
◈ Identifier: Instance Number (or port number)
◈ Container: Multiple Containers as our remote HANA database system is multi container. We selected – Tenant Database – so as to add the BW/BPC HANA development tenant database as resource to be monitored via HANA cockpit
◈ Database Name: Database name must be provided if we are trying to add a tenant database as a resource.
◈ Description: A short description for the remote HANA database we want to monitor

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Provide a technical user, a user which exists in the target remote HANA database. For e.g. we created a user COCKPIT_MONITORING in the tenant database. User should have minimum authorizations and can be created in the remote database using following SQL commands:

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Select whether to encrypt the connection or not. Check mark the options and continue to step 4.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Click – add group – to assign a particular resource group. So, the resource which we are trying to register will be a member of the resource group BW_BPC_NONPRD.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Select BW_BPC_NONPRD resource group, which we created before.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Continue with the selection after specifying the resource group.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Provide contact details for the person responsible for the resource/database and click Review.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Click the button – Register – to register the Resource.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

‘Resource Details’ will show the details about the registered resource. Remote HANA tenant database has been registered a resource GHC@GHC in SAP HANA Cockpit 2.0.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Validate the resource monitoring via SAP HANA Cockpit 2.0


We can click – Go to SAP HANA Cockpit – at the bottom of the screen from Cockpit Manager itself to go to HANA cockpit and to monitor the added resource. Or open SAP HANA Cockpit 2.0 via web URL http://fullhostname:510022

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Go to – Resources – to see the details for remote HANA tenant database. Resource state should be ‘Running’.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

Click on Resource, for e.g. GHC@GHC, to monitor the resource / HANA tenant database.

SAP HANA Cockpit 2.0, SAP BW/4HANA, SAP HANA Database Monitoring

S/4HANA Cloud Integration | Commercial Project Part 1

$
0
0

Introduction


In a professional services project you might use S/4HANA Cloud in combination with other solutions to apply the relevant business processes for professional services such as opportunity to cash. Scenario could be that project opportunities are managed in another solution, once the opportunity gets realized and priced, a project in S/4HANA Cloud will be created.

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

To be able to integrate these processes you need to understand how the S/4HANA Cloud APIs related to commercial projects work and how to address them. Depending on the source system where the priced opportunity is created, you might need a middleware such as SAP Cloud Platform Integration to perform mapping requirements of the message structure.

In this blog I would like to show you how we can leverage these S/4HANA Cloud APIs. We will start by understanding what these APIs are, where we can find information and documentation and how to test the APIs directly in S/4HANA Cloud.

The scenario in this blog will look like this:

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

In a following blog we will create an OData Service and Integration Flows on SAP Cloud Platform Integration to consume the S/4HANA Cloud API. Once you have your OData Service established, other systems can connect to SAP Cloud Platform Integration to match their source structure to the structure of the OData service.

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

For this blog ideally you have some basic understanding in S/4HANA Cloud Integration, APIs and SAP Cloud Platform Integration.

Discover and explore Commercial Project APIs


A good starting point to understand, which API to use and how, is the SAP API Business Hub. It lists the APIs that you can use for integration including a testing tool and detailed information on the structure of the API.

Under APIs –> S/4HANA Cloud –> Artifacts, you can search for your desired API on the right side, in our case I filter for “project”:

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

Currently we have five APIs related to projects. In this blog we will be using the APIs:

◈ Create and Update Commercial Projects (OData)
◈ Read Commercial Projects (OData)

In the overview page you get details regarding the communication scenario of your APIs, in our case SAP_COM_0054. This communication scenario you will need to activate in your S/4HANA Cloud system to be able to use the relevant APIs. Additionally you can find a link to a documentation of your API on the overview page under Business Documentation.

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

The API documentation is very important to understand the complete functionality of the API, what the service nodes are and which fields are mandatory and optional. For our read and create/update APIs the service nodes are ProjectSet, WorkpackageSet, WorkItemSet, PlanDataSet (read), DemandSet (create/update). In this blog we will be using the service node ProjectSet.

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

Clicking on service node ProjectSet will lead you to the parameter list of mandatory and optional parameters of the service. In our case we have 12 mandatory fields for the create API which need to be filled out to test the create/update service.

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

Enable APIs in S/4HANA Cloud


Communication Management

To be able to address the API, we need to activate the relevant communication scenario (SAP_COM_0054). You can find detailed steps on how to activate the communication scenario in API Business Hub under S/4HANA Cloud –> Documents –> Testing API Services of SAP S/4HANA Cloud

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

In your S/4HANA Cloud system under communication management you need to setup:

◈ A communication user for the technical exchange with your source system
◈ Define your source system from where the messages will come from, for this test you can tip in any name, I used APIHUB
◈ Activate communication arrangement SAP_COM_0054

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

Once you activated the scenario you can see that both read and create/update APIs are in communication scenario SAP_COM_0054.

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

Consume S/4HANA Cloud API directly


Testing

In this scenario we will be testing the APIs that we just activated in S/4HANA Cloud directly via Postman.

Read project

Once you have activated your communication arrangement in S/4HANA Cloud the read API will have an endpoint similar to this. Please replace <S4HC> with your S/4HANA Cloud tenant:

https://<S4HC>api.s4hana.ondemand.com/sap/opu/odata/CPD/SC_EXTERNAL_SERVICES_SRV

As mentioned before you need to include the service node that you want to address in the URL, in our case ProjectSet:

https://<S4HC>api.s4hana.ondemand.com/sap/opu/odata/CPD/SC_EXTERNAL_SERVICES_SRV/ProjectSet

In Postman you can execute the method GET with the credentials of your communication user, which you created when you setup the communication arrangement, and will receive the details regarding the current commercial projects in the system. If you want to search for a specific project, you can add the project ID at the end of your URL, in our case the project ID is “API”:

https://<S4HC>api.s4hana.ondemand.com/sap/opu/odata/CPD/SC_EXTERNAL_SERVICES_SRV/ProjectSet(‘API’)

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

You will receive a response xml with the project details:

<?xml version="1.0" encoding="utf-8"?>

<entry xml:base="https://<S4HC>-api.s4hana.ondemand.com/sap/opu/odata/CPD/SC_EXTERNAL_SERVICES_SRV/" xmlns="http://www.w3.org/2005/Atom" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices">

<id>https://<S4HC>-api.s4hana.ondemand.com/sap/opu/odata/CPD/SC_EXTERNAL_SERVICES_SRV/ProjectSet('API')</id>

<title type="text">ProjectSet('API')</title>

<updated>2018-04-03T13:31:28Z</updated>

<category term="/CPD/SC_EXTERNAL_SERVICES_SRV.Project" scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme"/>

<link href="ProjectSet('API')" rel="self" title="Project"/>

<link href="ProjectSet('API')/WorkpackageSet" rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/WorkpackageSet" type="application/atom+xml;type=feed" title="WorkpackageSet"/>

<content type="application/xml">

<m:properties>

<d:ChangedBy>CC0000000002</d:ChangedBy>

<d:ProfitCenter>YB102</d:ProfitCenter>

<d:ChangedOn>2018-03-29T09:49:40.0000000Z</d:ChangedOn>

<d:ProfitCenterName>Consulting Unit B</d:ProfitCenterName>

<d:ProjectID>API</d:ProjectID>

<d:ProjectName>ProjectAPI</d:ProjectName>

<d:ProjectStage>P001</d:ProjectStage>

<d:StageDesc>In Planning</d:StageDesc>

<d:StartDate>2018-03-29T00:00:00</d:StartDate>

<d:EndDate>2018-03-29T00:00:00</d:EndDate>

<d:Customer>IC1020</d:Customer>

<d:CustomerName><CUSTOMERNAME></d:CustomerName>

<d:ProjManagerId>50000701</d:ProjManagerId>

<d:ProjManagerName><PM_Name></d:ProjManagerName>

<d:ProjAccountantId/>

<d:ProjAccountantName/>

<d:ProjControllerId/>

<d:ProjControllerName/>

<d:ProjPartnerId/>

<d:ProjPartnerName/>

<d:CostCenter>0010201903</d:CostCenter>

<d:CostCenterName>Csltg Unit B-(DE)</d:CostCenterName>

<d:ProjectCategory>P</d:ProjectCategory>

<d:Currency>EUR</d:Currency>

<d:Currencyname>European Euro</d:Currencyname>

<d:OrgID>1020</d:OrgID>

<d:OrgDesc><ORGDESCRIPTION></d:OrgDesc>

<d:Confidential>N</d:Confidential>

</m:properties>

</content>

</entry>

Running the read API first before using the create project API helps us understand the structure of the message, with which we address the create API. Note:

◈ The cost center field requires 10 digits, our cost center (10201903) has only 8 digits, we need to add 2 leading zeros
◈ Specific date and time format for start and end date.
◈ Read API gives you the value of the project manager ID, while using the create API you require the project manager external ID. The project manager ID is the personnel ID of business user, while project manager external ID is the user name of the business user.

Create project

To create a project you need to execute the POST method on the create project API with relevant service node, in our case ProjectSet:

https://<S4HC>api.s4hana.ondemand.com/sap/opu/odata/CPD/SC_PROJ_ENGMT_CREATE_UPD_SRV/ProjectSet

Before we want to use the POST method, we require the x-csrf-token. We can get this by using the GET method on the API service with parameter x-csrf-token=fetch. Note that you do not need the service node here:

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

You need to have the payload in JSON format ready, which includes all the relevant project details and all 12 mandatory fields as mentioned before are maintained. You can get the model schema for the API in API Business Hub:

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

It included some optional fields and not all mandatory field, which I included manually. Below payload has only mandatory fields for service node ProjectSet:

{
  "ProjectCategory": "C",
  "OrgID": "your Org ID",
  "CostCenter": "your 10 digit cost center",
  "ProfitCenter": "your profit center",
  "Customer": "your customer ID",
  "Currency": "EUR",
  "ProjectID": "API",
  "ProjectName": "ProjectAPI",
  "ProjectStage": "P001",
  "ProjManagerExtId": "your user name",
  "StartDate": "2018-03-29T00:00:00.0000000",
  "EndDate": "2018-03-29T00:00:00.0000000"
}

Now you can use the POST method on the read API with the payload, the x-csrf-token and credentials of your communication user. If you are able to post it successfully, you should be able to search for project in S/4HC:

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

By default a work package is automatically created with the name of the project. If you want to post additional information when creating a project, such as work packages, work items or project role demand, you can use following payload structure:

{
  "ProjectCategory": "C",
  "OrgID": "your Org ID",
  "CostCenter": "your 10 digit cost center",
  "ProfitCenter": "your profit center",
  "Customer": "customer ID",
  "Currency": "EUR",
  "ProjectID": "API2",
  "ProjectName": "ProjectAPI2",
  "ProjectStage": "P001",
  "ProjManagerExtId": "user name",
  "StartDate": "2018-04-29T00:00:00.0000000",
  "EndDate": "2018-05-29T00:00:00.0000000",
  "WorkPackageSet": {
    "results": [
      {
        "ProjectID": "API2",
        "WorkPackageID": "API2.1.1",
        "WorkPackageName": "WorkPackageAPI2",
        "WPStartDate": "2018-04-29T00:00:00.0000000",
        "WPEndDate": "2018-05-29T00:00:00.0000000",
        "WorkItemSet": {
          "results": [
            {
              "ProjectID": "API2",
             "WorkPackageID": "API2.1.1",
              "WorkPackageName": "WorkPackageAPI2",
              "Workitem": "your work item",
              "Workitemname": "your work item name"
            }
          ]
        },
        "DemandSet": {
          "results": [
            {
              "ProjectID": "API2",
              "WorkPackageID": "API2.1.1",
              "ResType": "your resource type",
              "ResourceId": "your resource ID",
              "Effort": "10",
              "Uom": "H",
              "Workitem": "your work item",
              "WorkPackageName": "WorkPackageAPI2"
            }
          ]
        }
      }
    ]
  }
}

Again, if you struggle to find your values for any of these fields, you can use the read (GET) method to read the existing data of these fields in your S/4HANA Cloud system. This is an indication for you what the field value you need to enter could be.

If you set it up correctly another project is created:

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

Additionally a work package is created with the ID we specified and the work item is assigned to the work package. In our case the work item ID is “P002” and the work item name is “Testing”:

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

Also a project role is created in our work package:

S/4HANA Cloud Integration, SAP HANA Tutorial and Material, SAP HANA Certifications, SAP HANA Guides, SAP HANA Learning

S/4HANA Cloud Integration | Commercial Project Part 2

$
0
0

Introduction


Consume S/4HANA Cloud APIs indirectly via SAP Cloud Platform Integration

In this blog we will focus on indirect S/4HANA Cloud API consumption via SAP Cloud Platform Integration (in this blog referred to as CPI).

Ideally you have read through part 1: http://www.saphanacentral.com/2018/04/s4hana-cloud-integration-commercial.html

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

You might not want to address the API directly in S/4HANA Cloud but indirectly via a Middleware. This is the use case if you plan to connect S/4HANA Cloud to another system and an extensive mapping and routing operation is required. In our scenario we will use SAP Cloud Platform Integration (CPI). In CPI you are able to exploit an OData API, which both read and create project APIs are, as an Integration Flow or as an OData Service. In our case we will create an OData Service, and Integration Flow with an OData sender adapter and an Integration Flow with a HTTPS sender adapter. To do this you need to have activated the relevant APIs in S/4HANA Cloud (read part 1 for further information).

OData Catalog Service


If you are unsure of the URL to your OData service and want to view all deployed OData services in CPI, you can use the catalog service:

https://<CPI-tenant>/gw/CATALOGSERVICE;v=1/ServiceCollection

OData Service


In SAP Cloud Platform Integration you can expose existing S/4HANA Cloud SOAP and OData APIs as OData Services. The benefit here is that it creates relevant default mappings, while the Integration Flow does not. To do that instead of creating an Integration Flow in your integration package, create an OData Service:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Then you define a service name and namespace, which will be used in the URL to address the OData service. Once the top right click on “Import Model Wizard” and import the EDMX file you downloaded for your API from the API Business Hub. Select the OData Hierarchy that you want to include in your OData Service, in our case we included all four service nodes:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Once you check the structure and pressed on “finished”, you can view your OData model in the Graphical Model Viewer on the top right corner:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

It will display you the structure of your OData service graphically:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Now we want to define the method, with which you want to address the OData service. Because we want to create a project in S/4HC we choose the method CREATE and press on “bind”:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Here we set the entity set to “ProjectSet and our End Point is the S/4HC API:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Once we press on OK, it will create an Integration Flow automatically for the CREATE method with these parameters. You need to check if both sender and receiver OData adapter have been configured correctly and if both mappings are correct:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Once this is done, you can deploy the OData service. In our case we use “CREATEPROJECTS” as a service name and “SAP” as namespace, which means our URL will have following format:

https://<CPI-tenant>/gw/odata/SAP/CREATEPROJECTS;v=1/ProjectSet

Here again, you can use the same payload structure from part 1 of my blog series and just change the project ID and project name:

{
  "ProjectCategory": "C",
  "OrgID": "1020",
  "CostCenter": "0010201903",
  "ProfitCenter": "YB102",
  "Customer": "IC1020",
  "Currency": "EUR",
  "ProjectID": "API",
  "ProjectName": "ProjectAPI",
  "ProjectStage": "P001",
  "ProjManagerExtId": "<User_name>",
  "StartDate": "2018-03-29T00:00:00.0000000",
  "EndDate": "2018-03-29T00:00:00.0000000"
}

If successful, you should receive a successful message in the CPI monitor:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Once you check the S/4HANA Cloud system you will see that a new project has been created:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Integration Flow with OData sender adapter


In this case we create an Integration Flow with both OData sender and receiver adapter.

In the OData sender adapter
  • include the EDMX file of the OData API, which you can download from API Business Hub
  • set the Entity Set to ProjectSet:
S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

In the receiver OData adapter
  • enter the URL to your S/4HANA endpoint, then insert your name of your communication user and set resource path to “ProjectSet”:
S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

In our scenario our integration flow consists of
  • 2 scripts
  • 1 request-reply
  • 1 message mapping
S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning
  • Firstly we add a script that let us read the request payload
  • On the left hand side navigate to Script
  • create a GroovyScript
  • replace the default script with the following:
/*
 * The integration developer needs to create the method processData 
 * This method takes Message object of package com.sap.gateway.ip.core.customdev.util
 * which includes helper methods useful for the content developer:
 * 
 * The methods available are:
    public java.lang.Object getBody()
    //This method helps User to retrieve message body as specific type ( InputStream , String , byte[] ) - e.g. message.getBody(java.io.InputStream)
    public java.lang.Object getBody(java.lang.String fullyQualifiedClassName)
    public void setBody(java.lang.Object exchangeBody)
    public java.util.Map<java.lang.String,java.lang.Object> getHeaders()
    public void setHeaders(java.util.Map<java.lang.String,java.lang.Object> exchangeHeaders)
    public void setHeader(java.lang.String name, java.lang.Object value)
    public java.util.Map<java.lang.String,java.lang.Object> getProperties()
    public void setProperties(java.util.Map<java.lang.String,java.lang.Object> exchangeProperties) 
    public void setProperty(java.lang.String name, java.lang.Object value)
 * 
 */
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {
def messageLog = messageLogFactory.getMessageLog(message);
def bodyAsString = message.getBody(String.class);
messageLog.addAttachmentAsString("Request Message", bodyAsString, "text/xml");
return message;
}

Then the Request Reply
  • on the left hand side go to external call, then create a request reply:
S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Then we will add a message mapping for the response message from S/4HANA Cloud. Here we do a 1-to-1 mapping of the entity set ProjectSet.
  • Create a message mapping
  • Go to “edit message”
  • under “Source Messages” go to “edit message”
  • choose the EDMX file of your OData API
  • select “ProjectSet” as Element
  • under “Target Messages” go to “edit message”
  • choose the EDMX file of your OData API
  • select “ProjectSet” as Element
  • press ok
S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

  • click on the ProjectSet node on both source and target structure
  • click on the AB mapping button on the top right corner, it will automatically do your 1-to-1 mapping
S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Then we include a script to display our response payload:
  • On the left hand side navigate to Script
  • create a GroovyScript
  • replace the default script with the following:
/*
 * The integration developer needs to create the method processData 
 * This method takes Message object of package com.sap.gateway.ip.core.customdev.util
 * which includes helper methods useful for the content developer:
 * 
 * The methods available are:
    public java.lang.Object getBody()
    //This method helps User to retrieve message body as specific type ( InputStream , String , byte[] ) - e.g. message.getBody(java.io.InputStream)
    public java.lang.Object getBody(java.lang.String fullyQualifiedClassName)
    public void setBody(java.lang.Object exchangeBody)
    public java.util.Map<java.lang.String,java.lang.Object> getHeaders()
    public void setHeaders(java.util.Map<java.lang.String,java.lang.Object> exchangeHeaders)
    public void setHeader(java.lang.String name, java.lang.Object value)
    public java.util.Map<java.lang.String,java.lang.Object> getProperties()
    public void setProperties(java.util.Map<java.lang.String,java.lang.Object> exchangeProperties) 
    public void setProperty(java.lang.String name, java.lang.Object value)
 * 
 */
import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {
def messageLog = messageLogFactory.getMessageLog(message);
def bodyAsString = message.getBody(String.class);
messageLog.addAttachmentAsString("Response Message", bodyAsString, "text/xml");
return message;
}

Now we can test this integration flow. Below is the payload structure for posting. Please include your relevant data into this payload before posting.

{
  "ProjectCategory": "C",
  "OrgID": "your org ID",
  "CostCenter": "your 10 digit cost center",
  "ProfitCenter": "your profit center",
  "Customer": "your customer ID",
  "Currency": "EUR",
  "ProjectID": "API3",
  "ProjectName": "ProjectAPI3",
  "ProjectStage": "P001",
  "ProjManagerExtId": "your user name",
  "StartDate": "2018-04-29T00:00:00.0000000",
  "EndDate": "2018-05-29T00:00:00.0000000",
  "WorkPackageSet": {
    "results": [
      {
        "ProjectID": "API3",
        "WorkPackageID": "API3.1.1",
        "WorkPackageName": "WorkPackageAPI3",
        "WPStartDate": "2018-04-29T00:00:00.0000000",
        "WPEndDate": "2018-05-29T00:00:00.0000000",
        "WorkItemSet": {
          "results": [
            {
              "ProjectID": "API3",
             "WorkPackageID": "API3.1.1",
              "WorkPackageName": "WorkPackageAPI3",
              "Workitem": "your work item",
              "Workitemname": "your work item name"
            }
          ]
        },
        "DemandSet": {
          "results": [
            {
              "ProjectID": "API3",
              "WorkPackageID": "API3.1.1",
              "ResType": "your resource type",
              "ResourceId": "your resource type ID",
              "Effort": "10",
              "Uom": "H",
              "Workitem": "your work item",
              "WorkPackageName": "WorkPackageAPI3",
              "DelvryServOrg": "your delivery service org"
            }
          ]
        }
      }
    ]
  }
}

Firstly we need to get the x-csrf-token, check part 1 of this blog series.

Once we have the token in Postman
  • we select the POST method
  • set URL, it depends on the namespace and integration flow name, in our case the URL:
    • https://<CPI>/gw/odata/SAP/CREATEPROJECT;v=1/ProjectSet
    • replace <CPI> with your CPI tenant
  • include our communication user for basic authentication
  • include x-csrf-token
  • include payload in message body
If successful a response message is displayed:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

In S/4HANA Cloud a new project is created including a workpackage, work item and role assigned:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

In the CPI monitor you will see a successful message:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Under the tap “Attachments” you can view your request payload:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

And your response payload:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

Integration Flow HTTP sender adapter (READ)


In case you want to use the HTTP sender adapter in CPI to read projects, this is also possible. For the read operation, simply create an integration Flow with an OData receiver adapter, where you include the EDMX file, choose the GET method and the relevant query options of your resource path. Basically here you define which service node you want to address, in our case we want to read projects (ProjectSet), and will fields we want to see when viewing projects:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

As the sender adapter choose HTTPS and define an address. In our case the URL will be build as followed:

https://<CPI-tenant>/http/readprojects

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

If you were able to successfully read the projects in S/4HANA Cloud, you will receive a response message in xml format:

<ProjectSet>
    <Project>
        <Customer>IC1020</Customer>
        <ProfitCenter>YB102</ProfitCenter>
        <ProjectName>ProjectAPI</ProjectName>
        <CostCenter>0010201903</CostCenter>
        <OrgID>1020</OrgID>
        <CostCenterName>"Cost center name"</CostCenterName>
        <ProjectStage>P001</ProjectStage>
        <ProjManagerId>50000701</ProjManagerId>
        <ProfitCenterName>"Profit center name"</ProfitCenterName>
        <Currency>EUR</Currency>
        <CustomerName>"Customer name"</CustomerName>
        <ProjectID>API</ProjectID>
    </Project>
</ProjectSet>

You can also see a successful message in the CPI monitor:

S/4HANA Cloud Integration, SAP HANA Certifications, SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning

SAP HANA 2.0 SPS 03: New Developer Features; Database Development

$
0
0
In this blog, I would like to introduce you to the new features for the database developer in SAP HANA 2.0 SPS 03.   We will focus on the database development topic including Core Data Services, as well as SQLScript.

Core Data Services(CDS)


Extended Table & Multi-Store Table Support
Given that SAP HANA Dynamic Tiering is correctly installed and running, and extended storage is configured and available, CDS now allows you to partition column entities into both default storage and extended storage as well as fully in extended storage.

SQLScript


Exit Handler for UPDATE NOWAIT

Exception handling in SQLScript is designed to allow you catch general SQL error codes, or even specific ones, but not all SQL error codes are catchable.  For example, if you tried to catch SQL error code 146 which is an error for UPDATE NOWAIT failure, you would get an error message saying that the feature is not supported.  As of SAP HANA 2.0 SPS03, we’ve added support for defining an exit handler for this specific error code.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

BETWEEN Operator & Boolean Type


We continue to add language elements to SQLScript that simplify the programming model and improve the readability and maintainability of the code.   For example, in some cases, you might have a requirement to determine if a value is within a certain range.  Of course, you could write this simply as IF VALUE > 0 and VALUE < 100.  But in an effort to simplify the code, we now offer the BETWEEN operator to check the value.  You also see in the following example the use of the BOOLEAN type for the output parameter which we can set to true or false.  This is also a new feature as of SAP HANA 2.0 SPS03.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

Variable Declaration using LIKE


Another new language feature is the ability to declare variables using the LIKE keyword.  For example, if we wanted an intermediate table variable that was exactly like a persistent table or even an intermediate table variable, we can use the LIKE keyword instead of having to define the columns of the variable.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

Support Table Variables with Dynamic SQL


Previously we introduced the ability to exchange scalar values with dynamic SQL, meaning that we could do a SELECT * INTO a variable inside of EXEC or EXECUTE IMMEDIATE statements, and even pass parameters into it as well.  We have now extended this capability to include table variables as well, so you can now extract the result sets of SELECT statements into table variables. In this example, I pass in the table name as well as a multiplier which is then used to construct the SELECT statement within dynamic SQL.  The multiplier is passed in with the USING keyword.  The INTO keyword puts the results of the SELECT into the intermediate table variable LT_RESULTS.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

SELECT INTO with Default Values


In some cases, we might expect a SELECT INTO statement to fail and not return the scalar value.  We can now assure that in those cases, the scalar variables will contain a value by using the DEFAULT keyword. 

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

SEARCH Operator for Table Variables


In a previous support package, we introduced intermediate table operators such as INSERT, UPDATE, and DELETE.  In SAP HANA 2.0 SPS03, we have added a new SEARCH operator for enabling efficient search by key-value pairs in table variables.  In this example, you can see that I am doing an IF statement, which is searching for a particular PRODUCTID in the LT_PRODUCTS intermediate table variable.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

ROW Type


We now support ROW type variables.  This gives us the ability to have a collection of scalar variables with different types, very much like a structure in ABAP.  This is useful for storing a single row of table data.  For example, in the code below,  LS_EMPLOYEES is a ROW type, and then I do a SELECT * INTO LS_EMPLOYEES FROM “MD.Products” WHERE EMPLOYEEID = the input parameter.  So now LS_EMPLOYEES has all the data of that row, which I can then reference each field of that row, for example LS_EMPLOYEES.SALARYAMOUNT.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

Updatable Cursor


In some cases, we need to iterate over a cursor and update the table it is currently pointed to.  We use the FOR UPDATE when defining the cursor to acquire a lock on the effected rows.  We then use the WHERE CURRENT OF <name_of_cursor> in the UPDATE statement. Keep in mind we have several restrictions associated with the use of this, for example, you currently can not use this for tables which contain associations, or partitioned tables.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

CREATE OR REPLACE Enhancements


In SAP HANA 2.0 SPS02, we introduced the ability to do CREATE OR REPLACE for procedures and functions, but there was a limitation where you could not change the signature.  In SAP HANA 2.0 SPS03, we now allow you to make such changes to the signature of the procedure or function when using the CREATE or REPLACE statement.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

MAP_REDUCE Operator


MapReduce is a programming model which allows easy development of scalable parallel applications which processes large amounts of data.  In SAP HANA 2.0 SPS00, we introduced the MAP_MERGE operator which is a specialized version of the MAP_REDUCE operator, a sort of Reducer-less MapReduce.  In SAP HANA 2.0 SPS03, we now support the MAP_REDUCE operator. I’ll try to explain this using a simple use case. For example, lets say you have a table which contains IDs and a column containing string data, this string data consists of letters separated by commas.  The requirement is to count the number of strings that contains each character, and we need to count the number of occurrences of each character in the table.   Here the “Mapper” function, processes each row of the input table and returns a table for each row processed.  When all rows of the input table are processed by the “Mapper”, the output tables are aggregated into a single table(this is exactly what MAP_MERGE operator does).

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

Now the rows in the aggregated table are grouped by key-columns, which are defined in our code. For each group, we separate key values from the table.  We call the group table without key columns, the “value table”.  Now we use the “Reducer”  function to process each group(or each value table ), which then returns a table or multiple tables containing the STATEMENT_FREQ, and the TOTAL_FREQ.  When all groups are processed by the “Reducer” function, the output tables of the “Reducer” are then aggregated into a single table.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

If we now look at the code a little closer, we’ve already seen the “Mapper” and “Reducer” functions, lets look at how we use them in conjunction with the MAP_REDUCE operator.  In the example below, this is how we would do this calculation described earlier without the use of MAP_REDUCE operator. Here we use cursors and do UNION ALL with “Mapper” and “Reducer” functions.  While this works just fine, it does look a little messy, and perhaps it does not perform well due to the use of the cursors.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

Below is what the same logic would look like if we would use the MAP_REDUCE operator instead.  You will notice it is a lot cleaner, and also hides some of the complexities with managing the groupings and aggregations.  You can see that the MAP_REDUCE operator takes 3 inputs, the input table, the “Mapper” and “Reducer” functions, and returns the final result table.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

SQLScript Code Analyzer Rules


In SAP HANA 2.0 SPS02,  we introduced the SQLScript Code Analyzer which initially had 4 rules.  We’ve added 5 new rules for the SQLScript Code Analzyer in SAP HANA 2.0 SPS03.

◈ USE_OF_UNASSIGNED_SCALAR_VARIABLE- detects variables which are used but were never assigned explicitly. Those variables will still have their default value when used, which might be undefined. It is recommended to assign a default value (can be NULL) to be sure that you get the intended value when you read from the variable.

◈ USE_OF_CE_FUNCTIONS – checks whether Calculation Engine Plan Operators (CE functions) are used. Since they make optimization more difficult and lead to performance penalties, they should be avoided.

◈ DML_STATEMENTS_IN_LOOPS – detects the following DML statements inside of loops: INSERT, UPDATE, DELETE, REPLACE/UPSERT. Sometimes it is possible to rewrite the loop and use a single DML statement to improve performance instead.

◈ USE_OF_SELECT_IN SCALAR_UDF – detects if SELECT is used within a scalar UDF which can lower the performance. If table operations are really needed, procedures or Table UDFs should be used instead.

◈ COMMIT_OR_ROLLBACK_IN_DYNAMIC_SQL –  detects dynamic SQL which use the COMMIT or ROLLBACK statement. Since COMMIT and ROLLBACK can be used directly in SQLScript without the need of dynamic SQL, it is recommended to use COMMIT and ROLLBACK directly.

SQLScript String Built-In Library


In SAP HANA 2.0 SPS02, we introduced built-in libraries, and we delivered one called SQLSCRIPT_SYNC which contained functions for putting a process to sleep and waking up connections.  In SAP HANA 2.0 SPS03, we have added a new library called SQLSCRIPT_STRING which contains several functions for string manipulation, including several functions for splitting strings, splitting strings using regular expressions, and splitting to arrays and tables.  We also have several functions for formatting which uses python style formatting.  In this example, I’m taking an intermediate table variable and using the FORMAT_TO_TABLE function to transform the data from a multi column table to a single column table which contains the same data as comma delimited data.  After that I use the SPLIT function to break it apart again.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

SQLScript Print Built-In Library


We also introduced the SQLSCRIPT_PRINT library which we can use to write out results.  This is currently only supported in the hdbsql interface.  PRINT_LINE will print out a line as a string to the hdbsql interface, and PRINT_TABLE will parse a table variable into a single string and write to the hdbsql intereface. We do have plans to make this functionality available in the Database Explorer in the future as well.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

SQLScript User-Defined Libraries


In SAP HANA 2.0 SPS03, we now allow you to create your own libraries as well, which we call User Defined Libraries, or UDLs.  Libraries are a set of related variables, procedures, and function written in SQLscript.  The HDI artifact for libraries is .hdblibrary.  There are two access modes, public, and private. So you can flag each library member individually as to how you would like it to be exposed.  So if you do not want a procedure to be callable from outside of the library, you would use the private keyword.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

Libraries are only consumable from within SQLScript, and can not be used straightaway in any SQL statement or CALL statement directly.  Currently, you must use them in a procedure or anonymous block.  We need to use the USING keyword in the BEGIN statement to declare the use of the library, as you can see in the example below.  Here I am declaring MasterData as the library I’m using and assigning an alias called MData.   We can then reference the variables, functions, and procedures within our SQLScript code using the Alias:<name of procedure/function/variable> notation.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

SQLScript Plan Profiler


SQLScript Plan Profiler is a new performance analysis tool, mainly designed from the perspective of Stored Procedures and Functions. When the SQLScript Plan Profiler is enabled, results such as start and end time, cpu and wait time, etc. are generated per call statement in a tabular format. The profiling results can be found in the new monitoring view M_SQLSCRIPT_PLAN_PROFILER_RESULTS.  We use the ALTER SYSTEM statement to work with the profiler. Below is the syntax.

ALTER SYSTEM <command> SQLSCRIPT PLAN PROFILER [<filter>]
 <command> := START | STOP | CLEAR
 <filter> := FOR SESSION <session_id> | FOR PROCEDURE <procedure_name>

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

Procedure Result Cache


Procedure result cache is a server-wide in-memory cache that caches the output arguments of procedure calls using the input arguments as keys.  Previously, we introduce this same concept for scalar functions, where if we had a deterministic function, meaning that the results would be the same for the same input parameters for every call, the results would then be cache.  We now introduce the same for procedures as well.  We can flag our procedures as DETERMINISTIC in the same way.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

SQLScript Code Analzyer in Database Explorer


Again, in SAP HANA 2.0 SPS02 we introduce the SQLScript Code Analyzer, but the only way to use it was to call the procedures associated with it via the SQL Console.  Now in SAP HANA 2.0 SPS03, we have integrated the SQLScript Code Analyzer into the Database Explorer view in the SAP Web IDE for SAP HANA.  You can now right-click on a procedure in the Database Explorer, and choose “Analyze SQLScript code” and it will bring the results in a tab on the right.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

SQLScript Debugger Enhancements


Finally, you might notice a different look and feel to the SQLscript Debugger in SAP HANA 2.0 SPS03.  We have integrated it with the other debuggers in the SAP Web IDE for SAP HANA.  We’ve also added the ability to step into and out of procedure calls and scalar function assignment statements.

SAP HANA 2.0, SAP HANA 2.0 SPS 03, SAP HANA Certifications, SAP HANA Learning

Analysis of Memory Management & Host/Resource Utilization Statistics in BW ON HANA and above

$
0
0
Many a times users of SAP face performance issue on the system which in turn impact their daily tasks and business. One of the common cause behind such performance issues can be less efficient memory management analysis or improper load distribution which leads to high host/resource utilization. I am going to write series of blog on this issue and will share with you.

In today’s blog, will cover the data-flow creation and its analysis in BW for both. Lets’ start with creation of source system on HANA DB.

How to use HANA Database as SOURCE SYSTEM in BW:-


1) Go to Tcode RSA1 > Source Systems
2) In Source Systems > Right click – Create

3) Choose Connection Type “One logical system per DB Schema” and choose DB Owner/Schema as “SYS” or other as per requirement. Please note that you can consume you customer views created under your ID via this as well. Just choose your ID as DB Owner.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

4) Similarly, create other source system for _SYS_STAT (this is another SAP standard DB Schema).

Final Output will appear something like this:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Now, we need to create datasource of these source systems as per requirement. For example, we will be consuming below table via custom datasource.

Tables under SYS schema:

M_CS_TABLES

M_TABLE_PERSISTENCE_STATISTICS

TABLE_GROUPS

Tables under _SYS_STATISTICS schema:

HOST_RESOURCE_UTILIZATION_STATISTICS

Create datasource for each of the schema with their respective properties shown in screenshot:

Datasource 1(DS_1) : Host Resource Utilization Statistics

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

The sample data fetched from this datasource will be something like below:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Here Fields selected have following meaning:

◈ SERVER_TIMESTAMP – Timestamp (local server time) identifying when a data collection actually occurred on the server
◈ ALLOCATION_LIMIT – Allocation limit for all processes
◈ HOST – Host name/Server Name
◈ INSTANCE_TOTAL_MEMORY_USED_SIZE – Amount of memory from the memory pool that is currently in actual use by SAP HANA processes

There are other fields as well in this schema which can be made visible under proposal tab by selecting them. For example: FREE_PHYSICAL_MEMORY, USED_PHYSICAL_MEMORY, TOTAL_CPU_SYSTEM_TIME_DELTA etc.

Datasource 2 (DS_2):  M_CS_TABLES – Provides runtime data for column tables.

(there is M_RS_TABLES as well for Information on row tables: detailed table sizes and record count)

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Meanings of few fields:

PART_ID: Partition ID. 0 for non-partitioned tables, 1 through number of partitions for partitioned tables. -1 if the whole table is unloaded.

MEMORY_SIZE_IN_TOTAL : Total memory size is the sum of memory size in main, delta, and history parts.

LOADED : Flag to show how many columns of table are loaded in memory (NO, PARTIALLY, and FULL); see M_CS_COLUMNS for each column

READ_COUNT: Number of read accesses on the table or partition. Note: This is not the number of SELECT statements against this table. A SELECT may involve

several read accesses.

WRITE_COUNT: Number of write accesses on the table or partition. Note: This is not the number of DML & DDL statements against this table. A DML or DDL statement may involve several write accesses.

MERGE_COUNT: Number of delta merges done on the table or partition

Data preview:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Datasource 3 (DS_3) :  M_TABLE_PERSISTENCE_STATISTICS

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Here DISK_SIZE – Total disk size of all table parts

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Datasource 4 (DS_4):  TABLE_GROUPS

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Data here is self-explanatory.

For more details on used view and other views of interest, please refer to SAP Document “SAP HANA SQL and System Views Reference”

For saving memory on PSA, you may use below properties of DTP to extract data from these DB views: Extraction will be directly from Source system  – here HANA DB Schema.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

After creation of these data-sources, we can consume them in popular ADSO and Composite Provider – SAP Objects and create SAP query on it for analytics on data volume/memory management and host utilization.

Sample Dataflow has been explained below for lookups:

Composite Provider Scenario:-

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Above ADSOs have been made part of JOIN/UNION in such a way in composite provider to full fill lookup criteria (explained transformation logic for same). We can make ROW STORED tables also part of this multiprovider.

Data flow:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

◈ To Store memory related key figures more efficiently, convert them to GB or TB while storing them in ADSO.

For e.g. Disk Size (in GB) = DISK_SIZE / (1024^3)

◈ Table Type Info – Transformation Logic as it will be used as master data for determining table type:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

◈ START ROUTINE

TYPES:
BEGIN OF TY_TABLE_NAME,
SCHEMA_NAME TYPE C LENGTH 256,
TABLE_NAME TYPE C LENGTH 256,
END OF TY_TABLE_NAME.

DATA: L_TABLE_NAME TYPE STANDARD TABLE OF TY_TABLE_NAME.

FIELD-SYMBOLS:
<L_TABLE_NAME>    TYPE TY_TABLE_NAME.

SELECT TABNAME FROM DD02L
INTO TABLE L_TABLE_NAME
WHERE AS4LOCAL EQ 'A' AND TABCLASS IN ('TRANSP','POOL','CLUSTER').

LOOP AT L_TABLE_NAME ASSIGNING <L_TABLE_NAME>.
<L_TABLE_NAME>-TABLE_NAME = <L_TABLE_NAME>-SCHEMA_NAME.
<L_TABLE_NAME>-SCHEMA_NAME = 'SAPBIW'.
ENDLOOP.

LOOP AT SOURCE_PACKAGE ASSIGNING <SOURCE_FIELDS>.
DELETE L_TABLE_NAME WHERE TABLE_NAME EQ <SOURCE_FIELDS>-TABLE_NAME.
ENDLOOP.

APPEND LINES OF L_TABLE_NAME TO SOURCE_PACKAGE.

CLEAR: L_TABLE_NAME.

◈ END ROUTINE

DATA: L_TDESC TYPE STRING.
LOOP AT RESULT_PACKAGE ASSIGNING <RESULT_FIELDS>.
SELECT DDTEXT FROM DD02T INTO L_TDESC
WHERE TABNAME = <RESULT_FIELDS>-/B37/S_G_TABNAME.
ENDSELECT.

IF L_TDESC CO '0123456789'.
*    skip translate.
IF <RESULT_FIELDS>-/B37/S_G_GNAME IS INITIAL.
CONCATENATE 'Info' L_TDESC INTO L_TDESC SEPARATED BY SPACE.
ELSE.
CONCATENATE <RESULT_FIELDS>-/B37/S_G_GNAME L_TDESC INTO L_TDESC
SEPARATED BY SPACE.
ENDIF.
ENDIF.

<RESULT_FIELDS>-/B37/S_G_TDESC = L_TDESC.
TRANSLATE <RESULT_FIELDS>-/B37/S_G_TDESC TO UPPER CASE.
*  ENDIF.

CLEAR: L_TDESC.

IF ( <RESULT_FIELDS>-/B37/S_G_STYPE = 'ACTIVE' OR
<RESULT_FIELDS>-/B37/S_G_STYPE =
'QUEUE' ).
<RESULT_FIELDS>-/B37/S_G_STYPE = 'DSO'.

ELSEIF ( <RESULT_FIELDS>-/B37/S_G_STYPE = 'FACT_IMO' OR
<RESULT_FIELDS>-/B37/S_G_STYPE =
'FACT_F' OR <RESULT_FIELDS>-/B37/S_G_STYPE = 'FACT_E'
OR <RESULT_FIELDS>-/B37/S_G_STYPE = 'DIM' ).
<RESULT_FIELDS>-/B37/S_G_STYPE = 'CUBE'.

ELSEIF ( <RESULT_FIELDS>-/B37/S_G_STYPE = 'Y' OR
<RESULT_FIELDS>-/B37/S_G_STYPE = 'T' OR
<RESULT_FIELDS>-/B37/S_G_STYPE = 'Q' OR <RESULT_FIELDS>-/B37/S_G_STYPE
= 'P'
OR <RESULT_FIELDS>-/B37/S_G_STYPE = 'X' OR
<RESULT_FIELDS>-/B37/S_G_STYPE = 'H' OR
<RESULT_FIELDS>-/B37/S_G_STYPE = 'J' OR
<RESULT_FIELDS>-/B37/S_G_STYPE = 'I'
OR <RESULT_FIELDS>-/B37/S_G_STYPE = 'K' OR
<RESULT_FIELDS>-/B37/S_G_STYPE = 'SID' ).
<RESULT_FIELDS>-/B37/S_G_STYPE = 'MASTER_DATA'.
ELSE.
<RESULT_FIELDS>-/B37/S_G_STYPE = <RESULT_FIELDS>-/B37/S_G_STYPE.
ENDIF.

ENDLOOP.

Note:

1) Host Resource Utilization has been consumed in ADSO and then directly to Query.

2) IMPORTANT: For transports, please make sure that you maintain mapping logical systems for transport for new source systems created. One of the thing I struggled for days was that it was not maintained in target system in below table even though it was existing in RSA1>Source System.

It can be maintained via following two options:

Use view maintenance V_RSLOGSYSMAP in transaction sm30.
Call transaction RSA1 –> Transport connection –> select button CONVERSION

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Query 1 Based on Host Resource Utilization Stats Flow:

Query showing maximum memory utilization on given day

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Query 2: Based on datasource DS_2, DS_3, DS_4 and its flow:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA Learning

Query consumption in SAP Lumira or BO Dashboards is left to your imagination.

With the analysis on these views & queries, SAP HANA Architect/user can derive following potential benefits and many more:

◈ Find tables are unevenly distributed across servers and may require redistribution.
◈ Which table is occupying space in memory and can be unloaded if it is not used for reporting or any system related tasks (for example: DSO which are used for intermediate transformation of data can be unloaded to save space of memory)
◈ Find Tables which requires a data-cleanup check from Basis or BW end.
◈ Which server is highly utilized as compared to other servers and table/load distribution can be done accordingly.
◈ How much a server is utilized at most. (for example, if server is getting utilized at max at 30-35% at its peak time, then it infers that memory and processor can be saved from there which in turn will lead to cost saving for business)

Angular 2 on HANA XSA

$
0
0
From the day I heard SAP HANA will support Node.js, I was curious to experiment what all we can develop on HANA using the new platform. Just few days before I was able to run Angular 2 apps on HANA XSA and thought of sharing the same. So lets start with the prerequisites.

Prerequisites:


◈ Internet connection (via proxy or direct) for downloading angular packages
◈ git server connection
◈ and a hana xsa enabled server (obviously :P)

Now that we have everything lets start with setting upstream link in local npm registry

Step 1:

login to xs cli and type command:

“xs set-env di-local-npm-registry UPSTREAM_LINK http://registry.npmjs.org/”

restage and restart the app.

This will let you add packages which are not present in local npm registry to your xsa app.

Note: If you don’t have proxy enabled for whole hana server you can add proxy just for di-local-npm-registry using commands

“xs set-env di-local-npm-registry HTTP_PROXY http://proxy.abc.com:8080”

“xs set-env di-local-npm-registry NO_PROXY HOST, localhost, 127.0.0.0”

Step 2:

Now we will create a new project (or just a new node module if you already have a project) using web-ide and push it to our git server

Right click on workspace and select new project and then project from template. Give some name to project and click finish.

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

Now create a new node.js module.

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

Now we will link our project to git so we can add angular part. Right click the project folder, go to git and then initialize local git repo. After that set remote for you local repository.

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

Do one initial commit and push your code to remote repo.

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

Step 3:

Clone your repository to the local desktop or server where you have node installed with internet access.


SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

now we will install angluar-cli (you will require node version 6.9.0 or higher and npm version 3.X.X)

run command “npm install -g @angular/cli”

rename your existing package.json to package_old.json and .gitignore to .oldgitignore (angular cli will create its on package.json/.gitignore and fails if files are already present)

go inside the project folder (not inside the node module folder) and run command

“ng new <you node module name>”

this will create a new angular project inside your node module folder (if you give some other name it will create a new folder with that name)

your node module folder will look like this now

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

angular-cli includes lots of modules, if you don’t want some you can remove them from package.json (like karma, e2e)

now we will copy our contents of package_old.json to new package.json. Note that we have to rename some of the scripts. Also copy the contents of .oldgitignore

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

now use ng build to build the app. The build artifacts will be stored in the `dist/` directory. Use the     `-prod` flag for a production build.

use ng server to run the angular app (use –open to open your local default desktop)

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

now that our angular part is all done commit the changes and push it to remote repository.

but before pushing comment out /dist and /out-tsc paths below #Compiled in .gitignore as we will be using compiled files inside these folders

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

Step 4:

After configuring our angular part we are now back to the web-ide.

Pull all the changes you did into the web-ide project.

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

Now add following code to the server.js file. We are setting up an express app to serve our dist folder and routing all the requests to index.html inside dist. Do remember to add express in package.json

/*eslint no-console: 0, no-unused-lets: 0, no-undef:0, no-unused-vars: 0*/
/*eslint-env node, es6 */
"use strict";

//var https = require("https");
//var xsenv = require("@sap/xsenv");
var port = process.env.PORT || 3000;
var server = require("http").createServer();
//https.globalAgent.options.ca = xsenv.loadCertificates();
global.__base = __dirname + "/";

let express = require("express");
let app = express();

app.disable('x-powered-by');

app.options("/*", function(req, res, next){
  res.header('Access-Control-Allow-Origin', '*');
  res.header('Access-Control-Allow-Methods', 'GET,PUT,POST,DELETE,OPTIONS');
  res.send(200);
});

// Point static path to dist
app.use(express.static(global.__base + 'dist'));

// Catch all other routes and return the index file
app.all('*', (req, res) => {
  res.sendFile(global.__base + 'dist/index.html');
});

//Start the Server 
server.on("request", app);
server.listen(port, function() {
console.info(`HTTP Server: ${server.address().port}`);
});

Now we will build our app, but before build we need to scale up some of our SAP applications (di-core, di-runner and di-builder). This part can be skipped but as the number of packages used in angular app are high there are high chances that the di-core app may crash with out of memory error. You can also scale up web-ide if it’s response time get reduced.

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

Go to XS Admin cockpit and select Application Monitor. Select above mentioned application and scale them as shown below.

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

Once scaling is done you are set to build your application.

Right click the module and select build. It might take some time as it will download all the modules required for angular.

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

After build we need to create a run configuration.

right click module folder and select Run Configurations inside Run.

Create new node application run configuration, give it some name and select “start with package.json script ” and choose start-node script

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

click Save and Run. It will take some time depending on your server config.

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials

SAP HANA XSA, SAP HANA Certifications, SAP HANA Learning, SAP HANA Guides, SAP HANA Tutorials and Materials
Finally our angular 2 app is up and running on HANA XSA.

How To Integrate Hana Database with Excel

$
0
0

Connecting to SAP HANA with Microsoft Excel


Microsoft excel is most widely used BI tool across the globe, with excel we can explore well versed SAP Hana data.SAP HANA supports the query languages SQL and MDX where as JDBC and ODBC are used for SQL-based access. ODBO is used for MDX-based access. MDX (Multi-Dimensional Expressions) is a query language for OLAP databases. It has been defined as part
of the ODBO (OLE DB for OLAP) specification from Microsoft.

Hana MDX provider is nothing but installing Hana client as per the operating system version.

One unique benefit of SAP HANA MDX is the native support of hierarchies defined for
Attribute Views. There are two types of hierarchies in SAP HANA: level-based- and parent-child-hierarchies,both types of hierarchies are accessible via MDX.
SAP Hana MDX is able to consume models defined n Hana Studio.This design time environment allows you to define logical models on top of physical tables. The existing physical tables represent the data foundation for the logical model.

Please Note that all simple steps presented here are for Excel 2013 connecting to SAP HANA 1.0.

In order to create connection, first specify data source connection, then decide what you want to create with connection and finally use the data source to populate a table or chart report.

Click on blank workbook

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

It all begins with data connection wizard,to start the Data Connection Wizard, select the Data tab

Click data tab->From other sources->From data connection wizard.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

As shown below, Select other/advanced and click on next.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Under Data Link Properties,Select SAP HANA MDX Provider and click on next

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Provide the connection details like hostname, instance number, user name, password ,language type

click on test Connection and Ok

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Once the test connection is successful, click ok and proceed further

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Here we see different packages, these packages can be seen in sap Hana. database

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Here we see different packages, these packages can be seen in sap Hana. database

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Now go back to SAP Hana database.

->Content->select any one package->expand that package

select one view

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Right click on this analytic view and click on data preview

click on tab raw data.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Now, access this package in excel

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Go back to excel.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

As shown below Open Data Connection wizard ,Select the package-> analytic view

click on next.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Click on finish.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Click on OK

Here we see values and attributes.

Double click on company code to see it.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Double click on customer, we can also see customer along with the company codes.

In this way, select which ever you want to see those fields like gross revenue.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

Here we see gross revenue based on different company codes.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Studio, SAP HANA Certifications

SAP HANA 2.0 SPS 03 Installation – Preparation for HANA Migration using DMO

$
0
0
A lot of SAP customers have been migrating their existing SAP landscapes from traditional database platforms to SAP HANA. OS/DB migration using SAPinst (SWPM) has been in place since long, however the migration to SAP HANA database requires certain minimum SAP release/SPS level as a prerequisite for the existing SAP systems in the customer landscape.

SAP recommends HANA migration using Database Migration Option (DMO) method so as to achieve an SAP upgrade and HANA database migration in one project downtime. DMO is the option introduced in SUM upgrade tool itself. I performed a test HANA migration using DMO for a customer recently. The target HANA database should already be installed during migration preparation.

I want to share how I installed the latest release HANA 2.0 SPS 03 revision 30 as the target database for DMO HANA migration. A few prerequisites should already be in place:

◈ Proper SAP sizing should be done for the upgrade and migration project
◈ SAP HANA installation architecture should be discussed and finalized with the customer w.r.t single host, multi-host, scale out and other aspects
◈ Support OS platform should be ready. I tested the install on Suse Linux 12
◈ HANA user sidadm should be created beforehand if required for company policies

The sequence of activities I performed:

◈ Download SAP HANA media from SAP support portal
◈ Extract and prepare HANA archives on the database server
◈ Install SAP HANA database system

Download SAP HANA 2.0 SPS 03 Media


Go to SAP support portal support.sap.com and click on quick link Download Software.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Go to Installations and Upgrades – By Alphabetical Index – H for SAP HANA

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Choose the category SAP IN-MEMORY SAP HANA

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Select SAP HANA Platform Edition

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Again, HANA platform edition

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Select HANA Platform Edition 2.0 – to install the latest release

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Click on Installation

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Select HANA Platform Edition 2.0 SPS03 rev30 according to the hardware, for example, I installed the database on Linux x86_64 and add to download basket.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Download the DVD using SAP download manager. Extract the RAR archive and upload the media 51052481 on Linux server.

Install HANA database using hdblcm


HDBLCM is the tool to install or update HANA components. We can install HANA 2.0 with hdblcmgui binary if we want to install the database in GUI mode, however I preferred using hdblcm command line option.

Login Linux using root. Go to 51052481 – DATA UNITS – HDB_LCM_LINUX (according to the hardware/OS platform, for e.g. x86_64) and execute HANA database lifecycle management tool using “./hdblcm”.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Command will show the detected software components for version HANA 2.0 in the extracted HANA media.

Write index option 1 – to install new HANA system and press Enter to go ahead with option 1.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

HANA database version 2.0 SPS 03 will be installed. Continue with default option 3 (mentioned in the command line brackets) to install HANA Client. Other components or the option ‘all’ could also be selected if needed depending on project requirements.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

I continued with default options without providing any input and pressing Enter. A few notable input were for e.g. HANA system SID and instance number, according to the planned installation strategy.

System usage could be development or production etc. or we can change it even after installation.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Provide other necessary details according to installation plan along with the passwords.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Validate the details via summary before execution. Select Yes to continue the installation.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Installation should continue with a few phases. Preparing and installing packages.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Creating HANA instance.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Importing Delivery Unit.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Installing resident hdblcm and HANA Client.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Switch to user sidadm, for e.g. bhdadm here and we can verify the installed HANA services using command ‘HDB info’ and HANA version using ‘HDB version’.

SAP HANA 2.0, SAP HANA DMO, SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides

Exploring the SAP HANA XS Advanced Secure Store functionality with NodeJS

$
0
0

Introduction


Recently a question was raised here in the community how the secure store functionality can be used in a XS Advanced application without using the $.security.Store API available via the XSJS compatibility layer. Until HANA 2.0 SPS02 there was no official information available although the functionality was already part of the system, but not officially released, at least to my knowledge. With HANA 2.0 SPS03 in the SAP HANA Developer Guide for XS Advanced the Application Security chapter was enhanced by a new sub-chapter Maintain Values in the SAP HANA Secure Store which adds the information that the secure store can be used by new procedures:

◈ SYS.USER_SECURESTORE_INSERT to insert new entries
◈ SYS.USER_SECURESTORE_RETRIEVE to get a secure store entry
◈ SYS.USER_SECURESTORE_DELETE to delete a secure store entry


As the documentation at this point did not give me – let me say – enough information (e.g. parameters and their meanings) and the examples are “just” for Java (also w/o further describing any details) I experimented a little bit with the “new” procedures within a XS Advanced NodeJS module on a SAP HANA Express Edition installation (based on HANA 2.0 SPS03).

Finding the interface information for the Secure Store procedures


As we know from the documentation we somehow have to call procedures to insert/retrieve/delete secure store values. So it would be good to know the interface of the procedures. At least somehow in the current HANA 2.0 SPS03 documentation that information is missing, because it is not really important I think (you recognize the irony). So where to find that information. Good that there exists a system view PROCEDURES which shows at least the technical details. Lets switch to the Database Explorer, connect to SystemDB and do the following query:

SAP HANA XS, SAP HANA Guides, SAP HANA Learning, SAP HANA Tutorials and Materials

In the results DEFINITION column the interface definition can be found.

SYS.USER_SECURESTORE_INSERT

Parameter Direction Type Description 
STORE_NAMEIN NVARCHAR(530)The store name of the secure store. 
FOR_XS_APPLICATIONUSERIN BOOLEANA boolean to indicate if the value is stored for the application user instead of the technical user. 
KEY IN NVARCHAR(1024)Key within the secure store. 
VALUE IN VARBINARY(5000)Value of the secure story entry in binary. 

SYS.USER_SECURESTORE_RETRIEVE

ParameterDirection
STORE_NAME  IN
FOR_XS_APPLICATION_USER IN 
KEY IN 
VALUE Out 

SYS.USER_SECURESTORE_DELETE

ParameterDirection
STORE_NAME  IN
FOR_XS_APPLICATION_USER IN 
KEY IN 

Using the Secure Store Procedures within a NodeJS module


As we now know some more details about the procedures we are going to use them within a NodeJS module of an XS Advanced Multi-Target application. Please consider that the following coding contains just quick and dirty examples, so code structuring/encapsulation/error handling/… were not in focus. In a real-world application you have to consider that, but of course you know that.

Create a HANA service instance with “securestore” plan


To be able to use the secure store an instance of the “hana” (or “managed-hana”) service has to be created with service plan “securestore”.

With the new XS Advanced Cockpit this can be done in a very easy way by going to the “hana” service -> Instances -> New Instance. I created one with the name “securestore_test-hana”. This service instance will be bound later to our NodeJS application.

SAP HANA XS, SAP HANA Guides, SAP HANA Learning, SAP HANA Tutorials and Materials

If you wanna create the service via the XS command line tools you can do it like following (make sure to be in the right org and space):

xs cs hana securestore securestore_test-hana

Create a NodeJS module


Next step is to create a Multi-Target Application within the SAP Web IDE for SAP HANA using the project from template functionality. Within the created MTA a new node module is added (I called it node_securestore_test). To this node module a simple express application will be added with routes to interact with the secure store.

SAP HANA XS, SAP HANA Guides, SAP HANA Learning, SAP HANA Tutorials and Materials

Prepare the development descriptor file (mta.yaml)


The resource “securestore_test-db” pointing to the before created hana securestore service needs to be added to the development descriptor file as resource. This resource is required by the NodeJS module.

For the test also an xsuaa service instance “securestore_test-uaa” was created, added as resource and added as required resource for the NodeJS module.

ID: xsa_securestore_test
_schema-version: '2.0'
version: 0.0.1

modules:
 - name: node_securestore_test
   type: nodejs
   path: node_securestore_test
   provides:
    - name: node_securestore_test_api
      properties:
         url: ${default-url}
   requires:
     - name: securestore_test-uaa
     - name: securestore_test-db
         
resources:
 - name: securestore_test-uaa
   type: com.sap.xs.uaa-space
   parameters:
     config_path: ./xs-security.json

 - name: securestore_test-db
   type: com.sap.xs.hana-securestore
   parameters:
     service-name: securestore_test-hana

Prepare the server.js file


Within the server.js file (created by the NodeJS module creation) an express app is created, the hana securestore instance is added as middleware to the express app and an HTTP server is started. To the express app the routes for the secure store tests, defined in the router folder, are added too (details are described in the next chapter).

"use strict";

// create express app
var app = require("express")();

// add secure store middleware to express app
var xsenv = require("@sap/xsenv");
var hdbext = require("@sap/hdbext");

var hanaOptions = xsenv.getServices({
secureStore: {
name: "securestore_test-hana"
}
});

app.use(
hdbext.middleware(hanaOptions.secureStore)
);

// create server instance
var server = require("http").createServer();

// setup routes of express app
var router = require("./router")(app, server);

// start server
var port = process.env.PORT || 3000;
server.on("request", app);
server.listen(port, function() {
console.info('HTTP Server: ${server.address().port}');
});

Prepare the express routes for inserting/retrieving/deleting a secure store entry


In this step the express routes and the endpoint logic to interact with the secure store is described.

First a folder “router” is created. In this folder a file “index.js” is created with the following content which describes the available routes. Within the “server.js” file (see above) that file is used by the “setup of routes” step.

The following routes are made available by the router implementation:

◈ /createSecureStoreEntry to create a secure store entry
◈ /retrieveSecureStoreEntry to retrieve the created secure store entry
◈ /deleteSecureStoreEntry to delete the created secure store entry

"use strict";

module.exports = (app, server) => {
app.use("/createSecureStoreEntry", require("./routes/createSecureStoreEntry")());
app.use("/retrieveSecureStoreEntry", require("./routes/retrieveSecureStoreEntry")());
app.use("/deleteSecureStoreEntry", require("./routes/deleteSecureStoreEntry")());
};

As it can be seen in the above router coding the route implementations are created in a new folder “routes” (created within the “router”) folder.

Creating a secure store entry

Within file “createSecureStoreEntry.js” with the module “@sap/hdbext” the procedure “SYS”.”USER_SECURESTORE_INSERT” is loaded and executed. As store name “TEST_STORE” is used, the used key is “TEST_VALUE” and the value itself is a dummy string converted to a binary.

"use strict";

module.exports = function() {
var express = require("express");
var hdbext = require("@sap/hdbext");
var app = express.Router();
app.get('/', function(req, res) {
hdbext.loadProcedure(req.db, "SYS", "USER_SECURESTORE_INSERT", function(error, proc) {
if(error) {
res.send("Error during procedure loading:" + error.message);
return;
}
proc({"STORE_NAME":"TEST_STORE", 
      "FOR_XS_APPLICATIONUSER": false, 
      "KEY": "TEST_VALUE", 
      "VALUE": Buffer.from("Test Secure Store Value")}, function(error){
if(error) {
res.send("Error during procedure execution: " + error.message);
return;
}
res.send("Entry in secure store successfully created.");
});
});
});

return app;
};

Retrieving a secure store entry

For retrieving the created value in store “TEST_STORE” with key “TEST_VALUE” procedure “SYS”.”USER_SECURESTORE_RETRIEVE” is used. The value store as binary is converted back to a string value for the output.

"use strict";

module.exports = function() {
var express = require("express");
var hdbext = require("@sap/hdbext");
var app = express.Router();
app.get('/', function(req, res) {
hdbext.loadProcedure(req.db, "SYS", "USER_SECURESTORE_RETRIEVE", function(error, proc) {
if(error) {
res.send("Error during procedure loading:" + error.message);
return;
}
proc({"STORE_NAME":"TEST_STORE", 
      "FOR_XS_APPLICATIONUSER": false, 
      "KEY": "TEST_VALUE"}, function(error, out_parameters){
if(error) {
res.send("Error during procedure execution: " + error.message);
return;
}
if(!out_parameters.hasOwnProperty("VALUE")) {
res.send("Value of secure store entry could not be determined.");
return;
}

res.send("Retrieved value: " + Buffer.from(out_parameters["VALUE"]).toString());
});
});
});

return app;
};

Deleting a secure store entry

For the deletion of the secure store entry in store “TEST_STORE” with key “TEST_VALUE”, procedure “SYS”.”USER_SECURESTORE_DELETE” is called the same way than the other procedures.

"use strict";

module.exports = function() {
var express = require("express");
var hdbext = require("@sap/hdbext");
var app = express.Router();
app.get('/', function(req, res) {
hdbext.loadProcedure(req.db, "SYS", "USER_SECURESTORE_DELETE", function(error, proc) {
if(error) {
res.send("Error during procedure loading:" + error.message);
return;
}
proc({"STORE_NAME":"TEST_STORE", 
      "FOR_XS_APPLICATIONUSER": false, 
      "KEY": "TEST_VALUE"}, function(error){
if(error) {
res.send("Error during procedure execution: " + error.message);
return;
}

res.send("Entry in secure store successfully deleted.");
});
});
});

return app;
};

Build and run the NodeJS application


After the coding part is finished the MTA can be build and the NodeJS application/module can be executed.

In this case the NodeJS application is running on port 51026.

Executing the /createSecureStoreEntry results in following:

SAP HANA XS, SAP HANA Guides, SAP HANA Learning, SAP HANA Tutorials and Materials

Executing the same route again results in an error, because of a duplicate key:

SAP HANA XS, SAP HANA Guides, SAP HANA Learning, SAP HANA Tutorials and Materials

Lets retrieve the secure store entry by route /retrieveSecureStoreEntry:

SAP HANA XS, SAP HANA Guides, SAP HANA Learning, SAP HANA Tutorials and Materials

And finally delete it using route /deleteSecureStoreEntry:

SAP HANA XS, SAP HANA Guides, SAP HANA Learning, SAP HANA Tutorials and Materials

HANA Spatial Demos: Geocoding, Clustering, Aggregation

$
0
0

Business story


Let’s imagine a purely fictional story to back-up our applications:

◈ Our customer is an insurance company specialised in house insurance.
◈ Currently, they have all their data in Excel files.
◈ They want to move it into a dedicated system capable of grouping houses in the same area together.

Looking back, this scenario might not be that far from reality after all.

Objectives


To fulfil our customer’s needs, we will do the following steps:

◈ Take the data from the Excel file (.csv) and import it in HANA. This data contains:
     ◈ The insurance policy number (unique business key for a policy).
     ◈ The policy holder name.
     ◈ The insured object’s address.
     ◈ And the total sum insured.
◈ Geocode the house addresses to get their coordinates.
◈ Build apps for visualising, aggregating and clustering the houses.

Technical setup


We will do all of this on a HANA Trial MDC, which we create following the steps described in the “Setup your trial SAP HANA MDC instance” SAP developer tutorial. The development itself can be done directly in the Web-based Workbench.

The Google Geocoding API is a fairly good tool for obtaining coordinates from textual addresses. To use it, we also need an API Key for it. In addition, HANA needs to “trust” the SSL certificate of Google, so we need to create a new trust store for it.

Data import


First we need to create a CDS entity for storing the input data:

    // the main table, should store all the necessary data
    entity eInsurance {
        key policyNumber:       String(32);
            personName:         String(128) not null;
            objectAddress:      String(256) not null;
            insuredSum:         Decimal(16,2) not null;
            objectLocation:     hana.ST_POINT(4326);
            objectGeocoded:     hana.TINYINT default 0;
    };

Then we can use a table import file to move the data from the CSV to the table itself:

import = [
    {
        cdstable  = "workspace.master.data::ctxSpatial.eInsurance";
        schema = "WORKSPACE_MASTER_SPATIAL";
        file = "workspace.master.data:eHouses.csv";
        header = false;
        delimField = ",";
        delimEnclosing="\"";
    }
];

Geocoding


Normally, we could use HANA’s built-in functionality: the GEOCODE INDEX with a user-defined geocoding provider. Back when I wrote these apps, on an MDC I was not able to actually change the system configuration to use my own provider, so I decided to just call my function through an XS Job:

{
    "description": "Perform geocoding",
    "action": "workspace.master.service:job.xsjs::batch",
    "schedules": [{
        "xscron": "* * * * * * */5"
    }]
}

Roughly speaking, the following code is responsible for calling the Google API:

/**
 * Calls the google geocoding API.
 * @param   {string}    sApiKey     The Google Maps API key.
 * @param   {string}    sAddress    The address which should be geocoded.
 * @returns {object|null}   The response body returned by the service.
 * Null if the request failed.
 */
function callGeocodingApi(sApiKey, sAddress) {
    try {
        var sUrl = "/geocode/json?address=" + encodeURIComponent(sAddress) + "&key=" + sApiKey,
            oDestination = $.net.http.readDestination("workspace.master.service", "googleApi"),
            oClient = new $.net.http.Client(),
            oRequest = new $.net.http.Request($.net.http.GET, sUrl),
            oResponse = oClient.request(oRequest, oDestination).getResponse(),
            oData = JSON.parse(oResponse.body.asString());
        return oData;
    }
    catch (e) {
        $.trace.error(e);
        return null;
    }
}

To track which addresses were already processed, I also added a objectGeocoded flag to the CDS entity.

Visualisation


To view the points on a simple Google Map, we need to decompose the ST_POINT objects into the latitude and longitude components and expose them through an OData service.

For this we can simply create a simple CDS view:

    // the expanded view (expands the objectLocation into coords)
    // contains only the points which were geocoded succesfully
    view vExpandedInsurance as select from eInsurance{
        policyNumber,
        personName,
        insuredSum,
        objectAddress,
        objectLocation.ST_Y() as objectLatitude,
        objectLocation.ST_X() as objectLongitude
    } where eInsurance.objectGeocoded = 1;

And then add it to the xsodata service:

"workspace.master.data::ctxSpatial.vExpandedInsurance" as "ExpandedInsurances"
   keys ("policyNumber")
   create forbidden
   update forbidden
   delete forbidden;

Using a simple UI5 app, we get a fairly nice view of the points:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA Certifications, SAP HANA Studio

As a bonus, if we click on a point, we get more details about the insurance policy behind it.

Data acquisition


We don’t want our customer to always use the CSV import. It should just be there for the initial “migration” of the data.

When new policies are created, we would like to help our customer in getting decent-quality data, so we build a small UI which shows exactly where an address is when creating the policy:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA Certifications, SAP HANA Studio

Technically, this UI just reuses the existing geocoding functionality from before.

Clustering


One very interesting use case would be to cluster up the insured objects. Think about natural disasters; they have the potential of affecting a large number of houses located in relative close proximity. For an insurance company, having a large lump of risk in the same area can easily spell disaster.

Because we are not sure how the customer would like to have it, we build three different prototypes supporting this.

UI-based clustering


The first one is simply using Google’s Marker Clustering functionality. We don’t really like it though, because it will surely not scale and it doesn’t really use anything from HANA. Also, we don’t get to know exactly what’s the total amount of sum insured in each cluster (just the number of houses).

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA Certifications, SAP HANA Studio

Static clustering


The second one assumes that the user does not need to input the number of clusters, but we can decide on it. Truth be told, this variant only exists because, back when I created the applications, using a calculation view input parameter in place of the hard-coded cluster count would result in an error.

We write a fairly simple calculation view for this and then expose it through OData:

var_out = SELECT ST_ClusterID() AS "id", SUM("insuredSum") AS "totalSum", 
COUNT("insuredSum") AS "objectCount", ST_ClusterCentroid().ST_Y() AS "centerLatitude",
ST_ClusterCentroid().ST_X() AS "centerLongitude"
FROM (
SELECT "insuredSum", "objectLocation".ST_Transform(1000004326) AS "objectLocation" 
FROM "WORKSPACE_MASTER_SPATIAL"."workspace.master.data::ctxSpatial.eInsurance" 
WHERE "objectLocation" IS NOT NULL
)
GROUP CLUSTER BY "objectLocation"
USING KMEANS CLUSTERS 5;

Dynamic clustering


We don’t really like this either, because we hard-coded the cluster count. To work around the calculation view activation error, we move the code from the view into a simple XSJS script (with basically the same content, but with the cluster count parameterized). In the end, we obtain something that looks like this:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA Certifications, SAP HANA Studio

Aggregation


The customer might also want to select manually how to group the points together. To support this kind of scenario, we also create two different prototypes. For both prototypes, we want to:

◈ Split all the points into two groups based on some kind of user input.
◈ Display which points are in each group.
◈ Aggregate the total sum insured per group.

Radius-based aggregation


In this prototype, we split the points into two groups based on their distance to a user selected point. If they are inside a circle of given radius and center, then they are in the “inside” group, otherwise they are “outside”.

For determining this relationship, we use the following query that makes use of the ST_DISTANCE method:

 var_out = SELECT "policyNumber", 
"objectLocation".ST_Y() AS "objectLatitude", 
"objectLocation".ST_X() AS "objectLongitude", 
CASE WHEN "objectLocation".ST_DISTANCE(NEW ST_POINT(
    'POINT(' || :IV_LNG || '' ||  :IV_LAT || ')', 4326), 'meter') < :iv_radius 
THEN 1 ELSE 0 END AS "group"
FROM "WORKSPACE_MASTER_SPATIAL"."workspace.master.data::ctxSpatial.eInsurance"
WHERE "objectLocation" IS NOT NULL;

Similarly, we build a view for the aggregated values. Here we can take into consideration the fact that we will always have two groups (“inside” and “outside”):

var_out = SELECT 1 AS "group", IFNULL(SUM("insuredSum"), 0) AS "totalSum", COUNT("policyNumber") AS "objectCount"
FROM "WORKSPACE_MASTER_SPATIAL"."workspace.master.data::ctxSpatial.eInsurance"
WHERE "objectLocation" IS NOT NULL AND "objectLocation".ST_DISTANCE(
NEW ST_POINT('POINT(' || :IV_LNG || '' ||  :IV_LAT || ')', 4326), 'meter') < :iv_radius
UNION ALL
SELECT 0 AS "group", IFNULL(SUM("insuredSum"), 0) AS "totalSum", COUNT("policyNumber") AS "objectCount"
FROM "WORKSPACE_MASTER_SPATIAL"."workspace.master.data::ctxSpatial.eInsurance" 
WHERE "objectLocation" IS NOT NULL AND "objectLocation".ST_DISTANCE(
NEW ST_POINT('POINT(' || :IV_LNG || '' ||  :IV_LAT || ')', 4326), 'meter') >= :iv_radius; 

Then we just display these two sets of data on a simple “picker” UI:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA Certifications, SAP HANA Studio

Region-based aggregation


The second version is to actually let the user draw the region as a polygon. Thankfully, we can use the built-in ST_CoveredBy predicate to work this out:

var_out = SELECT "policyNumber", "objectLocation".ST_Y() AS "objectLatitude", 
"objectLocation".ST_X() AS "objectLongitude",
"objectLocation".ST_CoveredBy(NEW ST_POLYGON(:iv_polygon, 4326)) AS "group"
FROM "WORKSPACE_MASTER_SPATIAL"."workspace.master.data::ctxSpatial.eInsurance"
WHERE "objectLocation" IS NOT NULL;

The polygon itself is an input variable to this calculation view. The map includes a set of drawing tools which also support polygons. Using this functionality we can build a simple UI for this prototype as well:

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA Certifications, SAP HANA Studio

Make your HANA Express impersonate Cloud Foundry

$
0
0
When I help partners set up their HANA Express servers for development use, I want to configure them to impersonate the behavior of Cloud Foundry so that they get used to how the XSA generated URLs work.  This approach also tends to promote a more deployment target agnostic behavior and leads to projects that deploy into both HANA XSA on-prem and Cloud Foundry environments without any code differences.

Note: In this post, I’m using the fully qualified domain name for my server’s name as hxe.local.com.  You will want to replace this name throughout the example commands in this post with your choice for a server name.

The primary problem is that the HANA Express(HXE) installer assumes ports routing mode.  In ports routing mode, applications provided by the XSA system get urls like this.

https://hxe.local.com:53075

URLs in Cloud Foundry(CF) look like this.

https://webide.cfapps.us10.hana.ondemand.com

While I can’t (currently) cause apps deployed into CF to use my own domain name, I can configure HXE to create URLs in a similar way by installing it with hostnames routing mode which will result in a URL like this.

https://webide.hxe.local.com

Note that while I’m making an HXE install behave more like CF, I’m not duplicating other aspects of CF like stubbing up all the same services that CF makes available.  I’ll save that exercise for another blog post.

Another aspect that I can configure in HXE to make it behave more like CF is to define a trust relationship with an external Identity Provider(IdP) and use it as the source of the application’s users.  By default, HXE will use the local HANA DB as a store of applications users and in fact this is how the XSA_ADMIN, XSA_DEV, and other utility users are defined.  However, I try to promote the use of an external IdP and HANA SPS03 now allows you to configure your app’s xsuaa service instance to skip the normal (DB defined user) login page and redirect the user directly to external IdP’s login page seamlessly.  Again, this is fodder for another blog post so I won’t cover it here.

So the main point of this post is to force the HXE installer to hostname routing mode and as a bonus show you how to use real (or at least your own) certificates so that your browser doesn’t complain constantly about your HXE server.

These instructions assume that you’ll be installing on a Linux X86 based VM and compatible OS distribution.  You can use AWS, or Azure, or GCP, or your own laptop to provide this, so I won’t go into detail.  I will advise you to use a 32GB image or hardware that has 32GB of RAM or larger.  You may be able to get by with 16GB, but if you try to do it in 16GB, I’d advise to make sure you’ve also got 16GB of swap available and that you install things a little at a time (by breaking up the installer script) and turning things off as you go.

Also, I’m going to be assuming OpenSuSe with the Leap 42.3 repos.  YMMV.

With this in mind go get the binary installer files by signing up for HXE and using the latest downloader.

Note: This post is specific to the HXE installer distributed in April, 2018 with version 2.00.030…

SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides, SAP HANA Learning

SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides, SAP HANA Learning

Once you’ve got the tgz files, use SCP to move them to your VM or server.

SSH into your VM as root (or become root) and do some updating before we start installing anything.

Make sure the system libs are up to date.
zypper repos
zypper refresh
zypper update

Run update again to be sure.

zypper update
Loading repository data...

Reading installed packages...

Nothing to do.

Reboot to let the new versions of shared libs take hold.

shutdown -r now

Check that /etc/host has a Fully Qualified Domain Name(FQDN) for this host pointing to it’s loopback address.

cat /etc/hosts | grep hxe.local.com
127.0.0.1    localhost hxe.local.com

The DNS resolution mechanism needs to be able to resolve ANY hostname in your machine’s domain.  This can be accomplished in two ways.  If this server is on the public internet and you have control over the administration of your domain, you can use your domain provider’s tools to set up both an “A” record and a “CNAME” record to accomplish this.

hxe  IN A  123.45.67.123

.hxe  IN CNAME  hxe.local.com

As long as requests made to the server’s external IP address get redirected back to it’s internal address, this will work.

If however, you are installing within a VM on real hardware or Docker or on a server that is on a NAT protected sub-network, you will need to find a way to locally override the external DNS resolution and provide for wildcard DNS matching for any hostname in your machine’s domain.
The easiest tool I’ve found to accomplish this is dnsmasq.
Install DNSMasq using the SuSE package manager.

zypper in dnsmasq

Edit the DNSMasq config file to add an address entry for your host’s domain name.

vi /etc/dnsmasq.conf

Find the right section by searching for “double-click” and duplicate and uncomment and adjust (don’t forget the “.”)
...

address=/.hxe.local.com/127.0.0.1

...

Save your file (with VI, esc + :wq)

Restart dnsmasq to affect the change.

service dnsmasq restart

Verify with dig pointing to local DNS server(dnsmasq)

dig hxe.local.com @127.0.0.1
...
;; ANSWER SECTION:
hxe.local.com. 0 IN A 127.0.0.1
...
Verify other wildcard variations.

dig abc.hxe.local.com @127.0.0.1

Adjust the host’s DNS resolution order to use the local server first before others.

vi /etc/sysconfig/network/config
...

NETCONFIG_DNS_STATIC_SERVERS="127.0.0.1"

...

Trigger a rebuild of the /etc/resolv.conf file.

netconfig update -f

Double check that the /etc/resolv.conf file got generated properly.  The first nameserver line should be 127.0.0.1

cat /etc/resolv.conf

...

search us-west-1.compute.internal


nameserver 127.0.0.1

nameserver xxx.xx.xx.xxx

Test the default DNS search order by not specifying a server with dig.
dig hxe.local.com
...
;; ANSWER SECTION:

hxe.local.com. 0 IN A 127.0.0.1

...

And other variations.

dig abc.hxe.local.com
dig xyz.hxe.local.com

They should all resolve to the same server address as before.
Verify with a ping to be sure the search order is being followed.

ping hxe.local.com
PING localhost (127.0.0.1) 56(84) bytes of data.

64 bytes from localhost (127.0.0.1): icmp_seq=1 ttl=64 time=0.016 ms

64 bytes from localhost (127.0.0.1): icmp_seq=2 ttl=64 time=0.021 ms

64 bytes from localhost (127.0.0.1): icmp_seq=3 ttl=64 time=0.019 ms

^C
ping abc.hxe.local.com

Enable the dnsmasq daemon to start at system startup.

chkconfig dnsmasq on

Created symlink from /etc/systemd/system/multi-user.target.wants/dnsmasq.service to /usr/lib/systemd/system/dnsmasq.service.

systemctl enable dnsmasq.service
systemctl start dnsmasq.service​

Verify it’s set to start on startup.
systemctl list-unit-files | grep dnsmasq
dnsmasq.service       enabled

REALLY check that it works after a reboot!

shutdown -r now

After your server comes back up and you can ssh in as root.

ping anything.hxe.local.com

OK, now we’ve proven that our server can resolve any hostname in our FQDN’s domain, we can proceed to the installation.  If you haven’t proven that this is the case, DO NOT CONTINUE!  Go back and correct the DNS resolution until it’s working or the installation WILL fail.
Based on where you copied the installation files on your server, you will need to unpack them in a convenient place.  I like to use a top level folder called “install”.  If you use a different place or name, adjust these commands as needed.  I also copied the files on my server into the /var/tmp folder so again, adjust as needed for your situation.
Unpack the HXESPS03 installer files

cd /install
tar xzvf /var/tmp/hxe.tgz
tar xzvf /var/tmp/hxexsa.tgz

The HXE installer scripts are hard-coded to use port routing mode.  Since we want to use hostnames routing mode, I created a script that adjusts the stock HXE installer scripts so that hostnames mode is specified.  It also allows you to change the org and development space names that by default get created as HANAExpress and development.

Git clone this script repo into the /install folder.

git clone https://github.com/alundesap/hxe_installer_scripts.git

Move the script(s) to the current folder

mv hxe_installer_scripts/* .

Remove the repo folder

rm -rf hxe_installer_scripts/

Run the prep4hostnames.sh script and enter your specifics

./prep4hostnames.sh

Enter fully qualified host name: hxe.local.com

Enter organization name: MyHXE

Enter development space name: DEV

Use the screen command because the install takes a while and will fail if you loose connection otherwise.

screen

Ctrl-A + d to disconnect from screen
screen -R to reconnect

Run the setup script.

./setup_hxe.sh

Use defaults except…

instance number = 00

When prompted: Enter component to install:, Select server.  I’ve found that the installer will sometimes fail when all is selected.  I find it’s better to let the server only installation finish and sit for about 15 minutes before re-running the installation script.
local host name = hxe.local.com

password = <secret>

Expect this process to take around 15 minutes to complete.  Once completed, wait about another 15 minutes for the machine to settle before re-running the install script and installing XSA(don’t ask me why).
Rerun the install script.  Answer the prompts with the same values you did the first time.

./setup_hxe.sh

When you get to this prompt answer yes.

Detected server instance HXE without Extended Services + apps (XSA)

Do you want to install XSA? (Y/N): Y

XSA install take on the order of 90 minutes to complete.  This is a good time to walk the dog, get lunch/dinner, etc.

You can monitor what processes are listening on what ports with this command in another shell (as root).
while sleep 5; do clear; date; lsof -n -i -P | grep LISTEN | grep -v 127.0.0.1 | sort --key=9.3,9.7; done

Once you see a process running on 30033, XS-A has started up and is installing content.

sapwebdis  1467            hxeadm   12u  IPv4 294103      0t0  TCP *:30033(LISTEN)

Once you see this line of output, you can open a new console and become hxeadm and login as XSA_ADMIN and watch the remaining install of XS-A components.

[37] XS Controller API available at 'https://api.hxe.local.com:30033'. Creating a connection to this URL using the XS Commandline Client should get you started.

su - hxeadm
xs api https://api.hxe.local.com:30033/  --skip-ssl-validation
xs login -u XSA_ADMIN -p <secret> -o MyHXE -s SAP

If you list all the apps running in the SAP space you’ll see that they all now use port 30033.

xs apps

This is the default port when using hostnames routing mode.  You can change this to the default 443 SSL port so that you won’t have to specify a port at all.  The next set of steps accomplishes this.
Note that you should check to see that no other webserver process is already using port 443 before continuing.

Be sure the you are running as hxeadm.

The following are HANA sql command line commands that execute single commands.  The first one just show the contents of the xscontroller.ini configuration table.  We need to change some settings and then verify they are correct in the database before continuing.

hdbsql -u SYSTEM -p <secret> -i 00 -d SYSTEMDB "SELECT * FROM M_INIFILE_CONTENTS WHERE FILE_NAME='xscontroller.ini'"
FILE_NAME,LAYER_NAME,TENANT_NAME,HOST,SECTION,KEY,VALUE

...

"xscontroller.ini","SYSTEM","","","communication","api_url","https://api.hxe.local.com:30033"

"xscontroller.ini","SYSTEM","","","communication","default_domain","hxe.local.com"

"xscontroller.ini","SYSTEM","","","communication","internal_https","true"

"xscontroller.ini","SYSTEM","","","communication","router_https","true"

"xscontroller.ini","SYSTEM","","","communication","routing_mode","hostnames"

...
Press "q" to exit
Set the router_port to 443.
hdbsql -u SYSTEM -p <secret> -i 00 -d SYSTEMDB "alter system alter configuration('xscontroller.ini','SYSTEM') SET ('communication','router_port') = '443' with reconfigure"
Set the listen_port to 443.
hdbsql -u SYSTEM -p <secret> -i 00 -d SYSTEMDB "alter system alter configuration('xscontroller.ini','SYSTEM') SET ('communication','listen_port') = '443' with reconfigure"
Set the api_url to include 443.
hdbsql -u SYSTEM -p <secret> -i 00 -d SYSTEMDB "alter system alter configuration('xscontroller.ini','SYSTEM') SET ('communication','api_url') = 'https://api.hxe.local.com:443' with reconfigure"

Review all the settings to make sure they are correct.

hdbsql -u SYSTEM -p <secret> -i 00 -d SYSTEMDB "SELECT * FROM M_INIFILE_CONTENTS WHERE FILE_NAME='xscontroller.ini'"
FILE_NAME,LAYER_NAME,TENANT_NAME,HOST,SECTION,KEY,VALUE

...

"xscontroller.ini","SYSTEM","","","communication","api_url","https://api.hxe.local.com:443"


"xscontroller.ini","SYSTEM","","","communication","default_domain","hxe.local.com"


"xscontroller.ini","SYSTEM","","","communication","internal_https","true"

"xscontroller.ini","SYSTEM","","","communication","listen_port","443"

"xscontroller.ini","SYSTEM","","","communication","router_https","true"

"xscontroller.ini","SYSTEM","","","communication","router_port","443"

"xscontroller.ini","SYSTEM","","","communication","routing_mode","hostnames"

"xscontroller.ini","SYSTEM","","","communication","single_port","true"

"xscontroller.ini","SYSTEM","","","general","embedded_execution_agent","false"

"xscontroller.ini","SYSTEM","","","persistence","hana_blobstore","true"

Press "q" to exit

Stop the HANA processes.

./HDB stop

Drop out of hxeadm back to root user.

exit

Now, as root complete these steps.  What we’re doing here is replace the icmbnd executable with one that can bind to a port that is below 1000 and 443 is such a port so it requires root permission to do so.

cd /hana/shared/HXE/xs/router/webdispatcher
cp icmbnd.new icmbnd
chown root:sapsys icmbnd
chmod 4750 icmbnd
ls -al icmbnd

The last line should show the icmbnd file with these permissions.

-rwsr-x--- 1 root sapsys 2066240 May  4 04:03 icmbnd

Completely reboot the server so that all the processes change from 30033 to 443.

shutdown -r now

After reboot verify that things are now correct.
If you want to watch server processes and the ports they are binding to, use this command loop as root.  Ctrl-C to break the loop.

while sleep 5; do clear; date; lsof -n -i -P | grep LISTEN | grep -v 127.0.0.1 | sort --key=9.3,9.7; done

You should notice that there is no longer a process binding to port 30033, but now one on 443.

sapwebdis  4764    hxeadm   14u  IPv4  56676      0t0  TCP *:443 (LISTEN)

Break the above loop with Ctrl-C.
Now check that things look with the xs command.  First become the hxeadm user.

su - hxeadm

Reset the xs api command since it will still think that it needs to be using port 30033.

xs api https://api.hxe.local.com:443/ --cacert=/hana/shared/HXE/xs/controller_data/controller/ssl-pub/router/default.root.crt.pem

Re-login as XSA_ADMIN.

xs login -u XSA_ADMIN -p <secret> -o MyHXE -s SAP

You should now see that all the apps running in the SAP space no longer have port 30033 but no port designation now.  This is because 443 is the default SSL port.
Here is the line for the WebIDE app.

webide   STARTED           1/1         512 MB    <unlimited>   https://webide.us.sap-a-team.com

If you browse to the API endpoint, you will be greeted with the dreaded invalid certificate authority warning.  This is due to the fact that during the installation, the installer used a self-signed certificate that isn’t known to any of the browser’s built in certificate authorities.

https://webide.hxe.local.com/

SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides, SAP HANA Learning

Even if you choose to create an exception for this certificate in your browser, you’ll find that there are a few things that just don’t work properly.  It’s best to use a certificate that your local system really trusts.  This can be accomplished in one of two ways.

The easiest and most inconvenient and costly way is to go to a SSL certificate provider and purchase a wildcard SSL certificate for your server’s fully qualified domain name.  In the US, I use GoDaddy, and a quick check of their website shows that a certificate like this will cost me at least $300 per year

SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides, SAP HANA Learning

The biggest advantage of this approach is that ANY browser will already have GoDaddy’s certificate authority certs loaded and trusted so this purchased cert will be immediately recognized as secure and you will have to do this for production servers.

The disadvantages are costs, proving you are the rightful owner of the domain name, time it takes to purchase and install, and the fact that it will expire and you’ll have to renew and re-install an updated one in time.

Another way to go is to proclaim yourself as a valid and trusted certificate authority in your own right!  That way, you can generate your own certificates as you need them and you can set them to expire a long time from now and it doesn’t cost you anything more than your time and effort.
The downside is that nobody trusts you.  However, we can force them to trust you manually by requesting that they install and trust your certificate authority(CA) certificate and your intermediate certificate.  Since you’re likely to have only a handful of client machine’s you are using for development and testing (if this server is intended for production use, you’ll have to pay for a real certificate) you will only need to trust your CA on a few machines. Since it’s you that’s doing it, you trust yourself not to be malicious don’t you?

I’ve created a set of scripts that you’ll need to run as root to accomplish this.  I’m not going to get into the details of what they are doing.  If you’re interested, just inspect the scripts themselves.  I want to give Jamie Nguyen props for his excellent info that inspired these scripts.

OpenSSL Certificate Authority

Shell into your server and be sure to be the root user.

We will create a folder in the /root folder so change into that folder.

cd /root

Now git clone this repo.

git clone https://alunde@bitbucket.org/alunde/ca.git

This will create a folder inside /root called ca.  Change into that folder.

cd ca

We will need to create the top-level certificate authority certificate first.  When you are prompted to create a passphrase, use something that is non-trival and really make a note of it in a secure place.

./1_gen-root-key-cert

You’ll be prompted for information about your company, location, etc.  Just fill in reasonable values that you’ll recognize when you inspect the certificate.

You use this certificate to “sign” an intermediate certificate so let’s create that one with the next script.  Again be careful when you’re asked to provide the passphrase for the CA certificate you just created and when you’re asked to create a passphrase for the intermediate certificate we’re creating now.  I like to use the same passphrase for the intermediate certificate, but this is up to you.  As before, keep this passphrase super secret.

./2_gen-intermediate-key-cert

Again you’ll be prompted for info about your company, etc.  Enter values as before.
Now change into the intermediate folder for the real work of creating the certificate that we’ll use for our server.

cd intermediate

Now run the script to generate the SSL certificate for the server.

./gen-xsa-ca-ssl-cert

You’ll prompted to enter your server’s fully qualified host name.

Enter fully qualified host name: hxe.local.com

The script then reminds you that when you are prompted for the Common Name to add an asterix(*.) to the front of your server’s host name.  This is very important to do correctly or the certificate won’t be generated properly.

When prompted for Common Name []: use this *.hxe.local.com

You are also reminded to use the passphrase you created when you created the intermediate certificate in step #2 above.

Enter “Y” to continue.
----
Country Name (2 letter code) [US]:

State or Province Name [State]:

Locality Name []:

Organization Name [My Corp, Inc.]:

Organizational Unit Name []:

Common Name []:*.hxe.local.com

Email Address [info@local.com]:

You’ll see a lot of output describing the steps the script is taking and the output of inspecting the results so that you can verify that the certificate was created properly.  The resulting certificate is stored in the certs sub-folder and the key file that goes with it is stored in the private sub-folder.  The script also hints at the steps needed to install the certificate into the XSA system.  Since we have already shifted our server from port 30033 to port 443, you’ll need to adjust the command slightly. 

cd /root/ca/intermediate/
/hana/shared/HXE/xs/bin/xs api https://api.hxe.local.com:443 --skip-ssl-validation
/hana/shared/HXE/xs/bin/xs login -u XSA_ADMIN -p <secret> -s SAP
/hana/shared/HXE/xs/bin/xs set-certificate hxe.local.com -k private/hxe.local.com.key -c certs/hxe.local.com.pem

We need to restart the HANA system for the new certificate to take hold.

While we’re waiting, we can also replace the self-signed certificate that the XS classic web dispatcher uses as well.  This isn’t critical unless you’re using the older XS scripting engine.
Here’s the steps I use to accomplish this.
As the root user.  Copy the certs to the sec folder.

cd /hana/shared/HXE/HDB00/demo.isdev.net/sec
cp SAPSSLS.pse SAPSSLS_pse.bak
cp /root/ca/intermediate/certs/ca-chain.cert.pem .
cp /root/ca/intermediate/certs/hxe.local.com.cert.pem .
cp /root/ca/intermediate/private/hxe.local.com.key .

Now become the hxeadm user and generate the new pse file.  Make sure to enter a blank passphrase.

su - hxeadm
cd /hana/shared/HXE/HDB00/hxe.local.com/sec
sapgenpse import_p8 -p hana2sp03.pse -r ca-chain.cert.pem -c hxe.local.com.cert.pem hxe.local.com.key

Copy the newly generated one over the one the system uses.  Then restart HANA.

cp hana2sp03.pse SAPSSLS.pse
cd /usr/sap/HXE/HDB00
./HDB stop ; ./HDB start

When the HDB start command finishes, it’s important to remember that most of the XSA applications are started in the background.  This means it can take some time before they are all fully functional.  If you start interacting with the system before this, you may likely see issues so it’s best to wait for everything to become fully functional.

How long you might ask?  Well you can watch things to be sure.  First as root, watch to see when the 443 port is bound to a process.  We did this above but I’ll put the command here again.  Ctrl-C to break out of the loop.

while sleep 5; do clear; date; lsof -n -i -P | grep LISTEN | grep -v 127.0.0.1 | sort --key=9.3,9.7; done

Once you see that, become the hxeadm user and set the api to use the new certificate and port 443 and then login and check the state of the apps running in the SAP space.

su - hxeadm
https://api.hxe.local.com:443/ --cacert /hana/shared/HXE/xs/controller_data/controller/ssl-pub/router/default.root.crt.pem
xs login -u XSA_ADMIN -p <secret> -s SAP​

Now check the running apps.

xs apps

The second column will be either STARTED or STOPPED for each app.  This is deceiving until you notice the column header is “requested state” not “actual state”.  The apps that actually have started completely are those where the number of instances started is equal to the number of instances requested.  So initially in the third column you’ll see a lot of 0/1’s.  Keep running the xs a command until you see those with STARTED as 1/1.  Again, this will take some time but keep checking back.

webide    STARTED   1/1   512 MB    <unlimited>   https://webide.hxe.local.com

While we are waiting, we can check the info screen of the api endpoint in our browser.  Browse to this URL.

https://api.hxe.local.com/v2/info

But wait a second you say!  My browser is still showing that it’s not secure!

SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides, SAP HANA Learning

If you’re using Chrome browser and you fire up the developer tool window and select the security tab where you can inspect the details of the certificate.

SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides, SAP HANA Learning

SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides, SAP HANA Learning

These (blurred) values should match what you entered when creating the certificate authority certificate above.

But it’s still not trusted…

That’s because we haven’t yet told your system to trust certificates that we create with our own CA.  To do this we must first copy the ca certificate and the intermediate certificate to our local system.

Copy the ca.cert.pem and intermediate.cert.pem files from your server to the local system.  I’ll use scp, but if you don’t have a better way, you can always dump the contents of the pem files with the cat command and then just cut/paste them into text files on your local system.

scp hxe.local.com:/root/ca/certs/ca.cert.perm .
scp hxe.local.com:/root/ca/intermediate/certs/intermediate.cert.pem .

Now that you have local copies, import them into your local system’s trust store.  This varies by system type but on Mac you’ll use the Keychain Access application and set the CA cert to “Always Trust” and on Windows you’ll use the Internet Options system tool’s Content tab Certificates button to import them into Trusted Root Certification Authorities and Intermediate Certification Authorities respectively.  You’ll need local admin rights to do this and if your system is locked down by IT, you’ll have to get them to do it for you or you’re just plain out of luck.

Once this is done, it’s time to test.  Load the api URL as before and see that the location bar has a happy green lock icon now.  If it doesn’t, you may have to quit your browser and restart it as some browsers hang onto certificates for a while even when a new one is available.

SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides, SAP HANA Learning

You should now be able to get to the WebIDE without any issues of the browser complaining (assuming it’s had enough time to get started by now).

Note: Remember to login with the XSA_DEV user not the XSA_ADMIN user when accessing the WebIDE as the XSA_DEV user has the proper role collections assigned to it.

https://webide.hxe.local.com

Let’s also confirm that the XS-Classic web dispatcher has it’s certificate correctly set.  Browse to this URL to test that it also has a happy green lock icon in the browser’s location bar.

https://hxe.local.com:4300/

You should see this screen.

SAP HANA Certifications, SAP HANA Studio, SAP HANA Guides, SAP HANA Learning

HANA SLT(System Landscape Transformation) Archived Files(SARA) Replication to HANA

$
0
0
In this blog we are going to discuss how to replicate the data which had been archived on source systems through SARA files.

In every project based on agreement we will go for archiving either it will be to keep only last 12 months data or last 24 months based on business agreement.

If we have Archive user exclusion setting on SLT system from initial phase of the project then we are good but in some cases we may have to reload tables from source systems but if the tables already archived then we can use below steps to replicate archived data directly from SARA files without loading data to source system.

Steps to Replicate Archived Data to HANA Directly from SARA files.


1. Lets Check HANA system table for which data had been archived to make sure data no record counts showing for specific periods.
Example:Below table don’t have data for 2014 because had been archived so on source system also you don’t find any records.
HANA:

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Replication, SAP HANA Learning

Source System Records for 2014:

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Replication, SAP HANA Learning

2. Lets go to SLT system and use SE38 to execute IUUC_CREATE_ARCHIVE_OBJECT program

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Replication, SAP HANA Learning

If archive Object is not correct then you don’t see options for tables which archived through this archive file name so we have to make sure to enter correct name.

Portion size leave it as it which is 8000 portion.

We can use archive date from and to in case source system archived multiple times but we are interested to retrieve data only for specific periods so please enter the dates between when archive job had been executed.

Write behavior keep it 3 so that it will use array modify for replication.

When you execute above program screen then if you archive object name is correct it will shown you below screen to select for which tables you want to replicate data from SARA file.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Replication, SAP HANA Learning

After table selection please use below enter button to proceed.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Replication, SAP HANA Learning

As soon as you press enter button it will create migration object for replication and it will display below successful screen with names.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Replication, SAP HANA Learning

Press enter to see next successful message

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Replication, SAP HANA Learning

3. Go to LTRC transaction on SLT system to see weather migration object created with the name which used in step 2.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Replication, SAP HANA Learning

4. If you see loaded in above step for migration object then next step check HANA table record counts for same table and same period which we checked in above steps.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Replication, SAP HANA Learning

With this we are successful replication from SARA files to HANA directly without using source system.

Generating and Deploying HANA CDS Wrapper Views

$
0
0

Overview of Data Replication in S/4HANA Cloud


In the S/4HANA Cloud environment, data can be replicated with these three options:

◈ Whitelisted APIs
◈ Legacy APIs
◈ Data Models

Whitelisted APIs

SAP provides whitelisted APIs (on the SAP API Business Hub) that can be consumed either as ODATA or SOAP services. Data replication is a pull mechanism in which a user triggers the replication. Detailed documentation is available to facilitate a search for the required APIs.
Legacy APIs

SAP delivers legacy APIs using BAPIs and IDOCs. These APIs are primarily used for SAP to SAP based integrations.

Data Models

Data from S/4HANA systems can be replicated to SAP Cloud Platform systems via an automatic or manual provision using CDS views.

The Replicate CDS Views application is based on data model replication and can be used to replicate data from an S/4HANA system to an SAP Cloud Platform system using CDS views. This gives a customer access to data on the distributed systems in the central SAP Cloud Platform system. This facilitates business analysis of  data mart scenarios.

You can also create analytical applications by developing an OData service based on the replicated CDS views. For analytics, a key user can create Calculation views to manipulate data based on the replicated CDS views. An OData service can be used to project the data in the UI.

Why do we need HANA CDS Wrapper views?


Calculation views that are based on replicated CDS views often become invalid when fields are extended. Field level extensibility such as a change in the data type or length of a field results in an invalid Calculation view and the generated reports do not reflect correct data.

To overcome this, a HANA CDS view can be created as a wrapper for the replicated CDS view (router view). The HANA CDS wrapper view abstracts the underlying router view so that the Calculation view is unaffected by fields that are extended as it is based on the wrapper view.

A HANA CDS wrapper view is generated using the deploy.js program (which is a Node.js program). The deploy.js app uses the metadata of a replicated CDS view and generates the definition for a HANA wrapper view and a synonym file.

These sections detail the steps required to generate a HANA CDS Wrapper view.

Generating a HANA CDS wrapper View


These files are generated by the deploy.js program

◈ HANA Database Data Definition (.hdbdd) file
◈ Synonym file (.sql) using the metadata of a replicated CDS view

Note: You must download the metadata of a replicated CDS view using the Replicate CDS Views app.

To generate the definition and synonym files for a router view, follow these steps:

◈ Download and install the Node.js application.
◈ Open the Node.js command prompt.
◈ Download the ZIP file attached to the SAP note 2589286.

The WrapperGeneratorDeploy folder contains the deploy.js application that generates the definition and synonym files for a router view.

◈ Download the metadata (JSON) file using the Replicate CDS Views app and copy the downloaded file to the Input folder.

◈ At the prompt, run the CD command to change to the WrapperGeneratorDeploy folder. For example,Prompt>cd <drive>:<path>\WrapperGeneratorDeploy

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

◈ Run the deploy app using this syntax:
node deploy <source schema> <target schema> <CDS view name>
where
<source schema> is the schema in which the router views are created in SAP Cloud Platform
<target schema> is the schema in which the HANA CDS wrapper views are created in SAP Cloud Platform

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

◈ A confirmation message stating the successful creation of the HANA CDS view definition and synonym file is displayed.

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

◈ The HDBDD (HANA CDS view definition) file is available in the Output folder and the SQL (synonym) file is available in the Synonym folder.

Deploying a HANA CDS wrapper view


To deploy a HANA CDS wrapper view on the target schema, follow these steps:

◈ Create a package for a HANA CDS wrapper view
◈ Create a target schema
◈ Create a synonym file in the target schema
◈ Create a definition file in the target schema

To create a package for a HANA CDs wrapper view

◈ Log on to the SAP Cloud Platform system with your
◈ On the home screen, choose CATALOG.
◈ Expand the Views folder in the replicated view schema.

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

◈ To see the data in a view, open a view and choose the Open Content

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

◈ To create a package for the HANA CDS wrapper view, choose EDITOR on the home screen.
◈ From the Content context menu, choose New > Package
The Create Package dialog opens.
◈ Enter a package name and choose Create.

To create a target schema

◈ Choose CATALOG on the home screen. In the left side panel, right click anywhere and choose New > Schema.

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

The Create new Schema dialog opens.

◈ Enter a schema name and choose OK.

To create a synonym file in the target schema

◈ On the home screen, choose Editor.

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

◈ From the context menu of the replicated target schema (XXXX_WRAPPER_TEST), choose a new file or import a file.
◈ Enter a name for the new SQL file
For example, I_COSTCENTER.sql
◈ Copy the content of the generated (output of the WrapperGenerator.js program) synonym file to this new file, I_COSTCENTER.sql

Note: If you import the generated synonym file, you can skip this step.

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

◈ Save and choose <name of the action button> to run the synonym file. The synonym file is generated in the Synonyms folder in the target schema XXXX_WRAPPER_TEST.

To create a definition (HDBDD) file in the target schema

◈ From the context menu of the target schema, create a new file or import the generated HDBDD (output of the WrapperGenerator.js program) file.
◈ Enter a name for the new HDBDD file and choose Create.For example, I_COSTCENTER.hdbdd

Note: If you import the generated definition file, you can skip this step.

◈ Enter the following information in the HDBDD file:

1. namespace (the package created in the target schema)
2. schema (schema name of the HANA CDS wrapper view created in the target schema)
3. context (same name as the HDBDD file)

◈ Copy the generated HANA CDS wrapper definition (output of the WrapperGeneratorDeploy.js program) file and paste it in the context braces

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

◈ Save and choose <name of the action button> to run the HDBDD file.
The wrapper view is generated in the Views folder in the target schema XXXX_WRAPPER_TEST.

◈ Open the wrapper view and choose Open Content to see the data.

SAP HANA CDS, SAP HANA Guides, SAP HANA Certifications, SAP HANA System

We have successfully deployed a CDS View in the SCP System and also have seen its content. Hence, by creating a HANA CDS Wrapper View, we can preserve the consistency of the replicated CDS Views in the SCP System.

HANA Partitioning – 2 billion rows limitation – Part I: BWoH / BW/4HANA

$
0
0
Some of you may already know the limitation of 2 billion entries in a single table / partition.
Another hint regarding this limitation is the alert ID 17 ‘<table> contains xxx records’. The default threshold of this alert is 300,000,000 rows. This limitation applies to BSoH / S/4HANA and BWoH / BW/4HANA => general HANA limitation. But both have its own distribution rules. So I will split this blog into two parts BWoH and BSoH.

At first you have to know that you have to execute manual steps to create new partitions and distribute the data! I have heard this very often that some customers think this is a automatically task.
There is one exception: If you use dynamic range partitioning, but also this must be implemented manually.

Test Environment:

VMware 6.0
11xCPU: Intel(R) Xeon(R) CPU E7-8880 v3 @ 2.30GHz
22vCPUs
SLES for SAP 12 SP1
HANA 1.0 SPS12 Rev. 122.11

TypeSAP Note 
BWoH1908075 – BW on SAP HANA: Table placement and landscape redistribution
BWoH2334091 – BW/4HANA: Table Placement and Landscape Redistribution 
BWoH 2019973 – Handling Very Large Data Volumes in SAP BW on SAP HANA 
BSoH1899817 – Business Suite on SAP HANA database: table placement 

If you face such an issue in a BW system, you have to check first if you can solve it in the application layer. So check if you can use partitioning options in the backend system with SAP note 2019973.
When this is not possible anymore, than you have to do the following manual steps.

1. Check table distribution rules / table placement


Follow the steps in SAP note 1908075 for BWoH / 2334091 for BW/4HANA.

Download the attachment and choose your script regarding your HANA revision and topology.
In this example I have chosen the most spread variant: HANA 1.0 SPS12 Scale-up
Normally at the time of installation or migration this step is already performed but some thresholds may have changed over time. So first check the current parameter and thresholds:

ALTER SYSTEM ALTER CONFIGURATION ('global.ini','system') 
SET ('table_placement','same_num_partitions') = 'true' WITH RECONFIGURE;
ALTER SYSTEM ALTER CONFIGURATION ('global.ini','system') 
SET ('table_placement','max_rows_per_partition') = '1500000000' WITH RECONFIGURE;

SELECT 
SCHEMA_NAME,GROUP_TYPE,MIN_ROWS_FOR_PARTITIONING, 
INITIAL_PARTITIONS,REPARTITIONING_THRESHOLDS, 
LOCATION,DYNAMIC_RANGE_PARTITIONING 
FROM
"SYS"."TABLE_PLACEMENT";

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA BW/4HANA

If all the values match to the values from the downloaded SQL script, there is nothing to do for you.
In the other case replace $$PLACEHOLDER with the SAP schema and execute the script with a user with sufficient priviliges.

The current rules explained also in the attached pdf inside the same folder as your SQL script:
InfoCube fact tables are ROUNDROBIN partitioned on first level (sap.bw.cube).
DSO tables are HASH partitioned on first level (sap.bw.dso). They have 1 partition on first level regardless of the number of records in the tables – except for tables with more than 1.5 billion records, see remarks below. Tables for InfoCubes and DSOs are located on the master node.
InfoObjects tables (sap.bw.iobj) are not partitioned, except for InfoObjects with high cardinality. Those tables are HASH partitioned on first level. All other InfoObject tables are not partitioned i.e. they do not have a partitioning specification. InfoObject tables are located on the master node.
PSA tables (sap.bw.psa) and errorstack tables (sap.bw.dtp) are HASH partitioned on first level. They have 1 partition on first level regardless of the number of records in the tables – except for tables with more than 1.5 billion records, see remarks below. PSA and errorstack tables are located on the master node.

SAP HANA DataSources (sap.bw.dsrc) can have an insert and an upsert table. The insert table is dynamic range and the upsert table is HASH partitioned on first level.

Temporary BW tables (sap.bw.temp, sap.bw.trex) and OpenHub tables (sap.bw.openhub) are not partitioned i.e. they do not have a partitioning specification. They are located on the master node.

The number of partitions on first level according to the rules above is only set when the table is initially created; the number of partitions on first level is not adapted dynamically.

The number of first level partitions of a table does not exceed the number of nodes that are potential valid locations for that table. This rule is disregarded if a higher number of first level partitions is required to avoid first level partitions with more than 1.5 billion records (global.ini, section [table_placement], max_rows_per_partition = 1500000000).

=> this means a split will not performed before the table has reached 1.5 billion rows.
=> in a scale-up system all tables are created without partitioning besides the table was above 1.5 billion records at the time of the migration – SUM will take care

2. Check table grouping


If you have an old release or your migration was not completed correctly, some tables have a missing table group type.

select * from "SYS"."TABLE_GROUPS" where TABLE_NAME like '%<TABLE_NAME>%';
When the group type is missing, you have to implement the latest notes for report RSDU_TABLE_CONSISTENCY and execute the it to classify the tables.

3. Repartitioning plan


In the SQL select statement before you checked the table group type. In the output you’ll also found the group name.
This one we have to use in the reorg plan => the steps can be found at the end of SAP note 2019973.
Note: In case of a scale-out system you can skip this step and execute a table redistribution.

Be sure that you execute this steps in one SQL session with the schema owner or a user with sufficient priviliges to tables and content!

1. Generate a reorg plan


call reorg_generate(6,'GROUP_NAME=><your-BW-group-name>');

2. Check the generated plan


select * from reorg_plan;

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA BW/4HANA

If there is no output, you have to check if your table is above the defined threshold (max_rows_per_partition). I have adjusted the value that my plan results in 8 partitions => currently: 1.

3. Execute the plan


call reorg_execute(?);

Check the free ressources (data+log disk space + CPU + memory) of the system and the configured parallelism.

Adjust the parameters or execute it only in a time frame without high business load.
indexserver.ini => partitioning => split_threads => 16 (default)
indexserver.ini => partitioning => bulk_load_threads => 4 (default)

4. Check the status of the plan execution


select * from reorg_steps 
where reorg_id = <value returned by reorg_execute before>;
select * from reorg_steps where reorg_id = 1;

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA BW/4HANA

Normally reorg_id should be 1. Now you can monitor the process via the statement and the running sessions.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Learning, SAP HANA BW/4HANA

You will see the splitting per col as adjusted via split_threads. The number of ‘ZipResultJob’ threads correlate with the value of parameter bulk_load_threads.

In my example the splitting operation (step 1-8) tooks 50min for 1.8 billion rows (23GB of data).

Note: Be aware of this limitation and the tasks to split the table to solve the issue. If you won’t do this no more inserts are possible if you reach the limit and you will get a lot of errors in the backend.

SAP Fiori App on HANA XS/SAP Cloud Platform/Mobile Devices, Consuming SAP Business One HANA Service Layer

$
0
0

Objective


Utilize SAP tools to ease development efforts in packaging & deploying SAP UI5 Applications on Mobile/SAP Cloud Platform/Hana Platform.

Show a very simple Loosely-Coupled end-to-end Hybrid App solution utilizing SAP Business One Service Layer

Duration

90 minutes (not including setting up prerequisites)

Difficulty Level

Advanced

Challenge

Setting up various plugins & IDEs & Connecting the dots.

What you will achieve?


Hybrid Application deployed on mobile devices (iOS / Android), packaged by SAP Hybrid Application Toolkit (SAP HAT)

Consuming SAP Business One Service Layer (RESTful Web API), ONLY available on SAP Business One version for HANA

Developing using SAP HANA XS Engine (XS Application)

Overview:

Here’s a quick overview of methods, tools & technologies involved.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Prerequisites


The following prerequisites are required.

1. SAP Cloud Platform Trial Account

Required for utilizing Web Ide for downloading SAP UI5 templates and deploying application

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

2. SAP HANA Studio or SAP HANA Web-based Development Workbench

Required for XS Applications Development

3. SAP B1 Hana Service Layer

Required for performing backend operations and pushing data to front end.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Make sure you’ve installed Service Layer as part of the SAP Business One HANA Server Components installation. Once you’ve installed, you may check the following URLs & configuration files to verify that your Service Layer is setup successfully.

Navigate to your SAP Business One HANA Service Layer API endpoint (either HTTP/HTTPS):

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

https://<hana_ip>:50000/b1s/v1
http://<hana_ip>:50001/b1s/v1

4. SAP B1 Hana Database

For this demo shown, we will be using a demo company database SBODEMOUS

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

5. SAP HAT

Required for packaging UI Applications for mobile device (Android/IOS)

After setting up successfully, you can run the Hybrid App Toolkit on your local machine before “hooking” it to SAP Cloud Platform’s Web IDE.

This is an important step required later.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Use-Case


For this demonstration, we will be using specific use case for view Purchase Request created by a user.

Start Building Your Application


In the Back-end section, you’ll learn how to implement control flow logic using SAP HANA XS & Service Layer. It will be heavy (more manual hands-on) but it will benefit you a lot after understanding the mechanics.

In the Front-end section, you’ll learn how to exploit SAP Web IDE UI5 App Template, pointing to your OData-point and deploying it into your device directly or local server (exporting) for further development.

Option 1: SAP HANA Extended Application Services (XS)


Step 1: Created New project from Template by selecting UI5 application template.

Step 2: Right click on project click on Export to export the application.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Step 3: Open Hana Studio and go to window -> Perspective -> Open Perspective and Choose SAP HANA DEVELOPMENT

Step 4: To create a new project Click on File -> New and select XS Project

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Step 5: Extract the project exported from IDE and copy all application folders and files to the XS application in HANA DEVELOPMENT STUDIO.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Step 6: Add xsaccess and .xssqlcc to the project.

1. public.xssqlcc:

This is a SQL-connection configuration file specifies the details of a connection to the database that enables the execution of SQL statements from inside a server-side (XS) JavaScript application with credentials that are different to the credentials of the requesting user.

In short, you want to expose the application without any login prompt > anonymous_connection.

New > Other > SAP HANA > Application Development > SQL Configuration File > public.xssqlcc

{

“description” : “Public Open Connection”

}

After adding the script Activate the Project to apply the changes.

In your Chrome Browser > Go To > http://<hana_ip>:8000/sap/hana/xs/admin/

Navigate to public.xssqlcc > Edit the details as accordingly (Username & Password)

2. xsaccess:

SAP HANA App Access File – Basically you can define authorization & authentication configuration in this file. It is the entry point of every XS App. Without this file + .xsapp, your App will not fire up in the browser / device.

Make sure the value of anonymous_connection is pointing to the correct location of your .xssqlcc file. And of course .xssqlcc has to be activated on the server first, otherwise by activating your .xsaccess file, it will give you an error.

By default, you should already have this.

New > Other > SAP HANA > Application Development > XS Application Access File > .xsaccess

{

“anonymous_connection”: “PR_Demo1.1::public”,

“exposed” : true,

“authentication” : null

}

After adding the script Activate the Project to apply the changes.

Step 7: To connect application through SAP B1 service layer we need to establish the session.

For this we take reference of API of Service Layer. https://XXXXXXXX:50000 at this page there will be API Reference link click on it and Search for Login. Expand Login and click on post that will tell you request format with payload.

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Step 8: To consume any entity in SAPUI5 we need to create ajax call on Login entity with post operation. This will establish a session with SAP B1 using credentials and database. (I have only shared the CRUD operation code as sample not complete application code)

$.ajax({

url: “https://XXXXXXXX:50000/b1s/v1/Login”,

xhrFields: {

withCredentials: true

},

data: jData,

type: “POST”,

dataType : “json”,

async:false,

success: function( json ) {

},

error: function( xhr, status, errorThrown ) {

var msg;

if(xhr.status===401){

msg = “Invalid Username/Password”

}

else{

var a = xhr.responseText;

var b = JSON.parse(a);

msg = b.error.message.value;

}

sap.m.MessageToast.show(msg);

},

complete: function( xhr, status ) {

}

});

After completing the changes Run the project

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Step 9: After Login Post we can perform CRUD operation on all the entities listed in API reference.

$.ajax({

url: “https://XXXXXX:50000/b1s/v1/PurchaseRequests”,

xhrFields: {

withCredentials:true

},

type: “GET”,

dataType : “json”,

async:false,

success: function( json ) {

result = json.value;

},

error: function( xhr, status, errorThrown ) {

console.log( “Error: ” + errorThrown );

},

complete: function( xhr, status ) {

}

});

Output array will be the purchase requests which we can mount to UI tables

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Option 2: SAP Cloud Platform Web IDE


Step 1: Login into SCP with your Account Details

Step 2: Copy the modified Hana Studio project in SAP WebIDE template project

Step 3: Right click and select Deploy option either SAP Cloud Platform

Step 4: For mobile device  select HAT à Prepare Hybrid Project

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Step 5: After deployment complete go to deployed project path and as below and open command prompt. Then run command “cordova build android” to package the application

SAP HANA Tutorials and Materials, SAP HANA Guides, SAP HANA Certifications, SAP HANA XS/SAP, SAP HANA Learning

Step 6: To package the application for IOS run same process on Mac Book or any Mac System. During packaging some additional steps you have to follow as creating provisional profile for application to allow it package.

Setup S/4HANA 1610 IDES On-Premise – Part I

$
0
0
We installed our first SAP S/4HANA systems – IDES S/4HANA 1610 in premise. To setup this environment we purchased 2 Servers of following configuration.

◈ DELL PowerEdge R730
◈ Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.20GHz
◈ SUSE Linux Enterprise Server 12 SP3
◈ 2.5 TB Hard Disk allocated for SAP Application Server and 5.5 TB for SAP HANA DB
◈ 256 GB RAM in SAP Application Host and 512 GB RAM in SAP DB Host
◈ VMware ESXi 5.5 SP3 on SAP Application Host
We installed SAP ABAP Instance on Virtual host and SAP HANA DB on Physical host

OS installation was performed with the option of SUSE Linux Enterprise Server for SAP Applications

Sizing


SAP has provided the sizing consideration for IDES S/4HANA (SAP Note 2445594)

– HANA Appliance RAM 256GB minimum (Comfortable Sizing)

◈ data                 3  x Memory
◈ log                   1 x Memory
◈ shared/Trace    1x Memory

– HANA TDI RAM 256GB minimum (Reduced Sizing)

◈ data                 1-2  x Memory
◈ log                   0.5 x Memory
◈ shared/Trace    1x Memory

However I sized both the systems on the higher side, so we can create multiple SAP instances in future for our learnings.

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

SAP Media Download


You have to download the following packages:

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

You can either download the packages locally and move them to the server or setup your Linux environment which will help reduce the time of moving the files through the network.

Prepare Linux for SAP solution with saptune (SAP Note 2205917)


The tuned profile “sap-hana”, which is provided by SUSE as part of SLES for SAP 12, contains many of the settings mentioned below.

1. List available solutions – saptune solution list   

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

2.Choose a Solution: saptune solution apply S4HANA-APPSERVER

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

3. Start daemon: saptune daemon start

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

4. Enable daemon  systemctl enable tuned

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

5. Apply single SAP notes:  saptune note list / saptune note apply XXXX

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

SAP HANA DB 2.0 SP02 Installation


Login to DB Host  ./hdblcmgui

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

1. System Type – Single Host
2. Set the SID and Instance number
3. Define passwords for respective users

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

HANA DB Installation completed successfully

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

As we all know SAP HANA Cockpit 2.0 is an independent application to manage all the SAP HANA DB’s 2.0 and above. I installed it on the same host. Installation of HANA Cockpit takes minimum 45 Mins.

Final Disk space occupied by the key directories

SAP HANA Certifications, SAP HANA Guides, SAP HANA Tutorials and Materials, SAP HANA S/4HANA

Setup S/4HANA 1610 IDES On-Premise – Part II

$
0
0
As we successfully completed OS Setup and HANA DB Install in Part I, we will further see the installation steps to build S/4HANA ABAP Instance.

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

You may get error – Cannot resolve host host.qnx.corp’ by name – Please Update Host file with FQDN to resolve it before hand.

I may not be able to include all the screens here since many of them are common to any SAP Installation.

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

We encountered error during the defining Kernel Package path

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

2465430 – Individual package is still missing at kernel specification step

We tried multiple patches of Kernel 749, and then Kernel 745, however it still couldn’t detect the IGSEXE.SAR file. Hence we downloaded SAP Kernel 7.22 and it accepted IGS*SAR files

Since Kernel 7.22 worked, I then copied 722 IGS*SAR to 749 Kernel location and proceed (although it wasn’t needed)

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

I know few images are hard to read, but that’s how the new SWPM UI has been built with a very light font and background colors.

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

Chose EXPORT_11/LABEL.ASC for Export 2 S/4HANA 1610

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

Define SYSTEM User password, set new ABAP Schema

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

Declustering / Depooling Option – Enable of all ABAP Tables

HANA Table Placement – Do not use Parameter file

In Additional components to be included in the ASCS Instance – Install a Gateway Integrated in ASCS instance

Installation got stuck again during Starting of the instance

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

An error occurred while processing option SAP S/4HANA 1610 > SAP S/4HANA Server > SAP HANA Database > Installation > Application Server ABAP > Standard System > Standard System (Last error reported by the step: ABAP processes of instance QS4/D00 [ABAP: ACTIVE] did not start after 10:10 minutes. Giving up). You can now:

Log files are written to /tmp/sapinst_instdir/S4HANA1610/CORE/HDB/INSTALL/STD/ABAP.

igswd_mt, IGS Watchdog, GRAY, Stopped in sapinst_dev.log

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

2535340 – ABAP processes of instance [ABAP: UNKNOWN] did not start after 10:10 minutes. with 28: No space left on device in dispatcher trace file while installing additional dialog

1936475 – System is in an unsound state – igswd_mt IGS Watchdog GRAY Stopped

These notes helped to understand that we need to update IGSEXE.SAR which we had used of 722 Kernel during the initial setup

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

The data of the IDES model company can be found in client 400. In the client 000 you can logon with DDIC and SAP* with the password you have defined at the beginning of the installation as ‘Master-Password’, in the client 400 with the password xxx / xxx and in all clients with IDADMIN / xxxx.

Post Installation


Post the installation S/4HANA DB size was approximately 75 GB which I believe is fully of the SAP HANA simplified model. However post the SGEN run, you can notice an increase of 40-50 GB increase in 2 weeks.

Soon after the system was functional we started getting huge dumps due to already scheduled Jobs. hence we suspended the jobs causing dumps.

SAP S/4HANA, SAP HANA Tutorials and Materials, SAP HANA Learning, SAP HANA Study Materials

2543179 – DBSQL_TABLE_UNKNOWN after clean installation of S/4HANA 

Launching the SAP Fiori Launch Pad ( FLP ) in S/4HANA


https://<server>:<port>/sap/bc/ui5_ui5/ui2/ushell/shells/abap/FioriLaunchpad.html

Learnings


◈ OS installation and the post tuning of the OS parameters based on SAP environment is required to be done properly.
◈ Software Provisioning Manager has bug causing the Kernel components not recognized properly most of the time, hence be ready for workarounds
◈ Post S/4HANA installation you will notice lot of Table/CDS Views are inactive and needs explicit attention
◈ SAP HANA Cockpit we can manage majorly Tenant DB and not the System DB individually as when we just defined SystemDB along with its SAP control credentials, it kept giving us error “Failed to open SAP Control connection”

Use JDBC to connect to HANA database instance in SAP Cloud Platform

$
0
0
This blog is divided into two parts.

Part 1

1. Introduce how to create a HANA database instance in SAP Cloud Platform (called SCP for short in the remaining part of this blog)

2. Develop a Java Web application and deploy it to SCP. The application uses JDBC to access HANA database instance created in previous step.

Part 2

Develop a Java Web application and deploy to an On-Premise system under Corporate Network. This Java application gains access to the HANA databse instance in SCP by leveraging SAP Cloud Connector.

The source code used in this blog could be found from my github. You can also get it via neo SDK.

Create a HANA database instance in SCP


1. Create a new instance in SCP Cockpit:


Specify ID, user and password. Database ID will be used later.


Once created, the status turns to STARTED. You can also add url of Development Tools into favourate, as we will use it later as well.


Now this database instance is ready for use. Next step is to develop a Java application using JDBC to access it.

Access HANA database instance via JDBC


Import the application in my github into Eclipse. Three major Java files:


Person.java

It defines the Person model consists of three member attributes: id, firstName and lastName, and corresponding getter and setter method. This class is so called POJO(Plain Old Java Object), which means there is no specific constraints or framework( for example EJB ) applied to it.

PersonDAO.java

As its name implies : DAO – Data Access Object。It’s responsible to connect HANA database instance via JDBC, create database table named as T_PERSONS, and insert records into created table.

PersistenceWithJDBCServlet.java

Implement a simple UI to get user input and call PersonDAO to insert record into connected HANA database table.

The JDBC DataSource instance is gained via JNDI. Such instance is passed into DAO constructor as importing parameter. All subsequent JDBC operation are done by it.


The configuration for DefaultDB in line 28 above is done in web.xml:


Deploy this application in SCP with name jerryjdbc.


Till now, the Java application still does not know which database instance in SCP could be connected.
As a result we should bind the application to the very HANA instance hana01 created previously. The binding could be established by the below UI:


Once binding is done, we can now insert a Person record Jerry Wang to HANA instance via Servlet UI:


Access the Java application in mobile phone, and the record Jerry Wang inserted just now could be found:


Connects HANA database instance in Cloud from On-Premise system


Now we try the other way round. I deploy the Java application to local server which runs in SAP Chengdu office under Corporate Network. It will connect HANA database instance in SCP with help of Cloud Connector again.


Cloud Connector Configuration for connection from On-Premise to Cloud


I create a new HANA instance jerrydemo for this scenario. Log on Cloud Connector, click tab “On-Premise to Cloud”, create a new Service Channel:


Assign the new created HANA instance jerrydemo to this Channel:


Write down the Channel port number 32215.


Change the Java server setting in On-Premise system to point to HANA instance in SCP instead
No code change is needed for Java application but just server setting is necessary to change. Make changes in file connection.properties:


The idea is to point database connection of this Server to the Service Channel just created in Cloud Connector, so that the HANA instance in SCP behind Cloud Connector could be reached.

◈ javax.persistence.jdbc.url

the url localhost:32215 points to the Service Channel in Cloud Connector. The HANA instance jerrydemo is assigned to this channel. The url fragment “currentschema=SYSTEM” means the database table created by JDBC will be stored under SYSTEM schema.

◈ javax.persistenc.jdbc.user / password

The user and password for database instance jerrydemo.
So far all configuration are done.
Launch localhost Web server in On-Premise system, insert two records:


Open SAP Cloud Platform HANA Development Tool, check from SYSTEM schema and could observe the two records inserted by the Web application running in On-Premise system.

A short overview of What’s New for the SAP HANA Performance Management Tools in SAP HANA 2.0 SPS03

$
0
0

SAP HANA capture and replay


Testing database workload is still a large effort for administrators, testers, developers and consultants alike when looking to upgrading from one release to another. Especially when it comes at larger scale scenarios, the amount of work can grow exponentially.

Initially released with SAP HANA 1.0 SPS12, SAP HANA capture and replay offers semi-automated support for integrated testing in the context of SAP HANA. The goal is to simplify the manual effort needed for creating tests and performing more accurate replays.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA 2.0

In SAP HANA 2.0 SPS03, SAP HANA capture and replay has received a number of improvements in several areas.

◈ Down port of several enhancements into SAP HANA 1.0 SPS12

As introduced in my previous blog, many of the enhancements to functionality and stability of SAP HANA capture and replay have been added to the latest revisions of SAP HANA 1.0 SPS12 so customers on the long-term maintenance release can benefit.

◈ Re-design of overview page and replay report, as well as other usability enhacements

The new design of the overview page and several usability enhancements related to the overall navigation of the replay report help users determine problematic statements faster and more efficiently.

◈ New features for SAP HANA capture and replay

New features include the option to automatically trigger a database backup from the capture configuration screen, thus facilitating the backup creation for end users. Also, SQL execution parameters are now being displayed for users to easily re-execute statements with the used parameters. Users can now import and export replay reports from the UI to share them with others or analyze reports not available in their system. A new feature for resetting passwords for database users during replays has been added to make it simpler when dealing with large amounts of users that need to be authenticated when configuring a replay.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA 2.0

SAP HANA workload analyzer


Analyzing performance issues in SAP HANA can be complex and challenging. Multiple analysis steps might be necessary for the user to find the correct monitoring views or develop custom SQL queries to retrieve the needed information from SAP HANA.

SAP HANA workload analyzer, also released in SAP HANA 1.0 SPS12, offers deeper insights into current system workload by analyzing thread samples and engine instrumentation data. The sampling-based workload analyzer uses thread samples that are taken continuously and offer a real-time view on what is going on in the customer’s system.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA 2.0

The instrumentation-based workload analyzer focuses on analyzing the engine instrumentation of captured workloads. This tool can be used to load existing capture files to gain a better understanding of the workload that was captured.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA 2.0

In SAP HANA 2.0 SPS03, SAP HANA workload analyzer has received a number of improvements in several areas.

◈ Sampling-based workload analyzer

A new overview page with top SQL statements makes it easier to focus on the analysis of specific statements of interest. Global filters and variants have been added to improve usability when filtering and navigating data. The overall layout of the tool has been re-worked to provide a better, user-friendly tab-based navigation. Improvements to the chart visuals help users better understand the data displayed.

◈ Instrumentation-based workload analyzer

A new overview page now displays the most important information about loaded capture files. Similar to the sampling-based workload analyzer, global filters and variants have been added, and chart visuals have been improved.

SAP HANA SQL analyzer


Query-level analysis is generally the lowest level of depth in the process of performance analysis, and as such, also the most complex one. It requires deep knowledge into SAP HANA and plenty of experience with queries and data models. When analyzing performance of single SQL statements users can investigate the query execution plan to identify long-running parts of the query.

Since SAP HANA 2.0 SPS00 the SAP HANA SQL analyzer has been available for use. The tool is related to the “PlanViz” perspective in SAP HANA Studio and used for analyzing query-level Performance.

SAP HANA Tutorials and Materials, SAP HANA Certifications, SAP HANA Learning, SAP HANA 2.0

In SAP HANA 2.0 SPS03, SAP HANA SQL analyzer has received a number of improvements in several areas.

◈ Plan Stability UI

A UI to support the newly released Plan Stability feature in SAP HANA has been added. The feature can be used to capture, store and use abstract SQL plans to safeguard query performance for highly optimized queries when upgrading from one SAP HANA SPS to another.

◈ New features for SAP HANA SQL analyzer

Newly added support for SQL Script V3 framework brings value to customers when investigating SQL Script calls. It provides better visualization of control flow logic and procedure execution. A new view for table accesses complements the view of tables used during query execution with additional information on details such as number of entries processed and conditions of individual table accesses. Support for execution parameters ensures that users can re-execute parametrized statements easily.

◈ Overall UI enhancements

A new landing page helps users navigate through saved plans. Several access points to SAP HANA SQL analyzer have been added in SAP HANA Cockpit, such as the SQL plan cache view or the statement hint view. In addition, the overall layout of SAP HANA SQL analyzer has been enhanced for more usability, more flexiblility and better responsiveness.
Viewing all 711 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>