Quantcast
Channel: SCN : Blog List - SAP HANA Cloud Platform Developer Center
Viewing all 526 articles
Browse latest View live

SAP HANA Cloud Platform @ AribaLive 2016

$
0
0

Introduction

 

IMG_3805.JPG

While not my first time in Las Vegas, this was my first time at AribaLive.  From various emails and YouTube videos, I was pretty excited about attending this event, well beyond that, I was excited about having the opportunity to evangelize the SAP HANA Cloud Platform (HCP) story with a new group of people and beyond that, learning more about the Ariba Business Network.  This opportunity was afforded to me by our team, Jeramy Ivener & Michael Bain who are the HCP leads for our go-to-market and center of excellence teams, respectively.  I’ve always held the belief that in order to help solve a business problem, actually any problem for that matter, you must first understand the business.  Ariba was a completely new area for me, so being completely immersed in this topic for next several days was going to be awesome.

 

Event Organization

 

IMG_3798.JPG

The event was 3 days long and it took over the Cosmopolitan Conference center.  The first day consisted of various workshops were designed to allow businesses to network with their peers and learn best practices.  There were various workshops that focused on the buyer, seller, partner sides of the house respectively.  Many, if not all of these, require pre-registration, because of this, I was not able to attend. Which is why I cannot offer you any more details.  Towards the latter half of the day, into the early evening, there was a welcome reception that introduced all attendees to the “Commerce Pavilion.”  If you’ve attended other conferences, this is equivalent to a showroom floor which consisted of various partner booths and Ariba Demos.  The next couple of full days always kicked off with breakfast in the commerce pavilion and then leading into daily keynotes presented by Ariba executives and invited leaders in the various industries.  The remainder of the day were well organized breakout sessions that made it easy for you to create an agenda and follow the key topics you wanted to learn about. This meant that if you were interested in a particular learning track, e.g. as a buyer or as a seller, you would minimize the scheduling conflicts.  Tuesday was capped off with a huge party at the Marquee day/night club which allowed attendees to further mingle and build up their networks and socialize.

 

AribaLive!

 

IMG_3812.JPG

The content of the conference centered around some key themes that were mirrored in the keynote presentations and breakout sessions.  Aside from what you’d expect from the keynotes, customers sharing their stories of how the Ariba Network has shaped their business and contributed to their success, this year there was a strong message about digitization and the digital economy.  For many of us in technology, we sort of take it for granted that working with a computer or staying connected 24/7 is just like the air we breathe and sometimes think that this cannot somehow be incorporated into other industries like agriculture or construction as examples.  But the trends have shown that digitization has not only penetrated these industries, but transforming the way they do business - making them operate more efficient and faster.  The common overtone about digitization was certainly speed. Allowing business to be agile and adapt to changes very quickly, all facilitated by digital information.

 

IMG_3813.JPG

Another effect of digital transformation was the need for customization, specifically adapting to the needs of the industries.  It only made sense that as more and more users begin to use the system that their needs also differ from the existing base and opportunities arise to help make the solutions fit their specific needs.  Whether these come in the shape of creating questionnaires that are specific to their needs, or perhaps creating an interaction interface for a user group who just wants to order something simple.  These were features that were then supported by some new features in Ariba, including customizable forms.

 

IMG_3887.JPG

With the increase the demand for more customizations or extensibility of solutions, there is the need for help, help from an ecosystem to help create these features and functions to tailor towards the specific business needs. Therefore, Ariba was adopting quite a new strategy for them, which is openness.  Later this year, they will be opening up more API’s, beyond their current cXML based services that will allow the partner ecosystem to build new extension applications to help fulfill these additional requirements and needs for their customers.

 

SAP HANA Cloud Platform


IMG_3922.JPGIMG_3919.JPG

This was a perfect fit for SAP HANA Cloud Platform to be present at this event, as HCP is the platform that brings together the tools and services needed to help those partners and customers realize those extensions.  We were present in the commerce pavilion.  We also made a presence in the digital world by interacting and reacting in the twittersphere with the attendees at the conference.  Lastly, Jeramy, Michael, and I had a breakout session towards the end of the conference where people were primed to wanting to know more about how and where they could build these extensions.  Although we were the last session of the last day of the conference, there ended up being standing room only.  There was a good mixture of Aribians, Customers, and Partners in attendance.  In this breakout session, we introduced to SAP’s Platform as a Service.  We went through a discussion on the use case and demonstrated sample solutions on the possible solutions and scenarios that could be built using Ariba & SAP HANA Cloud Platform.  Amongst this was also a deeper dive into one the demo that Alex Atzberger had presented onstage that showed how IoT and predictive maintenance could help businesses stay up and always running.  The scenario showed how being able to monitor machinery, in this case a dump truck who has sensors monitoring the temperature of its engine.  Predictive maintenance has shown that over the course of a few days, the operating temperature has trended upwards and failure is predicted to occur in the next 5 days.  An alert is given to the operator, who never has to exit the system, but then is presented with a cause and course of action.  The action presented is the opportunity to locate the replacement parts, and then place the order with a supplier through the Ariba Network.  Order is confirmed and delivery date is given and made sure that the part arrives in time before the machine fails.  If you think about all the assets in the SAP portfolio, the scenario could be even extended beyond this, where, perhaps the expertise to repair the machinery has to be sourced, of course then we could also find this through SAP Fieldglass.  How about making the travel arrangements for this worker, SAP Concur.  The opportunities are quite numerous on how SAP HCP, which has the native capability to integrate with these SAP solutions (and others).

IMG_3920.JPG

The Impressions

 

My three take aways from this event were: I loved the way it was organized; At the ground level, many inside Ariba still do not understand SAP HANA Cloud Platform; with Ariba opening up API’s there will be a pull of information how these API’s can be used to build out scenarios.

 

While the event was only 3 days, it was non-stop, content was jam packed.  Somehow though, it was very nicely arranged. As mentioned, I liked the fact the tracks were well organized and that overlap was minimized.  Another great thing was that the sessions all started and ended at the same time, so you didn’t have to worry about missing part of a session.  The breakfasts and lunch were all centered in the Commerce Pavilion, and with no breakout sessions scheduled for this time, everyone ate at togther and actually had a chance to network and socialize.

 

I spoke to quite a few number of Aribians, and it was clear that many had heard of HANA, HANA Database, HANA Platform, HANA Enterprise Cloud, HANA Cloud Platform, with all these HANA’s as they put it, the story as to what each was, was a little confusing.  I think there is an opportunity for HCP, HANA Cloud Platform to help clarify the story and distinguish our capabilities from the other HANA family of products.  The first method was sharing a concept that has been voiced by Steve Lucas and many in leadership; there is HANA the brand, and HANA the products, including the database, the platform, of course us, the HANA Cloud Platform – SAP’s Cloud Platform as a Service.

 

The timing couldn’t be more perfect as Ariba adopts and open API strategy, customers and partners are already asking well where should do this. With us being here at the ground floor, HCP will help provide many of the answers to the fresh questions many will have.  If customers are already using SAP products, why wouldn’t you think about using SAP HANA Cloud Platform?  There will be an opportunity for both Ariba and HCP to help customers and partners provide a solution that helps solve business problems end-to-end.  HCP is close to Ariba and will help customers maximize the value of Ariba’s new open ecosystem.

 

You can find many more of my impressions through the tweets that I had made during the event, you can join me on twitter at https://twitter.com/thesapmic

 

IMG_3803.JPG


Principal Propagation between HTML5- or Java-based applications and SAP HANA XS on SAP HANA Cloud Platform

$
0
0

Introduction

Although there is no standardized definition of the term "Principal Propagation", it is commonly understood as the ability of a system to securely forward or propagate the authenticated user (principal) from a sender to a receiver in a way that the forwarded user information is kept confidential and - even more important - cannot be changed during transit. Based on a pre-established trust relationship to the sender, the receiver uses this information to logon the user without asking her again for the credentials.


Principal propagation plays an important role in many scenarios on SAP HANA Cloud Platform (HCP), e.g. when an application has to pass the logged-on user in the Cloud to an on-premise system via the SAP HANA Cloud Connector. More information on this scenario can be found here. The following picture illustrates another very common scenario for principal propagation, where an application on HCP consists of two components: The user interface (UI) is developed and deployed as an HTML5- or Java-application on HCP which consumes an API implemented as a RESTful service from an SAP HANA instance running on HCP. The API requires an authenticated user and exposes the user's data via SAP HANA Extended Application Services (XS).

figure1.jpg

On HCP, the user usually authenticates against an identity provider (IdP) which is configured for the account where the application is deployed to. In HCP trial accounts for example, this is the SAP ID Service by default, which is a free-of-charge public identity provider from SAP, managing the SAP Community Network users, SAP Service Marketplace users and the users of several other SAP sites. To delegate user authentication to the IdP, HCP uses the SAML 2.0 protocol. Upon successful authentication at the IdP, the HTML5 application on HCP receives a SAML Response from the IdP, which is a message digitally signed by the IdP. It must contain at least the unique logon name of the user, and may also include additional information about the user, such as the user's first and last name, e-mail address etc.


HTML5 applications usually rely on on-premise or on-demand RESTful services. When a RESTful service is called from an HTML5 application, a new connection is initiated by the central HTML5 dispatcher on HCP to the service that is defined in a corresponding HTTP destination. If this call requires the user to authenticate at the service, the HTML5 dispatcher should rather propagate the authenticated user or login context than prompting the user again for credentials to access the service.

figure2.jpg

There are two authentication mechanism available for an HTTP destination to propagate the logged-in user to a RESTful service running on SAP HANA XS: SAP Assertion SSO or Application-to-Application SSO (AppToAppSSO). The first one uses SAP Assertion Tickets to transfer the logged-on user information, the latter uses a SAML Assertion. Compared to SAP Assertion SSO, AppToAppSSO has the following advantages:

Compared to the alternative of using SAP Assertion Tickets for propagating the user, AppToAppSSO has two advantages:

  • The propagated user information can contain more information than just the user's login name. Additional user attributes are also forwarded with the SAML Assertion. SAP Assertion Tickets only forward the user's login name.
  • SAP HANA XS can dynamically create a new DB user based on the forwarded information. This user is required to successfully log on the user on the SAP HANA instance. With SAP Assertion Tickets, this mechanism, sometimes referred to as "Just-in-time (user) provisioning", is not supported, and the users have to be created in advance. However, this is sometimes not possible, e.g. if there is a large number of users accessing the service.

In this blog you will go step-by-step through a scenario using AppToAppSSO. Common for both mechanism is that the recipient (XS) must trust the sender (HTML5 dispatcher) to accept the propagated principal. For AppToAppSSO, this trust relationship is setup in XS similar to other SAML-based IdPs. Therefore, the SAP HANA instance must be properly setup for SAML-based authentication as one of the following prerequisites.

 

Note: Although an HTML5 application is used to implement the UI, a Java-based application could have been used as well for the scenario. AppToAppSSO works for both application runtimes to propagate the authenticated user to SAP HANA XS.

 

Prerequisites

 

The scenario in this blog is using an SAP HANA Multitenant Database Container (MDC) on the HCP trial landscape. Before getting started, please check that your meet the following prerequisites:

 

  • You have an HCP trial account, which can be created at no charge from here.
  • You have created a MDC in your trial account. Please followEkaterina Mitova's instructions in this blog to create one.
  • You have setup the SAML Service Provider in the MDC. Please follow the sections Creating Your Service Provider Certificate and Complete the Service Provider Settings in the blog “Play It Again, SAML” from Oliver Goetz.
  • You have installed Eclipse with the SAP HANA Cloud Platform Tools and SAP HANA Tools following the instructions on the SAP HANA Tools site
  • You have installed OpenSSLwhich will be used in first step to generate the signing key pair and certificate for your HTML5 SAML Service Provider

 

Step 1: Configuring the Local Service Provider for HTML5 apps

AppToAppSSO uses a SAML Assertion as the security token format to propagate the logged-on user. Therefore, your HCP (trial) account must be setup with a custom SAML Service Provider key pair which is used to digitally sign the SAML Assertion. Based on this signature, XS will verify that the user information has been propagated from a trustworthy system, i.e. your HTML5 application, or even more precisely, your account's subscription to the central HTML5 dispatcher. Login to the Cloud Cockpit on the HCP trial landscape and open the Trust settings of your account. Click on the Edit button and switch the Configuration Type from "Default" to "Custom".

figure3.jpg

If you have never done this before, you will see empty text fields for the Signing Key and Signing Certificate. Those need to be filled in this step as they identify your HTML5 application to the service running on XS. Unfortunately, you cannot use the "Generate Key Pair" button in this scenario, as XS will not be able to import the certificate. This bug will be fixed in HANA in one of the next revisions. For now, you have to create the key pair using a tool like OpenSSL by issuing the following command:

openssl req -x509 -nodes -days 365 -sha256 -subj "/CN=https:\/\/hanatrial.ondemand.com\/<your account name>" -newkey rsa:2048 -keyout spkey.pem -out spcert.pem

Please replace "<your account name>" in the command with your trial account name, e.g. "p<some number>trial". As a result, two files are generated: spkey.pem and spcert.pem. You need to convert the private key file spkey.pem into the unencrypted PKCS#8 format before pasting it into the text field using the following command:

openssl pkcs8 -nocrypt -topk8 -inform PEM -outform PEM -in spkey.pem -out spkey.pk8

Now open the files spkey.pk8 and spcert.pem in a text editor and strip off the tags "-----BEGIN PRIVATE KEY-----", "-----END PRIVATE KEY-----", "-----BEGIN CERTIFICATE-----" and "-----END CERTIFICATE-----". Copy the remaining content from spkey.pk8 in the text field with the label Signing Key, and spcert.pem into Signing Certificate. After clicking on Save you should get a message that you can proceed with the configuring of your trusted identity provider settings, and see a Local Service Provider configuration like shown in the following screenshot:

figure4.jpg

 

Click on the Get Metadata link to export the Local Service Provider configuration in a standardized metadata format, which will be use in the next step to import the trust settings in XS.

 

With the Configuration Type "Custom" you are now able to configure your own trusted identity providers, e.g. a corporate IdP. For the scenario in this blog you will continue to use SAP ID Service as our IdP to authenticate the users. Therefore you have to switch back to Configuration Type "Default" by clicking on the Edit button and reverting Configuration Type "Custom" back to "Default". Click on Save.

 

Note: By switching back to "Default", your "Custom" settings are not lost, and will be used for signing the SAML Assertion sent by the HTTP destination using AppToAppSSO principal propagation.

 

Step 2: Setup Trust in XS to the HTML5 Local Service Provider

 

Open the SAML Identity Provider list of your trial MDC with the XS Admin tool using your account-specific URL https://<mdcname><account name>.hanatrial.ondemand.com/sap/hana/xs/admin, and login with the SYSTEM user. If the SYSTEM has not yet the required roles to access the XS Admin tool, add all roles in SAP HANA Studio containing "xs.admin" in the name as shown in the following screenshot:

figure5.jpg

On the SAML Identity Provider list, click on Add ("+") to create a new trust relationship to your HCP account's Local Service Provider which has been configured in the previous step. In the Metadata field, copy and paste the content of the SAML Metadata file you exported from the Cloud Cockpit using the Get Metadata link.

figure6.jpg

 

When you click on Save, the fields in the form will be updated based on the values from the metadata file. The only fields left blank are "SingleSignOn URL (RedirectBinding)" and "SingleSignOn URL (PostBinding)", because you've actually imported a metadata file of a service provider, and not of an identity provider. Therefore add some dummy values, e.g. "/saml2/sso". Also make sure that the checkbox "Dynamic User Creation" is activated. This ensures that for new users a corresponding HANA user is created. Click on Save again to store your settings.


With HANA SPS10, certificates are managed differently, and you need to issue a few SQL commands to store the Local Service Provider's certificate in HANA. First, verify that the destination for the new IdP was stored in HANA by checking in SAP HANA Studio the _SYS_XS.HTTP_DESTINATIONS table using the command

SELECT*FROM _SYS_XS.HTTP_DESTINATIONS

 

You should see the destination in the result list:

figure7.jpgNext, store the certificate from file spcert.pem including the "-----BEGIN..." and "-----END..." tags with the following SQL command:

CREATE CERTIFICATE FROM'-----BEGIN CERTIFICATE-----

MIIDiDCCAnCgAwIBAgIJAM3+kppT633nMA0GCSqGSIb3DQEBCwUAMDYxNDAyBgNV

BAMTK2h0dHBzOi8vaGFuYXRyaWFsLm9uZGVtYW5kLmNvbS9kMDQ0NzI0dHJpYWww

...

hj65U8pdxfugQlhrnewfMrAYf6uqpe0Fbbz6e3Ig2o6lHdLRyLaZiffoVEc=

-----END CERTIFICATE-----';

 

Verify that the certificate has been successfully store using the following SQL command:

SELECT*FROM SYS.CERTIFICATES

The new certificate is shown at the end of the list:

figure8.jpg

Finally you need to add the new IdP certificate to the PSE container for SAML which has been created during the SAML configuration of your XS following the blog “Play It Again, SAML” (see steps Create a PSE Container and Assign the PSE Store For SAML Use). To add the certificate, issue the SQL command

ALTER PSE <your PSE name> ADD CERTIFICATE <your certificate id>


Note: You have to replace the PSE name and certificate id with your values, e.g. "SAMLTRUST" and 155581 (taken the generated certificate id from the screenshot above as an example).


This concludes the trust setup in HANA XS to your HTML5 application as a trustworthy system to propagate the authenticated user. Next you will configure the destination of your HTML5 application.

 

Step 3: Configure HTTP Destination for AppToAppSSO

The sample HTML5 application used in this blog is a project management application, which retrieves a user's project data from an REST service running on XS. The complete code of the HTML5 application can be downloaded from here and can imported in SAP WebIDE on HCP.

 

As a user, you login to the application via the IdP, and then see a list of projects where you are assigned to. Therefore the logged-on user must be propagated securely to XS which will use the propagated user id to query the database for the projects where the user is assigned to as the project lead. In addition, the user's attributes such as first- and last name are used to set the user's name in the list of projects returned from XS to HTML5.

 

The actual invocation of the service in XS is done in Project.controller.js of the HTML5 application:

figure9.jpg

In the JSON model, the data is loaded from the URL /api/projects, which is mapped in the HTML5 application's neo-app.json descriptor file to the HTTP destination with name "xsprojectdata" :

figure10.jpg

Let's have a look at the destination configuration in the Cloud Cockpit. The two most important settings are highlighted in the following screenshot:

figure11.jpg

  • The Authentication method is set to AppToAppSSO
  • An additional property with the name "saml2_audience" and the value "I1700" is set for the destination

 

The property sets an important value in the SAML Assertion which is used to propagate the user. This value, the SAML audience,

"contain[s] the unique identifier URI from a SAML name identifier that describes a system entity" and "evaluates to Valid if and only if the SAML relying party is a member of one or more of the audiences specified." (Assertions and Protocols for the OASIS Security Assertion Markup Language (SAML) V2.0, page 23)

In other words: XS would reject the SAML Assertion with the propagated user if the audience is not set correctly to its own SAML name identifier. By default, an HTTP destination configured for AppToAppSSO sets the audience to the name of the SAML local service provider (aka "relying party") configured in the Cloud Cockpit. For a trial account, this would be "https://hanatrial.ondemand.com/<your account name>" if you haven't changed it. However, your MDC container is configured to a different SAML service provider name. Mine got the name identifier "I1700" which can be looked up in the XS Admin Tool under "SAML Service Provider":

 

figure12.jpg

 

Last but not least, configure the URL of the destination according to your service location. You can download the XS code from here. I've deployed the service in a package sample.xproject in the HANA repository, so the resulting URL is https://<MDC_name><account_name>.hanatrial.ondemand.com/sample/xproject/xproject.xsjs.

 

Note: Please change the file data/projects.csv before you active it, and replace the two placeholders <your user id> in the file with your SAP ID Service user ID. This file will import some sample data into the PROJECT table which is used later for testing the scenario.

 

Step 5: Configure the default role of dynamically created users in XS Service

The xproject.xsjs file implements the XS service to retrieve the propagated user's projects from the database. The function getProject() retrieves the user's unique logon name and queries the database for projects where the user set as the project lead. The result is returned in JSON format. The PROJECT table can only be accessed by users with the role "projectmember" which is defined by the file projectmember.hdbrole. Therefore, new HANA DB users created dynamically according to the new IdP's setting should automatically be assigned to this role. To set this default role, you first need to create a run-time role by opening the Security folder of your system in the "Systems" view in SAP HANA Studio. There do a right-click on the Roles element and select New Role from the context menu. For the Role Name, enter a value such as "DEFAULT_ROLE_FOR_PROJECT_MEMBERS", and click on the "+" in tab Granted Roles to add your design-time role "sample.xproject::projectmember" to it. Press Ctrl+S to save you new run-time role.

figure16.jpgNext, double-click on your system in SAP HANA Studio to open the Administration. Select the Configuration tab and filter for "saml". Right-click on the saml section in the search results and select Add Parameter from the context menu. The Add Parameter Wizard opens. Leave the default selection ("Database") for the scope and click Next. For the key name, enter "defaultrole", and for the value the name of the newly created run-time role ("DEFAULT_ROLE_FOR_PROJECT_MEMBERS"). Click Finish to save the new parameter.

 

figure15.jpg


Step 6: Configure SAML for the XS Service

Before you can test the scenario, the XS Service must also be protected with SAML. In the XS Admin Tool, select "XS Artifact Administration" from the menu. Go to package "sample.xproject" and click on Edit. In the Security & Authentication tab, activate SAML and select newly created IdP in the dropdown box, starting with "HTTPS__HANATRIAL_...". Deactivate any other authentication methods and click on Save.

 

figure13.jpg

 

 

Step 7: Testing the Scenario

Now it is time to test the scenario: Go back to Cloud Cockpit and open the Overview page of your xproject HTML5 application. Right-click on the Application URL and open the application in a new private/incognito browser window to obtain a new session.

figure17.jpgYou will see the landing page of the xproject application. Click on Login.

figure18.jpg

Based on your trial account's trust settings, you will be redirected to SAP ID Service as the default IdP. Upon successful logon with your SAP ID Service credentials, your browser is redirected back to the application. The project overview page retrieves its data from the XS service, which uses the AppToAppSSO destination to propagate your user. Based on the configuration settings from the previous steps, only the projects for the currently logged-in user are retrieved by getting the username from the XS session object with

var username = $.session.getUsername();

 

in line 20 of the xproject.xsjs file, and appending it to the SQL statement which queries the application's PROJECT table. In addition, the federated user attributes for first- and last name of the logged-in user are used to return the display name of the user. Those are accessed in XS under the same name as in HTML5 or Java. For SAP ID Service, they are accessed using firstname and lastname using the following API:

var displayName = $.session.samlUserInfo.firstname+" "+ $.session.samlUserInfo.lastname;

Depending on your table data and user name, the list may look like this in the web browser, only showing two out of three projects in total:

figure19.jpgThis step concludes the scenario and I hope this is of help if you are implementing a similar scenario on SAP HANA Cloud Platform.

New SAP HANA Cloud Platform Essentials course on openSAP - also available in Japanese!

$
0
0

Since its inception in 2012, SAP HANA Cloud Platform has grown in a variety of ways. SAP HANA Cloud Platform allows you to focus on developing your software and use an enterprise-ready, preinstalled environment in the SAP Cloud to deploy and run your applications in a secure and reliable way.

openSAP learners have enjoyed learning how to develop on SAP HANA Cloud Platform, from an introduction through to next steps and extending SAP products to SAP HANA Cloud Platform. In addition to openSAP courses, learners have also enjoyed updates via the SAP HANA Cloud Platform podcast series. Today, we are happy to announce a new openSAP course,SAP HANA Cloud Platform Essentials. This course begins May 10 and is for new users of SAP HANA Cloud Platform, as well as experienced users interested in learning about the new and improved functionality. For the first time, learners can also register for the Japanese version of this course and learn through Japanese. The platform interface will also be available in Japanese.

 

As a developer, you will learn about the essential elements of SAP HANA Cloud Platform and how you can make the most of the variety of services it provides. Even if you’ve joined the previous SAP HANA Cloud Platform courses on openSAP, you will learn about the newly introduced services such as Internet of Things (IoT), gamification, mobile documents, and much more. Rui Nogueira and his team are ready to take you through SAP HANA Cloud Platform.

 

To take part, you should have basic Java programming skills as well as a basic knowledge of how to use the Eclipse development environment.   You will have access to a trial SAP HANA Cloud Platform account to perform hands-on exercises. As always, all you need to sign up is a valid email address. The course starts from May 10 and runs over a six week period.

 

Sign up today!

SAP HANA Cloud Platform Essentials (English)

SAP HANA Cloud Platform Essentials (Japanese)

 

 

As with all openSAP courses, registration, enrollment, and learning content are provided free of charge.

Other upcoming and current courses include:

Build Your Own SAP Fiori App in the Cloud – 2016 Edition

Software Development on SAP HANA (Delta SPS 11)

Implementation of SAP S/4HANA

Implementation Made Simple for SAP SuccessFactors Solutions

Digital Transformation Across the Extended Supply Chain

SAP Business Warehouse powered by SAP HANA (Update Q2/2016)

Sustainability Through Digital Transformation

High Availability and Disaster Recovery with SAP HANA Platform

Mobile app development and registration using WebIDE and HCPMS

$
0
0

In this article, we will see how one can develop a hybrid mobile application in WebIDE using HAT and then register it with HCPMS .

 

Please note that there are certain prerequisites that need to be fulfilled before we try to create Hybrid Mobile app .These have been mentioned very nicely here -

https://open.sap.com/courses/mobile2/items/09VyUvkiTYy78ePyRLqDBp

 

 

Step 1 - Creation of Hybrid Mobile App on WebIDE



Ceate.jpg

 

Step 2 - Device Settings

 

Right Click on the project name and select Project Settings . Then select device settings to configure device properties.

Provide App Name , App ID , Description and Version . Make a note of the App ID as this will be used later in HCPMS .

Under Build Options , select Debug Mode for our purpose .

Under Platforms , select your desired platform(s) on which you intend to deploy the app .

Under Plugins , go to Kapsel Plugins and select Logon Manager . If this is not selected , the device will not register on HCPMS as this plugin contains the required code for registration .

Now save and close .


Step 3 - App Configuration on HCPMS


Open the HCPMS service from your HANA Platform cockpit . Go to Applications and Click on Create. You will see a screen similar to what is shown below -


HCPMS1.jpg

Fill in the values as above . The Application ID should be the same as the one that was entered in step 2 . Security Configuration can be set to None . After saving , the application can be seen in the list . Click the settings icon against the application and click on the Back End tab . Set the configuration as shown below . Please note that I have used Northwind for oData and hence mentioned it here against Back-End URL . You can enter your applicable back end URL here .



HCPMS2.jpg


 

Step 4 - Deploying the app to Device/Emulator

 

In the WebIDE , right click on the name of the application , click on Run -> Run On -> Device/Emulator

 

HCPMS3.jpg

 

The Hybrid App would now be created and deployed on your Device/Emulator . This generally takes around 5 -7 minutes for me and could take the same time for you as well .

 

Step 5 - Opening the app on the device

 

You will see the following once the app deploys on your device/emulator

 

Dev1.jpg

 

Once you have filled in the user name and password , click on register to see the following screen .Screenshot_20160323-211527.png

I decided to disable the passcode and then clicked on Submit to be taken to my app .Dev3.jpg


At this point of time , the Kapsel Logon plugin would have run and the device would have registered itself on HCPMS . We can see the device entry by clicking on Registration and Users in HCPMS . You might have to adjust the date filters to see all devices .


HCPMS4.jpg

InnoJam++ at CeBIT 2016 by SAP and Volkswagen a Success!

$
0
0

Hi everyone,

 

Last week during CeBIT 2016 SAP hosted in collaboration with Volkswagen the programming competition InnoJam++ focusing on the Internet of Things and the mobility of the future powered by SAP HANA Cloud Platform.

 

IMG_0534.JPG

 

As Stephan Brand, one of our judges, explained in his announcement post nearly 100 students from 16 countries would be competing with the main part a 30 h nonstop coding sessions from Tuesday morning through the night to Wednesday evening. On Monday Design Thinking was used in the ideation phase and on Thursday the presentations prepared and presented at the corporate stage.

 

The students developed many interesting applications e.g. for drivers, fleet managers, driving schools, etc. over the course of the week and showed again how easy it is to get started with SAP HANA Cloud Platform, SAP HANA, the Internet of Things services and SAPUI5. Some of the teams used the Internet of Things services directly but also others were provided with previously collected data by the SAP Vehicle Insights team or the Volkswagen team to feed and test their applications directly with real data.

 

The winning team was a group of students from the Hochschule Karlsruhe, who developed an app for driving schools based on the SAP HANA Cloud Platform. Volkswagen awarded team ‘Pendler’ with a special prize for their SAP HANA Cloud Platform based smart commuting app. We congratulate again all the teams for their successful participation in the competition!

 

IMG_0753.JPG

 

For those interested, tweets covering the event can be found in my Twitter feed between 2016-03-14 - 2016-03-17 and more photos covering the event can be found in my Facebook photo album.

 

All the best,

Sven

Creating Cloud Extensions – Part 3b: Setting up the connection, when the automatic binding didn’t work

$
0
0

Setting up the connection, when the automatic binding didn’t work

 

Header.png

 

Large corporate enterprise typically span the globe with offices in multiple locations. Therefore, It is not unusual to find, that your cloud tenants are also scattered around the world. Perhaps with a SuccessFactors license based in the US, and your HANA Cloud Platform based in Europe.

 

In this case, what happens when you want to perform an automatic binding?

Automatic binding will result in you receiving a new tenant, based on a SuccessFactors trial license.

 

Annoying, right? So, what do we do now? How do we start consuming Data from our SuccessFactors installation when the 2 systems are standing in separate corners of the room, with their arms crossed – occasionally sending each other angry stares?

 

trial account.png

 

I myself have been struggling with this, and am here today to share my experience with setting up the connection, so Data can be consumed from SuccessFactors, in your HANA Cloud Platform. It’s not simple, but it can be done for sure!

 

For this guide, you will need:

 

  • The URL to your SuccessFactors tenant
  • Provisioning access to your SuccessFactors tenant
  • A HANA Cloud platform tenant of your own (not HANA Trial)
  • An SAP Cloud Identity with access to configure and administrate your HANA Cloud Platform

 

In this guide, we will focus on pulling data from SuccessFactors through a technical user, since we won’t do the integration in the other direction just yet (See a later guide for this). To keep the integration as simple as possible, we shall do it this way for now.

 

Setting up Successfactors to trust your HANA Cloud Platform

 

First we must prepare SuccessFactors to share its data with HANA Cloud platform.

 

This consists of 2 steps:

 

  1. Creating an Admin user, and setting some basic properties.
  2. Preparing the trust relationship with your HANA Cloud Platform.

 

Create Admin user, and setting some basic properties

 

provisioning.png

 

Firstly, log on to your SuccessFactors Provisioning section and go to the company you wish to connect to. Note down the Company ID (in the second column on the Provisioning overview page) – you will need it a lot for this configuration guide.

 

Also, note down the URL to the SuccessFactors tenant you are connecting to. This could be (for example) https://performancemanager5.successfactors.eu - or any of the other tenants in the Successfactors cloud.

 

Firstly, (under the company which you want to connect to) go to Company Settings, and create yourself an Administrative User (Search for “Create Admin”).

While here, you can also enable the “SF Web service” service as well.

 

Next, log into the SuccessFactors tenant as the Admin user you created, and assign roles to yourself. The more roles, the more data you can access in SuccessFactors.

 

  • Your complete list of OData Services can be found in SuccessFactors admin center, under Company Settings à OData API Data Dictionary. Keep this list for later, you will need it once you start developing.

 

Prepare trust relationship with HANA Cloud Platform

 

In order to initiate the trust with HANA Cloud Platform, the individual apps under HCP must be trusted. This is accomplished in SuccessFactors Provisioning, under the menu link “Authorized SP Assertion Consumer Service Settings”.

 

Click on “Add another Service Provide ACS” – and add the following information:

Assertion Consumer Service:https://webide-<your HCP Tenant ID>.hana.ondemand.com/portal/extensions/sfsf/index.html

Logout URL: https://authn.hana.ondemand.com/saml2/sp/slo/<your HCP Tenant ID>/<your HCP Tenant ID>  - (yes, it must be there twice)

Audience URL:https://hana.ondemand.com/<your HCP Tenant ID>

 

This information, can also be found by logging on to your HANA Cloud Platform, and selecting “Trust” on the left-hand navigation. Under Local Service Provider, you can download the Metadata file, and the paths can be found under the properties ns3:SingleLogoutService and ns3:AssertionConsumerService.

 

  • The URL to your webide, can be found by going to HANA Cloud Platform, and clicking Subscriptions à sapwebide (under Subscribed HTML5 Applications), then the URL is at the top of the screen, under “Application URL”.

 

Now that you are here (Successfactors Provisioning / Service Provider ACS) - You can prepare your SucccessFactors tenant for consuming apps from your HANA Cloud platform, by adding another Service Provider ACS, with the URL: https://cloudnwcportal-<your HCP Tenant ID>.hana.ondemand.com/portal/extensions/sfsf/index.html for Assertion Consumer Service – as this will allow calls from HANA Cloud Portal, which is used to connect to apps from Successfactors.

 

Setting up HANA Cloud Platform to consume data from Successfactors

 

hana cloud platform.png

 

Next, We must set up HANA Cloud Platform so it will trust SuccessFactors  to send Data to it. This consists of 2 steps:

 

  1. Establishing trust
  2. Setting up the destination

 

Establishing trust

 

Firstly, download the SAML 2.0 Metadata file from this URL:

https://<SF URL>/idp/samlmetadata?company=<company ID>

 

From your HANA Cloud Cockpit, Click Trust, and then click “Add Trusted Identity Provider”.

In the screen that pops up, upload the Metadata file you downloaded before, and the result should look somewhat like this:

 

Name:https://<SF URL>/sf/idp/SAML2/company/<company ID>

Description: Successfactors IDP

Assertion Consumer Service: Application Root (default)

Single Sign-on URL:https://<SF URL>/sf/idp/SAML2/SSO/POST/company/<company ID>

Single Sign-on Binding: HTTP-POST

Single Logout URL: https://<SF URL>/sf/idp/SAML2/slo/POST

Single Logout Binding: HTTP-POST

Signature Algorithm: SHA-1

Signing Certificate: <a long incomprehensible string of scrambled encryption key>

User-ID Source: Subject

Source Value:<blank>

User ID Prefix:<blank>

User ID Suffix:<blank>

Enabled:<Checked>

Only for IDP-Initiated SSO:<blank>

 

Remember to save the results.

 

Setting up the destination

From your HANA Cloud Cockpit, click Destinations, and then click “New Destination”, then fill out the parameters as follows:

 

Name: sap_hcmcloud_core_odata

Type: HTTP

Description: SuccessFactors Core OData API

URL:https://<SF API URL>/odata/v2/ - You can find the corresponding API URL to your Successfactors tenant, by going here: http://help.sap.com/saphelpiis_cloud4hr/EN/SuccessFactors_HCM_Suite_OData_API_Reference_en/frameset.htm?03e1fc3791684367a6a76a614a2916de.html

Proxy Type: Internet

Authentication: (For this simple development scenario, we shall choose) BasicAuthentication

User:<the user you set up in successfactors>

Password:<Password of the user you set up in successfactors>

Additional Properties:

TrustAll = true

WebIDEEnabled = true

WebIDESystem = sap_hcmcloud_core_odata

WebIDEUsage = odata_abap,dev_abap,ui5_execute_abap

 

Then click save, and your destination will be ready to be consumed from HTML5 and Java Apps.

So, there you have it - the trial account you received, while trying to do the automatic configuration, can be safely ignored!

 

In the next part (Part 4) of the series, I will explain how to start consuming data from Successfactors, by creating your first Hello World Application, which will be showing data from Successfactors.

 

You can go back to part 3 of my series here:

Creating Cloud Extensions – Part 3: Experiences with HANA Cloud Platform in one region, Successfactors in a different re…

HANA Cloud Portal meets Cloud for Analytics

$
0
0

In one of the latest blog post by my colleague Nash Gajic, we found out how to setup HANA Cloud Platform multitenant database as a remote data source and use it with SAP Cloud for Analytics to visualize and share the insights with other users.

 

I have decided to take this step further and describe how SAP Cloud for Analytics can be used along with the HANA Cloud Portal. The main focus of this post is on how quickly the IT department – or, better yet, the business users themselves – can turn critical analytics data into the self-service portal for customers & partners.

 

                             

                                                                                                (Image courtesy of Nash Gajic)

 

In this tutorial, you will learn how to setup HANA Cloud Portal for external facing analytics and collaborative B2B experience. I will be using my trial account to activate HANA Cloud Portal service and prepare a site to launch the C4A visualisations (created in the previous post).

 

 

Activate the Cloud Portal Service in your trial account


Login to the HANA Cloud Platform cockpit with your trial account and add a subscription


 

Select "trial" as the Provider Account and "flpportal" as the Application.

 

 

Navigate to Services in the menu and activate the service “HANA Cloud Portal”

 

 

 

Configure the Portal Service


Click on the configure SAP HANA Cloud Portal link to launch the configuration section of the Cloud Portal


 

In the roles section ensure your user is assigned to the role “TENANT_ADMIN” as shown below. If not, you will have to manually assign it.

 

 

Create a Portal Site


Launch the Cloud Portal by clicking on the “Go to Service” link


 

From the site directory menu, click on the + icon.

 

 

Select the “SAP Fiori Launchpad“ as a template and click on create button

 

 

 

In the Get Started section, click on “Create App Tile”

 

 

Ensure that the App Type is URL and provide the URL of the Story created in C4A

 

 

Click on “Next” to provide values for the Navigation tab. Assign the tile to a predefined Sample Group.

 

 

Click on the Site Preview at the top right hand corner

 

 

This will open the site in a new tab

 

 

When you click on the tile “Sales Report”, it will launch the C4A visualizations. It will prompt for a User/password as we have not setup any authentication between the two applications. After providing the login details, you will get to see the same C4A visualizations which you previewed earlier.

 

 

Using the Cloud Portal is one way to expose your Visualizations/reports. Based on your requirements, you can hook up these reports within other Cloud solutions or even to your on-premise solutions.

 

This will conclude part 2.In the next part, we'll have a closer look on some of the additional things that we can do using SAP HANA Cloud Platform and Cloud for Analytics.

SAP #innotakeoff was a big hit

$
0
0

SAP organized the #innotakeoff competition final showdown event last week at the SAP campus in Palo Alto. The competition was powered by CIO Center for Digital Leadership. The event was led byCarsten Linz- Head of CIO Center for Digital Leadership,Phani Bhushan Dhar and Christian Hastedt-Marckwardt


Innotakeoff.png


The main purpose of the competition was to bring up bright ideas and share competencies and to encourage the innovation spirit within and across the organization. By participating in such competitions, the teams can convert their ideas into working prototypes and then into live solutions within just few months. There were 3 tracks in the competition: Students, Startups and SAP Employees.


This year, the main focus was on Internet of Things and Big Data. In the first week of March, all the participants submitted their prototype with a video explaining the main use case and usage scenario and 13 finalists were selected for the final showdown event. Out of these 13 teams, there were 5 student teams, 5 employee teams, and 3 startup teams. Most of the teams used HANA Cloud Platform as the development platform and HCP IoT Services for gathering data from the sensors and devices, and then stored data in HANA in the cloud. For the UI part, they used SAP Fiori in the cloud.


The final showdown event was organized at the SAP Campus in Palo Alto. It was a 48-hour hackathon in which the finalist teams go the last mile to push their prototype and pitch deck forward with the help of extensive coaching whenever required. Participants flew in from different places around the world such as Germany, France, and Israel.  There were around 20 coaches and mentors who joined the event to provide some guidance to the teams. The teams were provided with the HCP accounts and all the pre-requisite materials and devices to get them onboarded easily and quickly. The event started with delicious breakfast that acted as energy boosters for participants as well as coaches.


Breakfast.jpgCoaches.jpg

Then Phani welcomed all the teams and gave a brief introduction about the event agenda. After that, Alexander Vonnemann from CDL team explained about User Experience and what are the best practices in order to create engaging user experience. Then I explained about services offered in HANA Cloud Platform and specifically gave an overview on HCP IoT services.


I also gave a cool demo showing them how the data gathered from sensors can be stored in a HANA database in the cloud using IoT services, and how we can use that data to develop a UI5-based web application or in addition integrate that data with Amazon Echo to give some useful insights to the end user. The audience was pretty impressed with the demo as it gave them some ideas as to how IoT services would help them in developing their prototypes. After this session, Aaron Williams and Alvaro Tejada from the SAP d-shop explained about different sensors and devices available and how to use these devices and sensors while building the prototypes.


Alex.jpgSonali_IoT3.png

After the initial session was done, coaches were assigned for each team. There were coaches from the business side, UX side and technology side for each team. As coaches, we went to each of the assigned teams to understand and help refine the solution, the target audience, the value proposition, the technologies used in the project and where they are as far as a working prototype is concerned. We offered them some advice as to where they could enhance their solutions and how they should prepare the final pitch presentations.


Coaches1.jpgCoaches2.jpg


My colleague Jin Wong and I were coaches from the technology side for HCP, and we helped many teams when they were stuck somewhere while developing on HANA Cloud Platform and while using IoT services.


Here is a testimonial from Jin about this competition:


“This is my first time participating in the three-day #innotakeoff event. I was given the role of being one of the coaches to support the “xProto” team.  xProto consists of 5 team members from Chico State University.  The “xProto” idea was about creating prototyping tool for developers, but after hearing the use case on the first day, we convinced them to change the use case because it needs to be related to the “IoT and HCP” topic.

“xProto” was the team’s new idea. The idea was to help shoppers better utilize their time in stores such as supermarkets, retail shops, and etc.   Shoppers will login into their online accounts and build a shopping list.  Once at the store, a sensor will send an alert to their smart phones if they are near the products on that shopping list.

On the second day, we helped the team prepare a 6 minutes’ elevator pitch.   I took the team to COIL and put the students on center stage to practice the presentation.   We modified the presentation and demo to be exactly 6 minutes after 3 tries. 

On the final day, we were excited to see the “xProto” team win 1st place in the student competition after 3 days of hard work! “


After the introductory coaching, some teams began working with devices while others were working on the code to do some enhancements. All coaches were there all day to help the teams whenever needed. Teams worked hard overnight. The following day, the teams started preparing for their final pitches. As coaches, we went on to see the dry run of their presentations and gave some advice as to how to stick with the presentation time limit, how to anticipate questions, and what topics to focus on.


Teams3.jpgTeams2.jpg

DSC_8380.jpgDSC_8401.jpg


To judge the final presentations, the jury for the event consisted of Dr. Carsten Linz– Global Head of CIO Center for Digital Leadership, Chris Mark– Executive Director of User Experience & Design, Anamarie Franc– Global Head of UX Strategy & GTM (User Experience & Design), Denis Browne– Senior Vice President of Imagineering, and Jonathan Becher– Chief Digital Officer.


Final pitches started with the Student groups first. There were teams from different universities such as San Francisco State University, Notre Dame University, and Chico State University. You can find more information about student group presentations here.


After the presentations by student groups, there were the presentations from SAP Employee groups who participated. There were 5 participating teams. You can find more information about employee group presentations here.


At the end, we had presentations from the startups. There were some cool ideas presented. 3 startups participated in this competition. You can find more information about their projects here.


Once all presentations were done, the jury selected the top 3 groups from each track. The top 3 student groups were xProto, MintPlug, and Emotiv. In the employees track, ‘Path to Wellnes’, ‘IoTa’ and ‘Economic indicator running on Big Data platform’ made it to the top 3. And among startups, the top 3 contenders were Preventive farming, Bloom, and Disaster Visual Helpline. At the end of the day, there was a nice party organized at a Palo Alto restaurant, which provided participants a well-earned opportunity to relax after 48 hours of continuous coding. On day 3, the top 3 groups presented their demos for all the visitors. People were excited after seeing these innovative ideas.


DSC_8979.jpgDSC_8918.jpg

PreventiveFarming2.jpgDSC_8901.jpg


And then comes the final moment. In the afternoon, the jury announced a winner in each group. Project ‘Bloom’ won the award in the startup track which earned them a $10,000 check. In the student track, project ‘xProto’ grabbed the first place with the prize of a $2,500 Amazon voucher. And project ‘Path to Wellness’ won in ‘Employees’ track and received some cool gadgets like Parrot Bebop 2, Skycontroller, Myo Gesture Control, Fatshark Dominator HD for each team member. Sam Yen, Managing Director of SiliconValley presented awards to all the winners.


Winner2.jpgWinner3.jpgWinner1.jpg

Overall this was a great experience for the participants as well as for the coaches and jury members. Lot of innovative ideas came through and hopefully, we will see some live solutions out of these ideas in near future. If you want to check out some of the ideas of last year’s winning teams, you can check it out here.


If you yourself have not yet gotten hands-on with HCP, welcome to the cloud! You can start building an application with HANA Cloud Platform by signing up for a free HCP developer trial account.You can learn more about HANA Cloud Platform at the following resources:


SAP HANA Cloud Platform Tutorials

SAP HANA Cloud Platform IoT Services Tutorials

SAP HANA Cloud Platform Developer Center SCN page

SAP HANA Cloud Platform SCN Page

SAP HANA Cloud Platform Podcast Series

HANA Academy video series on HANA Cloud Platform

You can also reach out to us via Twitter (@saphcp)


Students presented innovative ideas at #innotakeoff

$
0
0

As mentioned in my overview blog about #innotakeoff, there were 3 tracks in the competition: Students, Startups and SAP Employees. In this blog I’d like to give an overview of each of the projects from the Students teams.  For the students, it was beneficial to get in touch with SAP experts and be able to translate their ideas into working prototypes using the latest gadgets and technologies.


There were 5 groups in this track with the following projects:


1. SmartCanA group from San Francisco State University created the project called SmartCan. The idea was that there will be some sensors in the trash can which will identify the trash as recyclable or landfill or liquid waste, for example,. and will separate it out accordingly. The end-user does not have to worry about the separation. The SmartCan will help society as a whole by minimizing litter, educating users, and sorting trash in an effective way.  Each persona will experience different benefits: The Environmentalist will be at ease knowing that he or she is making the right bin choice and that those around them are doing the same, The Average user will be happy knowing that they are making the right disposal choice without having to exert any additional effort, and The “Don’t Know! Don’t Care!” group will still be able to quickly dispose of their trash and will unknowingly be making the right choice for the environment


Students1.jpg


2. xProto: This idea was presented by students from Chico state university. The idea was to provide a smart shopping experience to the shoppers. This app helps shoppers better utilize their time in stores such as supermarkets and retail shops.  Shoppers will login into their online accounts and build a shopping list.  Once at the store, a sensor will send an alert to their smart phones if they are near the products on that shopping list.


xShop.jpg


3. Faint Alert: The students from Hasso Plattner Institute presented this idea. This project would help people who previously experienced fainting due to epilepsy or people suffering from cardiovascular diseases. With this idea, the person with such problems will wear some sensing device around the wrist and if he faints, then that device will sense if the person has fainted based on his position and heart rate and some other sensor data and will send an alert to friend or family member on the record and to the emergency service. This solution will avoid the delay in getting the service and will further avoid any serious damage.

 

4. MintPlug: The idea of this project was presented by students from Notre Dame University. The main idea was to build multi-domain globally aware intelligence platform. It will provide the stacked services such as 1. Rich data analysis for structured and unstructured data, global environment data, and data gathered from IoT devices, 2. Cognitive processing capabilities for query compilation, 3. White-Label customizable boilerplate dashboards 4. Integration with plug-n-play voice interaction devices etc.


DSC_8979.jpg


5. Emotiv: Again students from Notre Dame University presented this idea. Today, 45% physicians suffer from being overworked leading to 22% increase in risk of error in patient care. And this will lead to projected deficit in physicians in near future. The idea of this project was to provide a management tool and wellness tracker that assists medical care managers who are dissatisfied with antiquated scheduling, toxic workplace environment, and facing a deficit in medical personnel in the near future. This tool would simplify employee management and improves workplace wellness. And thus it will decrease physician turnover, burnout and fatigue and will reduce the risk of error in patient care.


In the final round, project xShop, Emotiv and Mintplug were among the top 3 finalists. And xShop came out as a winner among these finalists. The winners were presented with the prize of $2,500 Amazon voucher by Sam Yen, Managing Director of Silicon Valley.


Winner3.jpg


Thus overall it was a great learning experience for students. At the same time, audience was also thrilled to see the innovative ideas that came up from these bright minds. You can also find the projects presented by employee track here and projects presented by startups here.


If you yourself have not yet gotten hands-on with HCP, welcome to cloud! You can start building an application with HANA Cloud Platform by signing up for a freeHCP developer trial account.You can learn more about HANA Cloud Platform at the following resources:


SAP HANA Cloud Platform Tutorials

SAP HANA Cloud Platform IoT Services Tutorials

SAP HANA Cloud Platform Developer Center SCN page

SAP HANA Cloud Platform SCN Page

SAP HANA Cloud Platform Podcast Series

HANA Academy video series on HANA Cloud Platform

You can also reach out to us via Twitter (@saphcp)

 


Startups presented innovative ideas at #innotakeoff

$
0
0

As mentioned in my overview blog about #innotakeoff, there were 3 tracks in the competition: Students, Startups and SAP Employees. In this blog I’d like to give an overview of each of the projects from the Startups teams.  Startups found it useful because they got in touch with potential customers and partners at the C-level and also got some feedback from the business experts.


There were 3 groups in this track, with the following projects:

 

1. Bloom: Businesses do not adequately capture the data related to goal completion and employee accomplishments. And even if data is collected, it is mismanaged and not well optimized. The idea behind this project was to provide a solution for professionals who are dissatisfied with annual performance reviews as a way to track their careers and managers who struggle to understand the state of their organization. Bloom is the cloud-based performance management solution that moves organizations from annual performance cycles to predictable delivery based appraisals with the swipe of a finger. This app consistently tracks the steps taken to complete the goals, highlights the business areas that need attention, and provides managers transparent interface with which they can they can take some future steps and provide feedback to employees.

 

Bloom2.jpgBloom3.jpg


2. Preventive Farming: Conventional farming treats farming with brute-force attack. The use of more and more fertilizers and pesticides to attack different issues creates a negative feedback look to the soil and plants. And farmers end up paying more costs with increasing use of these pesticides. This preventive farming solution is for farmers and people who are dissatisfied with conventional farming. It is based on the use of imaging by drones and expert system to provide early alerts to the growers to symptoms of diseases and sub-optimal growth. SAP will be the backbone for all the imaging and will be able to analyze the data and provide early alerts to the farmers, limiting excessive use of fertilizers and pesticides. Thus this will eventually reduce the costs, crop loss and will increase sustainability in farming category.


PreventiveFarming.pngPreventiveFarming2.jpg

3. Disaster Visual Helpline: If there is any disaster such as earthquake or flood and if your loved ones are stuck in the disaster zone, you will understandably be worried about their whereabouts. This solution allows various sources to upload data and information about the situation from various locations using IoT. And then the app will send messages to the public on the internet. People on the other side can upload the pictures of other people or location details  and people can search the database with pictures to understand the situation and look for the information about their friends and family stuck in that area.


DisasterHelpline.pngDSC_8877.jpg

 

Out of these 3 startups, Bloom was selected as a winner. Sam Yen, Managing Director of Silicon Valley, presented them an award of $10,000 check. Also, they get the chance to develop their idea further as a member of the SAP HANA startup focus program.

Winner2.jpg

Thus overall it was a great experience for startups as well as coaches and all the audience as they got to see some upcoming innovative ideas from these startups. And some of these ideas will definitely come up as live projects in future. If you want to find out more about the projects presented by the groups in employee track, please visit this link and to find out more about projects presented in student track, please visit here.


If you yourself have not yet gotten hands-on with HCP, welcome to cloud! You can start building an application with HANA Cloud Platform by signing up for a freeHCP developer trial account.You can learn more about HANA Cloud Platform at the following resources:


SAP HANA Cloud Platform Tutorials

SAP HANA Cloud Platform IoT Services Tutorials

SAP HANA Cloud Platform Developer Center SCN page

SAP HANA Cloud Platform SCN Page

SAP HANA Cloud Platform Podcast Series

HANA Academy video series on HANA Cloud Platform

You can also reach out to us via Twitter (@saphcp)

Employee teams presented innovative ideas at #innotakeoff

$
0
0

As mentioned in my overview blog about #innotakeoff, there were 3 tracks in the competition: Students, Startups and SAP Employees. In this blog I’d like to give an overview of each of the projects from the SAP Employee teams.  SAP employees benefited from participating in the #innotakeoff chiefly by gaining upper management awareness about their project and extending their network.


There were a total of 5 employee teams, with the following projects:


1. IoT application builder: It helps developers, partners and customers to efficiently create IoT applications within few hours using the standard templates built by this team. It allows a tremendous cost reduction in development and maintenance processes for developers, partners and customers in the IoT space. Furthermore, it offers coherent and state-of-the-art IDE tailored for IoT purposes. The team demonstrated a very cool demo of data gathered from Star Wars bb8 droid and developed an application showing the insights from that data using IoT application builder.

 

iotapp1.png

2. Zelda: Employees spend more time than necessary locating a place within campus or a resource, so they end up doing multiple steps to accomplish a simple task done. A virtual assistant, Zelda, helps to locate needed resource or answer a question to accomplish simple tasks. SAP can improve productivity of employees through Zelda so that they could focus on building great products for SAP end users. It is an application that acts a virtual assistant for smart campus that helps employees to find nearest meeting room, parking spot, and printer location etc. at the convenience of their fingertips.


Zelda2.png

3. Path to Wellness: Its main objective was to take care of physical, social and mental wellness of the employees. It gathers data from IoT wearables and use that aggregate level data to analyze the effectiveness of wellness programs, perform health risk assessment based on daily health checks and periodic surveys and provide recommendations. It also gives that extra push for the employees to stay active and creates healthy competition among co-workers. This means that the HR wellness manager will be able to engage, empower, and improve the health condition of their workforce by rolling-out cost-effective, customized wellness programs and initiatives which will benefit the organization as well.


Ptw1.jpgPtw2.jpg

4. IoTa: For employees who need to collaborate on an ad-hoc basis, IOTa is a versatile and easily expandable cloud-based infrastructure micro service. It allows to generically manage events, based on people’s interactions with things to trigger follow-up actions in subscribing client applications. It breaks down IoT silos such as healthcare, travel, and productivity. and bring all events together and helps you to plan all the events and combine them to give you recommendations for the next steps. For example, if it is raining and you can’t bike to work, then the app will automatically ask you if you want to book a ride on Uber. If you reply no, it will ask you if you want to send message to your friend to see if he or she wants to share a ride with you. If your friend replies yes, he or she gets the directions to pick you up and once you arrive at the office, it gives you an alert about the available nearby parking spot. All this has been developed using HCP IoT services.


iota.jpg

5. Economic indicator running on big data platform: It showcases the great value that lies within the data of the Ariba Pay platform and builds an economic indicator on Hadoop / HANA Vora to pave the way towards many more such projects by using a big data platform. The new intelligence and be consumed through APIs or our native HANA apps on HCP.

 

Among these ideas, Path to Wellness, IoTa and Economic Indicator running on Big Data platform were chosen as top 3 contenders. And among those top 3 groups, jury announced ‘Path to Wellness’ project group as a final winner. And I am happy because I was coach for this team and my team won

 

Winner3.jpg

The winners were presented with some cool gadgets like Parrot Bebop 2, Skycontroller, Myo Gesture Control, Fatshark Dominator HD for each team member. The awards were distributed by Sam Yen, Managing Director of Silicon Valley. Thus, overall it was a fun event with all very cool and innovative ideas presented by SAP employees. You can also find the projects presented by student track here and projects presented by startups here.


If you yourself have not yet gotten hands-on with HCP, welcome to cloud! You can start building an application with HANA Cloud Platform by signing up for a free HCP developer trial account. You can learn more about HANA Cloud Platform at the following resources:


SAP HANA Cloud Platform Tutorials

SAP HANA Cloud Platform IoT Services Tutorials

SAP HANA Cloud Platform Developer Center SCN page

SAP HANA Cloud Platform SCN Page

SAP HANA Cloud Platform Podcast Series

HANA Academy video series on HANA Cloud Platform

You can also reach out to us via Twitter (@saphcp)

My Intent to Create Maps and QRcodes on the HCP (Trial Account)

$
0
0

Background

Inspiration for blogging about my mapping application on the trial HCP is based on an old blog I published back in 2011. That blog covered ideas for thematic mapping with SAP systems and had a focus on mapping the worldwide user base of SCN, using each country's total SCN points for the theme of the map.  My original intention for that blog was to cover thematic mapping of a particular country with location based data. The old blog was broken, with missing images and formatting, in the move to the SCN Jive platform. As I have now found that I can run the open source software I used in my original setup on the HCP, I decided to revisit the ideas on mapping location data for a particular place. This time out I used the town where I live (Solihull, UK)  as the location with the intention of combining a variety of SAP technologies that I have used over the intervening years to create either maps or QRcodes. Although to begin with I have recreated a missing image from the original blog, which shows my general intention in how I use maps and QRcodes.

https://neogeo28p1248461150trial.hanatrial.ondemand.com/neogeo28/wms/reflect?layers=cite:japanregions&env=R1%3AFF0000%3BR2%3AFF0000%3BR3%3A00FF00%3BR4%3A00FF00%3BR5%3AFFFF00%3BR6%3AFF0000%3BR7%3AFF0000%3BR8%3AFFFF00%3BR9%3AFF0000&format=image/png&format_options=layout:scn


 


The above image is generated entirely with GeoServer (an open source server for sharing geospatial data) running in the HCP trial account. It is based on the original image in my broken blog where I was trying out the ideas of thematic maps for Japan. It was while I was living in Japan that I picked up an interest in topics such as GIS(maps) and QRcodes/Barcodes. I have found these topics useful over the years as a use case/personal project to try and integrate them with and learn more about  SAP technology. The QRcode in the above image is a static PNG file which is combined with the map to produce the final image by GeoServer. I use another open source program ZXing running on the HCP to dynamically create QRcodes in my application. I will cover how I use GeoServer and ZXing in the following sections but I start with an image of my final application below. The image below shows a choropleth map of Solihull next to a cluster map of crime data. The data is in HANA and I use XS to generate the choropleth and cluster maps, with GeoServer providing the base map and ZXing used to create the QRcode.


      1_solCrime.png

Over the next few sections I will cover each part of the application running in the HCP.

Neo GeoServer

I originally deployed geoserver into the NetWeaver cloud platform (it also had an alternate name of Neo) in 2012. Using previous versions of GeoServer I found myself hacking Java JAR files of the GeoServer package to get it running with the NetWeaver cloud Java runtime.

Since GeoServer moved to Java 7 with some of the latest releases and the now named HANA Cloud Platform (HCP) made a Tomcat Java runtime available, deploying GeoServer to the HCP is a lot more straightforward.

The way I deployed GeoServer is contained in this document - link. Some of the highlights (issues extracted below.

The key step for me was to actually pick the Tomcat server from Eclipse as the automatic choice did not always pick Tomcat from my experience. ( I found Tomcat was the automatic choice when deploying from the command line with the Tomcat SDK ).

 

 

     2_tomcat.png


Issues I faced were with increasing the timeout of the server while deploying to the cloud. The nature of how I use the application meant I had rather large GeoServer data directory to upload to the cloud. So I extended the timeout to allow the server to start.


      3_timeout.png

Also GeoServer contains the SLF4J (a java logging framework)  jar files and the Tomcat runtime also has these for logging. Once deployed I had a multiple bindings error in the logs. According to my interpretation of the slf4j website mentioning the issue, the JVM should still be able to pick a binding  (at random!) so I took no further actions on this. http://www.slf4j.org/codes.html#multiple_bindings


Running GeoServer in the HCP considerations

GeoServer can connect to spatially enabled databases and as HANA is spatially enabled it should be in theory possible to make a connection. There is no standard connection for HANA from the Geotools/GeoServer download sites directly. However as GeoServer is open source software it is possible to create your own. I personally do not have the Java skills to do this (yet, maybe never, or never say never...erm).

I did find Marcel Akkerman’s github repo and with an HANA jdbc driver it could be used with GeoServer. So as I only ever hack example code, I had to try it out and got so far but sadly not a working example. I never attempted to use it in the actual HCP but via a tunnel method. I had GeoServer running on my local Ubuntu virtual machine. By changing a line in Marcel’s github repo and building the project with maven I managed to get GeoServer initially connected to HANA via the tunnel method. Screenshot below shows how far I got by defining a GeoServer store and selecting the tables in my HCP HANA database. (it falls over pretty quickly after that). I did try and contact Marcel via twitter to check if he had made progress on his implementation as the github repo is now a few years old, Marcel didn’t reply but if by chance you read this Marcel I would be interested in your experiences with HANA and GeoServer.



      4_geoHANA.png


I would state connecting GeoServer to HANA directly would be my ideal approach but as that is a step too far (got a feeling I’ll come back to that in the future). I have alternate ways to use GeoServer in the HCP cloud as a standalone runtime only type installation. I use my local GeoServer as a working machine and when I am ready use bash scripts to publish to the cloud. As GeoServer has a REST interface I found it easier to use this interface to upload a lot of Ordnance Surveyopendata shapefiles to the cloud. Now that I mention Ordnance Survey opendata this brings me onto.

Open Data and open source geospatial software

While I was living in Japan I did find out that there was an open data strategy in the UK with the UK government opening up large amounts of data. I have tried out various sources of open geographic data such as the Ordnance Survey data and the Office for National Statistics geoportal was key for me to create the choropleth maps based on its geographical reference data.

If you are interested in starting out or checking out open source geospatial software then I would recommend the collection of software found on the the OSGeo Live collection as I mentioned in my original blog in 2011. This collection of software was how I got a head start into GIS systems when I lived in Japan. The latest edition is  OSGeo-Live 9.0and is a bootable DVD, USB or virtual machine. The quick start guides were great for me to get up and running with GeoServer and QGIS (which I cover later on) and it is straightforward to setup (if you check the manual and help :)). For GeoServer in particular I have also found the documentation on the GeoServer site has pretty much covered all the items I wanted to achieve and a couple of key items found on the GeoServer training material offered by http://www.geo-solutions.it/

That covers my use of GeoServer, now onto QRcodes.

A Zebra Crossing in the cloud

I covered a method to create QRcodes (in November 2010) in one of my previous blogs based on the open source project called  ZXing and running the app in an SAP NetWeaver Java Application Server.  I used the NWDS to create my project and made an EAR (Enterprise ARchive) file available for download in that blog (the EAR file can be downloaded by following this link). I found that with a bit a manipulation of that EAR file I could get this running in the HCP. And now in 2016 , two Java apps can run on the HCP at the same time!! So this allows me to create maps and QRcodes with my application. At the very start of the NetWeaver cloud (as it was known) trial, then accounts had some limitations and only one app was allowed to run for Java, which reminds me….

I did deploy ZXing for QRcodes to the NetWeaver cloud as a demo during the limited trial access to the cloud back then. As I recall this, it does prompt me to mention and thank Matthias Steiner for his help and assistance over the time I have been using the HCP. Also a thank you to  the SAP cloud team, even when I asked a n00bie (shooting myself in the foot type) question on the forums, there was an understanding and positive answer from HCP team. Also last time out with ZXing I tweeted a QRcoded message to Matthias asking why there was a only a one-off 90 day trial. Times have changed, and it seems a long time since the 90 day limits of NetWeaver cloud and amazing amount of possibilities with the HCP. Although it is difficult to know where to start next! So I’ll carry on with maps or a qrcode as a reason to hack around with the HCP. So below is a recreation of that tweeted QRcoded message asking why there was a limitation for the NetWeaver cloud. The message is way out of date now as there is ongoing access to the HCP trial but the QRcode is generated by ZXing running in the HCP.

 

https://neoqrcodep1248461150trial.hanatrial.ondemand.com/QRplutus/servlet/QRimg?t=why%20only%2090%20days%20one%20off%20trial%20of%20netweaver%20cloud,%20please%20extend%20as%20per%20ABAP%20trial%20systems%20and%20rolling%2090%20day%20license




Deploying a Zebra Crossing

I extracted the WAR file from the EAR file I had previously created, as I didn’t need all that enterprise packaging that an EAR file contains

Importing the WAR file into Eclipse and then deploying to cloud caused issues that the deployed servlet could not be found. Googling the subject I added a fix to the web.xml to link back to the servlet, also when I changed the project name, I updated the QRcode.jsp to reflect the name change. The details of how I did this can be found in this document.

Limits of trial

I did have some issues deploying GeoServer to the cloud and mainly due to how I use the application on the HCP. There is a difference between structured and unstructured data and the data limits are within the HANA and MAXdb database and not as I had assumed 1gb of space no matter where the data is located. Another consideration for GeoServer is permanent file creation at the operating system level on the HCP trial. GeoServer configuration is stored in a data directory and any changes are not persisted at restart in the HCP. I use GeoServer's REST interface to work around this on the HCP trial and will cover that later on in this blog.

I enjoy using the HCP and it’s great that I can use the platform and all the applications on the trial for free (and not limited to 90 days as before :)). So I have limited the use of GeoServer cache in my trial however mapping applications can use a large amount of cached data to help performance of the map. ( I have not tried as I don't want to pay but there are options to use AWS S3 for GeoWebCache)

At the moment I am checking out the Java monitoring and JMX monitoring capabilities of the HCP to assist me in monitoring my application.

Application setup

Now I had my base Java applications for maps and QRcodes running in the HCP it was time to adapt an application to use them. As I had already loaded 18 million crimes from 2011-2013 into HANA on the HCP it was an ideal dataset to use with its location based data. The data set in my HCP trial account covers crime and policing in England, Wales and Northern Ireland and downloaded from http://data.police.uk/data/ . The data is made available under the UK Open Government Licence.

Solihull Metropolitan Borough Council Crime Stats

As just mentioned back in 2013 I used Lumira to connect to the trial HCP to map England and Wales crime data and I went on to create a Lumira extension to map Solihull’s crime data. Example screenshot below. The method I used to create the maps with Lumira is covered in my blog post from 2014 (link). The same Lumira extension I created but one based on Japan can be found on the github open source collection of Lumira extensions here.


When I first created a map of Solihull below in Lumira I did notice one area which from the National Statistics site is registered as a Lower Super Output Areas (LSOA) and code is E01010109. LSOAs are a geographic hierarchy designed to improve the reporting of small area statistics


     5_blankSolihull.png


The above did not show any town names or identifiable features, so I used the openstreetmap based site http://overpass-turbo.eu/ to extract place names to show on the map.


     6 soliPlaceChoro.png


The LSOA E01010109 area is somewhat lost in the above image and also it indicates that there are no local towns/villages in that area. The area is wide and I was curious to find out was there any particular cluster of crime locations in LSOA E01010109?

I know LSOAs have a mean average population of 1,500 so what could be in the E01010109 LSOA area as there appears to be no towns or villages. This is where the Ordnance Survey open map data could come in useful for me to add some more detail to my maps.

How I used the Ordnance Survey Open Data with GeoServer is in this document here.

There are many mapping services on the internet and some free and others not so free. One thing with most of these services is that it is their styled map and some have usage limits or you have to pay. Something that would be useful with maps, is the ability to change a map or mark a feature more prominently to highlight a story or area of interest.

My styling skills are limited and I did mention in my original mapping blog that when I did try some colour coordination, I struggled to match my tie with my shirt. In the interim years I have managed to lose the ties but the styling skills are still somewhat lacking.  However using the great resource that is the Ordnance Survey data and style sheets I have had a go at altering the maps appearance, with an objective to make a point.

That point being Birmingham Airport, now I am picking on Birmingham Airport from E01010109 LSOA area but there are more commercial areas in that particular LSOA. A map below showing Birmingham Airport in the E01010109 LSOA area. The map is being styled and the (PNG) image created by GeoServer running in the HCP and using GeoServer’s CQL filter capability to highlight the LSOA area.


      https://neogeo28p1248461150trial.hanatrial.ondemand.com/neogeo28/cite/wms?service=WMS&version=1.1.0&request=GetMap&layers=cite:SP,sall,solihull&CQL_FILTER=INCLUDE;INCLUDE;LSOA11CD=%27E01010109%27&styles=&bbox=408813.38462,272194.17965,427732.57921,290850.25285&width=668&height=606&srs=EPSG:27700&format=image%2Fpng

 

Simply by changing my CQL filter I could highlight another area and by an example if you follow this link another LSOA area of Solihull MBC will be highlighted.


Back to my HCP crime application and the E01010109 LSOA selected from the choropleth map on the left. The cluster map on the right indicates crime is located around Birmingham Airport however there are other clusters spread around the LSOA.


     7_BhamAirport.png

HANA Spatial Projections

I needed a way to convert the National Statistics site’s open data shapefiles to projections that can be used with HANA and my XS based choropleth map code in the HCP trial. Background to map projections can be found on wikipedia. The map projections straight out of the box/cloud in the HCP are rather limited (there appears to be ways to create your own in an non trial HANA system but from my experience with the cloud based HANA spatial projections are limited).


    8_HCPspatial.png


I decided to work with these initial projections shown above. However to get the National Statistics provided shapefiles loaded into the cloud, then some form of transformation was required as they were in different map projections.

SQL Anywhere & QGIS

I have used SQL Anywhere for a while now and think it is a genius database and impressed by its capabilities. Back in 2014 I mapped the crime data with SQL Anywhere and blogged about it here.

As stated previously I do use QGIS (A Free and Open Source Geographic Information System) and up to a certain release it could be connected to SQL Anywhere. QGIS latest releases are no longer compatible with SQL Anywhere. However using the latest version of Ubuntu Linux then a base install of QGIS will fetch version 2.0.1 and this is still compatible with SQL Anywhere. You can install the QGIS SQL Anywhere plugin with sudo apt-get install qgis-sqlanywhere and then select the plugin via the plugin manager in QGIS.

Using SQL Anywhere’s spatial ST_TRANSFORM it is possible to convert/transform map projections. Also QGIS can convert map projections. Image below of my installation of QGIS reading the LSOA shapefile I downloaded from the National Statistics. The key step for me was to enrich the data with the reference tables from National Statistics site with SQL Anywhere. This allowed me to select only Solihull's LSOA areas in my application.

      9_sqlAnyQgis.png


With QGIS I can convert the enriched SQL Anywhere tables back to ESRI shapefiles with SRID 4326, which allows me to import them into HCP and leads me onto….


XS Application

I had the base XS services to produce choropleth maps when I created the SCN world application and blogged about that in 2015. The SCN world of points shapefile contained both the map and the SCN point values I wanted to map. How I loaded this into the HCP is covered in my blog here. With my crime map I did want to separate the data from the actual map shapefiles.

As you have made it this far through my blog, I will share the actual link to the application below. I have opened it up to all SCN users with a valid SAP ID logon account. As I learnt from my SCN World application I can open up the application to all SAP ID users. I would value your comments if you do have time to check out the application at the link below.


https://s5hanaxs.hanatrial.ondemand.com/p1248461150trial/hihanaxs/neogeocrime/xs/solihullcrime.html

I do use Stoyen Manchev’s 8 easy steps to develop XS app blog as a base or reference for everything I try with XS. As a result of using this base XS code, most of my XS services use a sales order variable even though there ain’t no sales order on my site. The code is based on examples from Stoyen and also Trinoy and Kevin who I thanked in my SCN world blog as well. Follow this link for the code to generate the maps with XS services and the javascript source code can be accessed in the link to my app above.



Putting it all together on a Leaflet

I used a great open source Javascript libraryLeaflet.js to bring all aspects of my application onto a map. Using leaflet I can make XS service calls to generate the choropleth and cluster maps.  As the choropleth is useful to identify areas of high crime, it does not provide the ability to drill down into the individual crime locations.  A leaflet plugin calledPruneClusterallows me to show a large dataset via a cluster of crime locations..

I have also chosen to useoverpass turboon my site to highlight certain aspects of the location related to a crime type. I have previously used overpass turbo with Lumira whenI highlighted the university of Oxford buildings/sites and linked this to the location of bike theft.This time I used a Leaflet.js plugin/extensionleaflet-layerJSONto show various points of interest on a map where I have related them to the crime. So for example a shopping trolley icon will be displayed for any shops, if shoplifting is the selected crime type. Also bike crime and vehicle crime will show parking areas for these types of crimes.

I call the overpass turbo site via the HTTP destination configuration setup with the HCP. I have a xshttpdest file and use XS service to call an HANA spatial query (using SHAPE.ST_ENVELOPE) to calculate a bounding box of the LSOA to pass to the overpass turbo site. The code I used for the HTTP destination is available via this link.

An example screenshot below highlighting these methods described above. The map on the left is the aggregated crime data shown in a choropleth map and the map on the right is the cluster of crime locations. The choropleth map controls the cluster map and is kept in sync with another leaflet plugin calledleaflet.sync.The bike icons are bicycle parking locations in Solihull from the overpass query. An attempt by me to highlight bicycle crime next to bicycle parks.

     10_SolihullBikeCrime.png


Scanning with Intent part II

The way I use the QRcodes in my app is to use them to start a navigation app on my phone. This will automatically start the navigation on my phone and works as follows.

In the same blog I created QRcodes with ZXing I also tried out scanning the QRcode back into SAP via an ZXing based app on my HTC Android phone. As a final part to this blog I will cover the attempts I have made to revisit the scanning of QRcodes but now with my Nexus 5 Android phone. I reused the intent code I shared in the original blog with template code from the Android SDK and checked it worked. I then modified the code to use a navigation intent from the results of the barcode intent.

https://developers.google.com/maps/documentation/android-api/intents#intent_requests

https://github.com/zxing/zxing/wiki/Scanning-Via-Intent


     11_AndroidSDKintents.png


The image above is a screenshot of the final app in the Android SDK.

I had complained in my original QRcode scanning blog on how difficult it had been with my original Android phone to capture screen shots in comparison to Apple products available at the time. Screenshots are now a whole lot easier on my Nexus Android phone and can be generated with two key presses. However has I have the SDK (again) installed I can now quite easily capture a video of my phone in action. A video of my apps scanning a QRcode and starting navigation via map apps installed on my phone is below (video recording direct to the phone via adb shell screenrecord /sdcard/Download/sapqrmap.mp4)

 

An Intent to use HAT

Before I cover how I used the Hybrid Application Toolkit (HAT) I will thank Simmaco Ferriero's for his blogs about installing HAT on a Mac and how to use the Kapsel barcode scanner. With my own Mac I had started to install all the component parts with independant downloads. Following Simmaco’s installation blog on a Mac I had success with Homebrew (  the missing package manager for OS X) method of installation. I followed Simmaco’s barcode scanner blog only for the barcode scanner parts and not the integration to Gateway. My intention was to use a third party plugin for cordova for Android like Intents but the plugin is stated to work for multiple devices. I only use an Android phone in my examples.


I used Simmaco’s Kapsel barcode scanning blog as a guide but chose the starter application (icon shown below - I also used this icon for the app on the phone). I added the Kapsel barcode scanner as per the blog to the device configuration but also added the geolocation option from the Cordova options page. I added the code from the blog to add the barcode button to the view and the code for the javascript controller and deployed to my running local HAT instance.



     12_ICON.png


I then chose to deploy to my phone to check that the base application would at least start the barcode scanner on my phone.

I built the Android APK with

cordova build

This created an APK called android-debug.apk

I then manually transferred the apk to my phone and tested this successfully, as the barcode scanner started.

I then added the custom plugin to call the navigation.

I had installed plugman (link to a reference for plugman in another of Simmaco’s blogs).

The third party plugin I wanted wasuk.co.workingedge.phonegap.plugin.launchnavigatorand I installed it with the following command.


plugman install --plugin=uk.co.workingedge.phonegap.plugin.launchnavigator --platform=android --project=QRcodeNavi/hybrid/platforms/android --plugins_dir=/Users/robert/SAPHybrid/plugins

Installing "uk.co.workingedge.phonegap.plugin.launchnavigator" for android

I then repeated the HAT setup but added the custom plugin path

13_CustomPlugin4HAR.png



HAT has to be ready and running (run.sh) and it will use the directory setup in the config of HAT.


./run.sh

Input your HAT Connector certificate password: **********


===================================================================

Hybrid App Toolkit (v1.12.5) Connector is listening on port 9010.


I created a new WEB IDE project from the starter application and added the same barcode and geolocation plugins as before. However this time I added the WEB IDE setup for custom plugins to pick up the navigation plugin.

    14_addCustomPluginToWebIDE.png


I updated the controller code to call the launchnavigator plugin with the results of the barcode scan. I used a Javascript substring command to get the lat,lon details only from the URL shared on my crime site. This way I can use both the native Android app and the SAP Mobile app with my site.


      15_customPluginControllerCode.png

 


 

I then built the APK with the cordova command and copied the APK to a temporary directory.

I used the following command for the first Installation onto my phone via a USB connection

adb install /var/tmp/QRapk/android-debug.apk

If I changed the code and wanted to push the same APK to my phone then the installation failed with  [INSTALL_FAILED_ALREADY_EXISTS]

To get around that I now use the -r flag

adb install -r /var/tmp/QRapk/android-debug.apk






A video for both mobile apps for scanning and navigation with QRcodes.




Thank you for reading and I end with some credits.



Credits


Crime Data


As covered in my previous blog (link above) I use the crime data from http://data.police.uk/data/. The data is made available under theUK Open Government Licence.


Census Data


I use the official labour market statistics site for the census data which can be found at the following web address

https://www.nomisweb.co.uk/census/2011


Choropleth Map Data


For the choropleth maps on my site then I used the Office for National Statistics to download the geographical reference data here at this link ONS Geo Portal


Contains Ordnance Survey data © Crown copyright and database right 2015

Contains National Statistics data © Crown copyright and database right 2015


Base Map Data

The base map data is from Ordnance Survey Open Data shapefiles and available with the Open Government License. https://www.ordnancesurvey.co.uk/business-and-government/products/opendata-products.html


Contains OS data © Crown copyright [and database right] (2016)


Leaflet - Javascript library for maps


My application relies on Leaflet.js to bring all the contents together in one place on a map. Thank you Vladimir. Also thanks to the extensions I use for leaflet,  leaflet.sync, leaflet prunecluster and leaflet-layer.json


OverpassTurbo


Overpass-turbo.eu I use the api from this stie to display Openstreetmap data on my map. License link for the API. So many options for extracting the data and the way I use it is to highlight certain features on a map to correspond to a chosen crime type.








.

HCP IoT HANA Car v1.0

$
0
0
DescriptionHyperlink
OverviewHCP IoT HANA Car v1.0
Internet of Things configurationHANA Car v1.0 - Internet of Things configuration
UI5 Java WebappHANA Car v1.0 – Java Webapp
The Car with raspberry PiHANA Car v1.0 – Raspberry Pi

Introduction

Once upon a time when I was still in high school I was dreaming of building my own car with remote control. At that time there was a great James Bond movie where James Bond controlled his car from his Nokia phone! This was incredible at that time! Here you have a scene of the James Bond movie where he’s using the remote control:

 

https://youtu.be/meY1R43fJIQ

 

This was my inspiration to build my own remote controlled car someday. That day has finally come Okay, building a remote controlled car isn’t that big of a deal.  It’s the underlying technology that makes this remote controlled car special. This remote controlled car is using Internet of Things service on top of the HANA Cloud Platform!  Thanks to this service I can control the car from everywhere as long as the car is connected to a Wifi network. (This could be improved with a 3G module.) All the data of the car will be stored in HANA, that’s why it’s called the “HANA Car”.

 

The Big Picture

I created a car with a Raspberry Pi which I can control from a UI5 application.The UI5 app sends messages to a Java servlet in HCP. The Java servlet sends messages to the IoT service in HCP and the IoT service sends messages to the car (raspberry pi).

full picture.png

 

Car

I started from an old “Radio Remote Controlled Toy Car” like this:

car.jpg

 

First I removed the electronic components of the car and only left the two motors of the car. One motor for going forward and backward, one for turning left and right. Then I connected the motors to motor controller IC L293D. Each motor has his own motor controller. Each motor controller is then connected to ouput ports on a RaspberryPI 2 model B:

raspberry.jpg

The raspberry has a large battery and a Wifi adapter so the car can drive wireless  On the raspberry I created a small script that listens to the Internet of Things service of HCP. Every time the script receives a message it will start the right motor in the right direction depending on the content of the message. You can find a python example of this on the IoT starter kit:

 

iot-starterkit/src/code-snippets/python/hcp-iot-services/wss at master · SAP/iot-starterkit · GitHub

 

More details about the raspberry pi: HANA Car v1.0 – Raspberry Pi

 

HANA Cloud Platform – Internet of Things

In the IoT service I configured one device and one messagetype.  For the direction of the messagetype I’ve chosen “bidirectional”. This is important if you want to use the websocket for the communication. With a websocket you don’t have to use polling to get new messages. A websocket will automatically inform you when there’s a new message for that device. In the case of the HANA car the response time is very important. For that reason I’ve chosen to use the websocket of the IoT service.

 

More information about the IoT part: HANA Car v1.0 - Internet of Things configuration

 

Webapp

For controlling the car I created a UI5 app. With JavaScript it isn’t possible to communicate to the IoT websocket directly. I suppose this is due to security. To avoid this I created a Java Servlet which I actually downloaded from the IoT starterkit:

 

iot-starterkit/src/apps/java/consumption at master · SAP/iot-starterkit · GitHub

 

I replaced the existing UI in the Java project with a simple joystick for controlling the car. Each move with the joystick will send the direction to the Java Servlet. The Java Servlet will in his turn forward the message to the IoT service. The IoT service will send the message to the raspberry and he will start the right motor.

 

This is how the webapp looks like: (Thanks to Jonas Vanderkelen for the UI5 app)


pic2.png

 

More details about the Java Webapp:  HANA Car v1.0 – Java Webapp

 

Demo

Together with a friend/colleague I’ve presented the HANA Car at SAP Inside Track Frankfurt. Thanks a lot to Jérémy Coppey for making the beautiful presentation! You can find the presentation at Hana Car by Jérémy Coppey on Prezi


The organization of sitFRA also created a small movie of the car:


Tweet: https://twitter.com/CBasis/status/708661853279416322


Thanks Christian Braukmüller for the video!


The HANA Car:


pic3.png

 

Next steps:

I'm planning to add a camera to the car so I can drive the car without being in the same place and still see where I'm driving

Also want to add additional sensors to gathering information and scanning the environment around the car and store it in HCP.

 

Hope this blog will inspire you to do more stuff with IoT on HCP

 

Kind regards,

Wouter

HANA Car v1.0 - Internet of Things configuration

$
0
0
DescriptionHyperlink
OverviewHCP IoT HANA Car v1.0
Internet of Things configurationHANA Car v1.0 - Internet of Things configuration
UI5 Java WebappHANA Car v1.0 – Java Webapp
The Car with raspberry PiHANA Car v1.0 – Raspberry Pi

 

In this blog I explain the steps of the IoT configuration of the HANA car

pic1.png

Activate IoT

 

First I activated the Internet of Things service by following the documentation:

 

https://help.hana.ondemand.com/iot/frameset.htm?53ad6006e50f4b0ca02402daa6da5bb5.html

pic2.png

 

Once IoT is activated, you can open the configuration from subscriptions or java applications:

 

pic3.png

 

Configuration

 

In the IoT cockpit I’ve configured a Device Type:

 

pic4.png

 

Based on that device type I’ve create a device:

 

pic5.png

 

After that I created a message type. A message type is not connected to a device. The device and the message type are connected to a device type. That way you can reuse message types for other devices.

 

I only created one message type “Motor”. The others (hidden ones) are just for testing purpose. The device type is the same as for the device. The direction is “Bidirectional”, this is required to use the websocket inside IoT. With a websocket you don’t have to use polling to get new messages. A websocket will automatically inform you when there’s a new message for the device. In the case of the HANA car the response time is very important. For that reason I’ve chosen to use the websocket of the IoT service.

 

pic6.png

 

The messagetype will only contain a timestamp and an action:

 

pic7.png

 

The action will contain the direction of the car.

 

 

Testing

 

Now, we can already test our service. Goto “Send and receive messages through Websockets”.

 

pic8.png

Change the deviceid, messagetypeid and message content. After you click on send you’ll see a reply from the server.

pic9.png

 

IoT configuration is done!

 

Kind regards,

Wouter

HANA Car v1.0 – Java Webapp

$
0
0
DescriptionHyperlink
OverviewHCP IoT HANA Car v1.0
Internet of Things configurationHANA Car v1.0 - Internet of Things configuration
UI5 Java WebappHANA Car v1.0 – Java Webapp
The Car with raspberry PiHANA Car v1.0 – Raspberry Pi

 

 

Before I created a Java webapp I tried to access the IoT service from JavaScript directly. With JavaScript it's possible to connect to a WebSocket but it's not possible to use a websocket with bearer authentication. Pushing http messages to the IoT service also didn't work, then I had cross domain errors. For these reasons I created a Java application with UI5 as webcontent and a Java Servlet to communicate with the IoT service. For this Java Servlet I started from an example on the IoT starter kit:

 

https://github.com/SAP/iot-starterkit/tree/master/src/apps/java/consumption

 

pic1.png

User Interface

For the User Interface of the Fiori app I had some help of a friend, Jonas Vanderkelen . Thanks a lot!

 

The User Interface contains  an on/off switch. When it’s on there will appear a joy stick. Moving the joystick will send directions to the Java Servlet. When you move the joystick up it will send the direction. All the directions

 

Joystick up = “f”

 

Joystick down = “b”

 

Joystick left = “l”

 

Joystick right = “r”

 

Release joystick = “n”

 

The joystick will always start by sending foward or backward when the joystick moves up or down (even if its left or right). After that it will send left or right if you hold the joystick that way. This is how I developed the car and can be changed. After some tests this gave the best handling for me

 

This is how the UI looks like:

pic2.png

 

Java Project

 

The Java project contains a Servlet and a UI5 app:

pic3.png

I reused almost all of the javascript functions from the IoT starter kit but we changed one thing. When sending the message, we changed the method from “http” to “ws” because we’re using the websocket!

pic4.png

 

I also added our deviceid and messagetype:

 

pic5.png

 

I didn’t changed a thing in the Java Servlet.

 

You can find our project in the attachement. (remove the .txt extension)

 

Kind regards,

Wouter


HANA Car v1.0 – Raspberry Pi

$
0
0
DescriptionHyperlink
OverviewHCP IoT HANA Car v1.0
Internet of Things configurationHANA Car v1.0 - Internet of Things configuration
UI5 Java WebappHANA Car v1.0 – Java Webapp
The Car with raspberry PiHANA Car v1.0 – Raspberry Pi

 

 

For controlling the motors of HANA Car I’m using a Raspberry PI 2 model B, a strong battery and a wifi adapter.

 

The raspberry will connect to the IoT service listen to directions. Everytime the raspberry receives a message/direction it will start the right engine in the right direction.

 

pic1.png

 

First I started without the connection to Iot. I just wanted to make the car drive by developing a python script. Before I could develop something I had to connect the motors to the raspberry. Therefore I followed this tutorial:

https://learn.adafruit.com/adafruit-raspberry-pi-lesson-9-controlling-a-dc-motor/hardware


I used the IC L293D for controlling the motor. Each motor has a IC L293. Once everything was connected, I could start with developments on the raspberry pi.


I used this script for testing the motors:

http://computers.tutsplus.com/tutorials/controlling-dc-motors-using-python-with-a-raspberry-pi--cms-20051


import RPi.GPIO as GPIO
from time import sleep
GPIO.setmode(GPIO.BOARD)
Motor1A = 16
Motor1B = 18
Motor1E = 22
GPIO.setup(Motor1A,GPIO.OUT)
GPIO.setup(Motor1B,GPIO.OUT)
GPIO.setup(Motor1E,GPIO.OUT)
print "Turning motor on"
GPIO.output(Motor1A,GPIO.HIGH)
GPIO.output(Motor1B,GPIO.LOW)
GPIO.output(Motor1E,GPIO.HIGH)
sleep(2)
print "Stopping motor"
GPIO.output(Motor1E,GPIO.LOW)
GPIO.cleanup()

I don’t put the two motors on one controller because I want to send different actions to each motor.

This was the result after my first script:

 

 

 

You’re probably wondering how I do the steering. Therefore I use a small motor which works like this: (this was already in the car)

 

 

After that all worked I added the code for listening to the IoT service. Therefore I started from this script on IoT starter kit:

 

https://github.com/SAP/iot-starterkit/tree/master/src/code-snippets/python/hcp-iot-services/wss

 

I combined this code with the code for controlling the motors and end up with the following.

  • setWheel
    • Function to turn the front wheels to left or right
  • stopWheel
    • Function to stop the motor for the steering and put the wheels straight
  • setMotor
    • Function to activate the driving motor in forward, backward or stop.
  • IoTServicesClientProtocol
    • this is the implementation of the websocket
    • onOpen function
      • I use this function to create a second thread and keep the connection alive between HCP and the raspberry pi. I noticed that after 60 seconds without communication the websocket will lose the connection. Therefore I send a message from the raspberry pi to IoT every 30 seconds.
    • onMessage
      • Every message from HCP will be catched by the websocket in this function. This is also the place where I activate the right engine.
        • F --> forward and stop steering
        • B --> backward and stop steering
        • L --> left ( will still go forward of backward depending on previous message)
        • R --> right ( will still go forward of backward depending on previous message)
        • N --> disable all engines
  • Other code is used to start the websocket

 

import RPi.GPIO as GPIO
import time
import simplejson as json
import sys
import thread
from optparse import OptionParser
from twisted.python import log
from twisted.internet import reactor, ssl
from autobahn.twisted.websocket import WebSocketClientFactory, \    WebSocketClientProtocol, \    connectWS
from base64 import b64encode
GPIO.setmode(GPIO.BCM)
enable_pin = 12
coil_A_1_pin = 5
coil_A_2_pin = 6
enable_pin2 = 16
coil_B_1_pin = 20
coil_B_2_pin = 21
GPIO.setup(enable_pin,GPIO.OUT)
GPIO.setup(coil_A_1_pin, GPIO.OUT)
GPIO.setup(coil_A_2_pin, GPIO.OUT)
GPIO.setup(enable_pin2,GPIO.OUT)
GPIO.setup(coil_B_1_pin, GPIO.OUT)
GPIO.setup(coil_B_2_pin, GPIO.OUT)
GPIO.output(enable_pin,1)
GPIO.output(enable_pin2,1)
def setWheel(m1,m2):    GPIO.output(coil_B_1_pin, m1)    GPIO.output(coil_B_2_pin, m2)
def stopWheel():    GPIO.output(coil_B_1_pin, 0)    GPIO.output(coil_B_2_pin, 0)
def setMotor(m1,m2):    GPIO.output(coil_A_1_pin, m1)    GPIO.output(coil_A_2_pin, m2)
def keepAlive(websocket):    while True:        print("Keep Alive")        websocket.sendToHCP()        time.sleep(30)
class IoTServicesClientProtocol(WebSocketClientProtocol):    def sendToHCP(self):
# send message of Message Type 1 and the corresponding payload layout that you defined in the IoT Services Cockpit        self.sendMessage('{"mode":"async", "messageType":"<messagetypeid>", "messages":[{"timestamp":1413191650,"action":"d"}]}'.encode('utf8'))        print("keep alive");    def onOpen(self):        print("connection open")        try:            print("start thread")            thread.start_new_thread(keepAlive,(self, ))        except:            print("Error starting thread")    def onMessage(self, payload, isBinary):        if not isBinary:            data = ""            data = json.loads(format(payload.decode('utf8')))            try:                print(data['messages'][0]['action'])                action = data['messages'][0]['action'];                if action == "f":                    setMotor(0,1)                    stopWheel()                    #time.sleep(1)                    #setMotor(0,0)                elif action == "b":                    setMotor(1,0)                    stopWheel()                    #time.sleep(1)                    #setMotor(0,0)                elif action == "l":                    setWheel(0,1)                elif action == "r":                    setWheel(1,0)                else:                    setMotor(0,0)                    stopWheel()            except:                print("alive")        
if __name__ == '__main__':    log.startLogging(sys.stdout)    parser = OptionParser()
# interaction for a specific Device instance - replace 1 with your specific Device ID    parser.add_option("-u", "--url", dest="url", help="The WebSocket URL", default="wss://iotmmsptrial.hanatrial.ondemand.com/com.sap.iotservices.mms/v1/api/ws/data/<deviceid>")    (options, args) = parser.parse_args()    # create a WS server factory with our protocol    ##    factory = WebSocketClientFactory(options.url, debug=False)    headers={'Authorization': 'Bearer ' + '<bearertokenid>'}    # print(headers)    factory = WebSocketClientFactory(options.url, headers=headers, debug=False)    factory.protocol = IoTServicesClientProtocol    # SSL client context: default    ##    if factory.isSecure:        contextFactory = ssl.ClientContextFactory()    else:        contextFactory = None      instance = connectWS(factory, contextFactory)    reactor.run()  

Added the full code as attachement, just remove ".txt"

 

Demo

 

 

 

Hope you enjoyed the blogs

 

Kind regards,

Wouter

How to use RDMS APIs to create types on HCP IoT Services

$
0
0

Note: I assume basic knowledge of HANA Cloud Platform IoT Services on your part. If you are very new to the concept, this blog series by Aaron Williams - Using the SAP HCP IoT Services should get you started.

 


HANA Cloud Platform IoT services provides an RDMS API to register device types, devices and message types. This RDMS API can be called from any client which supports REST calls over HTTPS. The IoT services also provides a cockpit for creating IoT types- device types, message types and devices. But I have always felt that if we have to create a large number of these types then it is better to do so via APIs.

 

In this post, I have included 2 ways to trigger the RDMS APIs. First is the simplest - use a REST client of your choice and call the respective URLs with parameters. Second is more exciting - a java application running on HCP that calls the REST APIs and creates the types.

 

You need an account on HANA Cloud Platform to run these steps. You can get a free developer account on HCP for demo and training purposes. Once the account is created, go to the cockpit: https://account.hanatrial.ondemand.com/cockpit -> Services and enable IoT Services.


Table of Contents

 

 

PART 1: Use a REST Client to call RDMS APIs

 

Simplest way to call the RDMS API is REST Client on a browser. I used Postman on Google Chrome and steps I followed are:

  1. Open Google Chrome, to install click here: https://www.google.co.in/chrome/browser/desktop/
  2. Install Postman App in Chrome via the Chrome Web Store. You can of course use any other client.
  3. Once installed, go to chrome://apps/ and launch Postman.
  4. In the Builder tab, select request type as POST.
  5. URL should be the endpoint URL for device types. This is a bit confusing to HCP application developers as you have encountered too many URLS till this point (HCP cockpit URL, IoT Services cockpit URL, RDMS APIs URL...) Two ways to get the URL -
    1. The URL looks like this: https://iotrdmsiotservices-<accountname>.hanatrial.ondemand.com/com.sap.iotservices.dms/api/devicetypes. Replace the account name and the host name. This is the URL for my trial account, for a customer/partner account use the appropriate host.
    2. If you want to copy the URL go to HCP cockpit -> Services -> Internet of Things Services -> Go to Service. This opens the IoT Services cockpit. Click on the arrow next to user name and click on About. The section on Device Management API gives you required URLs.
  6. Create 2 headers –
    1. Content-Type: application/json
    2. Authorization: Basic Auth and give username and password. Postman provides a separate tab (just below URL) to input the username and password. Postman encodes this information using Base64 encoding and includes it as part of header. Don’t include username and password in headers without encoding! (the request won’t work anyways since the IoT services expects encoding plus you are hugely compromising security)
  7. Insert Body text as this and press Send.

 

{
"name": "Coffee Machine Device Type”
}

If everything goes fine, you will see a 200 response code with JSON body which gives name of the device type, id of the newly created device type and token for future authorization.

 

You can repeat the same steps for creating a message type and device.

 

Parameters for creating Message Types:

  1. URL: https://iotrdmsiotservices-<accountname>.hanatrial.ondemand.com/com.sap.iotservices.dms/api/messagetypes
  2. Headers: You can use Basic Auth for Authorization (as used for creating device types) or use the Authorization token received while creating the device type.
    1. Content-Type: application/json
    2. Authorization: Basic Auth and give username and password.  Or Authorization: Bearer <token>
  3. Body

 

{    "device_type": "<device type id received in response while creating device type>",      "name": "Raw Material Levels",    "direction": "fromDevice",    "fields": [          {              "position": 1,              "name": "MilkLevel",                "type": "double"          },          {                "position": 2,                "name": "WaterLevel",                "type": "double"            }    ]
}

 

Parameters for creating Device:

  1. URL: https://iotrdmsiotservices-<accountname>.hanatrial.ondemand.com/com.sap.iotservices.dms/api/devices
  2. Headers: You can use Basic Auth for Authorization (as used for creating device types) or use the Authorization token received while creating the device type.
    1. Content-Type: application/json
    2. Authorization: Basic Auth and give username and password.  Or Authorization: Bearer <token>
  3. Body:

 

{      "name": "Coffee Machine Device 1",      "device_type": "<device type id received in response while creating device type>"
}

Voila! Let’s go to the cockpit and check if the types are created.

https://iotcockpitiotservices-<accountname>.hanatrial.ondemand.com/com.sap.iotservices.cockpit/# .

 

Now you have used the RDMS APIs at least once. But this is still cumbersome if you have to create a large number of device types, message types and devices. So we come to part 2 of the blog post.

 

PART 2: Use a Java Application to call RDMS APIs

 

I created a java application that uses this API and following section shows how the application is built. You can modify the code to insert details for your device types/ devices/ message types and run the application.

 

Note & Disclaimer

This is a sample code and should not be used productively. It only shows how the APIs can be used or can be used for demo purposes.

 

I have purposefully limited the code to call the RDMS APIs in background and not included any fancy UI. Two reasons – to keep code at its bare minimum and still make it understandable and extendable, a fancy UI just adds more clicks which is what we are trying to get away from (IoT services provides a very good UI already).

 

Note: The code snippets for all the classes are attached to the blog. Download the attached file, rename it to class_code.zip and extract the zip. The zip contains 4 files, one for each class.

 

Prerequisites:

 

  1. Install Eclipse tools for application development on HCP: https://tools.hana.ondemand.com/#cloud
  2. Download and extract SAP HANA Cloud Platform SDK for Java EE 6 Web Profile: https://tools.hana.ondemand.com/#cloud
  3. Download javax.json.jar to satisfy JSON dependency in the project code: http://www.java2s.com/Code/Jar/j/Downloadjavaxjson10jar.htm
  4. Configure Eclipse workspace for development: https://help.hana.ondemand.com/help/frameset.htm?e815ca4cbb5710148376c549fd74c0db.html This set-up should include these steps:
    1. Set correct JRE (Should point to JVM or JDK)
    2. Set Server Runtime Environment to point to HCP SDK folder
    3. Proxy settings (if required)
    4. Create a server for your HCP account

 

Step 1: Create a Dynamic Web project

 

In case you do not want to start from scratch and are ok with a workaround, import hello-world application from the samples folder of HCP SDK and modify the HelloWorldServlet.java class.


We start with creating an empty Dynamic Web project. Give a project name and select the appropriate Target Runtime (see prerequisite 4.b). Keep the rest settings to default in New Project Wizard.

 

Step 2: Add library for JSON dependency

 

Once the project is created or imported, create a folder with name lib under WEB-INF folder. Copy javax.json.jar to this lib folder. Add the javax.json.jar to build path of the project for local compilation. All jars from the WEB_INF->lib folder are included during deployment of the application.

 

Step 3: Create a destination

 

Create folder named destinations under the project root. Create file with name iotrdms and no file extension. This file is used to store parameters of destination required for this project.

 

Content of iotrdms file:

 

Description=IoT RDMS API
Type=HTTP
Authentication=BasicAuthentication
Name=iotrdms
CloudConnectorVersion=2
ProxyType=Internet
URL=https://iotrdmsiotservices-<accountname>.<hostname>/com.sap.iotservices.dms/api
User=

Set the user name and URL correctly. For example, for a user P123456789 on trial, paramters would be -

 

Step 4: Create Servlet Class: IoTRDMSServlet.java

 

Create a java class in the source folder with name: IoTRDMSServlet and package: com.sap.cloud.iot.rdms and copy the code from IoTRDMSServlet.java.txt into the class.

 

Code for IoTRDMSServlet.javafile:

 

package com.sap.cloud.iot.rdms;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.io.OutputStreamWriter;
import java.io.StringReader;
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.ArrayList;
import java.util.Iterator;
import javax.json.Json;
import javax.json.JsonObject;
import javax.json.JsonReader;
import javax.json.JsonValue;
import javax.naming.InitialContext;
import javax.naming.NamingException;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import com.sap.cloud.iot.rdms.sample.model.Device;
import com.sap.cloud.iot.rdms.sample.model.DeviceType;
import com.sap.cloud.iot.rdms.sample.model.MessageType;
import com.sap.core.connectivity.api.configuration.ConnectivityConfiguration;
import com.sap.core.connectivity.api.configuration.DestinationConfiguration;
public class IoTRDMSServlet extends HttpServlet {  private static final long serialVersionUID = 1L;  private static final String MESSAGE_TYPE_DIRECTION_FROM_DEVICE = "fromDevice";  private static final String DOUBLE_TYPE = "double";  private static final String STRING_TYPE = "string";  private static ArrayList<DeviceType> deviceTypes = new ArrayList<DeviceType>();  private static ArrayList<Device> devices = new ArrayList<Device>();  private static ArrayList<MessageType> messageTypes = new ArrayList<MessageType>();  private void initData() {  // 1st device type  {      DeviceType dt = new DeviceType("Coffee Machine Type");      deviceTypes.add(dt);      MessageType mt1 = new MessageType(dt, "Raw Material Levels",      MESSAGE_TYPE_DIRECTION_FROM_DEVICE);      mt1.createField("1", "MilkLevel", DOUBLE_TYPE);      mt1.createField("2", "WaterLevel", DOUBLE_TYPE);      messageTypes.add(mt1);      MessageType mt2 = new MessageType(dt, "Power Status",      MESSAGE_TYPE_DIRECTION_FROM_DEVICE);      mt2.createField("1", "PowerOn", STRING_TYPE);      messageTypes.add(mt2);      devices.add(new Device(dt, "Coffee Machine 1"));      devices.add(new Device(dt, "Coffee Machine 2"));      devices.add(new Device(dt, "Coffee Machine 3"));      devices.add(new Device(dt, "Coffee Machine 4"));      devices.add(new Device(dt, "Coffee Machine 5"));  }  // 2nd device type  {      DeviceType dt = new DeviceType("Mars Rover Type");      deviceTypes.add(dt);      MessageType mt1 = new MessageType(dt, "Position", MESSAGE_TYPE_DIRECTION_FROM_DEVICE);      mt1.createField("1", "Latitude", DOUBLE_TYPE);      mt1.createField("2", "Longitude", DOUBLE_TYPE);      messageTypes.add(mt1);      devices.add(new Device(dt, "Mars Rover 1"));  }      // and you can create as many types as you want but be careful to use      // correct device type while creating message types and devices  }  /** {@inheritDoc} */  @Override  public void doGet(HttpServletRequest request, HttpServletResponse response)  throws ServletException, IOException {      initData();      response.getWriter().println("<p>initializing configuration!</p>");      DestinationConfiguration destinationConfiguration = getDestinationConfiguration();      if (destinationConfiguration == null) {            throw new ServletException("Failed to establish a connectivity to the destination.");      }      registerDeviceTypes(response, destinationConfiguration);      registerMessageTypes(response, destinationConfiguration);      registerDevices(response, destinationConfiguration);  }  private void registerMessageTypes(HttpServletResponse response, DestinationConfiguration destinationConfiguration) throws IOException {      String destinationURL = destinationConfiguration.getProperty("URL");      URL messageTypesURL;      try {            messageTypesURL = new URL(destinationURL + "/messagetypes");      } catch (MalformedURLException e) {            throw new MalformedURLException("Failed to build a HTTP URL for creating message types request.");      }      for (Iterator<MessageType> iterator = messageTypes.iterator(); iterator .hasNext();) {            MessageType mtobject = (MessageType) iterator.next();            String payload = mtobject.getJsonFormattedString();            sendRestRequest(response, destinationConfiguration, payload, messageTypesURL);            response.getWriter().println(  "<p>" + payload + " json body sent</p>");            response.getWriter().println(  "<p>" + mtobject.getName() + " created successfully</p>");      }  }  private void registerDeviceTypes(HttpServletResponse response, DestinationConfiguration destinationConfiguration) throws IOException {      String destinationURL = destinationConfiguration.getProperty("URL");      URL deviceTypesURL;      try {          deviceTypesURL = new URL(destinationURL + "/devicetypes");      } catch (MalformedURLException e) {                throw new MalformedURLException(  "Failed to build a HTTP URL for creating device types request.");      }      for (Iterator<DeviceType> iterator = deviceTypes.iterator(); iterator.hasNext();) {            DeviceType dtobject = (DeviceType) iterator.next();            String payload = dtobject.getJsonFormattedString();            response.getWriter().println( "<p>" + payload + " json body sent</p>");            String restResponse = sendRestRequest(response, destinationConfiguration, payload, deviceTypesURL);            response.getWriter().println( "<p>" + dtobject.getName() + " created successfully</p>");            response.getWriter().println(restResponse);            JsonReader reader = Json .createReader(new StringReader(restResponse));            JsonObject readObject = reader.readObject();            JsonValue deviceTypeId = readObject.get("id");            dtobject.setId(deviceTypeId.toString());      }  }  private void registerDevices(HttpServletResponse response, DestinationConfiguration destinationConfiguration)  throws IOException {      String destinationURL = destinationConfiguration.getProperty("URL");      URL devicesURL;      try {            devicesURL = new URL(destinationURL + "/devices");      } catch (MalformedURLException e) {            throw new MalformedURLException( "Failed to build a HTTP URL for creating device types request.");      }      for (Iterator<Device> iterator = devices.iterator(); iterator.hasNext();) {      Device device = (Device) iterator.next();      String payload = device.getJsonFormattedString();      sendRestRequest(response, destinationConfiguration, payload, devicesURL);      response.getWriter().println( "<p>" + payload + " json body sent</p>");      response.getWriter().println( "<p>" + device.getName() + " created successfully</p>");      }  }  private String sendRestRequest(HttpServletResponse response, DestinationConfiguration destinationConfiguration, String payload, URL deviceTypesURL) throws IOException {      HttpURLConnection urlConnection;      try {            urlConnection = (HttpURLConnection) deviceTypesURL.openConnection();      } catch (IOException e) {            throw new IOException( "Failed to open a HTTP URL connection to the destination.",  e);      }      String user = destinationConfiguration.getProperty("User");      String password = destinationConfiguration.getProperty("Password");      String base64 = new sun.misc.BASE64Encoder().encode((user + ":" + password).getBytes());      urlConnection.setRequestProperty("Authorization", "Basic " + base64);      // prepare for HTTP POST      urlConnection.setDoOutput(true);      urlConnection.setDoInput(true);      urlConnection.setUseCaches(false);      urlConnection.setRequestMethod("POST");      urlConnection.setRequestProperty("Content-Type", "application/json" + ";charset=UTF-8");      OutputStream outputStream = urlConnection.getOutputStream();      OutputStreamWriter osw = new OutputStreamWriter(outputStream);      BufferedWriter bw = new BufferedWriter(osw);      bw.write(payload);      bw.flush();      bw.close();      String responseMessage = urlConnection.getResponseMessage();      InputStream inputStream = urlConnection.getInputStream();      InputStreamReader isr = new InputStreamReader(inputStream);      BufferedReader br = new BufferedReader(isr);      responseMessage = br.readLine();      urlConnection.disconnect();      return responseMessage;  }  private DestinationConfiguration getDestinationConfiguration()  throws ServletException {    DestinationConfiguration destinationConfiguration = null;    try {            InitialContext initialContext = new InitialContext();            ConnectivityConfiguration connectivityConfiguration = (ConnectivityConfiguration) initialContext.lookup("java:comp/env/connectivityConfiguration");            destinationConfiguration = connectivityConfiguration.getConfiguration("iotrdms");      } catch (NamingException e) {            throw new ServletException("connectivity configuration failed");      }      return destinationConfiguration;  }
}

 

 

Step 5: Create Data classes

 

Create a new package named com.sap.cloud.iot.rdms.model and copy these 3 classes into this package – Device.java, DeviceType.java, MessageType.java.


These classes are used to initialize the data before triggering REST APIs to create these types on the cloud.


Code for MessageType.java:


package com.sap.cloud.iot.rdms.model;
import java.util.ArrayList;
import java.util.Iterator;
import javax.json.Json;
import javax.json.JsonArrayBuilder;
import javax.json.JsonObject;
import javax.json.JsonObjectBuilder;
public class MessageType {  private DeviceType deviceType;  private String name;  private String direction;  private ArrayList<MessageTypeField> fieldArray = new ArrayList<MessageTypeField>();  public MessageType(DeviceType deviceType, String name, String direction) {      super();      this.deviceType = deviceType;      this.name = name;      this.direction = direction;  }  public void createField(String position, String name, String type) {      MessageTypeField messageTypeField = new MessageTypeField(position, name, type);      fieldArray.add(messageTypeField);  }  public String getJsonFormattedString() {      JsonObjectBuilder objBuilder = Json.createObjectBuilder();      objBuilder.add("device_type", deviceType.getId());      objBuilder.add("name", name);      objBuilder.add("direction", direction);      JsonArrayBuilder arrayBuilder = Json.createArrayBuilder();      for (Iterator<MessageTypeField> iterator = fieldArray.iterator(); iterator.hasNext();) {            MessageTypeField messageTypeField = (MessageTypeField) iterator.next();            arrayBuilder.add(messageTypeField.getJsonObject());      }      objBuilder.add("fields", arrayBuilder);      JsonObject build = objBuilder.build();      String payload = build.toString();      return payload;  }  private class MessageTypeField {      private String position;      private String name;      private String type;      MessageTypeField(String position, String name, String type) {            super();            this.position = position;            this.name = name;            this.type = type;      }      public JsonObject getJsonObject() {            JsonObjectBuilder objBuilder = Json.createObjectBuilder();            objBuilder.add("position", position);            objBuilder.add("name", name);            objBuilder.add("type", type);            JsonObject build = objBuilder.build();            return build;      }  }  public String getName() {      return name;  }
}


Code for DeviceType.java:


package com.sap.cloud.iot.rdms.model;
import javax.json.Json;
import javax.json.JsonObject;
import javax.json.JsonObjectBuilder;
public class DeviceType {  private String id;  private String name;  public DeviceType(String name) {      super();      this.name = name;  }  public String getJsonFormattedString() {      JsonObjectBuilder objBuilder = Json.createObjectBuilder();      objBuilder.add("name", name);      JsonObject build = objBuilder.build();      String payload = build.toString();      return payload;  }  public String getId() {      return id;  }  public String getName() {      return name;  }  public void setId(String deviceTypeId) {      id = deviceTypeId;  }
}

Code for Device.java:

 

package com.sap.cloud.iot.rdms.model;
import javax.json.Json;
import javax.json.JsonObject;
import javax.json.JsonObjectBuilder;
public class Device {  private String name;  private DeviceType deviceType;  public Device(DeviceType deviceType, String name) {      super();      this.deviceType = deviceType;      this.name = name;  }  public String getJsonFormattedString(){      JsonObjectBuilder objBuilder = Json.createObjectBuilder();      objBuilder.add("device_type", deviceType.getId());      objBuilder.add("name", name);      JsonObject build = objBuilder.build();      String payload = build.toString();      return payload;  }  public String getName() {      return name;  }
}

Step 6: Create deployment descriptor file: web.xml

 

Create web.xml file under <Project> -> WebContent -> WEB-INF folder.

 

Content for web.xml file:

 

<?xml version="1.0" encoding="UTF-8"?><web-app xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://java.sun.com/xml/ns/javaee"    xmlns:web="http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd"
xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd" id="WebApp_ID" version="2.5">    <servlet>        <servlet-name>IoTRDMSServlet</servlet-name>        <servlet-class> com.sap.cloud.iot.rdms.IoTRDMSServlet</servlet-class>    </servlet>    <servlet-mapping>        <servlet-name>IoTRDMSServlet</servlet-name>        <url-pattern>/</url-pattern>  </servlet-mapping>  <resource-ref>        <res-ref-name>connectivityConfiguration</res-ref-name>        <res-type>com.sap.core.connectivity.api.configuration.ConnectivityConfiguration</res-type>    </resource-ref></web-app>

 

Step 7: Deploy the application

 

Now deploy the application on the Server that we have defined in Prerequisite 4. If you are unsure on how to do this, refer to JAVA Hello World tutorial section on Deploy, Run and Test the Application in the Cloud.


Once the application is successfully deployed, and import the destination that we defined in Step 3. More info on what destinations are, how to configure destinations via Eclipse, Cockpit and Console is here.

 

If the application is already started before you create the destination, then stop and start the application again.

Step 8: Run the application

 

Once the application starts, open the application URL from cockpit. If everything is specified correctly you will see a log of device types, devices and message types created. You can also check the IoT Services Cockpit to see if the types are created correctly.

 

If you see some errors or some types do not get generated, then check if you have followed all the steps mentioned above.

 

Step 9: Extend the application for your data

 

The servlet class by default creates a 2 device types, 3 message types and 6 devices. If you need to create multiple types then you can modify the method initData() and run the application again.

 

Hope this post helps you in creating your own application!

Use Cloud for Analytics to get Real-Time Insights from IoT Devices

$
0
0

The Internet of Things (IoT) is revolutionizing business, generating revenue and improving customer experiences by creating a connected world. SAP HANA Cloud Platform IoT services makes it easier to connect any sensor or device with any app or business process in a company/business network and perform real-time predictive analysis to improve intelligence and decision-making. According to Gartner, the widespread adoption of the Internet of Things (IoT) is driving the utilization of the PaaS.

 

 

If you are looking to quickly setup and the test an IoT service on your HCP Trial account using the sensors within your phone, I would refer you to this good blog by Patrick Colucci. It took me only two minutes to set it up and stream sensor data to my HCP trial account.

 

There is also good series of blog posts by Rui Nogueira where he outlines the steps to enable a Raspberry Pi to send sensor data to your HCP trial account. I would highly recommend these blogs to get started with experimenting the IoT service on your trial account for free.

 

In this blog, I am going to show how we can use Cloud for Analytics to create great visualizations based on the IoT sensor data. This blog is not a step-by-step guide as my previous ones. It is intended to stir your thoughts by showing you how you can leverage other components of HCP to mash up your IoT data with your business data and create great visualizations thereby improving your business processes

.

Create a new HANA DB

 

Once you are logged into your HCP account, Select “Database & Schemas” from the cockpit menu. Click on the “New” button to create a new MDC(multitenant database containers) based HANA Database

 

 

Provide the Database ID as “hcpta” and select the database system type as HANA MDC. It will take few seconds to create the database.

 

 

Once your HANA DB is ready and started, click on the below link to navigate to the HANA Cockpit with the system user and explore the system information. You may create user accounts and assign proper privileges which can be used later for other activities.

 

 

Update the bindings for the IoT service


In the start-up blogs referenced above, you would have deployed your MMS. The Message Management Service (MMS) is the component that is responsible for receiving data from devices and sending messages to devices.

 

Navigate to the Java application and click on “iotmms”

 

 

Under “Data source binding” you will see a HANA <shared> schema assigned. This shared schema will be used to store all the data which the MMS receives. This is not a full blown HANA instance as it doesn’t provide us the capability to develop XS applications on top of it.

 

 

Delete the binding. Note, all the tables used earlier will no longer be accessible from the IoT services. You will have a fresh DB assigned to your IoT service. Assign the new HANA MDC database as shown below

 

 

Now you have a full-fledged HANA DB assigned to your IoT service. You can now go ahead and start building your HANA XS applications on top of it.

 

Model your HANA Views from heterogeneous sources


Quite often we see requirements where the data captured in IoT tables is not of much value unless you augment it with other business data which is available in your system of records. This is where the HANA Cloud Platform comes into play by providing additional services which enables customers to bring their data from on-premise or cloud solutions to the HANA Cloud Platform and create mashups.

 

 

 

 

The HANA Cloud Connector is an important piece when it comes to connecting the on-premise SAP backend system with HCP. The Cloud connector runs as on-premise agent in a secured network and acts as a reverse invoke proxy between the on-premise network and SAP HANA Cloud Platform. Customers could use HCI-DS (HANA Cloud Integration – Data Services) or SDI (Smart Data Integration) to provision the HANA tables which are needed for reporting purposes. This could be master data or other transactional data which are required for your reporting purposes.

 

In next step, I am going to show how you can access these IoT tables (available on HCP) using Eclipse/HANA Studio. I am using Eclipse Luna and have installed all the relevant HANA Plugins.

 

Add a new Cloud system as shown below

 

 

Provide the Account name, User name and your password

 

 

The next screen will request you to select a Schema or Database. Select the database you created earlier as show below and provide the SYSTEM credentials

 

 

You will be able to see all the Schemas available in the new database.

 

 

After you have setup and configured the IoT service, you will be able to stream data to HCP IoT service. You will be able to see the data stored in individual tables in the MMS service as shown below

 

 

You can now view the same tables within Eclipse too. If your IoT Tables are growing larger, you can use the SQL Commands here to delete the contents

 

 

In the below image, I have shown an example where there is another schema called  “ERP” which holds tables related to the trucks which are used for Fish Import/Export business (based on the scenario described by Rui in his blog posts).  As mentioned earlier, we could use replication techniques like HDI-DS or SDI to replicate the required data to these tables from the backend system. We can then build views on top of these tables (both from IoT and backend ERP) which can be used for reporting purposes.

 

 

There is another blog posted by Nash Gajic where he explains how to create HANA tables and views and consume it in Cloud for Analytics.

 

Create Mashups using Cloud for Analytics


I am going to take Rui’s Fish Export/Import business and show how we can create visualizations for the IoT data. Staff at the head office need to be able to monitor the temperature of the fish boxes in each truck constantly. The earlier they detect something, they chances that they could still use the fish are high. Below is a report which shows where each of their trucks are and also has a color coding to instantly tell which trucks are having a problem. According to the below report we have two trucks which seem to have a problem – one in the state of Victoria and the other in the Queensland.

 

 

The staff could then drill down to view more details of the particular truck which is reported to have a problem. This is where data gets mashed up with ERP system to give more information as to who the truck driver is today and what is the load of the truck. The graph below shows the temperature of the fish box in the last five hours and also has details on the outside temperature. In this case, the Staff would be able to see that the temperature of the fish box dropped from -15 degree to 2 degree in the last two hours.

 

They also can access all the required information to contact the truck driver and fix the problem instantly

 

 

They can also view reports to show the overall operating cost involved with the refrigeration systems which is having a problem. If the costs of repair and maintenance are too high, it is time to replace it.

 

 

With the complete maintenance data for the truck and refrigeration system in the backend ERP system, it is now possible to build reports on Cloud for Analytics which can be used to predict failures.  By now you should have a good understanding about how you can use SAP HANA Cloud Platform’s IoT services and Cloud for Analytics to create great and meaningful visualizations.

HANA Cloud Platform Security makes Amazon Echo enterprise ready

$
0
0

Hi all HCP enthusiasts and tech geeks, how are you doing? Hope you all are rocking the technology world!

 

We, as part of regional HCP team at Singapore, were experimenting with HANA Cloud Platform and we leveraged HCP's world class interoperability to integrate with other technology vendors.

 

Our team has built a ‘technology demonstrator’ to integrate health API's from leading fitness wearable vendors. After homogenization of the data, we have added gamification concepts and a HTML5 dashboards for users. This had been a key showcase where we can use HCP as data aggregator and presentation and analytics layer. This was also a perfect example of how personal devices can be integrated for an enterprise use. We call this internal program FitSAP and it has been adopted as a part of corporate wellness program. More on this later.

 

 

After the success of HCP based prototype as data consumer, we wanted to showcase the security and UX capabilities of SAP HCP. Meanwhile for an inspiration we found cool integration between SAP HANA and Amazon Echo - during teched2015 and in this blog Amazon Alexa and SAP HANA

 

Cool Apps should not make a breach in Enterprise Security

 

I have also seen several cool demo and PoC's involving HCP and Amazon Echo. All of them are great as concepts, but they are not ‘enterprise ready’. Once you have linked your amazon account with HCP/SAP/any enterprise account, anybody can impersonate the user to extract enterprise information from Amazon echo. This security loophole is a major roadblock for Amazon Echo to be used as the next-generation UX centerpiece. This security vulnerability has been highlighted in other scenarios- follow this YouTube videos to make yourself aware. Compromising Connected Home Security with Amazon Echo and the Insteon ISY-994i - YouTube


To counter this limitation of Amazon Echo in enterprise context, a security framework has been developed by us. This framework manages the lifecycle of every interaction from Amazon to our SAP service providers. This security framework is called (for this blog) ‘SecureHCPListener4Alexa’ and it has been built using the toolsets given by HCP.  SecureHCPListener4Alexa introduces 2 Factor authentication on top of usual OAuth based security model.



Account Linking: Basics of Amazon to HCP integration


Amazon Alexa skill (a set of voice intents, to be read out by Amazon Echo) was created in the amazon development site. On this Alexa skill we deployed utterances required for our use case.

AlexaSkill.png


Next step was to link Amazon account (US based). The process can be found at             Linking an Alexa User with a User in Your System - Amazon Apps &amp;amp; Services Developer Portal


To link Amazon and HCP account we needed to provide OAuth token from HCP to Amazon. This is a user specific action as it authorizes user's Amazon account to access protected by user's HCP account. Configured OAuth Provider in HCP and provided necessary endpoints of OAuth in the Amazon Skill settings. The OAuth endpoints and client configurations can be done in HCP Cockpit aat OAuth tab. As you know the UI and token endpoints are authenticated by Cloud Identity. Hence the Account linking procedure is also secured.

 

If you do not know where to get OAuth token in HCP, go to the OAuth tab in HCP cockpit. For your benefit I am addning the OAuth settings screenshots.

 

Oauth-URL.png

OAuth-client.png

 

During account linking process,  Alexa app will invoke the OAuth token UI which looks something similar to this. It is possible for the end-user to delete a token once it is generated.

 

OAuth UI.png

After the OAuth token has been acquired by Amazon Alexa service every inbound request from Amazon Alexa to HCP endpoint will contain this OAuth token. OAuth tokens are user specific and authorizes Amazon Alexa to invoke HCP services on behalf of the user’s HCP account. As OAuth generation UI is itself authenticated by the Cloud Identity, the one-time OAuth token generation process is also very secure.

 

The User interaction of the Account linking process has been captured in back of our ‘enterprise’ napkin-


doodle1.png

Introducing Cloud Identity based  2FA: Enterprise readiness of Amazon Echo

Once the account has been linked, Amazon Echo notifies the user about the success of the ‘account linking’. This is a critical step for any Amazon Echo <- -> HCP interaction. It has to be completed in any demo of Aamazon Echo with HCP.

 

OAuth token exchange between HCP and Amazon Alexa services is the authentication mechanism used by all HCP+Amazon demo. We believe this is not secure enough as Amazon cannot distinguish between the voice of a ‘true’ user and impersonator.

 

So to make voice base interaction through Amazon Echo ‘Enterprise Ready’ we implemented two factor authentication (2FA) in our ‘SecureHCPListener4Alexa’.

After the successful account linking, users can trigger HCP by uttering Ask HCP. But SecureHCPListener4Alexa monitors the lifecycle of user's session. If the user is initiating a session for first time or after specified duration of idle time, the service prompts the user, through Amazon Echo, to provide an One Time Password (tOTP) . We have built a HTML5 app to deliver the OTP to user. As this app is also protected by SAP Cloud Identity, hence users need to be authenticated to access the user dependent t-OTP generator via mobile or desktop.

 

OTP.png

The interaction between Users, Amazon Echo, HCP and Mobile app is described in the following interaction diagram.

doodle2.png


As you can see, we are security paranoid people. Hence we did not just stop here. Not only do we validate the t-OTP in SecureHCPListener4Alexa, we also check if the user has provided wrong OTP for few successive time. In case of successive wrong OTP, the service can block the user for certain time periods.

 

There are also provision for users to 'logout' from HCP services via Amazon Echo. In that case SecureHCPListener4Alexa invalidates existing session and prompts the user for OTP during next session.

 

We have made a good guy, bad guy doodle for you to understand that.

 

doodle3.png

 

 

When the valid user returns he can restart a valid and authenticated session using the OTP app in mobile.

 

For an impersonator, without the secure mobile device logging in will be impossible.

 

doodle4.png

 

Once users are authenticated, the SecureHCPListener4Alexa service allows users to consume enterprise data from multiple data providers. The data providers can on-premise or cloud based system. HCP supports both SAP and non-SAP data providers.

 

Architecture

For your reference, you can use the following architecture to build similar services -

Alexa_Archit.png

 

With this scalable and extendable application architecture we could 'securely' access various on-prem and cloud based business systems (e.g. CRM, HCP based FitSAP etc.)

 

Hope this helps you and you will keep this in mind when you build your voice based UX on enterprise systems.


Acknowledgements: Developers who put their hearts behind this demonstration are:  Gunter GliemJonathan Kuch

Develop Visualizations Using SAP HANA Cloud Platform And SAP Lumira

$
0
0

Introduction


This document will brief us on how we can develop a native XS Application using SAP HANA cloud platform. We will also see how we can connect SAP Lumira Desktop with SAP HANA cloud platform and develop visualization on the data put into SAP HANA Cloud.


In this document, we would be going through following steps to create visualizations:-


Steps:


  1. Connect to SAP HANA Cloud Platform from Eclipse IDE.
  2. XS Project is created and it should be stored in SAP HANA Repository.
  3. Create a Calculation View on the data of SAP HANA Cloud Platform using SAP HANA Modeler
  4. Create a DB Tunnel and Connect SAP Lumira Desktop to SAP HANA Cloud platform using the DB tunnel credentials.
  5. Once SAP Lumira is connected to SAP HANA Cloud, Analyze the data of HANA Cloud Platform by making visualizations.


Firstly, we need to install Eclipse IDE version Mars or Luna. We need to install SAP HANA tools for Eclipse.


Procedure to install SAP HANA Tools for Eclipse.

    

     1.    Open Eclipse IDE. From the menu bar, select Help->Install New Software.

     2.    Select the whole group SAP HANA Tools.

     3.    Click Next and Finish the wizard.


screenshot.jpg

 


s21.jpg


 

Create a SAP HANA instance

 

     1.     First we have to register and login to SAP HANA Cloud Cockpit using the given link.

          https://account.hanatrial.ondemand.com/cockpit

    2.    We need to select “Database and Schemas” to create a New trial instanceof SAP HANA database.

    3.    Give the Schema ID name and select Database System and save the entries.


Connect to SAP HANA Cloud DB from Eclipse IDE


  1.   Open SAP HANA Development Perspective and choose Add Cloud System.
  2.   We need to enter our SAP HANA Cloud trail account details.
  3.   Select the same Trial Instance of SAP HANA DB.

 

s22.jpg


s24.jpg

Create an XS Project in Eclipse IDE


     1.    Open the Project Explorer and select File->New Project. Select XS Project.

     2.    Select “Share Project in SAP Repository” checkbox.

     3.    Then, create Repository Workspace that will also hold our XS project.


s25.jpg


 

Reading CSV File in SAP HANA Cloud Platform


     1.    In SAP HANA Cloud Platform Cockpit Select the Database and Schema and select the schema id created and go to Development tools: SAP HANA Web-based Development Workbench.

     2.    In Project Explorer, Create a new package and select File->New->File. We need to enter the file name as .xsapp and choose Finish.

     3.    Create one more file with the name .xsaccess. . Write the following code in the file and save.

{

exposed” : true,

“default_file” : “hello.xsjs”

}

 

     4.    Create hello.xsjs file and write the following code.

 

$response.contentType = "text/html";

var output = "Hello, " + $.session.getUsername() + " <br><br>";

var conn = $.db.getConnection();

var pstmt = conn.prepareStatement( "SELECT CURRENT_USER FROM DUMMY" );

var rs = pstmt.executeQuery();

if (!rs.next()) {

       $.response.setBody( "Failed to retrieve data" );

       $.response.status = $.net.http.INTERNAL_SERVER_ERROR;

       }

       else {

   output = output + "This is the response from my SQL. The current user is: " + rs.getString(1);

       }

       rs.close();

       pstmt.close();

       conn.close();

       $.response.setBody(output);

 

  1. 5.    Right Click on Package and go to File->Import. Import the CSV file. e.g. bus.csv
  2. 6.    Create a new file with the name bus.hdbti and write the following code in it

 

import = [

{

table = "s0009779955trial.hana::mymodel.bus";

schema = "_SYS_BIC";

file = "s0009779955trial.hana:bus.csv";

header = false;

}];

 

  1. 7.    Create one more file with the name mymodel.hdbdd and write the following code in it.

namespace s0009779955trial.hana;

@Schema: '_SYS_BIC'

context mymodel {

type SString: String(60);

@Catalog.tableType: #COLUMN

@nokey Entity busfinal {

busno: Integer;

source: SString;

destination: SString;

arrival: SString;

departure: SString;

distance : Integer;

route : SString;

Seatno : Integer ;

};

}

 

Column name in the code above should be same as used in CSV file.

 

8.We need to activate the files by selected Team->Activate.

 

Note: Use Quick fix to change the encoding of individual files to UTF-8.


s27.jpg

 

 

Create Calculation View

 

     1.    In the Repository Package, a calculation view of type “Graphical” is created.

     2.    In the Scenario Editor, add a Join Node.

     3.    In the Join Node, select and drop the tables from schema you want to join.

     4.    Select the columns that you want to show in calculation view and choose “Add to Output”.

     5.    In the Scenario Editor, we need to link the join node with the aggregation node.

     6.    Select the aggregation node and add all the columns to output.

     7.    Semantics node should be selected and the enable analytic privilege checkbox in the Details pane should be deselected

     8.    View should be Saved and Activated.

 

 

The user needs to be granted select privileges. This is done by calling a procedure through the following in SQL Console.

CALL "HCP"."HCP_GRANT_SELECT_ON_ACTIVATED_OBJECTS"


Catalog Folder is refreshed and generated calculation view can be seen in _SYS_BIC schema.


We can connect “SAP Lumira Desktop” with SAP HANA Cloud Platform by creating a DB tunnel.


We need to run the following command at command prompt.


neo open-db-tunnel -a <account_name> -h <landscape_host> -u <user> -i <schema_ID>

<landscape_host>   = hanatrial.ondemand.com


This cmd will give us the password to connect to SAP HANA Cloud Platform.


Once SAP lumira is started, we can connect to SAP HANA Cloud Platform using the credentials of DB tunnel and see the calculation view created in SAP HANA Cloud.


We can make visualizations in SAP Lumira by making use of SAP HANA cloud platform data and analyze the data accordingly.


Hope you like reading my blog.


Please do not forget to provide your valuable feedback and responses.


Thanks & Regards,

Saurabh Raheja


Viewing all 526 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>