Search our Blogs
Showing results for 
Search instead for 
Do you mean 

[Video] Getting up and running with HPE Haven OnDemand


In the above video Getting up and running with HPE Haven OnDemand, we covered an overview of the platform, the APIs, and where to get help. This article focuses on the coding tutorial, so you can easily follow along and create the app yourself, and provides links to our Github pages and social media accounts.


Our Github Page and Wrappers

Check out our Github page here which houses all of our wrappers.


We have wrappers for all of the major software languages - iOS, Ruby, Node.js, PHP, Android, Python, Windows Mobile - for rapid integration into your project with minimal coding.


Coding Tutorial - using Node.js

Completed code here.


Essentials you’ll need to create this app:

  1. Node.js installed on your computer
  2. Haven OnDemand account - to perform analysis of text messages
  3. Twilio account - to deliver text messages to the webhook
  4. ngrok - to receive POST requests on local computer from external APIs


diagram.pngTo help illustrate how Haven OnDemand’s powerful Text Analysis APIs can be used, we’re going to create a Node.js app from scratch that will receive a text message via Twilio’s webhook service, analyze the sentiment, using our Analyze Sentiment API, extract any key concepts, using our Concept Extraction API, and extract any entities (famous people, notable places, companies, organizations), using our Entity Extraction API, then print all of this information to the console.


First, open up your terminal and create a new directory. Call it ‘video_workshop’


mkdir video_workshop


and ‘cd’ into it:


cd video_workshop


Then, go ahead and install express, a Node.js web framework, body-parser, a library to extract values from POST requests to the server, and Haven OnDemand’s Node.js client library using the following commands:


npm install --save express
npm install --save body-parser
npm install --save havenondemand


Now, let’s create our main file which we will write all of our code in:


touch index.js


Go ahead and open up the directory in your favorite text editor. I use atom in this example:


atom .


Open up the ‘index.js’ file and let’s start coding!


First, let’s include express and the built in 'http' module to create the server:


express = require('express')
app = express()
http = require('http').Server(app)


Next, let’s include the body-parser to extract values from POST requests which will come from the text messages:


bodyParser = require(‘body-parser’)


Because the API which delivers the contents of the text message delivers it as form-urlencoded, add the following line below the previous:


urlencoded = bodyParser.urlencoded({extended: false})


Then, let’s include the meat and potatoes of the app, Haven OnDemand, filling in your API key - which can be found here - where it says ‘API_KEY’:


havenondemand = require('havenondemand')
client = new havenondemand.HODClient('', 'API_KEY')


Next, let’s add the port so the app knows which one to listen to:


port = process.env.PORT || 5000


The contents of the text message will come in through a POST request, so let’s set up an endpoint at ‘/text_processor’ to accept it:‘/text_processor’, urlencoded, function(req, res) {
// stuff will go here soon!


Now, when the text message is sent, it will dump the contents to this snippet of code. When it comes in, let’s store what’s written in the SMS in a ‘text’ variable and create some objects which will be used when we POST to Haven OnDemand using the client library. Amend the previous code to this:‘/text_processor’, urlencoded, function(req, res) {
 var text = req.body["Body"]
 var data1 = {text: text}
 var data2 = {text: text, entity_type: ['people_eng', 'places_eng', 'companies_eng', 'organizations']}


The ‘data1’ object will be used when POSTing to the Sentiment Analysis API and the Concept Extraction API and the ‘data2’ object will be used when POSTing to the Entity Extraction API because it requires which entities we wish to look for (i.e. famous people, places, companies, organizations).


Now, let’s start POSTing to Haven OnDemand and storing the results in variables. First, we’ll analyze the sentiment, then, extract any concepts, then, extract any entities. Amend the previous code again to this:'/text_processor', urlencoded, function(req, res){
 var text = req.body["Body"]
 var data1 = {text: text}
 var data2 = {text: text, entity_type: ['people_eng', 'places_eng', 'companies_eng', 'organizations']}'analyzesentiment', data1, function(err1, resp1, body1){
   var sentiment = resp1.body.aggregate.sentiment
   var score = resp1.body.aggregate.score'extractconcepts', data1, function(err2, resp2, body2){
     var concepts = resp2.body.concepts'extractentities', data2, function(err3, resp3, body3){
       var entities = resp3.body.entities
       console.log(text + " | " + sentiment + " | " + score)
       printStuff("Concepts", concepts)
       printStuff("Entities", entities)


Unlike the aggregate sentiment, the concepts and the entities come back as arrays, so we’ve passed them to functions, which we haven’t created yet, that print them out. Let’s go ahead and create those functions which we will place below the previous snippet:


printStuff = function(string, arr) {
 console.log(string+": ")
 for (var i=0; i<arr.length;i++) {


Next, let’s have the app listen to the port we specified in the beginning so it can accept the POST requests from the text messages:


http.listen(port, function(){
 console.log("listening on port: " + port)


And that’s it! Again, for the complete code, look here.


Go ahead and run


node index.js


and you should see it say “listening on port: 5000”. Now, head over to where you have ngork saved and run


./ngork http 5000


to allow the text messages to be POSTed to your local computer running the Node.js server. Remember to change the URL on your Twilio phone number (here) to the one ngrok is running on otherwise no text messages will come through.


Go ahead and send a text message with some thoughtful, insightful text to the phone number you have with Twilio and watch as the sentiment, concepts, and entities are analyzed, extracted, and printed to the console.


Signup now!

If you’ve made it this far through the post and haven’t signed up for an account yet, what are you waiting for!? Go here and follow the instructions to sign up! You’ll receive an email to the email address you provide shortly after signing up. Open that email and click the link to activate your account, and you’ll be all set!


User benefits

We have awesome benefits for all of our users. We provide free monthly quota with generous limits; you can index and analyze your data directly via the web browser without writing any code; you get technical support via the community; and you can receive notifications about the platform and product updates.


Resources and support to accelerate your project


Social media

Engage with us on social media! We’re on most of the major sites where we post our latest news, updates, events we’re going to, and projects we’re working on!



Check out the HPE Haven OnDemand Developers group on LinkedIn for professionals building solutions with our platform.



Follow @HavenOnDemand for our latest news and announcements.



Check out the HPE Haven OnDemand YouTube playlist on the HPE for Developers channel.


. . .


Thank you for participating. We hope you found the workshop useful. Please let us know what you think and send us your feedback via the comments below.


We’d love to see what you build yourself, and when you’re ready, remember that you can publish your app in our showcase and marketplace.

Social Media
About the Author
† The opinions expressed above are the personal opinions of the authors, not of HPE. By using this site, you accept the Terms of Use and Rules of Participation