Search our Blogs
Showing results for 
Search instead for 
Do you mean 

Gesture Based Search on Android with IDOL OnDemand

Please note that HP IDOL OnDemand is now HPE Haven OnDemand. The API endpoints have changed to Haven OnDemand. Please see the API documentation for more details.




IDOL OnDemand can process, analyze and search all your unstructured data, like text and images, through a set of very simple RESTful APIs. Along with its rich set of powerful data analytics APIs, IDOL OnDemand inherits all the advantages intrinsic to REST – lightweight, stateless, platform independent, scalable, failover, language and platform agnostic, enabling rapid application development, etc.


In other words, you can easily create very powerful mobile apps with IDOL OnDemand, making your Android smart phone intelligentThe app described here, will extract entities from text in a screenshot. The screenshot is taken by using Android’s built in Motion gestures.


It can be a hassle to copy and paste text in your screen, then open up a browser or widget to search for more information. Instead, we can also automate finding the most relevant topics in the article using IDOL OnDemand.


Enable Device Gestures


Gestures are relatively new in devices, on Android you can enable specific actions by specific gestures. For this app we will enable the gesture “Palm Motion” to open the “Capture screen” action. On Google Glass for instance, you can also enable eye blinks to trigger activity.


First you must activate Android’s gesture triggered “Capture screen” feature, go to

Settings > My Device > Gestures and Motions > Palm motion


001_1_Screenshot_My_Device.png  001_2_Screenshot_Motions_and_Gestures.png


Turn on Palm notion and turn on “Capture screen”. This will “capture screen by swiping it from the right to left or vice versa with the side of your hand.”




Now you can use the side of your hand to swipe your screen, which then will trigger the “Capture screen” event, taking a picture of your current screen and saving it to your default “Screenshots” folder in your images gallery.


Try this by opening your favorite news application, and from the list of news items, select the news article where you'd like to find additional information.


01_NewsItemList.PNG   02_NewsItem.PNG




While reading the news article, you can swipe the side of your hand over your touch screen, which then saves a screenshot to your image gallery of Screenshots. But this will not yet automatically run our application to extract additional information from the article.


Create an Intent Filter to trigger our App


To start the IDOL OnDemand application, I used what is called an implicit Intent on Android. An implicit Intent is defined in the app’s manifest within the intent filter of an activity definition.


<activity android:name="com.idolondemand.ourApp.MainActivity"
          android:label="@string/app_name" >
    <action android:name="android.intent.action.MAIN" />
    <category android:name="android.intent.category.DEFAULT" />
    <data android:mimeType="image/*" />


In the Manifest fragment above, the implicit Intent starts the app’s main Activity, when the device tries to open a URI of Mime Type “image/*”.


Now when our Palm Motion captures the screen and saves the image, it automatically suggests running our app’s main activity, which runs a set of analysis APIs from IDOL OnDemand.




Now let’s add IDOL OnDemand Analytics


Now, Android suggests to run our application when a file of mime type image is opened. When the main activity is started, the Activity life-cycle on Android will run the callback onCreate. You get the image from the Intent.  I then create an OcrTask instance, a subclass of the AsyncTask, which allows to run a background thread calling IDOLOnDemand.


OcrTask ocrTask =  new OcrTask(this);
Intent intent = getIntent();
Uri image = intent.getData();


When the Activity finishes, a callback function is called that returns the result called onActivityResult, which executes the created instance of OcrTask. This runs a background process, which send the image of the captured screenshot in a request to the OCR API from IDOL OnDemand.




protected void onActivityResult(int requestCode, int resultCode, Intent data) {
    if(requestCode == 1){
        super.onActivityResult(requestCode, resultCode, data);



The ocrTask calls the following code below that uses Apache HTTPClient to send a POST multipart request.


DefaultHttpClient httpClient = new DefaultHttpClient();
HttpPost httpPost = new HttpPost(url);
MultipartEntityBuilder entity = MultipartEntityBuilder.create();
entity.addPart("file", new FileBody(new File(imagePath)));
entity.addPart("apikey", new StringBody(apikey, ContentType.TEXT_PLAIN));
HttpResponse httpResponse = httpClient.execute(httpPost);
HttpEntity httpEntity = httpResponse.getEntity();
String response = EntityUtils.toString(httpEntity);
JSONObject json = new JSONObject(response);


The OCR API from IDOL OnDemand returns the text from the image. Then, the text from OCR is used to run to the same multipart request code above and send requests to the Entity Extraction API with parameter entity_type to extract the following types from the text: ‘people_eng’, ‘companies_eng’ and ‘places_eng,’ in other words people, companies and places found in the text.


06_Threads_EntityExtraction.PNG    07_SearchResults.PNG


Each of these requests to Entity Extraction API is wrapped in a separate thread, so to not block the UI interaction.


class PeopleThread extends Thread {
    public void run() {
       // call the Multipart Post request here


When each of these threads run, they include a call to call the Query Text Index API in order to search the wiki_eng and the news_eng types, so to create English Wikipedia links and English news links for each of the entries. These Wikipedia and News queries are also defined in separate threads like the above.



Social Media
About the Author
† The opinions expressed above are the personal opinions of the authors, not of HPE. By using this site, you accept the Terms of Use and Rules of Participation