- Part 1 - an Overview: SAP HANA Text Analysis on Documents uploaded by an end-user
- Part 2 - Hands on: Building the backend for a SAP HANA Text Analysis application
- Part 3 - Presenting: A UI5 front-end to upload documents and explore SAP HANA Text Analytics features
- Part 4 - Deep dive: How to upload documents with OData in a UI5 Application
The application is a very simple UI5 application. The code, along with the back-end code is available on github, and instructions to install this application on your own SAP HANA system are provided as well.
The UI5 Application: Functional Overview
Here's an overview of the UI5 demo application:The application features a single page, which is split vertically. On the left hand side of the splitter is the list of uploaded files, and it shows all rows from the
CT_FILE
database table.
On the right hand side of the splitter is the list of text analysis results, and this shows rows from the $TA_
database table.
In the screenshot, only the
FILE_NAME
column is visible, but you can reveal the other columns by choosing them from the column header menu, which pops up when you right click a column header:
Since we haven't uploaded any files yet, both lists are currently empty. So, let's upload a file to see it in action! To upload a file, hit the button on the top left side of the application toolbar (1):
After clicking the "Upload File for Text Analysis" toolbar button, a dialog appears that lets you browse files so you can upload them. Hit the "Browse File..." button in the dialog to open a File explorer (2). Use the file explorer to choose a file (3). Note that this demo project's github repository provides a number of sample files in the
sample-docs
folder.
After choosing a file in the File explorer, the file name appears in the dialog:
To actually upload the chosen file, confirm the dialog by clicking the "Upload" button at the bottom of the dialog. The file will then appear in the file list left of the splitter, and is then selected.
Whenever the selection in the file list changes, the text analysis results in the list on the right of the splitter are updated to match the selected item. As we mentioned in the previous post, collection of text analysis results is
ASYNCHRONOUS
, so after uploading a new file, there is a possibility that the text analysis results have not yet arrived. Unfortunately, there is not much that can be done about that at this point.
You can now browse, filter, and sort the list of analysis results to explore the results of the text analysis. Obviously, by itself this is not very useful, but the point of this app is to make if very easy to inspect the actual raw text analysis results. Hopefully, it will give you some ideas on how you could use this type of information to build actual real world applications.
Once you're done with a particular file, you can remove it too using this application: in the File list, simply hit the trashbin icon to remove that particular file. A dialog will appear where you need to confirm the deletion of that file. When you confirm the dialog, the file will be deleted form the
CT_FILE
table.
Note that any corresponding analysis results from the $TA_
table will not be removed by this demo application, unless you manually added a foreign key constraint on the $TA_
table that cascades the deletes from the CT_FILE
table.
Installing this application on your own HANA System
Front-end and back-end code for this application is available on github and licensed as open source software under the terms and conditions of the Apache 2.0 software license. The remainder of this post provides the installation instructions.Obtaining the source and placing it in a destination package on your HANA system
- Create a package with your favorite IDE for SAP HANA (Web IDE, SAP HANA Studio, Eclipse with SAP HANA Developer Tools)
- Download an archive of the github repository
- Unzip the archive and transfer its contents to the HANA package you just created.
Updating Package and Schema names
- With
db/CT_FILE.hdbdd
:- update the
namespace
, update the package identifier from"system-local"."public"."rbouman"."ta"
to the name of the package you just created. - modify the
@Schema
from'RBOUMAN'
to whatever schema you want to use. (Create a schema yourself if you don't already have one) - Activate
db/CT_FILE.hdbdd
. In the database catalog, you should now have this table. Hana should have created a corresponding$TA_
table as well.
- update the
- With
service/ta.xsodata
:- In the first entity definition, update the table repository object identifier
"system-local.public.rbouman.ta.db::CT_FILE"
so it matches the location of the table on your system. - In the second entity definition, update the catalog table identifier from
"RBOUMAN"."$TA_system-local.public.rbouman.ta.db::CT_FILE.FT_IDX_CT_FILE"
so it matches the database schema and catalog table name on your system. - Activate
service/ta.xsodata
.
- In the first entity definition, update the table repository object identifier
Activation
You can now activate the package you created to activate all remaining objects, such as the.xsapp
and .xsaccess
files, as well as the web
subpackage and all its contents.
Running the application
After installation, you should be able to open the web application. You can do this by navigating to:http://yourhanahost:yourxsport/path/to/your/package/web/index.html
where:
yourhanahost
is the hostname or IP address of your SAP HANA systemyourxsport
is the port where your HANA's xs engine is running. Typically this is 80 followed by your HANA instance number.path/to/your/package
is the name of the package where you installed the app, but using slashes (/) instead of dots (.) as the separator character.
Summary
In this blog post we finally got to use the backend we built previously by installing and running the UI5 App. You may either use the app to explore the SAP HANA Text Analysis results and to experiment with different different document formats.If you're also interested into how the actual upload process works and how it is implemented in the UI5 app, then you can read all about it in the next and final installment of this series.
No comments:
Post a Comment