Streaming Cloud Analytics
Our streaming cloud analytic solution quickly processes high velocity data for real-time queries in the cloud. This demonstration ingests complex XML files, each containing 5000-7000 ship reports, and displays the latest ship positions in real-time. In addition, users can select individual ships to retrieve historical location data, including the most recent position. There system process over 1000 XML files, ingesting one file every five seconds.
To handle this workload, we've built a system implementing our iNovex Lambda architecture. This architecture combines stream processing and cloud storage to quickly publish data for end-user consumption. The stream processing and cloud storage systems scale horizontally to cope with big data requirements.
The iNovex Lambda architecture is composed of three layers: batch, speed and serving.
- The batch layer stores all data entering our system. For this, Apache Hadoop is the obvious choice for fault-tolerant scalable storage of data.
- The speed layer ingests, processes and emits new data as it enters the system. As a streaming data engine, it performs these tasks as fast as possible while guaranteeing no data is lost. In our implementation, we use the open source project "Storm", developed by Twitter, to process our XML data. We use Apache Cassandra to provide short-term storage and retrieval of the very latest processed data.
- The serving layer provides views of data stored by the batch layer. We implement HBase in our system, which holds the time-indexed location data for each ship.
All end-user requests come through a set of REST endpoints served by Apache Tomcat. A web application built on top of this layer provides a map view that's refreshed every few seconds.
The data we're using today has been provided by SRO, which aggregates, enhances and cleans many data sources into a small set of polished data products. Because of their data processing, all data files share a common schema, significantly easing the data parsing within the speed layer.