Enterprise Geospatial Solutions from Exelis Inc.
CLOUD-BASED DATA ANALYSIS IS THE FUTURE
Over the past decade the proliferation of Geographic Information Systems (GIS) in both military and commercial applications has been increasingly driven by developments in web technology. The ability to search, discover, and consume geospatial data and information over a network has made web mapping applications a mainstay of our lives. The increasing value that is being placed on geospatial information is evident in the growth of these applications and can be seen everywhere from the front lines to our daily commute.
In the same way that location-based information and geographic information systems has become a part of the larger information industry, image analysis and data fusion has a large role to play in increasing the types of information that can be used to help us make decisions. As interoperable, web-based data analysis software packages become more accessible, the information that can be derived with such software will be fused with traditional GIS data to increase the accuracy of the information being fed to the user.
From a geospatial standpoint, data analysis is aimed at deriving information from diverse types of data sources and providing insight into a situation based on information contained within those data sources. In an enterprise environment, this can involve multiple different users all being able to remotely access data and analysis from a centralized server. This concept has been proliferated by companies that are developing robust data content management and analysis capabilities that can be accessed from front end clients such as web or mobile applications.
Web-based data analysis and fusion allows an end user to gain near real-time insight from data and analytics in ways that are not possible with standard analysis tactics. It does this by allowing information from very different data sources to be combined and analyzed in an effort to draw conclusions based on previously unknown correlations between those data types.
An example of this could be an environmental responder getting real-time updates on rescue efforts while in the field, or a deployed soldier getting updates on enemy troop movements. Information uploaded by assets in the field provides current insight into on-the-ground conditions, while analytics derived from remotely sensed data can provide context as to what is happening or what has changed in a specific location. To bring it one step further, users can be tipped off to an event or a series of factors occurring which could imply the potential occurrence of an event. An example of this could be severe weather warnings going out to disaster response teams, or even an assessment of civil unrest within a region. In any situation, the ability to quickly and easily assess relationships between disparate data types allows the user to make more informed decisions.
INTEROPERABILITY IS THE KEY
In any system designed to consume and disseminate information, the ability for various pieces of the system to communicate with other pieces of the system is critical. Interoperability occurs at many levels in a data cataloguing and analysis system, including at the data level, the services level, and the interface level. A truly diverse platform should be able to interact with any other type of similarly architected system, and really exists as part of a larger network of data and services being offered within the cloud, or within the organization’s intranet.
- Data – Data comes from all different sources. The key to a successful system is having components that can easily read and analyze all different types of information. This includes not only traditional GIS data such as multi or hyper-spectral images, LiDAR, RADAR, or Full Motion Video (FMV), but also non-traditional data such as Twitter feeds, RSS notifications, and others.
- Services – The real driver behind cloud-based data fusion occurs with open web service standards. Many of these standards are adopted and managed by the Open Geospatial Consortium (OGC), which is a group comprised of over 400 organizations world-wide that provide input into the standards that should be adopted. Standards such as Web Map Service, Web Feature Service, and the soon to be adopted GeoServices REST specification are all protocols by which enterprise architectures can speak to each other.
- Interface – Finally, it is important for front end clients to be developed using architectural styles that can interact with the OGC protocols. HTTP REST and other architectural styles allow user interfaces to pass requests to a server and to receive information back from the server once the analytics have been run. These architectures allow back-end servers to be callable by a number of different clients, servers, or software packages.
ENTERPRISE SOLUTIONS COMPONENTS
Exelis Inc. develops interoperable, enterprise-level geospatial solutions that leverage open standards to provide robust data cataloguing and dissemination capabilities along with proven, accurate data analysis. Each of the components that comprise the overall solution are designed to be leveraged independently, or as part of a larger solution incorporating data, interfaces, and services. This flexibility allows for data analysis systems to be built solely on Exelis technology or for pieces of the technology to be incorporated into a pre-existing architecture.
The basic components that are a part of any data analysis architecture are defined by the utility they provide within the workflow. This could be an interface through which a user interacts with the data, a cataloging component for search and discovery, or a backend server that analyzes data and fuses information together. For any system, the components and standards by which they interact with each other are similar. In general there are three main components to a data analysis system: a user interface, a data catalog and dissemination component, and a processing component.
An user interface can be any type of program that is capable of requesting information from the other components of the system. This could be a desktop program, a web-based thin client, a mobile application, or any other program that a user might leverage to request information from a server. These interfaces can be delivered as an off-the-shelf utility, or a custom interface designed by the user. In any event, the interface must leverage certain standards in order to communicate with the other pieces of the system.
In many cases, the catalog and dissemination component of the system will also have an interface by which a user interacts with the middleware. Such is the case with the Exelis Jagwire product. Jagwire contains a freely distributable, light-weight, web based client that provides an interface into the capabilities of the greater Jagwire offering. Minor configuration allows the enterprise to quickly catalog and prepare their data for further analysis and dissemination, and interoperability standards allows for customization of the interface to configure it for ENVI tasks.
Data Catalog / Dissemination
This component is often described as middleware and supplies a number of different options for managing and supplying your raw data and derived products to the user. The catalog consists of information that is hosted locally, remotely, or is consumed as a web service from the cloud. The management of this data allows a user to query and visualize data, and to ingest, compress, and serve raw data and derived products over the internet.
Jagwire from Exelis is a web-based software system that is specifically designed for ingest, storage, management, discovery, and delivery of geospatial full motion video (FMV), imagery, and derived products with near real-time access. It promotes reduced latency from data collection to decision making in the field, is customizable, and can be integrated into existing IT infrastructures due to its standards-based design. Jagwire provides FMV viewing capabilities in near-real time, allowing users to see exactly what their assets are seeing from the air, which promotes more efficient tasking of aerial assets. It can be implemented as an enterprise-level solution, a ground-station level, or even at a mobile level, leveraging jpip streaming and other web standards to serve important data over constrained bandwidths.
The interface provides a way for the end-user to discover, visualize, and request analysis on data, while the catalog hosts the various datasets in a way that easily discoverable by the user interface. Once a user has found the data they are interested in and decided what type of analysis they would like to run, they initiate a call from the user interface to the analysis engine. A back end analytics engine such as the ENVI Services Engine is then able to locate the specified data set within the catalog and run the requested analysis algorithm on that data using the parameters provided by the client. Analysis in this case may consist of feature extraction, spectral identification, vegetative health, a custom function, or even a series of functions that have been chained together to extract a specific piece of information. Once the analytics have been run the derived dataset can then be passed back directly to the client and ingested back into the catalog to be discovered and utilized by other users of the system.
Regardless of whether you have an existing interface, want to build your own interface to host on ENVI Services Engine, or prefer to use the out of the box functionality provided by the Jagwire viewer, the back-end catalog, visualization, and analysis functionality provided by Exelis technology is interoperable and allows for seamless execution of your analysis tasks within any framework.
THE FUTURE IS NOW
As the benefits of incorporating image analysis as a data fusion component are realized by more and more industries, the ability for companies like Exelis to provide off-the-shelf, interoperable components that comprise cloud based geospatial systems will facilitate the integration of this technology into the mainstream. Exelis products like Jagwire and the ENVI Services Engine are specifically designed to be integrated into existing IT infrastructures while leveraging open source standards to facilitate collaboration among diverse data feeds and web based services. All of this aims to benefit the enterprise by reducing software and hardware costs, and ultimately providing the information that people need in a reliable, timely fashion.