A fraction of a second is the difference between success and failure. Whether you’re performing cyber and kinetic threat detection; processing analytics in denied, degraded, intermittent, or limited (DDIL) environments; predicting the weather for aerial, surface, or waterborne unmanned vehicles; or making a snap decision to increase warfighter advantage, data is the backbone of intelligence and decisions. In these scenarios, fast data isn’t fast enough.

Mission-critical situations require real-time data. A well-equipped database can perform 200 million reads per second. Taking decisions from fast to real time provides the necessary advantage to enable the swiftest and most efficient action.

The role of AI

The U.S. supports responsible military use of artificial intelligence and autonomous systems. Databases ingest and analyze massive datasets, combining historical and streaming data analysis, whether running on the tactical edge or across any public cloud. The data feeds AI and machine learning (ML) algorithms data rapidly and precisely in critical situations at all times. Feeding more, faster, and better data into any mission-critical application enables continuous learning and real-time decisioning. Both quality and quantity are essential for sound decision-making, which is where ML thrives.

AI/ML models run better with more data and more iterations. The more training, tuning, and validation that happens, the better the results. The challenges lie in data preparation, model creation, and tuning as models constantly evolve. This is further complicated when an online system has persistent signal data ingestion from disparate sources and still needs to make an inference in milliseconds. Modern data architecture needs to support both the online and offline data-feeding of machine learning systems. Increased data-feeding shortens the time between model iterations and improves model accuracy.

Moving data from the edge to the tactical cloud, despite challenges

A database needs to be able to process the data for any AI/ML engine. Plus, it needs to run anywhere and everywhere. Real-time data enablement needs to be possible on-premises or in a hybrid cloud model, multi-cloud model, or cross-cloud environment.

Data processing typically occurs remotely, often far from where decision-making occurs. High-speed processing occurs from the tactical edge to the tactical cloud — from an unmanned vehicle in one country to the command center in another, for example — to produce a Common Operating Picture (COP) for any Area of Responsibility (AoR), along with a global view.

Mission teams need to transmit data despite the constraints of low bandwidth and DDIL edge environments. During replication between the edge and tactical cloud, there needs to be virtually zero delay, whether processing gigabytes or petabytes of change and compressed data.

Modern data architecture serves the depth and diversity of real-time data

Data itself isn’t neat and tidy. Data comes from tens of thousands of sources all over the world, so modern data architecture must support multi-model capabilities. Databases need to ingest and analyze large volumes of structured, semi-structured, and unstructured data types — streaming of signals from radios, wireless access points, SATCOM/MILSAT, tactical data links, hidden links — into a multi-model database. The singular database then meshes different kinds of database models into one integrated database engine that services a wider range of data processing and data retrieval tasks and use cases.

Most database management systems are organized around a single data model (such as relational, document, graph, etc.) that decides how data is organized, stored, and manipulated. A multi-model database can accommodate key-value, document (JSON documents, complex data types for maps and lists, etc.), GeoJSON, graph, and object-oriented models. In short, the ability to store a broad set of data types, maps, and lists enables a database to address a broad range of use cases.

Applying real-time data to real contexts

Processing all of this data requires a next-generation Common Operating Database (COD) that can ingest and analyze massive datasets, combining historical and streaming data analysis with powerful location intelligence. A COD must be capable of performing data observability to track and monitor the data’s provenance and lineage, and must perform data tagging to differentiate between enterprise, mission, and contract data at multiple levels of security (unclassified, confidential, secret, and top secret). A COD is a shared-nothing, multi-threaded, multi-model data platform designed to operate efficiently on a cluster of server nodes, exploiting modern hardware and network technologies to drive reliably fast performance at sub-millisecond speeds across petabytes of data.

The COD should allow for a variety of deployment form factors (Kubernetes, virtualized, bare-metal, cloud) to be dictated by the mission, not the architecture. Processing data at the point of need enables data-driven insights resulting in decisions at the speed of conflict.

Edge computing, alongside 5G and streaming IoT data, has created the opportunity for more efficient decisioning at the point of events. Regardless of physical location, the edge is where milliseconds matter. By collecting and processing data closer to the network edge, less data needs to travel across the network, which means less data at risk and less required network bandwidth. Data can be continuously ingested and analyzed at real-time speeds, accelerating decisions that impact mission outcomes, ultimately providing an upper-hand advantage.

Although modern data architecture itself is complex, its result is simple. It enables decisions to happen seamlessly, and leveraging real-time data can elevate military strategy. Bottom line: when fast isn’t fast enough, real-time data processing functions in the milliseconds that matter.

Cuong Nguyen is VP, Public Sector at Aerospike

Share:
More In Opinion