June 15, 2024

The World on a String of Data

Every two years, the North Atlantic Treaty Organization (NATO)—one of the most important government-military alliances in the world—produces a report projecting the future security environment. The Strategic Foresight Analysis (SFA) and its visualization of the world’s geopolitical future are critical to planning, for NATO and for its member nations.

Earlier this year, NATO’s Allied Command Transformation office in Norfolk, Virginia, asked for help from industry. They wanted technology that would improve the process of analyzing global data and inform decision-makers about potential risks, trends, patterns and other aspects of the future global landscape.

Aveshka answered that call.

Partnering with Google Cloud and Boston-based firm Quantiphi, Aveshka developed an artificial intelligence and machine learning platform that uses big data analytics to help forecast future concerns for NATO and improve the SFA.

“Some call it predicting the future; we call it assisted decision-making,” said Shannon Vaughn, Aveshka’s Chief Innovation Officer. “What used to take months to get the relevant information can now be done in minutes.”

Using a big data engine powered by AI and machine learning, analysts could quickly extrapolate key data points, which could be on individuals or groups of interest, specific locations or entities, or certain topics, ideas or themes.

“You can use the tool to extract out data using high-end natural language processing technology. That allows us to speed time to insight,” Vaughn said.

How exactly could this work?

Consider the amount of information on a given current-events topic that could be found in written materials—think news articles, white papers, scholarly journals, blogs and more. On any given current-events topic, those materials could involve hundreds of thousands of documents.

It would be essentially impossible for a person to read all of those documents and glean meaningful information. But through machine learning and AI, the tool can ingest and sort all of those documents, then filter down the results according to key topics or terms. Those 100,000+ documents become a handful of pertinent documents, which are then broken down into 100-word chunks that highlight the reference and specifically where the document talks about the topic at hand.

This, in turn, yields much more manageable insight on trends, patterns and consensus that can be used to inform the SFA. But the tool also includes dynamic visual graphic representation, including knowledge graphs and link analysis diagrams that clearly present key information.

The fast-tracked prototype underscores the need to be able to quickly respond to evolving requirements, and to build technologies that are adaptable for varying applications.

“Everyone has the same problem—they need to get smart quickly. The problem set is big data and speeding time to insight. But this technology is so forward-leaning; it hasn’t been done before like this,” Vaughn said. “We’ve all talked for years about the coming wave of big data and high technology. That ‘coming wave’ has finally arrived and is crashing. As a forward-leaning, smaller technology company, we’re able to be agile and capitalize on opportunities like this.”

The original post can be found here.