MistNet, founded by former Juniper employees, moves AI processing to the network edge to build distributed detection and analysis models for security.

AI-based startup MistNet is moving intelligence to the edge of the network in an attempt to speed recognition of malicious and suspicious activity and reduce the amount of data that has to be moved from edge to cloud for analysis, storage, and forensics. This week’s closing of a $7 million Series A funding round will help it put that intelligence into the field.

MistNet, founded by a team who met while working at Juniper Networks, dubs the technology “mist computing” and its application in its products “CyberMist.” CyberMist uses a distributed analytical mesh that has artificial intelligence (AI)-based analysis occurring at the edge of the network under the control of a central, cloud-based manager.

CyberMist will typically be used to deliver information to security analysts for their work, according to the company. Although integration tools are available to link CyberMist to remediation systems, “We don’t want to be the the automation end of a SOAR [security orchestration, automation, and response solution]. We have integrations with the major SOARs, and we can automate do automatic remediation on that basis,” says CyberMist president and CEO Geoffrey Mattson.

Mattson says more traditional hub-and-spoke architectures make it more difficult to use data from a wide variety (and large number) of data sensors because of the sheer volume of data that must flow from the sensors to a central processor.

“They usually tap the network and look at the raw network data,” Mattson explains. “They often have agents that allow them to look at specific users’ behavior, and they tend to focus on that rather than the output of all the various security appliances.” And that narrow focus is just one of the issues he sees coming from the limitations on how much data most monitoring systems can scan in real time.

“Technically, it’s very difficult to have a separate overlay network to stream very large amounts of data in real time,” he says. “By the time you actually get it to the data center, you’ve lost a lot of the context. You lose spatial and temporal locality that can be very helpful in putting pieces of the puzzle together.”

One of the characteristics of mist computing, Mattson says, is that the edge nodes share a single, sharded, geographically distributed database. They also continually share modeling information so that each edge node has global awareness of conditions and activities on the network.

“We can keep hot data without moving it,” Mattson says. “You can call it up instantly, but we don’t have to move it back to a central repository.” The result is that customers can have real-time access for their own investigations or exploration of events that are occurring, while the MistNet system retains real-time access to do its own modeling and AI processing. 

MistNet dubs the technology for its distributed AI modeling “TensorMist-AI,” for which it has applied for a patent. According to the company, TensorMist-AI leverages technology in Google TensorFlow and Apache Spark that it deploys in a mist computing architecture.

The edge nodes each contain sensor and compute functions in the mist computing architecture. In most cases, the product of the modeling run in those edge nodes — not the raw data — will be sent back to a central controlling and storage facility where more complex AI models are created and used for processing. Customers that want the raw edge data stored for potential forensic analysis have an option to do so, Mattson says.

Related Content:

Curtis Franklin Jr. is Senior Editor at Dark Reading. In this role he focuses on product and technology coverage for the publication. In addition he works on audio and video programming for Dark Reading and contributes to activities at Interop ITX, Black Hat, INsecurity, and … View Full Bio

More Insights

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Top
%d bloggers like this: