Proliferation of data continues to outstrip our ability to manage and secure data. The gap is growing and alarming,especially given the explosion of non-traditional smart devices generating, storing, and sharing information. As edge computing grows, more devices are generating and transmitting data than there are human beings walking the planet.
High-speed generation of data is here to stay. Are we equipped as people, as organizations, and as a global community to handle all this information? Current evidence suggests not. The International Data Corporation (IDC) predicted in its study, Data Age 2025, that enterprises will need to rely on machine learning, automation and machine-to-machine technologies to stay ahead of the information tsunami, while efficiently determining and iterating on high-value data from the source in order to drive sound business decisions.
That sounds reasonable, but many well-known names in the industry are trying – and failing – to solve this problem. The struggle lies in the pivot from “big data,” to “fast data,” the ability to extract meaningful, actionable intelligence from a sea of information, and do it quickly. Most of the solutions available are either prohibitively expensive, not scalable, or both.
In this episode of CyberWire-X, guests will discuss present and future threats posed by an unmanageable data avalanche, as well as emerging technologies that may lead public and private sector efforts through the developing crisis. Don Welch of Penn State University and Steve Winterfeld of Akamai share their insights with Rick Howard, and Egon Rinderer from sponsor Tanium offers his thoughts with Dave Bittner.