Get ready for the coming era of fog computing

By Lynne Canavan

Director of Ecosystems

Real-Time Innovations (RTI)

September 06, 2017

Get ready for the coming era of fog computing

Cloud-only approaches can no longer keep up with the requirements of real-time communication. Fog computing and networking address these challenges.

The IT pendulum is again in motion, as we shift away from cloud towards a more distributed computing model. Gartner says that by 2022, 75 percent of enterprise-generated data will be created and processed outside the data center or cloud. This movement is driven by massive amounts of data from IoT and other digital applications—now measured in petabytes—that dwarf all previous networking demands. Given the requirements for real-time communication flows throughout and beyond IoT, 5G and AI apps, it’s clear that cloud-only approaches can no longer keep up with the necessary volume, latency, mobility, reliability, security, privacy, and network bandwidth challenges.

Fog computing and networking address these challenges by extending the traditional cloud-based computing model where architecture implementations can reside in multiple layers of a network’s topology. Fog is a system-level horizontal architecture that distributes resources and services of computing, storage, control, and networking closer to users, enabling faster processing time and lower network costs.

Fog provides a standard, universal way to manage, orchestrate, and distribute the necessary resources and services. For a historical comparison, look at TCP/IP, which provided a standard protocol to communicate and led to the rapid growth of the Internet. Fog is likewise “TCP/IPing” the necessary protocols to enable functionality for the digital world, bridging the continuum from cloud to things.

Fog scales both horizontally (fog node to fog node) and vertically (between fog layers), and can dynamically respond to network load changes and failures more effectively than a simple extension of the cloud. For the data-dense IoT world, this is a welcome and necessary architectural approach.

The market is now shifting from fog-as-a-concept to fog-as-reality, as companies start to roll out fog-based products and services. Over the next 18 months, we’ll see new use cases, and the emergence of standards and requirements for interoperability for systems. These systems must address the core attributes which the OpenFog Consortium terms as SCALE, which stands for Security, Cognition, Agility, Latency, and Efficiency.

  • Security – Fog addresses IoT security challenges through its distributed architecture, where fog nodes can quickly identify unusual activity and can mitigate threats or attacks before they’re passed through to the system without disruption of service.
  • Cognition – The fog architecture can best determine where to carry out the computing, storage, and control functions along the cloud-to-thing continuum. Decisions can be made on the device or via a nearby fog node, avoiding the need to transport data to the cloud solely for decision making.
  • Agility – The scale of data generated by automated systems and sensors makes it unlikely that humans alone can make the necessary operational decisions. In addition, the architecture and connectivity must be flexible. Fog works over and inside wireline/optical and wireless networks to enable flexibility and decision making.
  • Latency – While a one-second delay is annoying in gaming, it’s an eternity and a pending disaster to a fast-moving drone or autonomous vehicle. Analyzing data closer to where it’s collected minimizes latency.
  • Efficiency – Fog takes an “immersive distribution” approach that pools resources so applications can leverage idling computing, storage, and networking resources to enhance overall system efficiency and performance. If a network goes down, fog nodes can immediately switch to other resources available throughout the network.

There are eight fundamental pillars to fog, representing the fundamental elements to an open, interoperable fog computing architecture: Security, Scalability, Open, Autonomy, RAS (Reliability, Availability, Serviceability), Agility, Hierarchy, and Programmability. The attributes of these pillars are laid out in the OpenFog Reference Architecture, and will be addressed at the upcoming Fog World Congress event, Oct. 30 to Nov. 1 in Santa Clara, where 75+ speakers in 55 sessions will share tips for developers; insight into fog-based security; use cases of fog being deployed in energy, smart cities, manufacturing, autonomous cars, and more. A hackathon and interactive workshop enable hands-on learning.

Lynne Canavan is with the OpenFog Consortium and is executive co-Chair of Fog World Congress, the largest industry conference on fog computing and networking. OpenFog is the global organization driving the adoption of fog computing through an open, interoperable architecture. Previously, Lynne was Vice President of Program Management for the Industrial Internet Consortium. Before that, she spent 17 years at IBM.

Lynne Canavan is a collaborator and marketing professional with over 25 years in the technology industry, working with Fortune 100 companies, startups, agencies and consultants. Named a ' 2018 Woman of IoT in Marketing' by Connected World, Lynne has been involved in the launch and growth of collaborative ecosystems in the fast-moving Internet of Things world.

More from Lynne

Categories
Networking & 5G