Seeing through the fog (computing)

August 20, 2015 OpenSystems Media

For most, the vision of fog computing is, well, a bit foggy. At IoT Evolution Developers Conference in Las Vegas, Biren Gandhi, Principal Strategist at Cisco, helped attendees see through the first atmospheric anomaly to touch the tech industry since the cloud.

Beginning with a survey that shows the pervasion of IoT technology today, Gandhi shared that while 50 percent of companies are either planning to or already have integrated elements of an IoT solution in their business, a range of implementation challenges still exist. Many of these roadblocks are related to siloed, vertical architectures that rely on a backend cloud to perform most of the heavy data processing in an IoT deployment. This model creates several bottlenecks for IoT systems, which Gandhi specifically identified as:

  • Data size/sources (velocity, variety, volume)
  • Convergence of IT and OT
  • System-wide view
  • Cohesive operations
  • Data processing and analytics
  • Reliability/availability
  • Complex greenfield/brownfield deployment
  • Interplay with the cloud

While northbound/southbound cloud frameworks that have typified the IoT to date, this sort of architecture requires that large amounts of data be sent back and forth across the network to a datacenter before being analyzed and acted upon, which for many real-time applications is a non-starter.

However, Gandhi stressed that fog computing is not a replacement for the cloud, but rather “a distributed computing paradigm that extends cloud computing closer to the edge of the network to enable a new wave of applications and services in IoT land.” This means that within the vision of fog computing, an additional network layer is added between IoT end points and the cloud that provides virtualized resources such as compute and storage, allowing for additional service delivery and mobility. At a high level, the affords fog computing networks the ability to:

  • Enable low latency, near real time compute capabilities closer to the edge
  • Tames data deluge and effectively utilizes costly bandwidth
  • Leverages proven cloud eco-system of tools, technologies, best practices and developer API to support multi-agency orchestration and management

As a result of this added decision making power at the edge, data flows no longer have to be restricted by typical North/South traffic, but instead can also move in East/West orientations. In scenarios such as smart traffic management where adjacent streetlights may need to receive updated information in real time or near real time, the low latency and low cost of the fog computing paradigm become a huge enabler, Gandhi explains.

In short, Gandhi and his colleagues at Cisco view fog computing as the development of a sophisticated app store for heterogeneous IoT infrastructure in that policies and processes can be pushed at any time to specific end nodes based on the needs of the application. Therefore, organizations are able to more efficiently take advantage of newly exploited data currency within the Internet of Things.

Cisco is currently in the process of defining an open source effort around fog, so stay tuned. To find out more about fog computing, visit or access Biren’s slides on the IoT Evolution website after the show.

Brandon Lewis, Technology Editor
Previous Article
Using a trusted platform module and trusted brokered IO as the foundation of IoT security
Using a trusted platform module and trusted brokered IO as the foundation of IoT security

If you remember the 90s you probably remember the spread of unwanted software and the subsequent proliferat...

Next Article
Synopsys, Cypress hasten the delivery of USB Type C products

Momentum continues to grow around the latest incarnation of USB, dubbed USB Type C. Synopsys recently intro...