It’s not enough to know the Air Conditioning unit has failed, you need to know where it is, what model it is, when it was installed, if it is under Service Contract, etc. even when manually deciding on the appropriate action. Automating the event process response makes it even more necessary to ensure the completeness of data about the Device itself in context with its situation. Gathering and successfully digitizing an Asset for use by Event Hubs and Engines is a critical success factor in scaling up and developing any IoT project.

Research report now available: The Foundational Elements for the Internet of Things (IoT)

The concept of the 'Final Mile’ has been widely used in the Telecoms industry to describe the challenge of connecting the massive number of end users at their multiple locations through disparate types of technologies and protocols. Today the term is back in use to describe the similar challenge of connecting the equally disparate environment of the Internet of Things, IoT, with its wide range of Devices, connection types, protocols and data flow management issues.

While it is logical to focus on the business value output of an IoT deployment it will come as a nasty surprise to many IoT project owners to discover the challenges of the IoT Final Mile. The time and cost of gathering and creating contextual data to describe an IoT Device, or Asset, as well as establishing when, and where, to stream data flows can be considerable.

A proceeding post on Event Engines and Complex Event Processing, CEP, used as an example different combinations of input values from three sensors on a car enabled different output conclusions. Two of the outcomes; a warning of a slow puncture and a dangerous tire blow out, in addition to the immediate warnings to the driver have added value if forwarded to external Event Data Hubs, and Event Processing Engines.

The emergency services, and the car service management center will need the additional data on the location of the event, the type of car, wheel and tire, and service management agreement. Adding this additional contextual data these into the car service management Enterprise’s Event Engine as an example makes it possible to tell the driver where to get a puncture mended, or a new tire, on their service plan.

It is the broader ability to make use of contextual data aligned to sensed events to deliver complex insights and optimized actions that illustrates why IoT heralds a new era of Business capabilities. IoT is much more than the addition of sensors to existing Enterprise IT applications. IoT can transform simple online Services into high value Business Services that offer highly disruptive competitive advantage to a market.

Asset Digitization refers to the building of a complete digital picture of connected Assets, Of any type, to provide the contextual background data necessary for complex insights and actionable outcomes to be processed. As this is the reason for the deployment of IOT sensors any project that scales beyond a handful of locally contained and connected sensors manually supervised will need to include an Asset Digitisation program. 

Asset Digitization usually requires the assembly of data from a variety of sources using multiple and sometimes unfamiliar formats. As an example, CAD drawings may have to be used to gain not just individual sensor location information, but also to prepare a GUI presentation format of a site, or building. The collation and management of Asset Digital data requires a specialized tool, (see footnotes for example), that both captures and stores the initial deployment details, but will be used to maintain ongoing updates in respect of changes, such as service history updates.

It should not be assumed that this is necessarily a centralized Cloud based service stored in the Event Hub/Event Engine combination. There are two major considerations for a local Data store; first is the latency imposed by network transit times, and the second is the need to share the data with multiple Event Hub/Event Engines, possible operated by different Enterprises.

In the case of the car example a quasi real time response is required to the tire blow out event, and this illustrates the principle of Fog Computing where a localized, or edge, cloud of linked interactive IoT devices with an small Event Processing capability is required. A car with its many different sensors, probably hard wired, in a self-contained environment is a perfect example of a IoT Fog Computing deployment. (more information on Fog Computing)

Continuing to use the example of the car provides an illustration of remote event reporting and sharing as two further IoT Final Mile architectural elements; namely connectivity service types and data flow, or switching, management. The tire puncture, or tire blow out, events should both be reported externally via wireless, or satellite, to support Service providers to complete the comprehensive and cohesive response to the event.

The big question is not just how to provide the right connection type, but how to manage the data flow across this connection to the reach the defined next layer of Event Hubs and Event Engines operated by Enterprises such as the Car Manufacturer, Emergency Services, Traffic Management, etc. Data Flow management defines the circumstances to switch and direct event data streams to reach the next level of IoT Service Providers via their own IoT Event Hubs and Event Engines.

A Data Flow Engine has to master multiple rule sets with its own form of Complex Event Processing to decide where the event data stream should be forwarded across the Internet. Frequently some, or all, of an event Data stream will be required to be simultaneously forwarded to multiple Event Hubs and Engines in different Enterprises. This requires decisions with reference to the source, the context and the event data itself, and is a specialized task currently usually requiring a specialized product, (see footnotes for example).

The challenge of Final Mile connectivity does not present itself in small-scale local pilots within a single enterprise using their own network infrastructure and manual interpretation of the events. These types of deployments might be better termed an INTRANET of Things, but as the pilot stage moves into full-scale production deployment the automation and use of complex event processing becomes a necessity.

Excellent results can be obtained by Intranet of Things deployments to connect to and extend an Enterprise Application with some Application vendors providing a package to introduce quasi real time updating. In this type of deployment the difficultly of sensor protocols and data formats has already been addressed and rendered suitable to empower the enterprise application. Equally many ‘shop floor’ manufacturing machine Operational Technology, or OT, systems have been successfully deployed for many years as closed Ethernet based deployments.

Increasingly industrial, and consumer, devices will come with built in sensors and the manufacturer will have made the choice as to type of sensor, protocol and data format that they support, just as with Smart Phones today. Some Industrial Manufacturers such as Siemens have produced their own Event Hubs and Event Engines on their own form of localized on premised Fog Cloud and/or centralized Cloud. An output Data Flow via ETL, Extract Transform Load, is then provided for onward integration to an Enterprise Application, or an IT vendor IOT Event Hub and Event Engine. Siemens and SAP provide a good example of how this can function with their mutual integration capabilities

IoT Protocols are the last layer of the Final Mile in the emerging common approach to multi layer scaled up IOT architecture, and are required to be more than just a suitable connection protocol. Industrial IoT developers are focused on what, and how, IoT protocols should be able to have specific data definitions for different sensing types, functions, even different industry sector requirements. 

It is instructive to recall the very expensive lost rocket caused by the NASA and European Space Agency mistake over failing to ascertain what measurement standard for crucial metrics was in use in their data integration. Similar challenges exist for scaled up use of IoT sensors and Devices when integrated into large-scale interactive networks that share the data streams.  The Digitized Asset with also need to be able to self describe the data provided to ensure successful integration occurs in Complex Event Processing at Event Hubs and Engines.

Resource

The Foundational Elements for the Internet of Things (IoT)

Footnotes

Asset Digitization Tool example; https://www.youtube.com/watch?v=w4O6wtCDO7I

Data Stream Management example; http://cityos.io/tutorial/1032/Setting-up-Flowthings.io-flows-and-tracks

It’s

Business Research Themes