Industrial & Medical Technology

Optimal Analysis Algorithms are IoT’s Big Opportunity

12 January 2015

Hyperbole aside, the laser-like focus on the Internet of Things (IoT) at this week’s Consumer Electronics Show (CES) in Las Vegas was fitting, given that consumer applications will flood the data pipes while industrial and smart-city IoT concepts trickle along their respective paths.

In the process, these myriad consumer gadgets and applications will create lonely petabytes of streaming data looking to be paired with the optimum analysis algorithm that will extract the most actionable data for the user, at the lowest power and cost to the provider, while also ensuring maximum scalability.

These analysis requirements have implications across the entire IoT food chain, from the design of sensors at the very edge, to how data is presented to the user at the application layer, and of course impacts all the analog signal conditioning circuits, MCUs, communications and data processors in between.

Meanwhile, analysts and pundits are working hard to identify the key IoT applications that will rise to take full advantage of this analysis.

IIoT and smart cities on slow track

For some time, the Industrial IoT (IIoT) has been heralded as the big growth opportunity, it might be, but according to Bill Morelli, analyst and associate director of M2M and IoT at IHS, we shouldn't hold our breath waiting.

“While [industrial automation and manufacturing} is potentially huge,” he said, “IT guys don’t appreciate the number of obstacles.” These obstacles include the decades it takes to replace equipment, versus 15 months in consumer applications, and the fact that the equipment that is in place isn't IP addressable. “And there aren't a lot of ‘greenfield’ factories [being constructed].”

In addition, “Predictive maintenance sounds good, but you have to get the heuristics right, versus the guy who can put his hand on a motor and tell it’s about to go,” he added.

As for smart cities, “they have their own unique set of challenges,” said Morelli, ranging from fluctuating budgets to disparate, incompatible and isolated implementations, similar to issues Wi-Fi faced when initially being deployed across municipalities. And, of course, there are also the ever-looming security risks, he added.

That said, once silos start integrating, the opportunities are exciting, said Morelli. “What we’re trying to get to is improved efficiency and ROI [return on investment],” he said. A good example would be where factory energy-use data can be used by the power grid to perform ‘power shaping’ to cut system loading and improve overall efficiency.

Near term, the consumer is in the driver’s seat with respect to IoT, being, as they are, tightly coupled to the connected car and the smart home, said Morelli. Like the smartphone, consumers may well lead the adoption of IoT technology in the enterprise too.

Figure 1: The plethora of IoT devices announced at CES 2015 will add to the mass of data being generated, but the algorithms required to make quick sense and extract unique value of the data are still a ‘work in progress’.

Data wheat versus chaff

As the applications, in consumer or otherwise, gain traction, the amount of data being gathered and poured into the cloud grows exponentially. This translates to a need for more discerning choices with respect to the data being gathered, processed and communicated, not only to save time, power and cost at the edge, but also to enable more efficient processing, at the core – or in decentralized processing nodes, depending on the architecture used – in order to quickly present the most useful data to the user.

The algorithm, “needs to separate the wheat from the chaff to give me data I can use,” said Morelli.

This type of data analysis would seem to play to Google’s strengths, particularly in the mobile space with Android, “but Google is yet to play its cards,” said Morelli.

Meanwhile, the analytics required have created opportunities for ‘big data’ companies such as AGT International (Zurich, Switzerland), and Vitria Technology Inc. (Sunnyvale, CA), who are focused on doing the complex analysis to provide the most value to clients, with the most unique value to its users.

It has also shone a spotlight on fundamental research into the data-stream analysis required for IoT-type data, which is fundamentally different than classic, database-oriented, data set analysis.

Figure 2: AGT uses the DIKW model to convert raw, unstructured data into useful information for business decision makers in markets ranging from oil and gas to law enforcement.

AGT International’s approach to providing the actionable data for decision makers is based on the DIKW model (data, information, knowledge, wisdom + decisions) pyramid, where raw, unstructured data, such as that from sensors, is turned into structured data, with greater context and more usefulness for critical decision makers in markets that include oil, transportation, law enforcement and energy.

To accelerate the deployment and adoption of IoT-based applications, AGT announced its IoTA last October at the Internet of Things World Forum. Designed for developers, IoTA is a modular, open, cloud-based platform with IoT-specific analytics, data management and visualization capabilities.

For its part, Vitria distinguishes itself by providing ‘operational intelligence’ by not focusing on historical analysis of data, but instead enabling continuous, real-time analysis of both streaming and stored data. It then combines this with the ability to take immediate action on discovered insights through automated processes and guided workflows.

It's all a cluster

This all sounds good, but beneath it all fundamental research into data streams, which are distinct from data sets in that:

  • The data points can only be accessed in the order in which they arrive.
  • Random access to the data is not allowed
  • Memory is assumed to be small relative to the number of points (so only a limited amount of information can be stored.)

These data streams come from sensors and images and need to be corralled and smartly bucketed, sliced, and analyzed at the source to extract only the points that are of use. For example, instead of sending 20 minutes of sensor data from a vibrating motor, the algorithm is set to only send the min/max vibration data, and only if and when those two settings are exceeded by the motor.

Figure 3: In a data-stream management system, any number of streams can enter the system, with each stream providing elements at its own schedule at different rates with different data types. That's what makes IoT data analysis so difficult. To minimize storage requirements, only ‘summaries’ of data are typically stored.

How this data is collected and processed is the subject of research into Mining Data Streams at Stanford and Streaming-Data Algorithms For High-Quality Clustering, by Liadan O'Callaghan, Nina Mishra, Adam Meyerson, Sudipto Guha and Rajeev Motwani.

While the ultimate goal of truly useful, predictive analysis is still a long ways off, the accelerating deployment of IoT technology and the research into data-stream analysis techniques will push IoT and its proponents rapidly up the learning curve.

Questions or comments on this story? Contact dylan.mcgrath@ihs.com

Related links:

IHS Connectivity & IoT

News articles:

MediaTek Rolls SoCs for Android Wearables, TVs

IoT a Boon for MEMS

IDEs for the IoT

8-bit Microprocessor Opens Path to Organic IoT

Intel Rolls IoT Reference Model

MegaChips, IMEC Develop Short-Range Radio for IoT



Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the GlobalSpec
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement