Conceptualizing the 2020’s: The Decade of the Internet of Things

2020 heralds the decade in which the Internet of Things surges from the fringe to the fore of the data ecosystem.

Several lofty predictions about the number of connected devices in the IoT begin this year; many developments directly impacting its adoption rates will flourish in the coming 10 years, rendering it the premier expression of data management.

A number of trends in edge computing, 5G, cyber security, Artificial Intelligence, and digital twins will significantly alter what the IoT means to enterprises.

Opportunities for monetization will proliferate as, perhaps, will the potential for misuse.

Exactly which of these trajectories will dominate remains to be seen.

googletag.

cmd.

push(function() { googletag.

display(div-gpt-ad-1439400881943-0); }); “The creators of IoT apps and the whole platform, including hardware and software, are coming up with ways we never envisioned for what IoT systems could do,” observed Cybera President Cliff Duffey.

“We’re kind of at the beginning of a big evolution cycle where we see more and more creative ways to use all of this collected data.

” Inverting the Edge Some of the most innovative manifestations of the IoT—and the most beneficial to consumers—involve edge applications, which are being dramatically transfigured by a paradigmatic shift.

Conventionally, the endpoint devices at the IoT’s edge were considered simplified, “dumb” transmitters continuously emitting data to a centralized cloud for aggregation and analytics.

Today, however, that concept is being inverted so organizations are able to “when you want, not for the remote device to send data, but instead you want the cloud to be able to reach in and from the cloud side, either push data or pull, and talk directly with devices [at the edge],” Duffey remarked.

This model gives organizations more control over edge devices while offering an array of user benefits, particularly in retail.

Duffey referenced a use case in which consumers at gas stations access apps on their smart phones prior to purchasing fuel.

On demand, those apps access a centralized cloud that transmits the purchase information to the pump they’re at, so consumers get credit for their purchases or discounts as part of the retailer’s reward program.

The provider obtains detailed information about consumer habits for analysis, which can be monetized numerous ways.

“For the last 50 years, payment with magnetic stripe cards always began with swiping a card and a terminal reached out to request authorization,” Duffey reflected.

“Now, the payment flow starts on the cloud side trying to call the terminal.

” Artificial Intelligence of Things (AIoT) This architectural restructuring of sending, as opposed to receiving, data from the cloud to the edge is part of an overall movement bolstering edge computing.

The statistical branch of AI is an integral aspect of furthering this movement, making edge computing much more viable to the enterprise.

 Edge computing is critical for making the IoT feasible for organizations, particularly when considering the massive quantities of uninterrupted data transmissions the IoT involves.

By processing that data at the edge, organizations are effectively “reducing the latency in data transfer, reducing IT costs, [and] saving network bandwidth and server space,” commented Juthika Khargharia, SAS Principal Solutions Architect, IoT Division.

Dynamic machine learning models are useful for analyzing data as they are generated at the edge and can incorporate the results of real-time IoT data via: Incoming Training: According to Khargharia, with this approach organizations can evaluate streaming data for models and “also score it online, which means I am basically training the data, I’m updating my models, and scoring it as it is coming in.

”Offline Training: Described by Khargharia as a “multi-phase AI approach”, this method involves “model training offline on your historical data, and the intelligence from that model training is actually fed into a real-time streaming analytics engine that can do the data processing or the scoring in real time.

”Filtering: One of the basic requirements of using streaming data for AI models is the capacity for “intelligent filtering, intelligent pattern matching, intelligent data aggregation and data quality on the data as it is coming in streams,” Khargharia revealed.

These methods are useful for any number of edge applications.

For example, they can help organizations leverage computer vision techniques to identify anomalies in manufacturing assembly lines.

Security Micro-Segmentation Regardless of the use case, all IoT deployments require security frameworks that protect the endpoint devices and nexuses along the network.

Micro-segmentation is a time-honored approach that has become particularly relevant for the IoT.

According to Duffey, micro-segmentation “provides a unique, separate path: kind of like a dedicated LAN path for that device.

” Thus, data traveling from endpoint devices to central locations aren’t part of a larger network that, once infiltrated, enables intruders to access the entirety of an organization’s IT resources.

Software defined perimeters are a means of micro-segmenting individual devices and central locations at the application level.

 DH2i CEO Don Boxley explained this option “improves the security of data flows between devices by removing an IoT device’s network presence.

” Aided by various forms of encryption, software defined micro-segmentation transmissions “eliminate any potential attack surfaces,” Boxley said.

These approaches are gaining credence throughout the IoT space because they diminish opportunities for denial of service attacks.

Were endpoint devices somehow compromised, this method all but obsoletes the possibility of “creating a lateral network attack surface,” Boxley pointed out.

Micro-segmenting IoT networks makes endpoint devices much less of a weakness than they otherwise are.

   IoT Optimization The enduring appeal of the IoT is it enables organizations to gain real-time visibility into data related assets.

When pairing the IoT with cognitive computing applications, organizations get stunningly accurate predictions of the needs of individual assets.

With digital twins, and predictive digital twins in particular, organizations can not only model an entire production environment, but obtain predictions for a host of factors impacting it several weeks in advance.

The result is an ability to achieve optimization across a diversity of contextual considerations that maximizes the IoT’s value.

Future digital twins empower organizations by “predicting the future context [of operational environments] through the week, of how that context will evolve through the course of the week, and providing a plan that will take that into account,” maintained Smartest Systems Principal Consultant Julian Loren.

Implicit to utilizing digital twins for intelligent predictions is the deployment of what Loren characterized as “ensemble AI”, in which methods from a variety of cognitive computing statistical approaches are amalgamated, including “basic algorithms that would be used for machine learning.

Also, other types of approaches: topological data analysis [and] sometimes we might have to do a branch imbalance solve as a mini-solve to basically have an algorithm that figures out that it’s gotten so bad there’s no point in actually exploring along this path.

” Digital twins are critical to the IoT’s future because they’re no longer solely confined to manufacturing; they’re rapidly spreading to other verticals like health care and hospitality.

The 5G Effect According to EWSolutions CEO David Marco, the promise of 5G is “the biggest change in technology which no one is talking about.

” Its purported benefits are perhaps best understood in comparison to 4G’s wireless connectivity.

Because the band of 5G is considerably broader than that of 4G, the former offers “very low latency, huge amounts of data, energy savings, cost reductions, and all these other things,” Marco mentioned.

Nonetheless, these same characteristics of 5G—specifically the width of its band—are responsible for some very pressing issues related to: Structural Penetration: Whereas the band of 4G pervades throughout solid structures with minimal difficulty, “5G does not penetrate solid structures like walls and windows nearly as well as 4G,” Marco divulged.

Infrastructure: Largely due to the breadth of the transmissions of its band, “in 5G those decay at such a faster rate than 4G, so they can’t shoot those nearly as far without having another structure waiting for it to boost it up,” Marco disclosed.

“We’re talking billions, if not trillions, of dollars of infrastructure.

”Health Concerns: According to Marco, the health repercussions of 5G are largely unknown.

“There are strong concerns that even in a 4G world that that is causing some of the health issues people are experiencing,” Marco said.

“What happens when we quintuple the amount of equipment sitting out there, and we’re projecting these huge capacities of waves…flying [through people’s bodies]? What are the health concerns?” Maturation The IoT is set to reach a level of maturation in this decade that realizes its vision of a distributed data network of continuous connectivity.

Bi-lateral communication to and from the edge is fortifying this reality, as are the abundance of ways in which cognitive computing is underpinning edge processing.

Security measures are heightening with micro-segmentation approaches, while digital twins offer unexampled insight into predictive means for optimizing IoT productivity.

If implemented properly, the onset of 5G could transform science projects like autonomous driving into quotidian realities.

Viewed from this perspective, the IoT will not only represent the most innovative aspects of data management, but also the most productive.

About the Author Jelani Harper is an editorial consultant servicing the information technology market.

He specializes in data-driven applications focused on semantic technologies, data governance and analytics.

Sign up for the free insideBIGDATA newsletter.

.

Leave a Reply