Home Innovation How edge computing is accelerating innovation across hardware, software and service provider domains

How edge computing is accelerating innovation across hardware, software and service provider domains

0
How edge computing is accelerating innovation across hardware, software and service provider domains

Join executives from July 26-28 for Transform’s AI & Edge Week. Hear from top leaders discuss topics surrounding AL/ML technology, conversational AI, IVA, NLP, Edge, and more. Reserve your free pass now!


An increasing number of enterprises are placing more emphasis on edge computing. According to a report from AT&T Cybersecurity, 75% of security leaders are either planning, in the process of deploying, or have fully deployed an edge use case. This is largely attributed to the technology’s capacity to conserve bandwidth, speed up response times and enable data processing with fewer restrictions. In fact, the State of the Edge study from the Linux Foundation predicts that by 2028, enterprises will be using edge computing more extensively.

During VentureBeat’s Transform 2022 virtual conference, David Shacochis, vice president of product strategy for enterprise at Lumen, moderated a panel discussion to talk about how edge computing is transforming use cases and strategies for some of the real giants of the industry, across hardware, software and service provider domains. 

The discussion also featured Shacochis’s colleague Chip Swisher, who runs the internet of things (IoT) practice for Lumen; Rick Lievano, CTO for the worldwide telecommunications industry at Microsoft; and Dan O’Brien, general manager for HTC Vive.

Technology evolutionary cycles

Shacochis said computing power has gone through evolutionary cycles that oscillate back and forth between centralized and distributed models. Looking across periods of technological achievement, Shacochis said steam power enabled mass production industries, while electrical distribution fueled the modern industrial economy that brought about the dawn of computing power in microprocessing. This, he said, has now led to the present day and what is being called the Fourth Industrial Revolution. 

He further noted that the dawn of mainframe is the original dawn of computing power, distributing out to client server models, then consolidating back toward the cloud and bringing all the business logic into more centralized postures.

“Now we’re seeing this explosion of all the different sources of data, the different ways to process that data, the different kinds of sensor actuator cycles that can really add a lot of value to customer experiences and industrial efficiency,” Shacochis said. “All these different kinds of business outcomes from the many different ways to leverage the edge. So, those industrial cycles occurring across decades, the computing cycles occurring across even smaller periods of years, have really led us to this exciting time in the industry.”

The Fourth Industrial Revolution

Examining the Fourth Industrial Revolution era from a hardware perspective, O’Brien said HTC started off as an original design manufacturer (ODM) company. He said HTC was making motherboards and chipsets for other companies and other products and PCs, using immersive silicon. He added that the company moved very quickly into application-specific integrated circuit (ASIC) chips and GPUs that evolved to smartphone technology. 

O’Brien noted that “many people don’t realize that was the dawn of what we see today in the extended reality [XR] world, building these new types of immersive products. It actually evolved from so much of the chipsets and evolved so much from the smartphones. What’s in modern virtual reality [VR] headsets and displays is a smartphone panel that was powered by the need to have higher visual quality and fidelity inside of a smartphone.”

“Now we’re seeing where we need even more processing power,” he continued, “We need even more visual quality and performance inside of VR headsets for an XR headset and an augmented reality [AR] type of solution. We’re seeing this increase in terms of the demand and the overall performance needs. The additional products require large PCs and GPUs to make this stuff all work. Now, we’re actually moving all of this into a cloud environment.”

He added that there’s also now artificial intelligence (AI) and machine learning (ML) that will optimize the processes for all the virtual contents and interactions.

Additionally, Lievano said the cloud really has changed everything, and the edge is an extension of the cloud. He noted that at Microsoft, they talk quite a bit about this notion of the intelligent cloud and intelligent edge, which he believes is a way to deliver applications across the entire computing canvas to where they’re needed.  

“As a developer, you like to think that you can build an app once,” Lievano said. “You want to use the latest technologies, which right now is cloud-native principles, and you want to be able to deploy it anywhere, whether it’s a public cloud off in the sky, or an edge location. So, this vision that we have of intelligent cloud and intelligent edge is largely dependent on our telco partners because at the end of the day, they provide that connectivity — the connective tissue that’s required for this vision to become a reality. But the cloud needs to connect to the edge. And without telcos like Lumen, there’s no intelligent edge.”

According to Lievano, this is unlike the move from mainframe to client server, where each mainframe and cloud server had their own unique developed models that had their own governance. The cloud-native capabilities are the same, whether they’re available in Azure or in the cloud, he said. 

He also noted that on the edge, you may have a subset of those cloud capabilities because of scale, but the programming model, devops model, management portals, management interfaces, and APIs are all the same. He also said the advertisement becomes another cloud region for a developer to deploy their applications to, and that’s a huge difference between a mainframe and a client-server. 

“Again, as a developer, I’m amazed at the advances and tooling, especially in the last few years,” Lievano said. “AI, for example, has had an incredible influence not only in the applications that we create as developers, but also in the applications that we write and how we develop those applications. So, the cloud gives you limitless compute capabilities [that are] really at your fingertips. Again, scale is not an issue, but features like serverless computing, for example, enable you to take your applications to the next level. In science, you will be able to create and deploy complex applications using microservices.”

Evolution of IoT

From a solutions and service provider perspective, Shacochis said the cloud and some of its tools make some things easier, but the opportunities and customer expectations make things more complex. However, Swisher, speaking from his specialty around IoT, said while some say IoT is a new concept, in reality, it’s been around for more than 20 years. It’s a concept that explains the ability to take data off machines and devices and do certain operations with it, Swisher said. 

“I’ve experienced the wave of what I call IoT 2.0, where you may be held in a factory floor, a localized production line control machine that was doing processing there locally,” Swisher noted. “Then we saw the advent of moving that out to the cloud, and different stovepipe cloud providers providing centralized end-to-end solutions in that space. Now we’re really seeing the need for integration on the IoT 2.0, where we’re starting to see cross-stovepipe use cases, having data coming from multiple different IoT infrastructures and IoT paradigms and being able to bring all that data together into a single view.”

Swisher added that machine learning is the next evolution of having full visibility across everything that’s going on across the city, plant, warehouse and distribution to bring data together.

He noted that IoT 2.0 “creates new challenges both from a compute standpoint and network integration and services standpoint, where there’s a need to compute even closer to those aspects because building all those things together, we really need the ability to have that happen even more in real time to be able to adjust as we need it. The concept of using compute on a premise, you know, compute in a metro edge or a near-edge capability, as well as the cloud, and being able to have all those out there to bring all of those pieces together and be able to move, compute around those different locations really has become critical.”

Don’t miss the full discussion of how edge computing is transforming use cases and strategies for some of the real giants of the industry, across hardware, software and service provider domains.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

LEAVE A REPLY

Please enter your comment!
Please enter your name here