Skip to content

The Unintended Consequences of the IoT’s Dramatic Growth

    There is another side to the boundless opportunities associated with today’s rapidly growing IoT. Are we nearing a point where existing infrastructure can no longer handle the weight of the connected devices that rely on it, and if so, where do we go from here?

    By Charles Yeomans, founder and CEO of Atombeam.

    In the not too distant past, operators of CDMA and GSM networks wondered if the rapid proliferation of smartphones, and more specifically the data associated with them, would overwhelm the mobile networks of the day. Thankfully, the then nascent IoT never saw those fears materialize as providers made the necessary capital investments to rapidly scale their networks.

    Today, we are witnessing a similar scenario unfold, but on a far greater scale. Not only have the now ubiquitous smartphones become more powerful and synonymous with more data, but the sheer scale and scope of connected devices continues to increase exponentially. According to Parks Associates, the average U.S. household now has 17 connected devices, among them everything from smart televisions to connected appliances, security systems and more.

    Importantly, even those devices which have the smallest footprint, such as a temperature sensor for a connected thermostat, have an impact on networks as frequent pings add up and generate data that must be moved, used and sometimes secured. But all connected devices are, of course, not equal when it comes to the impact they have on the wired, wireless, cellular and satellite networks that make connectivity possible.

    That fact becomes painfully clear when you look at emerging and increasingly mainstream IoT use cases in which machines generate and rely on massive data volumes. Cars are but one example.

    In 2022, the sale of connected cars eclipsed unconnected ones for the first time. And while the amount of data they generate and use varies, it is estimated that between 50% and 70% of that information is ultimately sent to the cloud. That amount of data used goes up as more advanced features are added – everything from sensors that monitor engine health to entertainment systems. And it is estimated that even the most rudimentary connectivity creates 25GB of data per hour of a vehicle’s operation.

    Those numbers increase dramatically with the addition of autonomous features. A Waymo taxi, for example, has 29 cameras and uses powerful AI to analyze and act on the environment around it – a process that is estimated to create at least a terabyte of data per hour, with some putting the estimation much higher.

    All of this is noteworthy for a simple reason. While not all of the data created will be transmitted beyond the car or stored independently, connected and autonomous cars comprise a single use case that even on its own could dramatically weigh down existing networks. There are, after all, nearly 300 million cars and trucks on the road in the U.S. alone, a reality that will undoubtedly impact networks as more connected cars and more cars with autonomous features – including full self-driving capabilities – hit our roads.

    But needless to say, cars are not the only issue. Drones, too, are increasingly in use for everything from military applications to package delivery. And like their automotive counterparts, they create and use massive amounts of data.

    Then there are the workloads that while not specifically associated with the IoT, still use the same data centers and pipes. The most visible of these, generative AI, already threatens to upend networks as we know it – a fact that led Google’s former CEO Eric Schmidt to note in recent congressional testimony that data center energy consumption could quickly increase from 3% of the power generated today to 99% of it with data centers requiring 67 more gigawatts of power by 2030.

    More broadly we are also beginning to get a better sense of just how much data is involved with, and a byproduct of, the IoT. Although estimates vary widely, one thing is certain: With more connected devices, a greater reliance on applications of AI within them, and the rapid growth of edge computing, data volumes are increasing dramatically. IDC predicts that the billions of IoT devices already online will create 90 zettabytes of data in 2025.

    We are also beginning to see the impact of dramatic increases in data volume. This brings to mind some very important questions that will need to be addressed not just by those immersed in the IoT, but also the larger IT ecosystem that makes the IoT possible.

    • Have we finally reached a true data deluge? Are we running out of infrastructure capacity? While it can be argued that the growth of the IoT in and of itself would not upend networks as we know them, the relatively sudden growth of AI and use of large language models (LLMs) have already tipped the balance. Recent research from McKinsey & Company estimates that to avoid a deficit in overall network capacity by 2030, twice the capacity built since 2000 will need to be put into operation in a quarter of the time. Viewed as a capital expense, McKinsey estimates that this will require “$6.7 trillion worldwide to keep pace with the demand for compute power,” by 2030.

      But even were this to occur, there is then the issue of power. Hyperscalers continue to look to nuclear power, with Microsoft planning to restart Three Mile Island and others hoping to roll out small reactors in their data centers – both issues that introduce significant security and safety considerations. Whether such actions would suffice is also subject to debate.

    • Will our standard response to more data suffice? For decades, our collective answer to transformative computing trends that create more data has been to address them with still more innovation, specifically faster chips and more powerful processors. The current trajectory of IoT and AI infrastructure upends that dynamic. Moore’s Law, which held relatively true and enabled us to effectively address new and even greater data demands, no longer applies. We simply can’t build hardware that is powerful enough to address the shortfalls in capacity we are now seeing, particularly as it relates to AI.
    • And what about security? The intersection of the IoT and Al presents significant challenges beyond the issues of network capacity and power consumption. AI also threatens to make it more difficult to shore up and safeguard networks in the face of AI-powered efforts to discover new vulnerabilities and present deep fakes. And then there is the longstanding issue of IoT device security, with recent research by Forescout finding that IoT device vulnerabilities increased 136% since 2023. Perhaps most concerning, many low-powered and lightweight devices such as sensors have no encryption at all despite the fact that they sit at the edge of networks and present back actors with a literal open door into networks. The very real risks associated with this reality threaten to impact enterprises in much the same way as the lack of network capacity we are heading towards.

    Perhaps most importantly, there is the question “where do we go from here?” We must collectively accept that the cumulative, compounded impacts of IoT growth and AI adoption will not be addressed by a single response or solution. It is clear that our longstanding focus on hardware will not suffice and quantum computing is not yet mature.

    The current trajectory of IoT and AI is also unsustainable and plagued by architectural inefficiency. Current AI workloads using LLMs, for example, recompute things from scratch in every interaction.

    For these reasons, the path forward must be one in which we look beyond the usual suspects. That includes reimagining the very nature of data itself – merely transitioning away from binary code promises significant gains – and considering how we can rearchitect AI infrastructure in fundamentally new ways. The time for bold innovation is now.

    About the author: Charles Yeomans is the founder and CEO of Atombeam, whose Data-as-Codewords technology fundamentally changes how computers communicate while simultaneously decreasing the size of data by 75% and increasing available bandwidth by 4x or more.

    iotbusinessnews.com (Article Sourced Website)

    #Unintended #Consequences #IoTs #Dramatic #Growth