Technology is constantly evolving and maturing. It moves from a nascent, undefined entity to one that is widely accepted and becomes so commonplace that we wonder how we ever functioned without it. Mobile and cloud are two prominent examples of such a journey. From businesses to individual consumers, everyone is connected to everyone, everywhere, all the time. Internet of things has been THE hot topic for the past couple of years and that trend continues into 2016 and beyond. More and more companies are working to give appliances, automobiles and even lightbulbs an Internet connection and the ability to be controlled remotely. Solving some of the problems that have been holding the IoT back from realizing its full potential are some of the most compelling technology trends to watch in 2016. Here are some of the specifics to pay attention to over the next year.
More Internet of Things & Fog Computing
Early iterations of IoT introduced security concerns and many of those remain. The prospect of filling homes with many different types of connected devices also introduced data processing and network bandwidth shortcomings that stretched the limits of the mobile devices end-users rely on for control of IoT networks. Simply put 3g and 4g cellular networks have not kept up with the increased amounts of data sending, receiving, and processing required by IoT networks. Both the consumer and business markets are fully invested in sorting out the bugs in the IoT rather than scrapping the idea as a pipe dream. And one of the most promising solutions, at least at the consumer level, are the wireless micro networks originally named ‘Fog Computing’ by Cisco’s marketing department. Cisco, IBM and others are launching a new line of routers that are equipped to handle small amounts of data storage and processing on top of providing the internet connection to the IoT network. Those routers would become the main hub of a home or office IoT network that would improve data processing while eliminating the need for many IoT devices to connect to the internet at all. The router would handle it instead.
Device Mesh & Ambient User Experience
The expanding set of endpoints—smartphones, wearables, home electronics, automotive devices, and sensors in IoT devices—which users are employing to access network applications and data, or to perform interactive tasks with other people, communities, and entities is being termed a ‘device mesh.” These devices are all connected to network back-ends to perform commands ranging from adjusting a thermostat to interfacing with BPM software, typically operate unilaterally, separate from one another, as in a silo. However, designing next generation models with interactive capability with other devices in the mesh would create more seamless and ambient user experiences. The tech-research and consulting company, Gartner expects to see device capabilities expanded in this way.
Adaptive Security Architecture
The hacker industry is growing in pretty much every possible dimension. In fact, the U.S. Justice Department reports that a significant majority of warfare happening in the world right now is cyberwarfare. Today’s business organizations face an increased threat surface from hackers of every stripe. And relying on the old model of perimeter-based defense, and rule-based security will become more and more inadequate as businesses look to solve their data storage and processing problems with cloud, and hybrid cloud architectures. The IoT machines and the device mesh also introduce new network architecture demands that must be met to make them viable and worthwhile solutions. Hardware manufacturers are turning to cognitive computing—computers that mimic the human brain—for solutions. IBM, for example, recently invested $3 billion in research and development in synthetic brains and quantum computing.
The internet of things, and structural modifications to the computer networks that connect all those things are spawning massive shifts in the way businesses operate. ‘Things,’ so to speak, will be very interesting to watch as the computing technology that controls approaches maturity.