A brief history and future of the Internet of Things
6:55am Five minutes before Lyle is scheduled to wake up. Wrist monitors check his pulse to figure out when the best time to stimulate him awake is. Good, he has been asleep for at least eight hours and his heart rate and breathing is almost optimal. A quick traffic check confirms no need to wake him up early. His water heater starts for his daily morning shower and his thermostat for the bathroom is increased for when he gets out.
7:05am Lyle’s coffee-maker turns on and starts brewing a fresh cup of joe. His fridge checks to make sure he has his usual breakfast ingredients–orange juice, eggs, yogurt, and a banana–and orders more eggs for the next week.
7:35am Lyle departs his house on time and ready for the day ahead because of a refreshing shower and delicious breakfast.
All of this has become possible because of a recent paradigm shift in technology known as the Internet of Things, or as it is most commonly referred to in tech circles and articles, the IoT.
In 1999, Kevin Ashton, a British technologist who helped to found the Auto-ID Center at the Massachusetts Institute of Technology, coined the term ‘Internet of Things,’ but the idea of devices connecting with each other hails from as far back as the creation of the internet itself. The dawn of the internet age kickstarted an era of growing and shrinking. The amount of information that could be created, stored, and shared grew exponentially with the ability to create and harvest from across the world–or, at least, from across the world wherever servers were at the time. Simultaneously, places and people that once seemed far away and beyond one’s own scope could now be reached and interacted with on a more personal level.
Unfortunately, the interactions allowed by the internet were limited to only those few scholarly elites or academic institutions that invested in this process and new contact was limited. And connection with devices was even more stringent since wireless technology did not exist and wired connections had to be used through ethernet cords to communicate with the internet. As a result, machine-to-machine (M2M) interactions were nearly impossible, and M2M links over long distances were unheard of; Internet interface was solely between a computer and a human.
How did the Internet of Things come to be then?
Futurist and technologist Richard Yonck, who has written extensively about the IoT, explained the precipitation of devices connected to the internet and each other:
"If you think about it, the IoT is a fairly natural evolution of processing and communications technologies. Computers have continued to become smaller and cheaper over the decades. As they continue doing so, where will they go and how can we use them? Throughout our environment, naturally!"
The first internet-capable machines do not seem like much today, but when they were first created, Carnegie Mellon University programmers and engineers developed the first appliance connected to the internet in the early 1980s. They rigged a Coca-Cola machine to send status updates and messages about the availability of a can of Coke so that a trip to the snack area would not be in vain.
Other similarly sized projects became the norm for bored or experimental college students with enough resources and time. None of these devices became commercially viable and the Internet of Things remained a topic confined to academia.
It wasn’t until the late 1990s and early 2000s that the concept of having a network of interconnected devices became popular and drew interest from corporations and consumers. Kevin Ashton led the movement at his Auto-ID Center at MIT with research into the field of radio-frequency identification, or RFID.
Bill Joy supplemented Ashton’s research with a proposal for a “Six Webs” framework. Joy graduated from the University of California, Berkeley with a Master of Science degree in electrical engineering and computer science, and then went on to co-found Sun Microsystems. His initial thoughts into the development of a standardized system (a Web) paved the way for M2M interactions that occurred through similar protocols and syntaxes. However, it was a combination of his theory and Ashton’s research that a truly useful and pervasive Internet of Things could be developed.
Although Ashton and Joy began the process of creating a standardized system of communication and interaction in the beginning of the millennium, the Internet of Things remains a rather fragmented and developing field. Since the Internet Coke machine, many more devices have been created by university researchers and commercial companies, but most have stayed rather proprietary and do not discuss results with other devices.
Some of the most prominent products today represent huge leaps in technology from even ten years ago, but the status of the Internet of Things remains a gradual acceptance into society. Many media outlets predict that 2014 will be hailed as “The Year of the IoT” but few care to define by how much or through which methods. In fact, despite most tech pundits believing this year to be the year, many also point out that the IoT will grow slowly. Brian Proffitt of ReadWrite, a prominent online magazine about technology, argues that, “the Internet of Things won’t see any big splashes in 2014, just steady and incremental progress toward automating … everything.”
Currently, the biggest problem facing the IoT is the lack of standards for communication. Without a “common communication method,” devices will only be able to talk to their own brands and severely limit the helpfulness of connected machines. For example, currently, sleeping monitors only give results to phones for users to analyze themselves. Imagine a future where sleeping monitors could give results to doctors or alert users of abnormal or unhealthy sleeping patterns and suggest fixes that could in turn be prompted by communication with a coffee machine (if caffeine is suspected of hurting the user) or thermostat (if temperature could be negatively impacting sleeping habits).
To remedy this situation, Intel, Cisco, GE, and IBM have come together to form the Industrial Internet Consortium, a conglomerate nonprofit with the goal of increasing inter-operability standards in devices connected to the Internet.
As a thriving industry, M2M has proved that people are willing to allow more and more technology into their lives. Yonck wrote an article last year that discussed the adoption of innovation into mainstream culture and the process a prototype undergoes to become a product:
"Consider that in order to move all the way from concept to prototype to marketable product, every idea has to pass through a succession of filters. Is the idea possible within the laws of physics as they’re currently understood? Then forget retro-causality (time machines), perpetual motion, faster than light travel/communication, etc. Do our existing, or soon to be existing, engineering capabilities, materials, tolerances, etc., allow us to realize the idea or will it remain on the drawing board for centuries, as did Leonardo da Vinci’s flying machines or Charles Babbage’s Difference Engine? Can a need be established? That is, can consumers, corporations, or the military be convinced this is something they must have? Because without a perceived need, it will surely go the way of the [Ford] Edsel.
"And what of other institutions? Regulatory bodies, insurers, political organizations and others must be persuaded to support or at least tolerate and accept the new tech. And ultimately is this an idea that is right for its time? An invention must fit within the established mores, accepted behaviors and realities of user understanding and functionality. Without all of these, the idea will die stillborn. Given all this, it may seem a miracle any new tech ever comes to life and gets the opportunity to walk the earth, even if only for a few years."
The Internet of Things meets all of these criteria and therefore has seen dramatic commercial success.
In fact, according to a Business Insider Intelligence report on the future of the internet:
"The IoT will account for an increasingly huge number of connections: 1.9 billion devices today, and 9 billion by 2018. That year, it will be roughly equal to the number of smartphones, smart TVs, tablets, wearable computers, and PCs combined."
And Cisco CEO John Chambers predicts that the Internet of Everything (as he refers to it) could be worth $19 trillion in the near future–a future where objects all over from house to airport will know people’s preferences and set themselves to certain modes to best suit the individual.
However, The Internet of Things caters to a growing category of what people enjoy calling “first world problems.” The technology that has been developed in hopes of creating the IoT is great and innovative and nothing can be said to take away the promise of advancement. Unfortunately, many of these technologies have not been created with the idea of helping developing countries and economies. A thermostat that optimizes the temperature of your floor and shower down to a degree may seem like a necessary item to some who struggle with a bathroom that is too cold after a searing bath, but to a farmer in Africa or artisan in India it fails uselessly.
As more and more companies start connecting their products to the internet and the markets of industrialized and modernized countries become saturated, hopefully devices will be created to help the people who truly struggle and could benefit with a system of interconnected machines. Perhaps an irrigation system that interacts with a weather prediction service and a local water storage facility for baths and showers of citizens as well as drinking water for local livestock. With better water management, farmers could optimize crop yield and sell to other through an online or other system that tracks grain production. The application of such devices to different environments is inevitable, so it is just a matter of when companies will realize the opportunities in creating an IoT for those countries.
Yonck, as a futurist, understands the current trends of technology and predicts where they are headed.
"As it develops, the future of IoT is to basically make our world more intelligent. Technology everywhere will literally have the ability to sense it’s environment and respond to it. While this may not result in direct physical action on the particular device’s part, it will be capable of relaying data to servers elsewhere that will potentially cause other devices to respond."
In the information age that we live in technology regularly changes the way we live. In the 1970s it was mainframe computing. In the 80s it was the PC. The 2000s saw the rise of social media. Today, the Internet of Things is revolutionizing the way we live.
Techie and entrepreneur with a passion for soccer and a distaste for chocolate, Dylan Steele dabbles in a little bit of everything, including that new crypto-currency/property. This post first appeared on Medium.