Article from Forbes.com
By Shel Israel and Contributor Melanie Martella
I’m writing a book with Robert Scoble. The working title is Age of Context: How it will change your work and life. Robert and I contend there are five rapidly converging forces: social media, mobile devices, sensor networks, Big Media and mapping that are forming a new contextual era.
In researching the book, I’m up to the section on sensors where we will cover several very promising apps that are currently under development, nearly all from start ups that you probably don’t know about today, but their apps may be changing your life in the next year or two.
There’s a mystery to sensors. They are among the freaky factors that make people uncomfortable with mobile applications that act like they know you better than your closest human knows you.
To get a sense of what sensors are and what they can do, I spoke with Melanie Martella, executive editor of New Hampshire-based Sensors magazine. She told me her publication originally bore the tagline, “the journal for machine perception,” which sounded a bit odd to me, because machines don’t perceive. People do, or so it seemed to me.
Martella explained that for the last half of the last century, sensors came into widespread industrial use. They were used to measure and report on little changes that might occur along the assembly line—labels pasted on crooked or bottle not completely filled or caps not fastening properly; in agriculture—where sensors are used in irrigation and fertilizer application can note changes in crops caused by water quantity, disease or infestation; or in security—where motion and anomalies can be remotely detected and reported.
“Sensors give machines feedback like our sense of touch gives people feedback,” she told me, helping me understand that sensors actually perceive on their own.
In the coming age of context, sensors will use their perceptions of who we are, what we are doing and what we will need next to serve us in many ways. They will work with one or more of the other five forces to create a new kind of relationship between people and their devices, one that may seem uncannily human at times.
Sensors alone have no context, but used in applications that also include, social media, Big Data or mapping, they start to give you highly personalized results and the sensation that your mobile device is your personal assistant working in your interest 24/7 even when you are blissfully sleeping.
Very few of the contextual apps would work if sensors didn’t start making great leaps forward after the year 2000. Until then, they were fairly boring things, with most people understanding how they worked only because their neighborhood streetlights went on when they sensed darkness, smoke detectors sounded alarms and fire trucks beeped because motion sensors understood when they were backing up.
As the new millennium began engineers figured out how to make sensors wireless, greatly cutting the costs of each in industrial environments. Once untethered, they started seeing the power of using sensors collectively, and started building Wireless Sensor Networks [WSNs], which have become a force to reckon with.
Today, we may be talking about how sensors will let Google: Project Glass know whether or not you are skydiving or strolling on a beach. If you are on the beach, sensors may warn you through an app in your mobile device that sharks are lurking just offshore.
Originally, according to Martella, sensor networks were created because they didn’t like noise, and in industrial environments, there is a great deal of noise that can cause them to fail.
The cost of one sensor may be minuscule, but the damage that the failure can cause on an assembly line can be huge. If we are talking about a hazardous environment such as a nuclear power plant, the damage caused by a single, un-networked sensor could be catastrophic, for two reasons: first the sensor failure itself could be the problem. Second, it the sensor’s job is to report on say, radioactivity levels in a power plant, then failing to successfully communicate it to the control room could cause an even greater disaster.
So, according to Martella, WSNs were set up so vital information would not be lost when one sensor malfunctioned. Wireless mesh networks, comprised of a large number of tiny nodes were set up in a specific area. Each node contained a small processor, a sensor and a low-powered radio with just enough power to talk to adjacent nodes and the network was just smart enough to reconfigure itself to reroute data whenever one node got damaged.
The network became smarter than the sum of its sensors. It’s similar to how flocks of birds operate. One little bird isn’t all that smart. It knows just enough to fly a few inches from other birds in any direction. If one straggles from the flock it becomes easy prey for larger, smarter predators such as hawks. But when the flock stays intact, the hawk has little choice but to dive into the center of the swarming birds.
It almost always comes up with an empty beak.
Somehow the flock is smarter than the sum of its birds. It’s even smarter than the hawk. Wireless Sensor Networks are smarter than the sum of their sensors. This becomes very, very important as sensors start being placed into more and more inanimate objects, such as lamps, sprinklers, running shoes, thermostats, toll booths, vending machines, shark-detecting buoys and just about everything else.
Martella walked me through numerous examples of how wireless sensor networks operate. Researchers working at Caltech for NASA’s Jet Propulsion Lab [JPL] in southern California created “sensor pods” to measure changes in the Sierra mountain snowpack and growth in botanical gardens. The WSNs “paint an extremely granular picture,” she said.
A decade later, sensor technology had evolved sufficiently for JPL to create the sensors used on the Curiosity Rover as it meandered over Mars, looking for minerals, signs of current, past or future life and whatever else it can find. Martella mentioned that Washington state uses optical fibers as sensors to detect the sources of water feeding into the state’s waterways..
There’s a bevy of modern sensor applications such as attaching a tiny pressure sensor, the size of a grain of sand, to a contact lens to detect glaucoma. Cars are swarms onto themselves with sensors that will safely park your car or start windshield wipers at the right speed. In health, there sensors in wearable devices that check your pulse while you run. Researchers are studying digestible sensors that will be embedded in a pill that reports on internal issues ranging from ulcers to aortic blood flow. Martella mentioned “smart knee implants” that can measure stress on the joint and sensors on body armor and military vehicles that can detect environmental changes, as well as motion caused by enemy combatants or bullets.
Today’s mobile devices contain an average of seven sensors. In the future there is likely to be more of them. Martella said that phone sensors today can track you as you move about the world, but an emerging technology called “sensor fusion” uses information from multiple sensors and stitches it together to give your phone an understanding of its location and environment.
Sensor fusion might let your phone know more than you want it to know. It already knows what building you are in and soon it will know what floor you are on, what room your are in and your types of motion is going on. Understanding that, may discourage clandestine in-office affairs and office supply pilfering.
While mobile is moving forward with great momentum, there is a sea anchor slowing it down: batteries. They seem to be progressing slower than almost every other component in the contextual puzzle. While we’ll discuss some exciting improvements in battery technology, the innovation still seems a bit off in the fuzzy distance of our vision.
Less so with sensors. First off, each sensor uses relatively little power. However, energy needs increase exponentially when considering wireless sensor networks of many thousand components the consumption adds up quickly. This is being offset, in part, because Bluetooth released a new version 4.0 in 2012, which significantly reduces energy needs.
Martella pointed out a second way they’ll cut down battery needs: They go to sleep. She predicted that sensors in you mobile device will soon “see” when you are not moving and go dormant. But then, they’ll magically spring to life when you cannot. For example, if you are incapacitated the sensors in your mobile device may soon be able to contact 911 on your behalf, even if you are unconscious.
Sensors, Martella asserted, are essential to the contextual web and the applications are abundant. As a potential application in the near future your phone will know when you are going to the airport. It knows your calendar and when your flight is schedule to depart.
But it also watches for other factors while you are driving to the airport. If your flight gets delayed, your your phone will see it while you drive and understand it’s important enough to alert you. A sensor-driven mobile app may be able to book a different flight for you. If there’s an accident en route, it will allow you to reroute. If the new route is longer, you may need gas and the sensors will help you locate the closest or cheapest available resource.
These of course braid sensors with mobile, Big Data, social media and mapping–each of our other forces of context. Almost by definition no contextual app will incorporate just one of our five forces.
The world is already filled with small and scattered wireless sensor networks covering local areas in factories and farms, homes and offices and increasingly cars and people. These scattered WSNs however are starting to be joined by “smart grid” projects that run across municipalities.
Where many feel we are headed in the next five years is into one global, humungous WSN.
It is called the Internet of All Things and it has already started to change your work and your life.
[NOTE: I am currently looking for cool companies and stories related to sensors, Big Data and mapping. They must be mobile and they must contribute to the contextual web. Please email me.}
Source: This article is available online at http://www.forbes.com/sites/shelisrael/2012/09/16/how-sensor-networks-add-to-context/