Sensors as a Foundation for the Next Technology RevolutionMarch 29, 2013 By: Hamid Farzaneh, SensoPlex LLC
In the past decade venture investors have, by and large, overlooked hardware partly because of the perception that hardware is hard and expensive, but also because fewer active VCs nowadays have a hardware-savvy background. Beginning with the dot-com boom, software has been the sexy sector, and has gradually absorbed the lion's share of capital allocation. However, below the surface, things have been changing.
Investors are starting to realize that sustainable value comes from "hardsoft" tech—a combination of software and hardware that delivers a user experience that achieves 'stickiness.' Apple is a prime example; how many Mac users have moved away to Windows platforms? Apple's success gave a big boost to this notion, and giants such as Google and Microsoft, both of which have software-centric roots, are following in its footsteps.
More recently, grass roots funding concepts such as Kickstarter have shown the market's appetite for new hardsoft products such as the Pebble ($10M raised against a target of $100,000), TikTok ($940,000 vs. a goal of $15,000), or Ouya ($8.5M vs. a goal of $950,000). Considering current trends, we expect that the next decade will involve an even stronger turn to hardware tightly combined with software in a way that encourages "stickiness," making it more difficult and less appealing for customers to switch away to a different solution. Sensors will be a key mover behind this trend because the next step in technology is toward smoother and more intuitive control, and greater integration between the real selves and our environment. This ever-greater personalization will marry our physical and digital worlds, and allow us to operate more smoothly and effectively.
To get to this next level of human-machine interaction, we turn to sensors. Sensors can be used to effectively map and track the activity, bio-function, and characteristics of humans and their environments. This helps us to measure variables that affect our lives and wellbeing and to formulate a desirable response or change of behavior within ourselves or through machines. Greater integration of sensors will be integral to creating both more natural and intuitive interfaces, as well as the closed loop mechanisms necessary for two-way interfaces, such as the environment determining how the device behaves (such as holding a smartphone near the head and having the ring tone or its volume change in response).
Looking to the next five to ten years, sensing devices that interface with big data and analytics centers promise to enhance users' experience and control. User interfaces between human and computing devices will evolve to adopt more kinetic and ergonomic methods such as gestures, eye tracking, facial expression, and glance control. Using integrated sensors or networked peripheral sensing antennas, our personal devices will soon measure, analyze and make sense of the physical and environmental realities that impact our short and long term lifestyles, health, safety, security, transportation as well as personal and professional effectiveness.
Using recent innovations in sensors made by sensor makers such as Invensense, Bosch, ST, and Kionix, to name a few, and by system-level enablers such as Movea, SensoPlex and Sensor Platforms, we can postulate some exciting tech innovations for the next few years:
- 24/7 wearable devices that monitor our activity, posture and vital signs (temperature, pulse, blood pressure, respiratory rate, blood sugar). We see them already in a number of forms (Up, Fuelband, Fitbit, Lark, Glooko, and Basis)—but they'll be getting a whole lot better and smarter.
- Consumer applications that link the wearable devices to smartphones or computers and provide real-time feedback and advice.
- Big Data applied to medical databases will offer better access to objective information and statistics and will allow doctors to better diagnose and treat patients. Doctors will now be able to pro-actively follow the state of their patients.
- Natural ergonomic interfaces will extend the user interface to integrate eye tracking, 3D imaging, motion, and sound to allow us to use hand motion, glances, and facial expressions to connect with computing devices. Think of a more advanced and mobile version of the interfaces in "Minority Report"—we're already seeing this with technology like the Kinect and Leap.
- Gaming suits with wearable sensors and actuators that not only serve to control and play the game, but also tell team members what our heart and respiratory rates are (are we sweating it?) and, through vibration or mild electrical discharges, provide physical feedback for hits and effort. The Oculus Rift was one of the coolest gadgets at CES this year, and this is v1. Gaming suits and headsets are just the beginning; like Ender's Game, the knowledge will be transferred to military, space, and industrial applications.
- Self-driving cars are not far off, as we've seen with Google's ambitious projects in this arena. The improved efficiency, speed, and safety of car travel has the potential to transform our society and to spur economic growth and opportunity through the work hours previously lost to commuting.
- Mobile payment security will be substantially enhanced with integrated sensors enabling the matching of password, picture, voice, and/or gesture signature. Mobile payments players like Card.io have already started down this path.
- Contextual technology. For artificial intelligence to make better recommendations to real world actions, it must be fundamentally linked to the real world, through sensors. Futurist Ray Kurzweil has it right that semantic language is the key to AI—but looking towards a more holistic future, gestures and body sensors will have to supplement language as potentially more sensitive and discreet variations of real-time machine communication and learning. Siri-like personal assistants in 2050 will know exactly what you need and when you need it, when you're sick, the best home remedies, and when to schedule an appointment.
This ongoing revolution has been enabled by the rapid innovations that have made sensors exponentially more capable, flexible, efficient, and inexpensive. At the center of this revolution, and its key catalyst, is the smartphone. The 3-in-1 device that Steve Jobs famously unveiled in 2007, which brought together computing, sensors, and communication, was the spark that began this journey of integrated hardware and open application software. Now we look at a century that leverages this technology, as well as an exponentially better understanding of our humanness—a century of Augmented Sense.
ABOUT THE AUTHOR
Hamid Farzaneh is a serial entrepreneur, CEO, and Silicon Valley tech veteran of over 30 years. He is currently the CEO of SensoPlex LLC, Palo Alto, CA. He can be reached at 650-283-5005, firstname.lastname@example.org.