Ever since artificial intelligence and robotics became a glimmer in the eye of engineers and science-fiction writers, the question has emerged: can humans and machines interact with each other on an equal level? Well, humans have controlled machines since day two. They can turn them on/off, operate them, repair them, and, most importantly, design new ones when necessary or if they just need a few extra bucks.
On the other side of the tracks, the concept of humans interacting with machines and other non-living devices on an equal and natural footing spurs some strong, sometimes vehement debates. Someone always points out the risk of humans becoming friends with, or even lovers of their machines, which are no more than tools to complete their tasks, whatever those tasks may be. Until some machine runs for public office or some social media maven marries a robot, we can reserve that debate for another century (hopefully two).
On a somewhat different level, humans are quickly admitting a wide range of autonomous devices into daily life, devices and systems that operate on their own based on human instruction and interaction. The IoT is filling up with them, particularly devices that comprehend and react to human speech. Some can even detect the emotional component of the speaker and react accordingly.
Whether one likes or dislikes this situation or is involved or disassociated from the technologies at hand does not matter. These machines and devices are here, they are proliferating, and humans need to get on board with the program. I’d say sink or swim, but there are robotic lifeguards being tested as we speak. Of even greater importance, the demand for devices that interact easily and naturally with humans is escalating and designers and engineers need to get a handle on this technology – sink or swim if you will.
For those who choose to swim, at Sensors Expo & Conference 2019 in San Jose, CA, on Thursday, June 27 from 10 am to 10:50 am PST, you have a golden opportunity to learn how contextual awareness with smart sensing technology is replicating the human senses. And you will be learning from one of the experts spearheading this exciting field of technology, which is quite sensor centric.
Mr. David Jones is the Head of Marketing & Business Development for Intuitive Sensing at Infineon Technologies. In this role, Mr. Jones is responsible for the development of new applications for sensors together with artificial intelligence and new sensor-fusion techniques to better improve human machine interfaces to create interaction that is more intuitive.
Before arriving at Infineon David was at Caruma Technologies Inc., an artificial intelligence (AI) based connected vehicle platform start-up, heading up their customer engineering, product management and business development. Prior to Caruma he served as VP of Marketing and GM for the Smart Machines at IntelliVision Technology Group, an AI and computer vision software company.
Prior to joining IntelliVision, David worked for several semiconductor companies including ViXS Systems Inc., Conexant Systems, Motorola Semiconductors and LSI Logic in Canada, USA, Germany, and the UK. David is a chartered engineer and member of the Institution of Engineering and Technology (IET) and holds a BSc, with honors, in Electrical and Electronic Engineering from the University of Heriot-Watt in Scotland.
Mr. Jones is no stranger to Sensors Expo. When asked if he has attended past events, he enthusiastically replied, “Yes, good show and unique for specializing in the latest trends in the sensor industry.”
David is also quite enthusiastic about his upcoming presentation, “How Contextual Awareness & Smart Sensing Devices can Now Interact More Naturally with Humans”, where he will be discussing, “How sensors are becoming so sophisticated that there are now one or more sensors that replicate the human senses. However, having sensors alone that replicate the human sensors still does not fill a gap to having intelligence built into a machine that can leverage these machine “senses” like a human uses their senses to interact with other human beings. Big steps in artificial intelligence and pushing this to the edge are helping machines make decisions, but taking the next step in making machines do this intuitively are far from being ready today. This presentation will show how things can change and how these changes can make the interaction with machines much more intuitive and what relevance this will have on the way humans interface with machines in the next few years.”The one most important idea Mr. Jones wants his audience to take away with them is, “That no one sensor technology does everything and even with today’s voice activated devices we are still a long way from making our actions with machines intuitive.”
In addition, Infineon will be on the show floor. David says, “Yes, we will be exhibiting our sensor technologies, for example MEMS bases and 3D sensing such as Radar and Time-of-flight (ToF) camera technology. We will also be demonstrating our Human Machine Interface use case examples that combine sensor technology with AI processing in complete systems. Our booth number is 1110.” And he will be attending some of the other educational sessions available at Sensors Expo & Conference, “hoping to get insights into market trends and advances in sensor technology.”
If you are involved in any area of technology, it can only be to your advantage to get on top of the relationship of sensors and the various sensor types to the intuitive interfacing of humans and machines. And there’s a very simple path you can follow that does not involve yellow bricks and ruby slippers. First, register for Sensors Expo, and second, attend David Jones’ session, titled “How Contextual Awareness & Smart Sensing Devices can Now Interact More Naturally with Humans”, Thursday, June 27 from 10 am to 10:50 am PST. And you’re still here because why? ~MD