I don't know how many of you have heard about RoboEarth, a research project in which academic and industrial robotocists seek to create a sort of World Wide Web for robots. Bad comparisons to Terminator movies aside, the real aim is to mobilize knowledge of Approaches That Work and share them with other robots, so that they aren't stuck recreating the wheel.
Robots, fascinating beasts that they are, work well in controlled environments and this is why they're so incredibly useful if, for example, you're assembling cars or doing other complicated, repetitive tasks that require precision and consistency. Put a robot in an uncontrolled environment, however, and it is suddenly much, much harder for the robot to operate successfully. Uneven terrain, rapidly changing environmental conditions, intrusive animals, other people; all of these things can mess with a robot's ability to complete a given task in a new, uncontrolled environment. Ask the robot to navigate the environment on its own, and the problem becomes even more complicated and much harder to solve.
The project was launched in December of 2009 and has another three years still to go. The motivation behind the RoboEarth project is to create a networked repository/database of information for robots, to help speed up their ability to learn and operate successfully in a variety of unstructured environments, whether they have been specifically programmed to do so or not. A robot that knows how to navigate in a specific environment could upload its method to the database and another robot, elsewhere in the world and faced with a similar environment, could download the method and apply it to its problem. (Or at least that's my understanding, anyway.)
Key to the endeavor is the need to create some standardized way of communicating information from robots to the database where it can then be used by different robots. Sounds funky, right? Honestly, what it reminds me of is the Open Geospatial Consortium's work on Sensor Web Enablement, creating a framework and a suite of standards to enable real-time data from sensor networks to be incorporated, via the Web and Web services, into other types of information systems.
Is it an interesting project? Of course it is! But I also suspect that the devil is in the details. Part of the reason why we don't all have robotic cars yet is that there's just so much variation: in robot hardware, in the types and numbers of sensors that the robots use, in the way sensor data are used to influence robot behavior, in how fast the robot needs to react to its environment, in how the robot learns, and that's before you've even added in the environmental factors. And, frankly, any standards development process is (as far as I can tell) just plain painful. It's worthwhile, but not particularly easy or straightforward.