Thursday, September 02, 2004
USATODAY.com - Domestic bliss through mechanical marvels?
An interesting article about how aging babyboomers might drive home robotic advances.
Based on the performance of my Roomba, however, predictions like the one made by Sebastian Thrun of the Stanford AI lab that in 5 years there will be robots that "pick up dishes from the table and put them in the dishwasher" seem a little optimistic.
There seems to be an excessive focus on visual recognition in the article, which states "a homebot needs to recognize a person coming down the hall so it can get out of the way" as a reason that visual systems will be required.
This feels like over-engineering to me. Sonar and other non-visual technologies are already in use by robots to detect and avoid objects both stationary and moving. I don't see any reason that vision would be required for a good number of things people say they do. At some point, however, visual recognition will probably take over some of the functions of non-visual sensors as we develop better algorithms for spotting items of interest and they can be more readily (read: cheaply) integrated into consumer class robots, but relying on them in the short term will probably just get in the way.
The Roomba is a good example of this. It could use visual recognition to work its way around the room, but a bumper-based detection works just as well. Now the Roomba has other problems largely relating to its inability to remember anything, and what seems like a poor room navigation algorithm, but nothing that I think won't be fixed in a simpler manner by tweaking the existing, relatively simple, algorithms, than by adding a complex visual recognition system.
On another note, there is mention of how so-called "carebots" could be used to monitor blood pressure, dispense pills, and call 911 for people who need special care.
These all feel like things that could be more effectively handled by building something more specific to solve the problems rather than trying to build a generalized "carebot".
Specifically, I think that on-board health monitoring systems are a better bet for identifying potential 911 emergencies. Having an udergarment that could track your respirations, pulse, and body temperature, for example, and funnel that back to a nearby computing device wirelessly, seems like it would be much cheaper to develop, easier to use, more accurate, and could have implications beyond just those that need immediate care.
Of course I'm biased on this because this kind of monitoring has been on my list of where things are going for a while (see the entry regarding Smart Pillows). I even have crude drawings that I'll try and upload when I get a chance so we can all have a laugh.
Robotics is, by its nature, and active system with primary emphasis on manipulating or navigating the environment. Too many things mentioned in the article could just as easily be done by passive computing systems in a much more efficient way.
If you're looking for a good idea for "carebots", how about a robot that can administer a defibrillator when a person has been detected to have gone into cardiac arrest? This seems something worth the robotics challenge, and probably requires vision processing as well.
USATODAY.com - Domestic bliss through mechanical marvels?