Sunday, January 18, 2009

asimo robot

Regardless of your feelings for the quality of the writing or storyline, you have to admit that the TV program “Knight Rider” has helped to plant the concept of a self-driving car into the public’s consciousness. Now that KITT, the self-driving talking car, has returned to the screen in 2008, and with the successes of robot cars in the DARPA Urban Challenge (see Ken Berry's report in our Spring 2008 issue), autonomous cars seem far more plausible, and indeed, inevitable. As a current baseline of where we are today versus the ideal of KITT’s intelligence, we can compare the smart-aleck Mustang GT500 to a current production Unmanned Ground Vehicle (UGV), the U.S. Army’s MULE, or Multipurpose Logistics Vehicle, which is nearing testing and production. The MULE is being built under contract by Lockheed Martin in Grand Prairie, Texas, a company more famous for building fighter jets.

Any robot sees the world through its sensors. UGVs need to be able to sense a wide variety of objects—cars, the terrain, trees, people, buildings, weaponry—and have a variety of sensors to do this job. They also need to understand where they are and where they are going.

First of all, pretty much every UGV uses some form of the Global Position System (GPS) and has on board some type of mapping system that understands roads and terrain to position the vehicle and to plan ahead for the route it is to take. The operators of the UGV assign it “waypoints,” or spots on the map, that describe the route it is to take. While a robot aircraft or unmanned aerial vehicle (UAV) can blindly follow GPS steering instructions, obviously, a ground vehicle cannot.

No comments:

Post a Comment