Google

Selasa, 19 Februari 2008

Bug Robotic

I don't show many people on this site, but there is no discussion of bug robots without Rodney Brooks. He pretty much started the bug robot idea. There is so much information online about Rodney, his work at MIT and with iRobot that I am not even going to bother with that. I will tell you about a time I had lunch with him in the mid-1990's just off the Stanford University campus. I'm sure he doesn't remember. It was a group of ten or twenty people presenting research work and I was lucky enough to sit at the same table as Rodney. We talked about our kids, our wives, the weather and none about robots. He's a nice guy; with an unassuming, down-to-earth personality and an Australian accent. I do find it interesting that Rodney's robotics research has progressed from bugs, to horses, to dogs and now on to humanoid robots. A kind of evolution in a lifetime.

The basic idea behind the bug robots was to try to understand very simple biologic creatures and create corresponding robots before trying to build highly complex robots that try to mimic human reasoning. Even a fruit fly with less than twenty neural connections can fly, avoid obstacles, find food and mate (however it is that fruit flies mate). This idea makes a lot of sense, but in practicality building very tiny robots is quite difficult. It seems like the people who work on bug robots spend more time developing techniques for building tiny robots than they do on studying bug behaviors and ways to mimic them.

If you want to give bug robots a try, you might consider the little BugBrain by Yost Engineering on the left. Those big whiskers on the front give the bug the ability to sense contacts with objects in its environment and you can program the onboard computer to make decisions about how to react. That's a true robot. You can add other sensors to it too. Maybe a phototransistor so the bug can "run for the shadows" like real bugs do? According to the manufacturer you can also add wireless RF and an ultrasonic range finder to the bug. That could give it sensing and decision making capabilities on-par with University research robots.

The exploration of Mars is one application that has been proposed for bug robots. Instead of sending one or two big robots, send one or two thousand bug robots equipped with small cameras and chemical sensors. One of the advantages of this approach is fault tolerance. If a few of the bugs break or get lost, it is no big deal. Another is the ability of the small bugs to get into small places such as cracks or fissures in rocks. Or course the small size of the robots also limits the scale of tasks they can accomplish. For example even a thousand bugs working together are not going to drill a core sample ten feet into the Martian crust.

Source: www.learnaboutrobots.com

Rabu, 02 Januari 2008

Vision Robotic

If you're in an industrial setting using Machine Vision you will probably find an Adept robot at work. The company has spent many years interfacing their robots with vision based tools to allow for identification and assembly of parts. One of the most basic problems with industrial assembly is a process know as parts feeding. In this scenario objects/parts that are required for product assembly are contained in a large bin. The assembly process requires a single part to be isolated from the bin of parts. Adept has pioneered the use of vision in solving this part picking problem.


One of the most fundamental tasks that vision is very useful for is the recognition of objects (be they machine parts, light bulbs, DVDs, or your next door neighbor!). Evolution Robotics introduced a significant milestone in the near-realtime recognition of objects based on SIFT points. The software identifies points in an image that look the same even if the object is moved, rotated or scaled by some small degree. Matching these points to previously seen image points allows the software to 'understand' what it is looking at even if it does not see exactly the same image.

As the hobbyist robotics market rapidly grows so too are the machine vision choices that the hobbyist has at their disposal. The CMUCam (initially created at Carnegie Mellon) is by far the most popular vision camera that can track an object based on its color and even move the camera if it is mounted on servos (small motors) to track the object. At a low price and basic usage it has become very widely used by hobby and academic roboticits

Also knows as the "great desert race", the Darpa Grand Challenge was a raced proposed by the Defense Advanced Research Projects Agency which required racing of unmanned robotic automobiles from California to Nevada. The race attracted the best in what mobile robotics had to offer and included many fields of robotics including vision. Due to the ground speed requirements of the vehicles (about an average of 30mph) many of the racing vehicles used vision as a forward looking sensor to help estimate the direction the car should be going long before it got there. The winner, Stanley from Stanford, used such a technique combined with shorter distance LIDAR systems.

RoboRealm attempts to reduce the complexity of using machine vision in robotics by providing a comprehensive user interface to experiment with different vision filters. Using RoboRealm you can add vision capabilities to your robot by using inexpensive USB cameras and the PC that you already have. With interfaces to extend the application using your own custom filters and modules to connect to most of the popular servo controllers RoboRealm is quickly becoming the vision software to use with your robot.



Source: www.learnaboutrobots.com

Industrial Robotic

Modern industrial robots are true marvels of engineering. A robot the size of a person can easily carry a load over one hundred pounds and move it very quickly with a repeatability of +/-0.006 inches. Furthermore these robots can do that 24 hours a day for years on end with no failures whatsoever. Though they are reprogrammable, in many applications (particularly those in the auto industry) they are programmed once and then repeat that exact same task for years.

A six-axis robot like the yellow one below costs about $60,000. What I find interesting is that deploying the robot costs another $200,000. Thus, the cost of the robot itself is just a fraction of the cost of the total system. The tools the robot uses combined with the cost of programming the robot form the major percentage of the cost. That's why robots in the auto industry are rarely reprogrammed. If they are going to go to the expense of deploying a robot for another task, then they may as well use a new robot.

This is pretty much the typical machine people think of when they think of industrial robots. Fanuc makes this particular robot. Fanuc is the largest maker of these type of robots in the world and they are almost always yellow. This robot has six independent joints, also called six degrees of freedom. The reason for this is that arbitrarily placing a solid body in space requires six parameters; three to specify the location (x, y, z for example) and three to specify the orientation (roll, yaw, pitch for example).

If you look closely you will see two cylindrical pistons on the side of the robot. These cylinders contain "anti-gravity" springs that are a big part of the reason robots like these can carry such heavy loads. These springs counter-balance against gravity similar to the way the springs on the garage door make it much easier for a person to lift.

You will see robots like these welding, painting and handling materials.

The robot shown at right is made by an American company, Adept Technology. Adept is America's largest robot company and the world's leading producer of SCARA robots. This is actually the most common industrial robot. SCARA stands for Selective Compliance Articulated (though some folks use Assembly here) Robot Arm. The robot has three joints in the horizontal plane that give it x-y positioning and orientation parallel to the plane. There is one linear joint that supplies the z positioning. This is the typical "pick and place" robot. When combined with a vision system it can move product from conveyor belt to package at a very high rate of speed (think "Lucy and the candies" but way faster).

The robot's joint structure allows it to be compliant (or soft) to forces in the horizontal plane. This is important for "peg in hole" type applications where the robot will actually flex to make up for inaccuracies and allow very tight part fits.

The machine at left can be called a Cartesian robot, though calling this machine a robot is really stretching the definition of a robot. It is Cartesian because it allows x-y-z positioning. Three linear joints provide the three axes of motion and define the x, y and z planes. This robot is suited for pick and place applications where either there are no orientation requirements or the parts can be pre-oriented before the robot picks them up (such as surface mounted circuit board assembly)..



Source: www.learnaboutrobots.com

Selasa, 01 Januari 2008

Airborne Robotic

The little device at the left is a mock-up of an ambitious project at UC Berkeley to develop an artificial fly. If you ask me, they don't have a chance of succeeding. The challenges are just too great. They need to get the tiny wings flapping at 150 times per second, there needs to be some means of keeping the system stable in the air and somehow it has to navigate. And all this on something the size of a dime. They have gotten one wing to flap fast enough that, if they mount it on a little wire boom, it will generate some thrust. In other words they are nowhere close after years of work. This may be the type of system that can only be developed via evolution.

At right we see a little robot blimp made with a polymer balloon. These blimps are available as R/C controlled toys. They can be modified to add sensors and computational hardware which can transform them into robots. I think they are a great way to experiment with obstacle avoidance and machine-based decision making. You can go straight to the machine intelligence and skip the engineering of a mobility platform. Well, unless you think engineering the mobility platform is the fun part.

I love this little robot plane developed by the Navy. They call it the "Silver Fox" and it really does use an engine from the world of R/C planes. This is no R/C plane though. It is capable of fully autonomous flight and is designed for reconnaissance, intelligence, surveillance and target acquisition by small military units. The current model carries commercially available sensors. The goal is to give the Silver Fox, which is also known as the Smart Warfighter Array of Reconfigurable Modules (SWARM), 24-hour endurance a 1,500-mile range and a maximum altitude of 10,000-feet. The idea of 100 of these things filled with explosives flying 1000 miles and then closing on an enemy target like a swarm of mad bees is truly terrifying.

Source: www.learnaboutrobots.com