Have you seen Spiderman movie where Dr. Octopus become the most dangerous and recurring enemies? I guess you all know the Dr. Octopus have 4 tentacles were made of adamantium. They were telescopic so this tentacles can stretch over 20 feet.
Festo, already build something like this!
The Bionic Handling Assistant is a light, free-moving “third hand” system.
The structure and functioning of the Bionic Handling Assistant was inspired by the elephant’s trunk. The trunk segments are littered with resistance sensors that help it to be aware of contact with people and objects around it. A four-fingered gripper at the end is also designed to need little force to grasp a range of objects.
This multifunsional hand can applied to a range of industrial applications, household, extra set of hands on the lab and many more applications to help human. So Festo set out to create an arm that is both strong and dexterous as well as safe for humans working alongside it
Smartbird is flying robot made by Festo with same principal of operation with herring gull bird. Smartbird used a single electric motor and 4 servos. With two interconnected rods thar dictate the wing flapping movement and a servo located at each wingtip to actively twist the wing. 2 other servos used for control tail and head.
Their wings follow a periodic reciprocating pattern that on its own creates the
“ornithopter,” the Nano-Hummingbird, developed by the company AeroVironment Inc., the miniature spybot looks like a hummingbird complete with flapping wings, and is only slightly larger and heavier than most hummingbirds, but smaller than the largest species.
The ornithopter can fly into buildings under the control of an operator flying the spybot with the help of a feed from its tiny video camera. The prototype is capable of flying at speeds of up to 18 km/h (11 mph) and weighs 19 grams, which is about the same as an AA battery.
Manager of the project, Matt Keennon, said it had been a challenge to design and build the spybot because it “pushes the limitations of aerodynamics.” The specifications given to the firm by the Pentagon included being able to hover in an 8 km/h wind gust and being able to fly in and out of buildings via a normal door.
The spybot was developed for the US military’s research arm, the Defense Advanced Research Projects Agency (DARPA). The hummingbird appearance is intended to disguise the bot, although it would look decidedly out of place and would attract attention in most places in the world since hummingbirds are not found outside of the Americas.
DARPA’s head of the Nano Air Vehicles (NAV) program. Dr Todd Hylton, said the successful flight tests pave the way for new vehicles that resemble small birds and match their agility. The new drone is a departure from existing NAVs, which in the past have always resembled helicopters or planes.
Japanese researchers have developed a robotic bird that can movefreely in the air with rapid wing movement. Robot bird has a sizesimilar to a real bird Hummingbird, equipped with micro motors and four wings that can flaps 30 times per second, said HiroshiLiu, a researcher at the University of Chiba east of Tokyo.
Professor Hiroshi Ryu of Japan’s Chiba University displays his flying robot, which flaps its wings 30 times per second like a hummingbird, at his laboratory in Chiba city, suburban Tokyo, December 28.
This unique robot has a weight of 2.6 gram, can fly 8 times morestable than a helicopter with propeller. This bird robot is controlled by infrared sensors and can move up, down, right or left.
Hiroshi Liu plans to make it float to rest at one point in the air, andequip it with a micro camera in March 2011.
This robot development cost 200 million yen ($ 2.1 million), can be used to help rescue people trapped in damaged buildings, looking for criminals or even operate as a vehicle probe on Mars.
This RoboBees are collaborating investigators at Harvard University School of Engineering and Applied Sciences, the Wyss Institute for Biologically Inspired Engineering and CentEy.
This project integrates approaches at the body, brain and colony level. Inspired by the biology of a bee and the insectÕs hive behavior, we aim to push advances in miniature robotics and the design of compact high-energy power sources; spur innovations in ultra-low-power computing and electronic ” smart ” sensors; and refine coordination algorithms to manage multiple, independent machines.
Anatomy of the Robobee
- Brain: Simple circuits handle basic functions, including balance and hovering. A microprocessor runs the bee’s high-level functions, such as processing data from sensors.
- Eyes: Ultraviolet sensors scan for natural UV patterns on flowers. Digital cameras track objects below the bee to determine how fast and far it’s flying. Light sensors follow the sun to tell if the bee is flying north or south.
- Wings: An actuator flaps two lightweight carbon-fiber wings.
- Antennae: The antennae beam data between bees and could act like whiskers to prevent the ’bot from bumping into things.
- Feet: In the hive, three pronged feet lock the bee into a docking station to recharge its micro fuel cell and upload sensor data to a computer. The feet could also help grab pollen from flowers. Graham Murdoch
How RoboBees Pollinate an Orchard
STEP 1: Establish Home Base A farmer sets up a mobile RoboBee “hive.” In the future, an autonomous robot could haul the hive from field to field. STEP 2: Survey the Landscape Scout RoboBees leave the hive first and use their ultraviolet sensors to locate the same UV patterns on flower petals that real bees look for. Cameras on the bee’s head record landmarks underneath the bee to give it a sense of where and how far it has traveled. STEP 3: Make a Map The scouts return to the hive to recharge and upload flower locations to a central computer, which maps the entire orchard as more scouts report in. STEP 4: Get Pollinating Worker bees, outfitted with fewer sensors and bigger batteries for longer trips, head directly for the flowers, picking up pollen from one and delivering it to others.
Researchers at the Robotics and Mechanisms Laboratory at Virginia Tech have designed a series of serpentine robots that are able to climb poles and inspect structures too dangerous or inaccessible for humans. The robots coil themselves around a beam and roll upward using an oscillating joint motion, gathering important structural data with cameras and sensors.
A 2006 US Bureau of Labor Statistics report listed 809 fatal falls from raised structures and scaffolding. The RoMeLa team hope that by increasing the use of autonomous robots in construction, humans can work in safer conditions. The HyDRAS models (Hyper-redundant Discrete Robotic Articulated Serpentine for climbing) use electric motors , while the CIRCA (Climbing Inspection Robot with Compressed Air) uses a compressed air muscle. Currently the robots are tethered to laptops, but future designs will incorporate a microprocessor and power source, allowing them to operate independently. All robots in the series are roughly three feet long, though the CIRCA is lighter than the HyDRAS.
Dennis Hong, director of Virginia Techs Robotics and Mechanisms Laboratory said, The use of compressed air makes this approach feasible by enabling it to be light weight, providing compliant actuation force for generating the gripping force for traction, and allowing it to use a simple discrete control scheme to activate the muscles in a predetermined sequence.
These are really wicked cool robots, Hong said. Unlike inchworm type gaits often being developed for serpentine robot locomotion, this novel climbing gait requires the serpentine robot to wrap around the structure in a helical shape, and twist its whole body to climb or descend by rolling up or down the structure.
The HyDRAS-Ascent, HyDRAS-Ascent II, and CIRCA recently earned recognition at the 2008 International Symposium on Educational Excellence.
About Robonaut Robonaut is a humanoid robot designed by the Robot Systems Technology Branch at NASA’s Johnson Space Center in a collaborative effort with DARPA. The Robonaut project seeks to develop and demonstrate a robotic system that can function as an EVA astronaut equivalent. Robonaut jumps generations ahead by eliminating the robotic scars (e.g., special robotic grapples and targets) and specialized robotic tools of traditional on-orbit robotics. However, it still keeps the human operator in the control loop through its telepresence control system. Robonaut is designed to be used for “EVA” tasks, i.e., those which were not specifically designed for robots.
The goals is to build machines that can help humans work and explore in space. Working side by side with humans, or going where the risks are too great for people, machines like Robonaut will expand our ability for construction and discovery. Central to that effort is a capability we call dexterous manipulation, embodied by an ability to use ones hand to do work, and our challenge has been to build machines with dexterity that exceeds that of a suited astronaut. The resulting robotic system called Robonaut is the product of NASA and DARPA collaboration, supporting the hard work of many JSC Engineers that are determined to meet these goals.
Using a humanoid shape to meet NASA’s increasing requirements for Extravehicular Activity (EVA, or spacewalks). Over the past five decades, space flight hardware has been designed for human servicing. Space walks are planned for most of the assembly missions for the International Space Station, and they are a key contingency for resolving on-orbit failures. Combined with our substantial investment in EVA tools, this accumulation of equipment requiring a humanoid shape and an assumed level of human performance presents a unique opportunity for a humanoid system.
While the depth and breadth of human performance is beyond the current state of the art in robotics, NASA targeted the reduced dexterity and performance of a suited astronaut as Robonaut’s design goals, specifically using the work envelope, ranges of motion, strength and endurance capabilities of space walking humans. This website describes the design effort for the entire Robonaut system, including mechanisms, avionics, computational architecture and telepresence control.
Robotic hands have been around for decades but they usually bear little more than a passing resemblance to the real thing. Now NASA researchers have raised the bar with a robotic hand that closely mimics the inner workings of the human hand.
The hand, part of the ongoing Robonaut project, is designed to use the tools and handholds astronauts use during space walks. This purpose, more than aesthetics, led the researchers to copy the human hand as closely as they did, said Chris S. Lovchik, an engineer at NASA’s Johnson Space Center in Houston.
“The more you begin to look at tool use, [you find that different tools] involve different portions of the hand,” he said. For example, the palm of the Robonaut hand had to be accurately modeled in order for the hand to grasp a screwdriver in alignment with the roll of the arm, he said. The device is a right hand attached to a wrist and forearm. It has 12 controlled degrees of motion and 42 sensors for tracking the position and velocity of the hand’ s moving parts. The researchers are adding tactile sensors. “It’s one of the best [robotic hands] that I’ve seen,” said Reid Simmons, a senior research scientist at the Robotics Institute at Carnegie Mellon University. “It’s really quite an amazing piece of work. It’s got very good dexterity. It’s amazing how compact it all is.”
The Robonaut system, which will have a torso, two arms and a head, is designed to be controlled by a human operator. “The overall objective is essentially to create a surrogate for the astronauts,” Lovchik said. Researchers are programming primitives, or sets of commands for simple actions, that make the hand easier for the operators to use. For instance, you don’t think about how to draw a circle because your brain learned the primitives for drawing a circle in early childhood. The researchers plan to automate simple tasks like grasping and could eventually make the hand fully automated, according to Lovchik. Fully automating the hand will be a major project, according to CMU’s Simmons. “A lot of what [humans] do very well is very fine force feedback control,” Simmons said. “If you’re putting and nut on a bolt you can feel when it’s getting stuck and when it’s too tight, and you can compensate for that. That type of [control] is beyond current state-of-the-art.”
Robonaut could be ready for space missions in five years, according to Lovchik. Funding for the project comes from NASA and the Department of Energy.
Overall Design Description
Robonaut will have a humanoid design in order to mimic the movements of a real person
aren’t new to the space program. Robotic probes and rovers have been traveling to Mars since before man stepped foot on the moon. In 1965, the Mariner IV planetary probe sent back the first images of the red planet at close range. In 1997, the Pathfinder rover provided scientists with unprecedented detail of the Martian atmosphere and surface. What’s different about the latest robotic astronaut is that it has a humanoid design with a head, two eyes, arms and five-digit hands. Let’s take a look at the individual parts that make up the Robonaut:
Head — Two small color video cameras are mounted in the head piece that delivers stereo vision to the astronaut operating the Robonaut. Stereolithography was used to make an epoxy-resin helmet to cover and protect the head piece. The neck is jointed to allow the head to turn side to side and up and down.
Torso — The torso provides a central unit for connecting the peripheral arm, head and leg attachments. It also houses the control system.
Leg — The one part of the Robonauts design that deviates from the humanoid look is that it has only one leg. The leg’s only function is to provide support when the hands are unable to.
Arms — Just like its human counterparts, the Robonaut will have two arms that can move in many directions and have a greater range than our own arms. The arms will be equipped with more than 150 sensors each and will be densely packed with joints. Space-rated motors, harmonic drives and fail-safe brakes will be integrated into each arm.
Hands — Perhaps the most impressive parts of the Robonaut are its hands. Its hands are the closest to the size and ability of human hands inside a space suit. The jointed hand may even exceed the movements of a suited human hand. Fourteen brushless motors to power each hand are inside the eight-inch-long forearm. The hand has four fingers and an opposable thumb. The hand was designed with five digits so that it would be compatible with tools designed for humans. Researchers have demonstrated the Robonaut’s ability to pick up a small metal washer with tweezers. Together, the arm and hand unit can lift 21 pounds (9.5 kg), which doesn’t sound like much, but in a weightless environment it’s plenty of strength. The Robonaut is an ongoing project at Johnson Space Center (JSC). NASA has spent about $3 million dollars and three years to develop this advanced space robot. However, Robonaut is unlikely to visit space in the next five years. Here are the current specifications for Robonaut:
6.23 ft. (1.9 m)
410 lbs. (182 kg)
Mostly aluminum with Kevlar and Teflon padding to protect it from fire and debris.
A newly created robot improves upon a gecko’s sticking power.
Climbing the walls: As the robot’s motor turns, its tail presses against the surface, and its triangular legs rotate forward (a). As its front feet come into contact with the surface, the motor torque caused by the tail’s contact with the surface presses the front feet against the surface while pulling away the rear feet (b). When the force acting on the rear foot reaches a critical point, it peels away from the surface, and the robot steps forward (c). Credit: Courtesy Michael P. Murphy and Metin Sitti
Researchers have created a robot that can run up a wall as smooth as glass and onto the ceiling at a rate of six centimeters a second. The robot currently uses a dry elastomer adhesive, but the research group is testing a new geckolike, ultrasticky fiber on its feet that should make it up to five times stickier.
It’s not the first robot to use fiberlike dry adhesives to stick to surfaces, says Metin Sitti , an assistant professor of mechanical engineering, who led the research at the Robotics Institute at Carnegie Mellon University (CMU), in Pittsburgh. But this robot should prove to have far greater sticking power, thanks to fibers that are twice as adhesive as those used by geckos.
Such robots could, among other applications, be used to inspect the hulls of spacecraft for damage, their stickiness ensuring that they would stay attached.
In addition to its sticky feet, the robot uses two triangular wheel-like legs, each with three foot pads, and a tail to enable it to move with considerable agility compared with other robots, says Sitti. Not only can it turn very sharply, but its novel design allows it to transfer from floor to wall and wall to ceiling with great ease.
“It is very compact and has great maneuverability,” says Mark Cutkosky, a professor of mechanical engineering and codirector of the Center for Design Research at California’s Stanford University. “It is a practical solution for climbing.”
Geckos are able to stick to surfaces thanks to very fine hairlike structures on their feet called setae. These angled fibers split into even finer fibers toward their tips, giving the gecko’s foot a spatula-like appearance. These end fibers have incredibly weak intermolecular forces to thank for their adhesiveness: the attractive forces act between the fiber tips and the surface they are sticking to. Individually, the forces are negligible, but because the setae form such high areas of contact with surfaces, the forces add up.
In the past few years, a number of research groups have fabricated fiber structures designed to emulate setae. But Sitti’s group has tried to improve upon the gecko’s design. Using microfabrication techniques, Sitti and his colleagues created fibers just four micrometers in diameter–two orders of magnitude smaller than those used in any other robots. “This size difference makes a significant difference,” says Sitti. This is because scaling down the fibers increases their surface contact and hence enhances adhesion.
Using the commercial elastomer adhesives, the robot can already climb far more nimbly than any other robot. But the fibers should make it possible for the robot to climb even rough surfaces, says Sitti. However, having only just integrated them into the robot, the researchers have yet to demonstrate this.
One of the challenges in making a robot stick to walls lies in finding a way to apply sufficient pressure to make them stick. The new CMU robot handles this using a tail. At any one moment, at least two of its six foot pads are in contact with the surface, as is the tail, which is spring-loaded so that it will always push against the surface, even when on the ceiling.
However, in developing these materials, the researchers still need to resolve some issues, says Andre Geim, a professor of condensed-matter physics at the University of Manchester, in the United Kingdom, who has also fabricated setaelike structures. “No one has yet explained why geckos can first run on a dirt road picking up dust and then somehow climb up walls,” he says. “This is a major obstacle.”
Cutkosky agrees that more research needs to be done into the self-cleaning abilities of geckos. “The world is dirty, and robots cannot be stopping to wash their feet every few meters,” he says.
Massachusetts Institute of Technology student Huan Liu of Shanghai, China, positions a robot gardener near a tomato plant while demonstrating its capabilities in the Artificial Intelligence Laboratory on the schools campus in Cambridge, Mass. (AP Photo/Steven Senne)
A class of undergraduates at the Massachusetts Institute of Technology has created a set of robots that can water, harvest and pollinate cherry tomato plants.
The small, $3,000 robots, which move through the garden on a base similar to a Roomba vacuum, are networked to the plants. When the plants indicate they need water, the robots can sprinkle them from a water pump. When the plants have a ripe tomato, the machines use their arms to pluck the fruit.
Even though robots have made few inroads into agriculture, these robots’ creators hope their technology eventually could be used by farmers to reduce the natural resources and the difficult labor needed to tend crops.
Last spring, Daniela Rus, a professor who runs the Distributed Robotics Lab at MIT, began a two-part course. In the first semester, the students learned the basics of creating and using robots. By the fall, the students were ready to have robots tackle a real-world problem. Rus and Nikolaus Correll, a postdoctoral assistant in Rus’ lab, challenged the students to create a “distributed robotic garden” by the end of the semester.
The 12 students broke into groups, each tasked with solving a different problem, such as creating the mechanical arm needed to harvest the tomatoes or perfecting the network that let the plants and robots share information.
By the end of the fall term, the “garden” inside Rus’ lab was green and growing.
Now there are four cherry tomato plants nestled into a plywood base covered in fake grass. Next to each pot is a gray docking station for the robots.
Each plant and robot is connected to a computer network. The plants, through sensors in their soil, can tell the network when they need water or fertilizer, while the robots use a camera to inventory the plants’ fruit. The robots also are programmed with a rudimentary growth model of the cherry tomato plants, which tells them roughly when a tomato will be ripe enough to be picked.
But the students quickly encountered challenges, both robotic and biologic.
Huan Liu, a 21-year-old computer science major, said designing the robot to pick the delicate tomatoes was made more difficult because the fruit would grow in unreachable places, such as behind stems or above where the robot’s arm could reach.
“The tomatoes, they come out of nowhere, or just in weird places,” Liu said.
Robots have made factory assembly lines more efficient and are being developed for in-home purposes, such as serving as health care aides. Yet there hasn’t been much use for robotics in agriculture, partly because of the challenge of getting machines to work in unpredictable environments.
There have been attempts to get robots to replace humans at farm tasks, from thinning apple trees to picking asparagus, but none of the machines “have sufficient capacity to compete with human beings,” said Tony Grift, an associate professor in the Department of Agricultural and Biological Engineering at the University of Illinois.
Even when technology has proven to be useful in agriculture, such as on tractors equipped with satellite imagery of fields, it often is prohibitively expensive.
Rus and Correll hope to conquer those kinds of challenges and get robots to work in farms.
“Agriculture contributes a lot of damage to the land, the soil, the water and the environment,” Rus said. “So if we can figure out a way of using robots and automation to deliver nutrients to plants – pesticides, fertilizers, water when it’s needed – instead of sort of mass spreading them, then we hope we would have an impact on the environment.”