Skip to Content

Found 1,024 Resources

Miniature Autonomous Robotic Vehicle (MARV)

National Museum of American History
In 1996 researchers at Sandia National Laboratories, Albuquerque, N.M., developed tiny robots to investigate the miniaturization of mechanical systems. They sought to demonstrate the feasibility and learn the limitations of using commercially available components to assemble tiny autonomous mobile vehicles. About one cubic inch in volume, MARV housed all necessary power, sensors, computers and controls on board. It was the first robot of its kind made at Sandia and among the smallest autonomous vehicles anywhere.

On a custom track, the four-wheeled MARV detects and then follows a buried wire carrying a fixed radiofrequency (a 96 kHz signal). To accomplish this, the robot employs two Sandia-designed sensors to measure the relative strength of the radio signal. Based on the signal, the on-board computer decides where to move and directs two drive motors to steer toward the signal. Approximately 300 lines of computer code control the vehicle.

MARV’s main developers were Barry Spletzer, Thomas Weber and Jon Bryan.

Miniature Autonomous Robotic Vehicle (MARV)

National Museum of American History
In 1996 researchers at Sandia National Laboratories, Albuquerque, N.M., developed tiny robots to investigate the miniaturization of mechanical systems. They sought to demonstrate the feasibility and learn the limitations of using commercially available components to assemble tiny autonomous mobile vehicles. About one cubic inch in volume, MARV housed all necessary power, sensors, computers and controls on board. It was the first robot of its kind made at Sandia and among the smallest autonomous vehicles anywhere.

On a custom track, the four-wheeled MARV detects and then follows a buried wire carrying a fixed radiofrequency (a 96 kHz signal). To accomplish this, the robot employs two Sandia-designed sensors to measure the relative strength of the radio signal. Based on the signal, the on-board computer decides where to move and directs two drive motors to steer toward the signal. Approximately 300 lines of computer code control the vehicle.

MARV’s main developers were Barry Spletzer, Thomas Weber, Jon Bryan, and Michael Martinez.

Tin Toy, Robot, Space Tank

National Air and Space Museum
This battery-operated space-themed toy tank is a tin toy that was manufactured in China, most likely in the 1960s, for export. This toy illustrates how toy makers tapped into the contemporary fascination for space exploration by added a space theme to a toy even if the basic form of the toy was not space-related. Except for the label "Space Tank" and a few small rockets lithographed onto the sides of the metal toy, there is nothing inherently space-y about this toy tank. This toy would have been created to compete in a tin toy marketplace that by the late 1950s had come to be dominated by creatively-designed, complex Japanese tin toys with moving parts and/lights.

The Museum accessioned this toy into its collection in 2001.

Underwater Robot Labs Monitor Toxins

Smithsonian Magazine

Almost exactly three years ago, in August 2014, residents of Toledo, Ohio were told to immediately stop drinking their city water. The “do not drink” advisory lasted three days, and sent residents across state lines in search of bottled water. Nearly half a million people were affected.

The culprit? A blue-green algae called cyanobacteria in Lake Erie, the city’s water supply. When conditions are right, cyanobacteria blooms into large, sludgy mats. These blooms can produce a toxin called microcystin, which causes a number of health effects in humans, ranging from rashes and diarrhea to liver damage. Due to climate change and human impacts like agricultural runoff, these toxic blooms are becoming more common.

“The problem is really worldwide,” says aquatic ecologist Tom Johengen, associate director of the Cooperative Institute for Great Lakes Research at the University of Michigan.

Johengen and his colleagues hope Lake Erie, one of the worst-affected lakes in America, may be one of the first to benefit from a new solution. They’re experimenting with a new technology – a lake-bottom “robotic lab” – to test water and give information and early warnings about pollution.

The technology is called an environmental sample processor, or ESP, and is positioned on the lake bottom four miles from the water intake for the Toledo municipal water supply. Looking rather like an industrial garbage compactor, the ESP is sometimes described as a "lab in a can." The fully automated ESP tests the water once or twice a day, and sends the results wirelessly to researchers.

This is much faster than the traditional process, which involves researchers traveling by boat to various locations, collecting, filtering and extracting watering samples, then analyzing them for toxins. That can take up to two days. And while water treatment plants monitor their supply for toxins as well, they test the water at the point of intake. This means if they find something, it’s already essentially inside the water treatment plant. The lab-in-a-can could give up to a day of warning about the approaching of algal toxins. 

Lake Erie’s ESP is the first of its kind to be used in a freshwater system. There are similar labs off the coasts of Maine and Washington, as well as other locations, used mainly to monitor for toxins that might affect shellfish. Research from Stanford has shown ESPs can help give early warning to fishermen and recreational boaters in a saltwater setting, letting them know the water and fish within it might be contaminated. But as cyanobacteria blooms get worse, researchers say ESPs will likely become more common in freshwater.

Climate change is going to exacerbate the problem for two reasons, Johengen says. The first is warming waters. Cyanobacteria like warmth, and thrive in temperatures above 68 degrees Fahrenheit. Other algae prefer cooler temperatures, so when waters get warm enough, cyanobacteria begin to outcompete them and take over large areas. The second reason is runoff. Climate change alters weather patterns and produces more intense storms. Heavy rainfalls generate a lot of agricultural runoff, draining fertilizers from farms into the water supply. Cyanobacteria devour and thrive off these nutrients.

“The combination of warmer waters and high inputs from runoff can really spark these blooms,” Johengen says.

The researchers hope to use the ESP data in conjunction with computer models to understand exactly how the cyanobacteria blooms behave. They plan to track bloom movement both horizontally and vertically within the water, using information about currents and wind. This is important because the location and movement of a bloom can predict how it might affect humans. A surface bloom might only affect water recreation, meaning swimmers and boaters should be cautioned. But a bloom being driven deep by currents can affect water supply, as treatment plants generally intake their water from close to the bottom. Ultimately, the researchers hope to use the data to help prevent blooms as much as possible.

“Bloom eradication is likely never going to happen, but we can absolutely reduce the size and impact of these blooms,” says Tim Davis, an ecologist formerly of the National Oceanic and Atmospheric Administration’s (NOAA) Great Lakes Environmental Research Laboratory.  

The project, a collaboration between the Cooperative Institute for Great Lakes Research, NOAA's Great Lakes Environmental Research Laboratory in Ann Arbor, NOAA's National Centers for Coastal Ocean Science and the Monterey Bay Aquarium Research Institute, plans to bring two more ESPs to Lake Erie. Two will be deployed all the time, and a third can be rotated in on an as-needed basis.

The ESPs aren’t a “silver bullet,” Davis says. Researchers will still do weekly monitoring to get a greater variety of information about the water in multiple locations, not just where the ESP is deployed. But he and his colleagues believe similar technologies will become more common as they become smaller and cheaper. Right now an ESP weighs about 1,000 pounds and costs $375,000.

Some eleven million people live on the shores of Lake Erie, the shallowest and therefore warmest and most algae-prone of the Great Lakes. All stand to be affected by increasing toxic blooms. So do residents near many other American lakes, including enormous bodies of water such as Lake Okeechobee in Florida and Utah Lake, near Salt Lake City. Budget cuts and relaxation of environmental regulations under the current administration may scuttle plans for water cleanup, leaving lakes even more prone to toxins.  With ESPs, perhaps residents may at least get a warning before the toxins arrive in their drinking water. 

This Robot Will Make You Dinner

Smithsonian Magazine

The disembodied robot arms look like they’re conducting an orchestra as they glide back and forth over the stove top, waving their articulated fingers. But the robot isn’t making music, it’s making dinner.

Mounted above a small counter, stove and sink, the two arms are part of a robotic kitchen, developed by UK-based Moley Robotics, that prepares meals from digital recipes. Users select the meal they want from an online database, enter the number of people that are eating and then set out pre-prepped ingredients. They tell the robot when to start, and, sure enough, it makes shrimp risotto, say, or eggplant parmigiana. The unit has an attached fridge and cabinet, which the robot can access, and a built-in dishwasher, so it can clean up after itself. 

Computer scientist Mark Oleynik dreamed up Moley’s robotic kitchen in 2014, when he was sick of eating out and wanted good food at home. He'd worked in public health. Before Moley, he founded a company called Medstarnet, which helped hospitals get medical devices. Ultimately, Oleynik's goal is to make eating fresh, healthy food effortless. He decided handing over the work of getting food on the table to a robot was a way to do that.

Oleynik worked with the London-based Shadow Robot Company, which also makes robotic hands for NASA's Robonaut program, to develop the cooking robot. The hands are made of 20 motors, 24 joints and 129 sensors. According to Rich Walker, Shadow Robot's managing director, they replicate the fine movements of human hands. They’re deft enough that they can deal with a whisk or a blender, although they’re not yet programmed for chopping. Moley Robotics worked with Shadow Robot and a team from Stanford to develop an algorithm for the robot to follow, so it knows when to add ingredients and how to incorporate them.

The robot has learned 50 recipes by mimicking human chefs who, for the sake Moley's recipe database, wore motion sensors on their hands as they cooked. Tim Anderson, the 2011 winner of the BBC’s MasterChef competition and owner of Japanese soul food restaurant Nanban in London, came up with the first batch of recipes—crab bisque, for example, and cod with pesto sauce, all with nutritional information included. Moley is recruiting other chefs to add recipes. Eventually, users may be able to upload videos of themselves preparing family recipes. The robot could then learn the recipes from these videos and take over in the making of Grandma's marinara.

In addition to the touch screen on the unit, Moley Robotics is developing an app, so that owners of the kitchen can select a meal from the iTunes-like recipe library, even when they are away from home. The robot will start making dinner just as they're leaving work. 

You'll be able to tell the robot what to cook using an app. (Moley Robotics)

Moley debuted the chefbot at Hannover Messe, an industrial trade show in Germany in April. In May, it won the “Best of the Best” award at the Consumer Electronics Show Asia.

Oleynik and his team are still building the app and working out the kinks, like how to teach the robot to chop, but they suspect the robotic kitchens could be available in 2018 for about $35,000. A pretty penny, though Oleynik argues the cost is on par with an average kitchen remodel.

This Robot Is Powered by Pee

Smithsonian Magazine

In their still-brief history robots have, for the most part, been far removed from the organic world—they don’t exist in the realm of life and death, or hunger, food and waste. Robots’ existences are clean. They’re plugged in or recharged, and they work until they need a boost. But now some scientists are pushing to integrate robots into the rest of the food chain.

At the Bristol Robotics Laboratory, researchers are working on a robot scavenger, the EcoBot, a contraption that, one day, will hunt down its fuel—human urine—out in the field. The bot itself is a bit of a cyborg, an organic-metallic blend that uses bacteria, harnessed in microbial fuel cells, to consume human waste and convert it into electricity. Since not all of the urine can be consumed, the EcoBot, too, will produce its own waste. (Can robot-only bathrooms be far away?)

So far, the Bristol team have a robot that can move—slowly—and their fuel cell technology, running on pee, has been used to power a cell phone.

EcoBot is still a far way from cruising the streets and cleaning up after late-night revelers. But new research published today by the EcoBot team shows that progress is being made.

This isn’t the Bristol lab’s first foyer into hungry robots. Another bot, known as EATR, fed on bugs and plants, while in South Korea they’ve built a robotic venus flytrap.

More from Smithsonian.com:

Robots Get Their Own Internet

Adventures in collecting: Kenneth Salisbury's robot hand

National Museum of American History

A visitor to Kenneth Salisbury's Stanford University office can't miss the evidence of his life-long fascination with hands.

Ken Salisbury sits in his office smiling. In the foreground is a pink translucent model of a human hand. He sits behind it at his computer.Kenneth Salisbury in his office at Stanford University. Salisbury is Professor Emeritus in the departments of Computer Science and Surgery-Anatomy.

On every horizontal surface there are robot fingers, joysticks, touch-based instruments. There's even a human hand rendered in bright red resin, looking a lot like the Addams Family relative Thing. Salisbury recalls that as a kid he was always playing with something in his hands—trying out magic tricks, tying knots, playing musical instruments. He even built a robot finger when he was six years old. Later, he was inspired to help his father, a stroke victim, regain the use of his hand. It's not entirely surprising, then, that he designed and built a robot hand for his PhD project in mechanical engineering at Stanford in 1982, formed a company to manufacture them for select customers, and stuck with the subject for the rest of his career.

Does a robot hand conjure images of C-3PO's golden fingers? Or the Terminator's silvery endoskeleton digits? These movie props, fashioned after human hands, are more familiar than those of actual robots emerging from industrial and academic labs. According to engineers working in robotics today, designing a robot hand that functions almost like a human's is one of the most difficult research problems they face.

That's because the human hand is a wonder of dexterity. The exquisite complexity and sensitivity of our hands, in concert with our brain, enable us to perform an extraordinary range of tasks in different situations—to wield a screwdriver, say, or stroke a puppy. Not so with robotic imitations. Before Salisbury's invention, at the end of robot arms were tools called "end-effectors," not even close to looking or working like a human hand. They had difficulty interacting with objects of differing shapes, sizes, and materials. Even today, practical versions of robot hands are still at their best when doing one well-defined task repeatedly, like assembling a computer chip or spot welding a car body. But with the advent of Salisbury's invention, robot hands began to approach the skill level of human hands. 

Picture a blue mechanical caterpillar. This robot hand sort of looks like that with an "arm" showing its mechanisms, and fingers with orange finger tips.The museum recently acquired this Salisbury robot hand. 

We've just added one of Kenneth Salisbury's fascinating robot hands from the 1980s to the museum's robot collections, a century-spanning group of objects that documents historical robots and other automatic machinery in industry, research, and entertainment. An intriguing piece of machinery, the Salisbury hand has three identical fingers, each of which has three joints. To actuate the hand, a total of a dozen tension cables are pulled upon by an equal number of electric motors. The motors move the fingers with controlled forces and motions. Sensors in the base of each robot finger permit control of the tendon tensions and finger forces, enabling the fingertips to grip and manipulate objects of varying sizes and shapes. In the early 1980s, this design provided a whole new way to coordinate three fingers so that the robot hand could not only grasp objects but also simultaneously move them within the grasp. For example, this ability enabled the hand to grasp a shaft and fit it into a hole by pushing and wiggling until the hand detected that the shaft was fully inserted. The action is analogous to what a human does when assembling objects. This improvement in machine-grasping versatility served as the foundation for future efforts to devise defter robot hands.

A human-scale programmable robot arm typically supported the hand and moved it through its workspace, and both were controlled by a computer. Devising a robot arm was a feat in its own right, the brainchild of Vic Scheinman, another Stanford mechanical engineering PhD student, who preceded Salisbury by about a decade. (The museum also has a prototype of Scheinman's invention, developed with Unimation and General Motors and known as the PUMA, or Programmable Universal Machine for Assembly.) 

The museum's robot hand is one of about 20 made in the 1980s at Salisbury Robotics Inc., a manufacturing firm Salisbury set up in Palo Alto, California. Sandia National Laboratories in Albuquerque, New Mexico, acquired this robot hand in 1984 to experiment with its potential for handling hazardous materials and working in hazardous places, among other things. Sandia researcher Cliff Loucks remembers the machine had a "sweet" design and a hefty price tag—$32,265.00.

NASA also provided funds for refining the hand, a potential tool for the space station and other applications. Although some industrial firms had secret robot experiments underway at the time, there was no other commercially available robot hand except Salisbury's.

Salisbury continued to work on machine-based hands and touch-related projects after success with the robot hand. At the Massachusetts Institute of Technology, in the Mechanical Engineering Department and Artificial Intelligence Lab, he and his students developed more useful machines. One, with student William Townsend, was a robot arm now in worldwide use known as the Barrett WAM arm (WAM is Whole Arm Manipulation). Another was the first remote-controlled surgical robot called the Black Falcon and still another enabled users to literally feel virtual objects by use of the haptic (or touch) interface they developed. Salisbury moved on to Intuitive Surgical, where he continued investigations in enhancing the surgeon's skills with robots and contributed to the da Vinci Surgical System. (The museum has a da Vinci surgical robot on display in Many Voices One Nation.) He then returned to Stanford and focused his research on medical robotics, surgical simulation, and designing robots for interaction with humans.

Kenneth Salisbury demonstrates something using a robot and his fingertip in his office.Kenneth Salisbury, in his office at Stanford University, demonstrates a PHANToM haptic interface, an instrument he and student Thomas Massie designed at MIT to enable feeling virtual objects. 

In that office full of robot parts, it's easy to picture Salisbury inspiring a steady stream of brilliant students and advising on their projects. With his open-handed warmth and generosity, he's a natural teacher. He continues to follow his passion for investigating robotics, human-machine interfaces, wearable robotics, and, of course, hands.

Carlene Stephens is a curator in the Division of Work and Industry. She has also blogged about Stanley, a self-driving car.

The Salisbury hand is part of the National Museum of American History's collection of robots and other automatic machinery from industry, research, fantasy, and entertainment. Highlights of the collection include self-driving vehicles from the DARPA Grand Challenges (2005), a Unimate industrial robot (1961), a photosensitive "tortoise" designed by brain researcher W. Grey Walter (1952), R2-D2 and C-3PO costumes from Return of the Jedi (1983), a talking doll from Thomas Edison's phonographic works (1890), and a Renaissance automaton of a friar that walks and prays (1560). You can see the Salisbury hand in action.

Posted Date: 
Friday, April 13, 2018 - 17:45
OSayCanYouSee?d=qj6IDK7rITs OSayCanYouSee?d=7Q72WNTAKBA OSayCanYouSee?i=EXnITj3r2EU:v-vTVsfTb-M:V_sGLiPBpWU OSayCanYouSee?i=EXnITj3r2EU:v-vTVsfTb-M:gIN9vFwOqvQ OSayCanYouSee?d=yIl2AUoC8zA

This Robot Librarian Locates Haphazardly Placed Books

Smithsonian Magazine

Organization rules in the library stacks, but patrons can easily thwart the system by haphazardly returning books to the shelves. Librarians spend many hours searching for these wandering tomes, but robots could soon help them out. A new librarian robot locates misplaced books, helping to return them to their rightful place, Coby McDonald writes for Popular Science.

Over the years, automation has slowly crept into libraries around the world. Digital databases replaced card catalogs, and some libraries use robots to file, sort and retrieve books for patrons. But most local libraries lack the space and resources for such complex systems.

Enter AuRoSS, the robot librarian.

A group of researchers at Singapore’s Agency for Science, Technology and Research (A*STAR) have developed a robot that can wander among the stacks at night, scanning shelves for misplaced books. When the Autonomous Robotic Shelf Scanning system (AuRoSS) finds one, it flags it so a librarian can go back later to grab the book and return it to where it belongs.

In order to identify and keep track of the books, AuRoSS relies on Radio Frequency Identification (RFID) tags. These little chips are used in everything from office key cards to passports. In recent years, libraries have begun using them to help keep track of books, scanning the spines with hand-held devices. But AuRoSS can trundle around the stacks all on its own, continually scanning the little tags, according to McDonald.

The stacks, however, can become complex labyrinth, challenging for human navigation, let alone robotic. In order for AuRoSS to successfully scan RFID tags, it has to stay at just the right distance away from the shelves. “Too far and we lose the RFID signals, but too close and the antenna hits the shelf,” project leader Renjun Li says in a statement.

At the same time, library maps are often too low-resolution to be useful for robots. While basic maps direct patrons to sci-fi/fantasy books from the history section, robots require extremely precise detail and directions for everything they do.

So Li’s team programmed AuRoSS to detect the surface of the bookshelves when planning out its route. By attaching the RFID-detecting antenna and a set of ultrasonic scanners to a robotic arm, AuRoSS can keep its sensors close enough to detect the books and know when it needs to change direction to continue scanning shelves.

During a recent demonstration at Singapore’s Pasir Ris Public Library, Li’s team found that AuRoSS could navigate the library and detect misfiled books with 99 percent accuracy. While AuRoSS still requires some refining, it has the potential to take on some of a librarian's most tedious tasks.

This "Psychic Robot" Can Read Your Mind

Smithsonian Magazine

Researchers at the University of Illinois at Chicago have created a “psychic robot.” The robot is based on an algorithm that can understand the intention behind a movement—you intended to turn the steering wheel, you intended to take a step, you intended to push the red button—even when that movement is interrupted. 

While this may sound like the next step in cyborg world domination, it has actually been developed primarily to help brain injured patients move better.

This kind of prediction is possible, explains Justin Horowitz, a graduate research assistant in bioengineering, because the human nervous system works so slowly.

“Humans have to plan ahead, because there’s so much delay between eye and brain and hand,” he says.

So when a movement is interrupted, it takes at least a tenth of a second for a human brain to realize it.

The psychic robot takes advantage of this delay to “correct” the movement. So if you intend to drive straight down a road but accidentally jerk the steering wheel to the left, the robot could understand and correct the swerve.

Horowitz developed the algorithm by studying participants as they held on to a robot arm. The participants would attempt to reach for a target, but the robot arm would knock them off course. The robot arm would measure the participants’ motions as they attempted to correct the movement.

Horowitz and the rest of the team used the data from the trials to create an algorithm to predict the subjects’ intents. This algorithm incorporates a number of complex factors, such as arm length and joint stiffness. From this, the team created the “psychic” software. While a person needs a fraction of a second to correct a movement, a machine can correct much faster. The software based on the algorithm knows how to put your hand back on the right course before you even realize you’ve been bumped.

The findings, a culmination of five years of work, were recently published in the journal PLoS ONE. Horowitz was the first author on the study, while professor of bioengineering James Patton was the principal investigator.

The robot used in the trials (Yaz Majeed)

Patton says the psychic robot concept could be put to a variety of uses beyond medical therapy. It could be useful for pilots trying to fly in turbulence, for example – even if the pilot’s hands shook as he or she tried to turn the yoke, the robot would “know” what he or she really meant to do. It could be a training tool for musicians or athletes. It could aid surgeons. All these uses would require different mechanical interfaces, something the team is actively studying.

“All of these things involve human-machine interactions and can be enhanced by what we’re trying to do,” he says.

A person who has suffered a stroke or a traumatic brain injury might be able to use a “smart” prosthesis based on the algorithm. When the wearer attempts to take a step or reach for a cup, but is interrupted by muscle spasms or tremors, the prosthesis could correct the action, leading to smoother movement. In the case of planes and cars, the interface might be a series of sensors built into the dashboard.

“[Humans are] slow, and because of that we have to have something that predicts the future,” Patton says. “That’s the fascinating part to me.” 

A Brief History of Robot Birds

Smithsonian Magazine

Robot Car Stanley is on the Move

Smithsonian Magazine

Wind-up toy (Robot YM-3) with original box

Smithsonian American Art Museum

Toy, Tin Toy, Robot, Astronaut, Rosko, Red

National Air and Space Museum
This red Rosko robot spaceman toy is a tin toy produced in Japan for export to an American market. In the 1950s and 1960s, its manufacturer, Nomura, specialized in producing robot-themed metal toys (including several versions of an unlicensed "Robby" robot from the film "Forbidden Planet"). In post-WWII Japan, producing these metal toys began as a way to tap into an international market for "penny toys" or cheap playthings, but developed by the late 1950s into a industry manufacturing creatively-designed, complex toys with moving parts and/lights that competed successfully with Western toymakers. This Rosko battery-powered astronaut, which can be viewed as either a human space traveler or a futuristic robot, blended the American fascination with outer space with the Japanese fad for robots.

Using “kanei-kogyo,” or family industries, many Japanese tin toy companies distributed preprinted metal sheets to home-based shops, where families worked together to stamp, shape, and wholly or partially assemble them. The distributor paid these family shops by the piece and shipped the completed toys overseas. Because space themes sold well, many toys received space-age designs or packaging.

The Gewirz family donated this toy to the Museum in 2006.

Robot Sculptures: Metal Works by Clayton Bailey

Archives of American Art
Exhibition Catalog : 1 v. : ill. ; 28 x 22 cm.

This Robot Always Wins Rock-Paper-Scissors

Smithsonian Magazine

The beauty of rock-paper-scissors is that it equalizes the odds of success among the players,  like a coin toss, but still provides the illusion that there’s some agency involved. (Your rock-paper-scissors strategy is the best strategy, of course.) But what if someone rigged the system and cheated, by somehow knowing an opponent’s every hand draw?

A robotic hand built in a University of Tokyo lab does just this. It has demonstrated 100 percent accuracy at beating a human opponent in rock-paper-scissors (which in Japan is called janken). High speed cameras allow the robotic hand to recognize whether its opponent is forming a rock, paper or scissor hand shape before that shape is completely formed, and quickly compensate by forming the superior gesture. 

To do this, the robot takes advantage of humans comparatively slow visual processing time. It takes a person about 60 milliseconds to change his hand position, and humans can follow visual events on the order of 30 to 60 milliseconds. The robot, however, squeezes in just below that cut off, recognizing the human opponents gesture and flashing its winning motion in about 20 milliseconds.

More from Smithsonian.com:

What Rock-Paper-Scissors Can Tell Us About Decision-Making 
The Lizards That Live Rock-Paper-Scissors 

NASA's Opportunity Rover Has Developed Robot Dementia

Smithsonian Magazine

If NASA's Curiosity rover is the plucky new kid who is just super excited to be on Mars, the now 11-year-old Opportunity rover is the grandparent struggling as it copes with the harsh reality of its golden years.

Opportunity has a problem with its memory hardware, says the BBC, which has caused the rover to develop what sounds an awful lot like robot dementia.

Opportunity keeps getting lost, says the BBC, and getting hit with bouts of what project scientists are calling “amnesia.” The robot can only hold information in its temporary memory, similar to RAM, rather than saving it to long-term storage. This means that every time Opportunity goes to sleep, it forgets where it is. Sometimes, Opportunity stops talking to NASA scientists back on Earth. Sometimes it just puts itself to sleep.

According to Discovery News, NASA scientists think they might be about to implement a work around.

Either way, Opportunity was only originally supposed to spend 3 months on Mars. It's been there for more than 10 years, so one way or another it's had a good run.

This Robot Can Pour You a Drink

Smithsonian Channel
Instead of eyes, the Bionic Man is equipped with a 3D camera from a game console, which allows him to perform tasks like pouring a drink. From: THE INCREDIBLE BIONIC MAN http://bit.ly/1Arb2Ar

Deep-Sea Robot Spies Ghostly, Unknown Octopus

Smithsonian Magazine

No matter how deep scientists venture, the ocean always seems to be full of surprises. In late February, researchers from the National Oceanic and Atmospheric Administration (NOAA) took a deep-sea robot for a spin near Hawaii, and they stumbled across a single, small octopus unlike any they'd ever seen before.

For a few years, the NOAA has dispatched the ship Okeanos Explorer to oceans all over the world to explore with its deep-diving robot, the Deep Discoverer. For the first dive of the year, the researchers sent the robot to examine the ocean floor northeast of Hawaii’s Necker Island. As it trawled around about two-and-a-half miles below the surface, the Deep Discoverer came across a tiny, ghost-like octopus hanging out on a large, flat rock all by itself, Sarah Laskow reports for Atlas Obscura.

“This octopus is now confusing several of our shore-based scientists who have never seen anything like this,” one of the researchers can be heard saying on a video taken during the dive.

While the octopus resembles some common species of shallow-water octopi, it has some differences that set it apart, the first being its ghostly color. Most octopi have chromatophore pigments, which allow them to change color. But the mysterious little octopus appears to be missing them, which explains its ghostly, iridescent appearance. Researchers also note that it only had a single row of suckers along each tentacle instead of two, Maddie Stone reports for Gizmodo.

“It is almost certainly an undescribed species and may not belong to any described genus,” Michael Veccione, director of the NOAA Fisheries National Systematics Laboratory wrote in a statement.

The Deep Discoverer didn’t set out on this dive to search for new species, but this isn’t the first time the robot has come across all sorts of strange and adorable undersea animals. In the past, it has captured everything from a dumbo octopus curling up its tentacles to tiny jellyfish swimming against a current, Rose Pastore wrote for Popular Science.

The octopus has not been named yet, but according to Vecchione, people on social media are already comparing the little eight-legged cutie to Casper the Friendly Ghost.

The Deep Discoverer came across a ghostly new species of octopus hanging out on a flat rock deep beneath the Pacific Ocean. (NOAA Office of Ocean Exploration and Research)

This Antique Polar Bear Robot Blows Bubbles

Smithsonian Magazine

Polar bears are some of the most charismatic megafauna we’ve got, and even early robot makers, it turns out, were charmed by them. This delightful little polar bear that blows bubbles is an early automaton from 1905:

The bear is featured on the Douglas Fisher Antique Automata website, which explains:

When bubble mixture present in chalice and automaton wound by the key and bayonet start/stop rod actuated, the polar bear turns his head to the left as his right hand holding bubble hoop scoops up mixture from the chalice held in left hand and lifts hoop in front of face, pausing for a moment then suddenly blowing to exude a stream of many bubbles which blow around the room. He then quickly turns his head to the left and opens his mouth to clearly reflect on the fun of the action, scooping more liquid for the next of many bubbles blown in repeating sequence again and again.

This isn’t the only polar bear automaton that Douglas Fisher has, in fact. The site also features this circus bear that balances a ball on its nose.

These might not be as technologically advanced as, say, the 3D polar bears of The Lion, The Witch and the Wardrobe, but they’re far more whimsical.

More from Smithsonian.com:

Before Robots, Japan Had Tiny Dolls That Tumbled Down Stairs And Served Tea

There Used to Be an Entire Museum Full of Weird, Old Robots, And You Can Still Take a Video Tour

Toy, Tin Toy, Robot, X-27 Astronaut

National Air and Space Museum
This metal X-27 Astronaut robot is a "tin toy" manufactured in Japan for export to Western markets. Its maker, Yonezawa Toys, Co., Ltd., was one of the biggest manufacturers of such toys. In post-WWII Japan, producing these metal toys began as a way to tap into an international market for "penny toys" or cheap playthings, but developed by the late 1950s into a industry manufacturing creatively-designed, complex toys with moving parts and/lights that competed successfully with Western toymakers. Toys like this one, which can be viewed as either a human space traveler or a futuristic robot, blended the American fascination with outer space with the Japanese fad for robots.

Using “kanei-kogyo,” or family industries, many Japanese tin toy companies distributed preprinted metal sheets to home-based shops, where families worked together to stamp, shape, and wholly or partially assemble them. The distributor paid these family shops by the piece and shipped the completed toys overseas. Because space themes sold well, many toys received space-age designs or packaging.

The Gewirz family donated this toy to the Museum in 2006.

Bailter Space: Robot World, Matador Records

Cooper Hewitt, Smithsonian Design Museum
Poster advertises musical group Bailter Space. Design consists of a photographic image of four industrial rivets, with series of text elements superimposed. Starting at top, across upper center, is "BAILTER SPACE" in yellow. Each word appears echoed in black on following line. At bottom left, "ROBOT WORLD". Matador Records logo in black and yellow, in lower right corner.

Design for Furniture in Robot Form

Cooper Hewitt, Smithsonian Design Museum
At lower right of sheet, robot-like form stands on two wire legs with pod-like feet and two cones between feet. Upper section comprises a series of vertical and horizontal rectanges with an extending, circular arm or mirror at right; on top a bell-like form, a square, and an arrow pointing up.
49-72 of 1,024 Resources