close

Robots

Robots

Killer Robots? Nah, These Ones Just Mop Your Floors

112717-Brain-Corp-robot-illustration

“Anything you see that has wheels, we can turn into a robot. Wouldn’t it be nice to have 100 different robots three, five years from now?”

 First, the good news: robots are finally coming. The bad news: they won’t be the sexy humanoids imagined by anime creators.

That was the message from Eugene Izhikevich, founder and CEO of Brain Corp., who was in Tokyo last week to speak at SoftBank’s annual robot conference about one of his creations — an 881-pound autonomous floor cleaner.

The machine, which looks like a cross between a Zamboni and a motorized wheel chair, was originally designed to be operated by a human. Equipped with Brain’s software and an array of sensors typically found in a self-driving car, it mops floors on its own even when customers are around.

“Anything you see that has wheels, we can turn into a robot,” Izhikevich, a neuroscientist, said in an interview, speaking with a trace of a Russian accent. “Wouldn’t it be nice to have 100 different robots three, five years from now?”

Though the industrial robot population has risen to 1.8 million since GM first put one on an assembly line in 1961, growth has been limited because most are variations on the original theme: a big claw on a metal limb. Attempts to use humanoid machines as companions and assistants have also fallen flat. Brain has found a way to get the machines out of their metal cages by finding niche business applications for recent advances in machine vision.

Brain’s robots already clean floors at Wal-Mart, Costco, Lowe’s and multiple airports in the U.S., and sales will begin in Japan next summer. Izhikevich said his order book is full until February.

The sensor package to convert a manual cleaner into a robot includes a laser range finder; Brain charges $500 a month for the service. That’s still a bargain considering Wal-Mart spends on average $1,500 a month to mop up a single store.

Floor cleaning is just the start; the company sees robots in the near future helping with security patrols and personal mobility. SoftBank’s founder Masayoshi Son is on board with that vision and in July led a $114 million investment round in Brain via his $93 billion Vision Fund. Qualcomm is also a backer.

Unlike the billionaire’s other robotics bets, Brain doesn’t actually make the machines. The company is developing a software platform that will let other robots move independently in closed environments. Much as Google did with its Android operating system, Izhikevich plans to give access to Brain OS to outside developers, a move that might happen as soon as next year.

“You want robots everywhere and you don’t want to wait until you are 90,” Izhikevich said. “If I’m the one who has to build those robots, it will take me 50 years. Instead, I can partner with 50 and 100 companies who are already building manual equipment and provide them the brains.”

read more
robotrobotic process automationRoboticsRobotsTechnology

New robotic hand named after Luke Skywalker helps amputee touch and feel again

BBEZeTB
© University of Utah Using a robotic arm that allowed him to feel objects again, Keven Walgamott was able to pick a grape without crushing it.
 Keven Walgamott wasn’t sure what to expect when scientists first hooked up what was left of his arm to a computer.Last year — 14 years after he lost his hand and part of his arm in an electrical accident — he heard about a team at the University of Utah working on an experimental robotic arm. The prosthetic hand and fingers would be controlled by an amputee’s own nerves. Even more challenging, researchers were trying to restore the sense of touch to amputees through that robotic hand.

Walgamott volunteered for the experimental program. A few weeks after surgeons implanted electrodes into the nerves of his arm last year, he found himself hooked up to a computer getting ready to touch something with his left hand for the first time in more than a decade.

The Utah researchers had created a computer program to simulate the feel of touching a virtual wall — an early test to prepare Walgamott for the robotic arm.

As Walgamott moved his arm, a virtual hand on the computer screen before him moved as well, plunking down the ridges of the corrugated wall.

“It was stunning. I could actually feel the wall. I could feel the bumps along it,” he said. “It almost brought tears to my eyes.”

The “Luke” arm, a robotic prosthetic created by DEKA and named after the sci-fi robotic hand wielded by Luke Skywalker.

© University of Utah The “Luke” arm, a robotic prosthetic created by DEKA and named after the sci-fi robotic hand wielded by Luke Skywalker.

Then researchers attached the robotic arm itself, putting Walgamott through a battery of tests over 14 months that had him touch and manipulate objects with it.

“When I went to grab something, I could feel myself grabbing it. When I thought about moving this or that finger, it would move almost right away,” he said. “I don’t know how to describe it except that it was like I had a hand again.”

At the Society for Neuroscience conference in Washington on Tuesday, the University of Utah team presented part of their work on adding the sense of touch and movement to prostheses — the latest step in the rapidly developing field of neuroprosthetics.

Over the course of the past year, while working with Walgamott as their key subject, they have found adding touch to prostheses markedly improves motor skills of amputees compared with robotic prostheses on the market. Adding the sense of touch to prosthetic hands also appears to reduce a painful feeling many amputees experience called phantom pain, and it creates a sense of ownership over the device, researchers said.

“By adding sensory feedback, it becomes a closed-loop system that mimics biology,” said Jacob George, a bioengineering PhD student at the University of Utah and lead author of Tuesday’s study. The goal, he explained, is to get prosthetic technology to a point where someone using a prosthesis wouldn’t have to think through every movement to pick up a cup. They wouldn’t even have to look at the cup. They would simply move the hand toward it using their brain and existing nervous system, feel it and pick it up.

person sitting at a desk with a laptop: University of Utah researchers have developed technology that allows users to feel through this robotic arm. In one experiment, they were able to use the hand to distinguish soft foam from hard plastic.© University of Utah University of Utah researchers have developed technology that allows users to feel through this robotic arm. In one experiment, they were able to use the hand to distinguish soft foam from hard plastic.

The most cutting-edge prosthetic hands available can make sophisticated movements, but they require complicated — and often imprecise — methods of operation. Some rely on tilt motions by the user’s foot and others on movements by the muscles remaining in a user’s arm.

The Utah research group’s approach, however, relies on a device called the Utah Slanted Electrode Array. The device is implanted directly into the nerves in a subject’s arm. The USEA, along with electrodes implanted in muscles, allows amputees to control a robotic hand as if they were flexing or moving their original hand. The approach also allows signals like sensation to be transmitted back to the subject’s nervous system, creating a “looped system” — like in a human limb — where the hand’s feeling and movements inform each other.

“We often think of touch as one thing, but it’s more than that. It’s pressure, vibration, temperature, pain,” said Gregory Clark, the bioengineering professor leading the Utah research team. Because of that, it has required painstakingly slow work from a multidisciplinary team of experts — over the course of years — to build those sensations into the robotic arm, figure out which spot on the hand corresponds with which nerve fiber in the arm and the algorithms required to send touch signals back into the nervous system.

Clark’s team is part of a larger effort funded by the U.S. military’s Defense Advanced Research Projects Agency. DARPA launched its neuroprosthetic program in 2014 — called HAPTIX — with the goal of developing an advanced robotic arm within years that would help amputees feel and move intuitively. The researchers received additional funding from National Science Foundation.

The robotic arm the Utah researchers have been working with was developed under the HAPTIX program by the company DEKA (the company founded by Segway inventor Dean Kamen). The state-of-the-art robotic limb was dubbed the “Luke” arm by its makers, after the advanced prosthesis wielded by Luke Skywalker in “Star Wars.”

The results of the Utah group’s experimental tests so far have been both gratifying and inspiring, the researchers said.

Walgamott -a real estate agent in Utah — described the joy of being able to do everyday mundane tasks again with his left hand — like picking up an egg without crushing it, clasping his hands together and holding his wife’s hand.

But the highlight of his entire 14 months in the experimental program, he said, was being able to put a pillow into a pillowcase on his own.

“When you have just one hand, you learn to adapt,” he said, describing the infuriatingly slow process he usually uses for pillowcases, pulling them on inch by inch on each side, rotating the whole time. “To just take a pillow in one hand and put the pillowcase on with the other. I know it sounds simple, but it’s amazing.”

read more
robotrobotic process automationRoboticsRobots

New Video Shows a Creepily Human-Like Robot Doing a Backflip

robot.jpeg g
A new version of a humanoid disaster robot, called Atlas, can do half-turns in the air and even a backflip.

A new video shows a robot performing amazing acrobatic feats, from backflips to half-turn jumps.

The eerily humanoid robot, called Atlas, is 4.9 feet (1.5 meters) tall and weighs 165 pounds (75 kilograms), and uses Lidar and stereovision to navigate in its surroundings, according to Boston Dynamics, which makes the robot. Atlas is designed to be able to take on emergency situations where human life would normally be put at risk, such as going into buildings that have crumbled after an earthquake, or dealing with patients who have deadly, highly infectious diseases, according  to the Defense Advanced Research Projects Agency (DARPA).

In the video, the newest version of the humanoid does a kind of jump training called plyometrics, leaping between raised platforms, doing a 180-degree turn in the air on raised platforms and performing a backflip off a platform. Though he may not give American gymnast Simone Biles a run for her money right now, the robot does manage to stick the landing. [Machine Dreams: 22 Human-Like Androids from Sci-Fi]

Other videos show the robot stacking boxes on a shelf, ambling on a walk in the snow with a human “friend” and chasing after, and picking up, a box that’s deliberately moved out of its reach. According to the Boston Dynamics website, Atlas can carry payloads up to 24 lbs. (11 kg).

Atlas has other human-like abilities, such as a sense of balance, so it resists toppling when pushed, and can get back up after a fierce shove.

The current version of Atlas isn’t yet as agile as the average human; when it walks, it uses an awkward gait resembling a person who really, really has to get to a bathroom. And though it can travel over rough terrain, video seems to show it stumbling where a human might be fine.

Still, the current version of Atlas is a dramatic improvement over its ancestors: In 2013, when it first debuted at the DARPA Robotics Challenge, Atlas weighed 330 lbs. (150 kg) and required a cord for power, Technology Review reported at the time.

read more