ABB introduces design and color change for new era of robotics

IRB 6700 robot

2014-05-14 – In 1974, ABB introduced the world’s first all-electric, microprocessor-controlled industrial robot. During these last 40 years, the robotics industry has seen an amazing amount of innovation and incredible improvements. As ABB’s product offer has evolved and changed dramatically, one thing has remained constant: the color of the robots.

BackwhenABBwasintroducingroboticsproductsthatwerepreviouslyunheardof,itwasfeltthe colororangewasappropriateforsafety.Wewanted tohelppeoplerememberthattheywereworkingwith apowerfulpieceofequipmentthatwaspotentiallydangerous.But timeschange,andABB has tochangewiththem.We are now entering a new era of robotics, one in which collaboration between humans and robots is a reality. Recent advances in software and hardware have enabled a new generation of robots that can safely work next to people. In the past, a bright color was needed to keep humans away, but this new era of robots should be more welcoming.

At the same time, ABB has changed as a company. We are realizing our unique ability to deliver complete global solutions, and the ABB brand is stronger around the world. A new design language and color ensures our robots are easily identifiable as ABB products.

“Today we are launching a new look that is both more modern and better suits the era of collaboration,” says Per Vegard Nerseth, Head of ABB Robotics. “We call this new design language ‘Dynamic Design,’ and it is built around the concept that ABB provides efficient solutions for a dynamic world. Not only does the new look adopt unique forms and shapes, it also comes with a new color, Graphite White.”

The best example of this design change can be seen in our recently introduced IRB 6700 robot (pictured above). From the curves on its arm to the new colors, the design language that all of our robots will adopt is evident. Starting in May of 2014 all of ABB’s standard robots will ship in the new Graphite White color and every newly designed robot we release from now on will also be based on the dynamic design philosophy. Traditional orange will remain a free option through the end of 2014, but customers will still be able to order our robots in any color they want.

ABB Robotics is proud to have been such a strong influencer of the last 40 years of robotics development. With this fresh, new design language we are preparing ourselves for another 40 years of incredible innovation and strong collaboration with our partners around the world.

Source: ABB Robotics

Teaching Robots Right from Wrong

Newswise — MEDFORD/SOMERVILLE, Mass.—Researchers from Tufts University, Brown University, and Rensselaer Polytechnic Institute are teaming with the U.S. Navy to explore technology that would pave the way for developing robots capable of making moral decisions.

In a project funded by the Office of Naval Research and coordinated under the Multidisciplinary University Research Initiative, scientists will explore the challenges of infusing autonomous robots with a sense for right, wrong, and the consequences of both.

“Moral competence can be roughly thought about as the ability to learn, reason with, act upon, and talk about the laws and societal conventions on which humans tend to agree,” says principal investigator Matthias Scheutz, professor of computer science at Tufts School of Engineering and director of the Human-Robot Interaction Laboratory (HRI Lab) at Tufts. “The question is whether machines—or any other artificial system, for that matter—can emulate and exercise these abilities.”

One scenario is a battlefield, he says. A robot medic responsible for helping wounded soldiers is ordered to transport urgently needed medication to a nearby field hospital. En route, it encounters a Marine with a fractured leg. Should the robot abort the mission to assist the injured? Will it?

If the machine stops, a new set of questions arises. The robot assesses the soldier’s physical state and determines that unless it applies traction, internal bleeding in the soldier’s thigh could prove fatal. However, applying traction will cause intense pain. Is the robot morally permitted to cause the soldier pain, even if it’s for the soldier’s well-being?

The ONR-funded project will first isolate essential elements of human moral competence through theoretical and empirical research. Based on the results, the team will develop formal frameworks for modeling human-level moral reasoning that can be verified. Next, it will implement corresponding mechanisms for moral competence in a computational architecture.

“Our lab will develop unique algorithms and computational mechanisms integrated into an existing and proven architecture for autonomous robots,” says Scheutz. “The augmented architecture will be flexible enough to allow for a robot’s dynamic override of planned actions based on moral reasoning.”

Once architecture is established, researchers can begin to evaluate how machines perform in human-robot interaction experiments where robots face various dilemmas, make decisions, and explain their decisions in ways that are acceptable to humans.

Selmer Bringsjord, head of the Cognitive Science Department at RPI, and Naveen Govindarajulu, post-doctoral researcher working with him, are focused on how to engineer ethics into a robot so that moral logic is intrinsic to these artificial beings. Since the scientific community has yet to establish what constitutes morality in humans the challenge for Bringsjord and his team is severe.

In Bringsjord’s approach, all robot decisions would automatically go through at least a preliminary, lightning-quick ethical check using simple logics inspired by today’s most advanced artificially intelligent and question-answering computers. If that check reveals a need for deep, deliberate moral reasoning, such reasoning would be fired inside the robot, using newly invented logics tailor-made for the task.

“We’re talking about robots designed to be autonomous; hence the main purpose of building them in the first place is that you don’t have to tell them what to do,” Bringsjord said. “When an unforeseen situation arises, a capacity for deeper, on-board reasoning must be in place, because no finite rule set created ahead of time by humans can anticipate every possible scenario.”

Bertram Malle, from the Department of Cognitive, Linguistic and Psychological Services at Brown University, will perform some of the human research and human-robot interaction studies. “To design a morally competent robot that interacts with humans we need to first get clear on how moral competence functions in humans,” he said. “There is a fair amount of scientific knowledge available, but there are still many unanswered questions. By answering some of these questions in the project, we can move closer to designing a robot that has moral competence.”

The overall goal of the project, says Scheutz, “is to examine human moral competence and its components. If we can computationally model aspects of moral cognition in machines, we may be able to equip robots with the tools for better navigating real-world dilemmas.”

Besides the experts from Tufts, Brown, and RPI, this team will include consultants from Georgetown University and Yale University in this multi-year effort.

The group brings together extensive research expertise in theoretical models of moral cognition and communication; experimental research on human reasoning; formal modeling of reasoning; design of computational architectures; and implementation in robotic systems.

###
Tufts University School of Engineering Located on Tufts’ Medford/Somerville campus, the School of Engineering offers a rigorous engineering education in a unique environment that blends the intellectual and technological resources of a world-class research university with the strengths of a top-ranked liberal arts college. Close partnerships with Tufts’ excellent undergraduate, graduate and professional schools, coupled with a long tradition of collaboration, provide a strong platform for interdisciplinary education and scholarship. The School of Engineering’s mission is to educate engineers committed to the innovative and ethical application of science and technology in addressing the most pressing societal needs, to develop and nurture twenty-first century leadership qualities in its students, faculty, and alumni, and to create and disseminate transformational new knowledge and technologies that further the well-being and sustainability of society in such cross-cutting areas as human health, environmental sustainability, alternative energy, and the human-technology interface.

Source: Newswise

Marco Tempest: EDI, the multi-purpose robot designed to work very closely with humans

 

 

 

Publicado el 06/05/2014

Marco Tempest uses charming stagecraft to demo EDI the multi-purpose robot designed to work very closely with humans. Less a magic trick than an intricately choreographed performance, Tempest shows off the robot’s sensing technology, safety features and strength, and makes the case for a closer human-robot relationship.” by TED Conferences, LLC

Source: Rethink Robotics

Robots may need to include parental controls

 

ORONTO — Older adults’ fears that companion robots will negatively affect young people may create design challenges for developers hoping to build robots for older users, according to Penn State researchers.

Companion robots provide emotional support for users and interact with them as they, for example, play a game, or watch a movie.

Older adults reported in a study that while they were not likely to become physically and emotionally dependent on robots, they worried that young people might become too dependent on them, said T. Franklin Waddell, a doctoral candidate in mass communications. Those surveyed also indicated that although they were not worried about being negatively affected by robots, the adults would still resist using the devices.

“We’ve seen this type of effect, which is usually referred to as a third-person effect, with different types of media, such as video games and television, but this is the first time we have seen the effect in robotics,” said Waddell. “According to a third person effect, a person says they are not as negatively affected by the media as other people.”

The researchers, who presented their findings today (April 30) at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems, said this effect could eventually lead to changes in behavior. For instance, people who believe video games harm young people may tend to avoid the games themselves. Likewise, older adults who believe that companion robots could harm young people may tend to avoid robots.

To compensate for the effect, robot designers may need to consider adding controls that will help adults monitor the use of robots by children, said Waddell, who worked with S. Shyam Sundar, Distinguished Professor of Communications and co-director of the Media Effects Research Laboratory, and Eun Hwa Jung, a doctoral candidate in mass communications.

“Robot designers and developers look at older adults as a central user base for companion robots,” said Waddell. “This effect is something they should consider when designing the interface for the robots to make sure, for example, that the robot includes some type of parental controls.”

Robots with parental controls may convince adults that they can own and use robots and still protect children from their fears that the devices might lead to laziness and dependency.

The researchers studied two types of robots: companion robots and assistant robots, said Sundar. Assistant robots are devices that help with everyday tasks, such as vacuuming the floor or playing a CD, he said, while companion robots are more interactive.

This interactivity may be one reason that users tend to attach human-like emotions to companion robots, Waddell said.

“A companion robot provides the user with a source of friendship,” said Waddell. “They might watch TV with the participant, provide emotional support, or complete an activity with the user.”

Waddell said the participants did not seem to show the same level of apprehensions about assistant robots.

Researchers asked 640 retirees over the age of 60 — 53 percent female and 47 percent male — about whether robots would have negative effects on themselves and on others. For instance, they asked the subjects whether robots would make them lazier and encourage them to interact less often with other people. They then asked similar questions about the effects of robots on young people.

The Korea Institute for Advancement of Technology supported this study, which is part of an international research and development program between Penn State and the Industrial Academy Cooperation Foundation of Sungkyunkwan University in South Korea.

Source: PSU

 

 

Margo: The Semi-Autonomous Social Telepresence Robot

Publicado el 01/05/2014

UMass Lowell Robotics Lab
http://robotics.cs.uml.edu/

The work shown in this video was supported in part by NSF IIS-1111125.

Source: Robotics Lab