Free Shipping on orders over US$39.99 How to make these links

Computer scientists’ interactive program aids motion planning for environments with obstacles — ScienceDaily


Like us, robots can’t see walls. Sometimes they need a little help to get to their destination.

Rice University engineers have developed a technique that allows humans to help robots “see” their surroundings and perform tasks.

The strategy called Bayesian Learning IN the Dark – BLIND, for short – is a novel solution to the long -standing problem of movement planning for robots working in environments where not everything is clearly visible to everyone. weather.

The peer-review study led by computer scientists Lydia Kavraki and Vaibhav Unhelkar and co-lead authors Carlos Quintero-Peña and Constantinos Chamzas of Rice’s George R. Brown School of Engineering was presented by the Institute of Electrical and Electronics Engineers ’International Conference on Robotics and Automation in late May.

The algorithm developed primarily by Quintero-Peña and Chamzas, both graduate students who worked with Kavraki, keeps someone in the loop to “increase the robot’s visibility and, importantly, prevent unsafe execution. movement, “according to the study.

To do this, they combined Bayesian inverse reinforcement learning (where a system learns from constantly updated information and experience) with established action planning techniques to help robot with a “high degree of freedom” – that is, many features of movement.

To test the BLIND, the Rice lab ordered a Fetch robot, an articulated arm with seven joints, to take a small cylinder from one table and transfer it to another, but in practice in this it must pass through a barrier.

“If you have a lot of joints, the robot’s instructions are complicated,” Quintero-Peña said. “If you’re managing someone, you can just say, ‘Raise your hand.'”

But robot programmers need to be specific about the movement of each joint at every point in its path, especially if obstacles prevent the machine from “seeing” its target.

Instead of programming a trajectory forward, BLIND inserts a mid-human process to refine the choreographed options-or best guesses-suggested by the robot algorithm. “BLIND allows us to take information in the human head and compute our trajectories in this high level of freedom space,” Quintero-Peña said.

“We use a specific method of feedback called critique, basically a binary form of feedback where the person is given labels on the pieces of the track,” he said.

These labels appear as connected green dots representing possible pathways. As the BLIND steps from point to point, the person agrees or rejects every move to clear the way, avoiding obstacles in the most effective way.

“It’s an easy interface for people to use, because we can say,‘ I like it ’or‘ I don’t like that, ’and the robot uses this information in planning,” Chamzas said. Once rewarded with an approved set of movements, the robot can do its job, he said.

“One of the most important things here is that human desires are hard to describe using a mathematical formula,” Quintero-Peña said. “Our work simplifies human-robot relationships by incorporating human preferences. That’s how I think applications get the most benefit from this work.”

“This work uniquely demonstrates how small, but focused, human intervention can enhance the capabilities of robots to perform complex tasks in environments where certain features are not known to. robot but knows man, ”said Kavraki, a pioneer in robotics whose resume includes advanced programming for NASA’s humanoid Robonaut aboard the International Space Station.

“It shows how the methods for human-robot interaction, the research topic of my colleague Professor Unhelkar, and the automated planning pioneered over many years in my laboratory can be combined to provide reliable solutions. who also respects human desires. “

Rice undergraduate alumna Zhanyi Sun and Unhelkar, an assistant professor of computer science, are co -authors of the paper. Kavraki is Noah Harding’s Professor of Computer Science and professor of bioengineering, electrical and computer engineering and mechanical engineering, and director of the Ken Kennedy Institute.

The National Science Foundation (2008720, 1718487) and a grant to the NSF Graduate Research Fellowship Program (1842494) supported the research.

Video: https://youtu.be/RbDDiApQhNo



Source link

We will be happy to hear your thoughts

Leave a reply

Info Bbea
Logo
Enable registration in settings - general
Compare items
  • Total (0)
Compare
0
Shopping cart