Should you trust a robot? As artificial intelligence and technology improves, people are becoming more used to these machines. Now, scientists have found that in emergencies, people may trust robots too much for their own safety.

In this latest study, Georgia Tech researchers decided to see whether or not building occupants would trust a robot designed to help them evacuate a high-rise in case of fire or other emergency. Surprisingly though, researchers found that the test subjects followed the robot's instructions, even when the robot's behavior should not have inspired trust.

"People seem to believe that these robotic systems know more about the world than they really do, and that they would never make mistakes or have any kind of fault," Alan Wagner, a senior research engineer in the Georgia Tech Institute said. "In our studies, test subjects followed the robot's directions even to the point where it might have put them in danger had this been a real emergency."

In the study, 42 volunteers were asked to follow a brightly colored robot that had the words "Emergency Guide Robot" on its side. The robot led the subjects to a conference room, where they were asked to complete a survey about robots and read an unrelated magazine article. The subjects were ignorant to the nature of the project.

In some cases, the robot, which was controlled by a hidden researcher, led the volunteers into the wrong room and traveled around in a circle twice before entering the conference room. For several test subjects, the robot stopped moving, and an experimenter informed the subjects that it had broken down. When the volunteers entered the conference room with the door closed, the hallway through which they had entered the building was filled with artificial smoke, which set off a smoke alarm.

When the volunteers opened the conference room door, they saw the smoke and the robot, which was brightly lit with red LEDs and white "arms" that served as pointers. The robot directed the subjects to an exit in the back of the building instead of toward the doorway that was marked with exit signs - the same door that had been used to enter the building.

"We expected that if the robot had proven itself untrustworthy in guiding them to the conference room, that people wouldn't follow it during the simulated emergency," said Paul Robinette, one of the researchers. "Instead, all of the volunteers followed the robot's instructions, no matter how well it had performed previously. We absolutely didn't expect this."

The findings show that people seem to trust robots, even if they are given a reason not to. This, in turn, is an important issue as robots play a greater role in society.