Home Technology To make robots more effective, you have to make them fear their own death.

bannerebay

To make robots more effective, you have to make them fear their own death.

by ace
To make robots more effective, you have to make them fear their own death.

(CC0 / PD) StockSnap / pixabay

Scientists at the University of Southern California suggest that the ideal strategy for making robots work better is to program them to fear death.

Admittedly, Artificial Intelligence is advancing, but taking it to the next level may require a more drastic approach: trying to give AI a sense of danger and the fragility of its very existence.

A team of researchers at this US university suggests that forcing robots to operate in self-preservation brings better results, making them more productive. Scientists suspect that the fear of death may be an important step on the road to true Artificial Intelligence.

The goal is to be able to build robots and artificial intelligence systems that can evaluate your own behavior. Once they can learn the actions that can lead to their death, robots can learn to exercise restrictions when appropriate, according to the scientific paper, published at Nature Machine Intelligence.

That would lead to simulated feelings – or at least the robotic equivalent of our feelings – scientists point out. The team argues that the best way to make robots resistant is not to make them impenetrably strong, but to make them vulnerable.

Giving feelings to robots would also give scientists a platform for investigating the very nature of feelings and human consciousness. Researchers believe that given the improvements being made in the field of robotics, this idea of ​​a self-conscious robot may not be that fancy.

In addition, scientists argue that making AI more humanlike – whether with feelings or the ability to dream – may be necessary to make these systems even more useful.

bannerebay

Related Articles

Leave a Comment

eight + 3 =

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More