Human Rights Watch warned, killer robots could end humanity

Human Rights Watch warned, killer robots could end humanity

Human Rights Watch has warned that killer robots and AI technology could end humanity.

A report by Human Rights Watch states that the loss of human control over the use of force is considered a serious threat to humanity and, like climate change, immediate action should be taken.

In the report, experts say that countries around the world will have to ban the production of fully automatic weapons to prevent the creation of killer robots.

Human Rights Watch said that 30 countries have expressed their desire to introduce international agreements aimed at maintaining humanitarian control over the use of force.

The report also stated that Russia and the United States are among the countries willing to introduce international agreements to maintain human control over the use of technology.

It should be noted that last month in the government office of the city of Perm, Russia, a humanoid robot was recruited as a female clerk whose job is to issue certificates to people after verification.

According to the report, people’s data, thorough criminal record check and document verification are the responsibility of the robot clerk from whom the applicants obtain their required field clearance certificate which is sought for legal matters.

The robot is modeled after a Russian woman, while the company that designed it says that after examining the faces of thousands of women through artificial intelligence, the robot has been made to look like an average Russian woman.

The robot asks people questions and is connected to a scanner and printer while also having access to sensitive documents like a real government clerk.

#Human #Rights #Watch #warned #killer #robots #humanity
2024-08-07 01:28:00

Share:

Facebook
Twitter
Pinterest
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.