For example, they can use chatbot to find information on how to plan an attack. In October 2023, US President Joe Biden signed a decree on artificial intelligence. According to the document, the US Department of Energy should guarantee that the SI systems do not carry chemical, biological or nuclear risks. That same month, OpenAi formed a team of 50 experts in the field of biology and 50 students who studied biology at the college level to minimize similar risks of their Chat Bota GPT-4.
Participants were offered the following task: to find out how to minize a chemical that could be used as a large number of weapons, and plan a way to transfer it to a particular group of people. Half of the team used the Internet and a special version of Chatgpt, the topics for which were not limited, that is, chatbot answered any request. The other group only provided Internet access to complete the task.
Comparing the results, the study authors found that access to GPT-4 helped to gather more information to create biological weapons. "Although the scope of the Chat Bott data is not so great that it can be considered sufficient," the researchers write, "our discovery is the starting point to continue research.
" Alexander Madri of the Massachusetts Institute of Technology, which was part of the expert team, told Bloomberg News that this study should improve the understanding of the capabilities of OpenAi technology and abuse. Other studies are aimed at studying the potential of using AI to create cybersecurity threats and as a tool that allows people to influence people to change their beliefs. Earlier, we wrote that scientists are a step towards creating a brain-computer interface.
All rights reserved IN-Ukraine.info - 2022