Anthropic CEO encourages caution about the safety of AI models
The post Anthropic CEO encourages caution about the safety of AI models appeared on BitcoinEthereumNews.com.
Dario Amodei, the CEO of Anthropic recently mentioned that artificial intelligence companies should undergo mandatory testing requirements. This also applies to his own firm, Anthropic. The goal is to make sure that these technologies are safe for the general public before they get released. Amodei was answering a question at an AI safety summit recently held in San Francisco by the US Departments of Commerce and State. In his response, the CEO stated, “I think we absolutely have to make the testing mandatory, but we also need to be really careful about how we do it.” These remarks follow a release by UK and US AI Safety Institutes that contain results of their testing of the Claude 3.5 Sonnet model by Anthropic. These tests were conducted in various categories that include biological and cybersecurity applications. Previously, both OpenAI and Anthropic agreed to submit their models to government agencies. Major companies like Anthropic follow self-imposed safety guidelines Amodei mentioned that major companies have voluntarily agreed to certain self-imposed guidelines. This includes OpenAI’s preparedness framework and Anthropic’s responsible scaling policy. However, he further added that more work is needed to ensure safety. “There’s nothing to really verify or ensure the companies are really following those plans in letter of spirit. They just said they will,” Amodei said. He also added, “I think just public attention and the fact that employees care has created some pressure, but I do ultimately think it won’t be enough.” Amodei believes that powerful AI systems have the capability to outperform the smartest of humans. He also thinks that such systems can become available by 2026. He adds that AI companies are still testing certain catastrophic harms and biological threats, none of which are currently real. However, these hypothetical tests can become a reality much sooner. He also…
Filed under: News - @ November 21, 2024 3:28 pm