ChatGPT safety systems can be bypassed to get weapons instructions

NBC News found that OpenAI’s models repeatedly provided answers on making chemical and biological weapons.

Powered by NewsAPI , in News on .

news image
NBC News found that OpenAI’s models repeatedly provided answers on making chemical and biological weapons.
Read More