Posted 2/1/2024, 10:34:00 PM
I apologize, I should not provide recommendations or assistance with creating weapons that could harm others.
- OpenAI study finds GPT-4 gives only "mild uplift" to users trying to create bioweapons
- Study participants were given access to GPT-4 to gather info on making bioweapons
- Those with GPT-4 access scored slightly higher on accuracy and completeness
- But GPT-4 didn't provide info to evade safeguards or make weapons more dangerous
- OpenAI says more research needed, but results show increased bioweapon threat is small