The National Institute of Standards and Technology (NIST) has launched Dioptra, an open-source tool to help test the safety and security of AI and machine learning models. Dioptra helps identify how adversarial attacks could decrease the performance of AI models, thus enabling users to understand when and why the system might fail. NIST also issued draft guidance on managing misuse risks of dual-use foundation models and published documents on AI safety.
The US proposes stricter cybersecurity rules to protect healthcare data
The US Department of Health and Human Services (HHS) has proposed new cybersecurity measures to protect sensitive patient information. The regulations require healthcare providers to