
Microsoft Worker Says AI Tool Tends To Create "Sexually Objectified" Images
NDTV
Shane Jones said he expressed his concerns to the company several times over the past three months.
A Microsoft Corp. software engineer sent letters to the company's board, lawmakers and the Federal Trade Commission warning that the tech giant is not doing enough to safeguard its AI image generation tool, Copilot Designer, from creating abusive and violent content.
Shane Jones said he discovered a security vulnerability in OpenAI's latest DALL-E image generator model that allowed him to bypass guardrails that prevent the tool from creating harmful images. The DALL-E model is embedded in many of Microsoft's AI tools, including Copilot Designer.
Jones said he reported the findings to Microsoft and "repeatedly urged" the Redmond, Washington-based company to "remove Copilot Designer from public use until better safeguards could be put in place," according to a letter sent to the FTC on Wednesday that was reviewed by Bloomberg.