Toner also studied China's AI industry.[3] She later worked as a research affiliate at the University of Oxford's centre for the governance of AI, before becoming Georgetown's Center for Security and Emerging Technology's director of strategy and foundational research grants.[3][6] She has co-written articles in Foreign Affairs.[7][8]
In late 2021 Toner was appointed to the board of OpenAI.[3] OpenAI is owned by investors including Microsoft, but the organization has retained its nonprofit governance structure, making board members accountable to the organization's altruistic goals, rather than shareholders.[9]
In October 2023 she published the report "Decoding Intentions: Artificial Intelligence and Costly Signals" with two co-authors, writing[10]
OpenAI has also drawn criticism for many other safety and ethics issues related to the launches of ChatGPT and GPT-4, including regarding copyright issues, labor conditions for data annotators, and the susceptibility of their products to “jailbreaks” that allow users to bypass safety controls.
After the paper’s publication, Altman tried to push out Toner because he thought the paper was critical of the company.[4]
On November 17, 2023, Toner along with three other board members voted to remove Sam Altman as CEO of OpenAI. The board's stated reason was that Altman was "not consistently candid in his communications” with the board,[11] and was influenced by perceptions that Altman was manipulating board members for his own gain.[12] Four days later, the decision was revoked and she was removed from the board of directors.[13]
In May 2024, in a podcast, she explained the board's rationale for firing Sam Altman.[14]