A super PAC with deep ties to OpenAI CEO Sam Altman launched an AI chatbot this week in a bid to deny President Joe Biden from winning a second term in office, The Washington Post reports.
The group, We Deserve Better, launched the Dean.Bot as an AI stand-in for Dean Phillips, a Minnesota congressman who’s running a quixotic campaign to challenge Biden in the Democratic primary.
Phillips has attracted the attention of Altman and other Silicon Valley elites, who say they’re concerned about Biden’s recent low poll numbers and his chances against former President Donald Trump, the likely winner of the Republican primary.
But tellingly, the formation of the super PAC to support Phillips occurred in early December, soon after Biden signed a sweeping executive order that seeks to regulate AI tech.
Altman and other AI proponents have said in the past that they want regulation and safeguards on AI, but that stance often changes when proposals become reality. For example, Altman earlier threatened that OpenAI would quit operating in Europe if certain AI regulations came to pass, and has also lobbied hard to water down those same regulations.
So it’s interesting that a group with deep ties to Altman would launch an AI-powered chatbot to try to unseat Biden, who wants to curb AI technology. One of the founders of the super PAC, Matt Krisiloff, used to be an employee of OpenAI and apparently dated Altman.
Strikingly, OpenAI’s ChatGPT once powered the chatbot, but now it relies on open source models, according to WaPo.
The Dean.Bot is clearly labeled as an AI chatbot and has disclaimers such as: “Feel free to ask it anything, but please take answers with a grain of salt!”
If you ask him about any recent campaign controversy, the Dean.Bot, like any seasoned politician, passes the buck.
For example, the Phillips campaign quietly changed a header on its website that once read “Diversity, Equity, and Inclusion,” The New York Times reports. It now reads instead “Equity and Restorative Justice.”
This change comes after the Phillips campaign received an infusion of $1 million from Bill Ackman, the billionaire hedge fund manager who’s mounted a recent assault on DEI initiatives.
When you ask the Dean.Bot about the recent removal of DEI from the Phillips campaign website, the chatbot says that “I must clarify that as a digital clone, I don’t control the content of websites or make decisions about what is posted or removed.”
“However, I can tell you that my commitment to diversity, equity, and inclusion is unwavering,” the chatbot continues. “My actions and policies consistently reflect the importance of these principles in creating a fair and just society for all. If there have been changes to my website, they would be made with the intent to best represent my current focus and priorities in serving the people.”
As it stands, the Biden campaign probably shouldn’t worry too much about the chatbot, because it’s still pretty clunky when compared to dealing with a real flesh and blood human being, more an election year curiosity than anything else.
Beyond the chatbot being an avatar for a monied class who want to influence elections, though, it does feel like a prototype of how AI and tech may change politics going into the future.
Late last year, the campaign of Pennsylvania Democrat politician Shamaine Daniels ran a chatbot called Ashley to help in her phone banking efforts.
Both Dean.Bot and Ashley have been clearly labeled as AI chatbots, but they could be laying the groundwork for a reality in which bad actors can release chatbots that can lie or obfuscate rival politicians’ policy platforms.
“I see this as a Pandora’s box problem,” New York University Stern Center for Business and Human Rights’ deputy director Paul Barrett told WaPo. “Once we have AI versions of candidates chatting up voters, it’s a short step to bots used by political opponents to fool voters into thinking that politicians are saying things they never said. And soon, everyone gets so cynical about all of this fake communication that no one believes anything anyone is saying.”
Source : The Byte