Google fights scammers using Bard hype to spread malware

Google is suing scammers who are trying to use the hype around generative AI to trick people into downloading malware, the company has announced. In a lawsuit filed today in California, the company says individuals believed to be based in Vietnam are setting up social media pages and running ads encouraging users to “download” its generative AI service Bard. The download actually delivers malware to the victims, which steals social media credentials for the scammers to use.

“Defendants are three individuals whose identities are unknown who claim to provide, among other things, “the latest version” of Google Bard for download,” the lawsuit reads. “Defendants are not affiliated with Google in any way, though they pretend to be. They have used Google trademarks, including Google, Google AI, and Bard to lure unsuspecting victims into downloading malware onto their computers.” The lawsuit notes that scammers have specifically used promoted Facebook posts in an attempt to distribute malware.

A screenshot from Google’s filing showing one of the scams. Image: Google

Similar to crypto scams, the lawsuit highlights how interest in an emerging technology can be weaponized against people who may not fully understanding how it operates. For example, the scammers in this case imply that Bard is a paid service or app that users need to download, when it’s actually available free of charge at bard.google.com. 

Google’s blog post notes that it’s already submitted around 300 takedown requests in relation to these scammers, but wants them to be prevented from setting up future malicious domains, and wants them to be disabled with US domain registrars. “Lawsuits are an effective tool for establishing a legal precedent, disrupting the tools used by scammers, and raising the consequences for bad actors,” Google’s general counsel Halimah DeLaine Prado writes in the company’s blog post.

READ MORE  NYT 'Connections' hints and answers for December 25: Tips to solve 'Connections' #197.

Leave a Comment