GetReal Secures $17.5M to Stop Deepfakes in Their Tracks

GetReal Secures $17.5M to Stop Deepfakes in Their Tracks GetReal Secures $17.5M to Stop Deepfakes in Their Tracks
IMAGE CREDITS: SHUTTERSTOCK

As deepfakes become more convincing and widespread, the risks they pose to businesses, governments, and public trust are multiplying. From scamming enterprises out of millions to threatening national security. The dangers of AI-generated audio, video, and images are no longer hypothetical—they’re real, and growing fast. That’s where GetReal steps in.

The San Mateo-based deepfake detection startup just secured $17.5 million in Series A funding to expand its cutting-edge forensic platform. Forgepoint Capital led the round, with Ballistic Ventures, Evolution Equity, and K2 Access Fund also backing the mission to detect and stop deepfakes before they cause harm.

GetReal’s technology offers a full suite of forensics tools for enterprises and government agencies. It provides a secure cloud-based platform that includes a dashboard for threat exposure, tools to protect public-facing executives from impersonation. And a media screening system designed to flag suspicious content. For high-risk scenarios, GetReal’s human analysts dig even deeper.

The startup, co-founded by Hany Farid—one of the earliest voices in deepfake detection. Has been operating behind the scenes since its 2022 incubation at Ballistic Ventures. That same VC also led GetReal’s $7 million seed round, backed by a roster of supporters including Venrock and Silver Buckshot. Notably, Ballistic’s founder and longtime security investor, Ted Schlein, is also a co-founder and serves as chairman of the company.

Farid, currently a professor at UC Berkeley, has been pioneering forensic techniques for two decades, long before the term “deepfake” entered the mainstream. Until now, he’s worked mostly with newsrooms, law firms, and courts—helping verify digital evidence and expose manipulated media. With GetReal, that expertise is being transformed into scalable software.

“What we built is basically Hany as a service,” said CEO Matt Moynahan. Who joined the company after decades leading top security firms like Veracode and Symantec. He warned that deepfakes are one of the most serious threats he’s ever seen—far more troubling than the viruses and malware that defined previous cybersecurity eras.

Today’s digital-first, cloud-heavy workplaces are especially vulnerable, he explained. Even smart professionals are falling for synthetic voice or video impersonations that look and sound shockingly real. GetReal’s platform aims to reduce that risk with fast, reliable analysis.

One major challenge is the lack of skilled cyber forensic experts. “If cybersecurity has a talent gap, forensics is a black hole,” Moynahan said. By automating much of the deepfake detection process and combining it with expert oversight, GetReal is closing that gap.

Their platform isn’t just catching investor attention—it’s already in use by companies like John Deere and Visa, with strategic interest coming from sensitive industries where trust and verification are everything. Cisco Investments, Capital One Ventures, and In-Q-Tel (a CIA-linked investment firm) also joined the funding round, reflecting growing demand across both commercial and government sectors.

Forgepoint’s Alberto Yépez, who led the Series A investment, said regulated industries have been asking for a solution like this. Boardrooms are now discussing deepfake threats regularly—often after executives have already been targeted in fake interviews or spoofed conversations.

Government agencies are also feeling the pressure. With bad actors now capable of influencing decisions through fabricated voice or video, the stakes are high. Intelligence services, too, have recognized the risk of acting on deepfake content that could be used to manipulate diplomacy or defense operations.

While GetReal’s platform currently focuses on visual and audio media, one recent incident shows how fast this landscape is evolving. A bizarre case involving a Signal group chat about a military operation in Yemen initially looked like a hoax—but it turned out to be real. Still, Farid said text-based impersonations are not yet part of GetReal’s scope, noting that the challenge is fundamentally different.

That could change in the future. Farid hinted at broader plans, saying the goal is to eventually cover all types of manipulated content. As deepfakes evolve, GetReal aims to stay ahead, combining decades of digital forensics expertise with scalable, cloud-based tools built for the real world.

Share with others

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Service

Follow us