Meta Secretly Profits Big from Llama AI Models

Meta Secretly Profits Big from Llama AI Models Meta Secretly Profits Big from Llama AI Models
IMAGE CREDITS: GETTY IMAGES

Last July, Meta CEO Mark Zuckerberg made it clear that selling access to the company’s Llama AI models wasn’t part of its business strategy. In a blog post, he stated, “selling access isn’t our business model.” However, newly unsealed court filings from the ongoing Kadrey v. Meta copyright lawsuit tell a slightly different story.

According to the filing, Meta does, in fact, generate revenue from Llama — though indirectly. The document reveals that Meta has revenue-sharing agreements with companies that host its Llama models. Whenever users tap into these models through certain platforms, Meta receives a share of that income.

Interestingly, the filing didn’t disclose which partners are actively paying Meta. Still, the tech giant has previously named a list of official Llama hosts, which includes big players like AWS, Google Cloud, Microsoft Azure, Nvidia, Databricks, Groq, Dell, and Snowflake.

For developers, using a hosting partner isn’t mandatory. Llama models can be freely downloaded, fine-tuned, and run on a variety of systems. Yet, many choose these hosts because they offer extra tools, support, and resources that make deployment simpler and faster.

Hints of monetization plans for Llama surfaced as early as April last year. During Meta’s earnings call, Zuckerberg acknowledged that large cloud providers reselling Llama-based services should share revenue with Meta.

“If you’re someone like Microsoft or Amazon or Google reselling these services, we believe we should get a portion of the revenue,” Zuckerberg noted. “That’s the kind of deal we intend to pursue, and we’ve started doing it.”

Beyond those deals, Zuckerberg also floated future monetization ideas, like integrating Llama models into business messaging tools or running ads within AI-driven interactions. However, specifics on these plans remain vague.

More recently, Zuckerberg doubled down on the value Meta gains from keeping Llama open. He stressed that allowing the AI research community to work with Llama leads to better models — which, in turn, enhances Meta’s own products like Meta AI, the company’s AI assistant.

“I think it’s good business for us to do this in an open way,” Zuckerberg added during Meta’s Q3 2024 earnings call. “It makes our products better rather than building in isolation without industry standardization.”

Still, these revenue-sharing arrangements are drawing renewed scrutiny as the Kadrey v. Meta lawsuit unfolds. The plaintiffs claim that Meta trained Llama models using vast datasets of pirated e-books — reportedly obtained through torrenting. They argue that by participating in torrent networks, Meta not only downloaded these works but also distributed them back into the network, enabling further copyright infringement.

This fresh detail, suggesting Meta directly profits from Llama, could strengthen the plaintiffs’ case. They argue that Meta not only used pirated content to build its models but also created an ecosystem that benefits financially from those models — a serious accusation in copyright law.

Meanwhile, Meta’s AI ambitions are rapidly expanding. The company recently announced plans to nearly double its capital expenditures, projecting spending between $60 billion to $80 billion in 2025. A large portion of that budget will go toward building new data centers and scaling up AI development teams.

Reports suggest Meta is also exploring new ways to monetize its AI push. One idea on the table is a premium subscription service for Meta AI. Though details are scarce, the offering could include advanced features or capabilities for power users, helping the company recover some of its soaring AI-related costs.

For now, Zuckerberg maintains that Llama’s open approach is a long-term bet — one that builds goodwill with developers while keeping Meta’s AI ecosystem competitive. Still, with lawsuits looming and massive investments underway, Meta is walking a fine line between open-source ideals and the realities of monetizing cutting-edge AI.

Share with others

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Service

Follow us