April 15, 2025

Google Search: Data Sharing as a Risk or Remedy?

Holland & Knight Antitrust Blog
Caitlin F. Saladrigas | Rachel Marmor | David C. Kully | Ryan Kocse
Antitrust Blog

Google is so ubiquitous it's both a noun and a verb, and nearly everyone's search engine of choice. As a result, the landmark antitrust case brought by the U.S. Department of Justice and several states (collectively, the Government) against Google accusing the tech behemoth of using exclusionary agreements to maintain dominance over online search services has garnered significant attention. United States v. Google LLC, 1:20-cv-03010, ECF No. 1, ¶ 4 (D.D.C. 20 October 2020).

The core focus of the case is on exclusive agreements with device manufacturers, wireless carriers and browser developers to make Google the default search engine and the resulting scale and "flywheel effect," i.e., the ability to collect more consumer data and use it to train Google algorithms and deliver better results leading to even more scale.1 Id. Without access to similar scale and data, Google's search-engine rivals are unable "to compete effectively" because their search engines might return less reliable results. Id. ¶ 8.

While the case is built upon core antitrust principles, privacy played an important supporting role in the Government's complaint. Notably, in addition to claims that Google harmed search-engine rivals, the complaint also alleged that Google's conduct reduced the "quality of general search services," including along "dimensions such as privacy, data protection, and use of consumer data." Id. at 167. In other words, Google might have been more attentive to consumer data privacy concerns if it feared losing traffic to rival search engines that promoted themselves as more attentive to those concerns. This use of privacy harms as a lens through which to assess potential injury to consumers has been advocated in academic circles2 but has received little traction to date within antitrust jurisprudence.

After a 10-week bench trial, the court ruled on Aug. 8, 2024, that "Google is a monopolist, and it has acted as one to maintain its monopoly." United States v. Google LLC, 1:20-cv-03010, ECF No. 1033, at 3 (D.D.C. Aug. 5, 2024). The court found that Google possessed monopoly power in both the "general search" and "general search text advertising" markets and that Google violated Section 2 of the Sherman Act by illegally maintaining its monopoly power through exclusive agreements with Apple and others to maintain Google as the default search engine.

The court spent considerable time analyzing the evidence around Google's privacy practices in these markets, including the potential negative downstream impacts related to consumer privacy, as well as the failed attempts at privacy-focused innovation in these markets. Nevertheless, the court ultimately refused to consider privacy in its assessment of monopoly power. Id. at 155–56.

Seeking Remedy

Following the court's ruling, in November 2024, the Government filed its initial remedies proposal for addressing Google's unlawful monopoly power. See ECF No. 1062 (D.D.C. Nov. 20, 2024). The proposal includes many proposed structural and behavioral remedies, with some of the proposals reviving new privacy questions. For example, the Government's proposal would require Google to share a mountain of information, including its search index,3 user-side data,4 synthetic queries and Ads Data,5 with certain competitors.6 Moreover, the proposal expressly precluded Google from using and retaining data to which access cannot be provided to competitors on the basis of privacy or security concerns. Id. at 12–13.

Although the proposal stated that the data was intended to be shared in a way that "provides suitable security and privacy safeguards," the proposal was criticized for ignoring both cybersecurity and national security risks associated with such data sharing.7 Google also raised this issue.8 While the Government's proposed remedies are intended to restore competition, requiring widespread dissemination of data raises privacy challenges.

On March 7, 2025, the Government filed its revised proposed remedies, which still broadly requires Google to make critical portions of its search index and user-side data available to certain competitors. See ECF No. 1084-1 (D.D.C. Mar. 7, 2025). In what appears to be an attempt to remedy privacy concerns, the revised proposal would require Google to use "ordinary course techniques to remove any Personally Identifiable Information" while at the same time to provide these competitors sufficient information to make sense of the data, "including but not limited to a description of what the dataset contains, any sampling methodology used to create the dataset, and any anonymization or privacy-enhancing technique that was applied." Id. at 17.

While privacy did not ultimately feature in the court's liability determination, privacy is hardly absent from the case. Interestingly, it is now the Government's data sharing proposal, even as revised, that leaves significant privacy concerns as issues to be resolved.

To start, the Government's proposal around data sharing remains fairly vague. For example, the revised proposal uses the term "Personally Identifiable Information," which is somewhat fraught when viewed in context. Although the term has historically been interpreted to mean direct identifiers, such as name, email or physical address, it is currently under siege by certain states and even the Federal Trade Commission, which have sought to expand the term to include information such as IP addresses. That latent ambiguity is further compounded by the "ordinary course techniques" Google purportedly must use to remove "Personally Identifiable Information" from the data sets it would be required to share with competitors. There is no industry standard that governs this process, and experts, including those assigned to the "technical committee" that would oversee Google's compliance with the remedy, could easily disagree on implementation. Indeed, although "anonymization" is referenced as an option to address potential privacy issues, it is not required. Moreover, there is no U.S. law that sets forth a standard for "anonymization," and rendering data truly incapable of being identified to a person could result in it becoming useless to competitors.

More substantively, questions remain as to what incentives exist under the Government's proposal for Google to undertake its best efforts in removing personally identifiable information. As it stands, how would a consumer know if Google were to fail to use "ordinary course techniques" to remove personally identifiable information? What mechanisms to police that conduct are available? Typically, when data sharing arrangements are entered into between two distinct organizations, roles and responsibilities are outlined and the data sharing occurs only to the degree consumers have been specifically informed at the time of collection. But the Government's proposed remedy ignores issues of consumer consent and opt-out rights. In an era when regulators are acutely focused on restricting data sharing and the monetization of data more broadly, allowing such data sharing with such limited controls seems almost anachronistic.

Conclusion

As often is the case in the privacy world, the details will matter. Both what the court decides as appropriate remedies and how those remedies are implemented through the technical committee will be critical to their potential impact on consumer privacy.

For more information or questions, please contact the authors.9

Notes

1 See id. ¶¶ 8, 36; see, e.g., Abigail Slater, "Why 'Big Data' Is a Big Deal," The Regulatory Review (Nov. 6, 2023).

2 See, e.g., Peter Swire, Protecting Consumers: Privacy Matters in Antitrust Analysis," Center for American Progress (Oct. 9, 2007), Garrett Glasgow and Christopher Stomberg, "Consumer Welfare and Privacy in Antitrust Cases—An Economic Perspective," 35 Antitrust 1 (Dec. 8, 2020).

3 This refers to the databases that store and organize information crawled from the web and collected through other means that feeds results to user general search queries. ECF No. 1062, at 5.

4 The initial proposal defines this to mean "all data that can be obtained from users in the United States, directly through a search engine's interaction with the user's Device, including software running on that Device, by automated means." Id. at 6. The term also includes information Google collects when answering commercial, tail and local queries, as well as datasets used to train Google's ranking and retrieval components and its artificial intelligence models. Id.

5 This is defined to mean data related to Google's selection, ranking and placement of Search Text Ads. Id. at 2.

6 Where a "Competitor" is "a provider of, or potential entrant in the provision of, a General Search Engine or of Search Text Ads in the United States"; a "Qualified Competitor" is one who meets certain data security standards and agrees to regular security and privacy audits recommended or performed by the Technical Committee the plaintiffs proposed be created to implement its request remedies. See id. at 4–5.

7 See, e.g., "DoJ's Google Search Remedies: Even Worse than DMA?," The App Ass'n (Jan. 31, 2025); Trevor Wagener, "Mandated Tech and Data-Sharing: A Remedy to "Cure" Privacy, Innovation, and U.S. Leadership," Comput. & Commc'ns Indus. Ass'n, (March 24, 2025).

8 Lee-Anne Mulholland, "Our remedies proposal in DOJ's search distribution case," Google (last visited April 15, 2025).

9 Caitlin Saladrigas is a partner at the firm who concentrates her practice on data privacy and antitrust litigation. Rachel Marmor is a partner at the firm who concentrates her practice on privacy counseling. David Kully is a partner at the firm and the head of the firm's Antitrust Team. Ryan Kocse is a senior counsel and a member of the firm's Antitrust Team.

Related Insights