Criticism of Google FLoC is mounting – What do critics think of the cookie alternative?

Spread the love

If Google comes up with new measures that would better protect the privacy of users, you can be skeptical. Privacy to whom is the first question you should ask. That is exactly the discussion that is going on around a new initiative from the internet giant, Privacy Sandbox. Privacy advocates, consumer groups and regulators have tumbled over each other in recent weeks and months to voice their criticisms. But what exactly is that criticism about?

Privacy Sandbox is an initiative of Google that dates back to 2019 . The company has been experimenting with it for some time, but it is now starting to gain momentum. One term plays a leading role in this: FLoC. To look at the criticism of FLoC and Privacy Sandbox, we first have to go back to what those terms mean. In the privacy discussion, it is mainly about FLoC, but that is only a small part of a much bigger picture.

FLoC

FLoC stands for Federated Learning of Cohorts. It is part of a broad program called Privacy Sandbox. Earlier in this article we described what Privacy Sandbox does, how it works technically and how FLoC fits into it. A brief description is that Privacy Sandbox consists of a broad set of APIs that will replace various technologies that now use tracking cookies.

Advertisers use tracking cookies for a variety of reasons. There is a difference between cookies that are used to measure metrics such as site visits and cookies that track users across different websites to build an interest profile. Privacy Sandbox consists of multiple APIs, one of which replaces the first; FLoC is one of the APIs that should do exactly the latter. As the name suggests, Federated Learning of Cohorts divides users into ‘cohorts’. Such a cohort is based on the surfing behavior of users. For example, if you go to Tweakers ten times a day and you often watch Linus on YouTube, there is a good chance that you will be classified together with other tech enthusiasts in a cohort of ‘people who love technology’.

Theory

In theory, this is a privacy-friendly method for advertisers to approach potential customers. These are then no longer individually tracked with a tracking cookie, but the advertiser only focuses on large groups full of nameless users. Google has also built in a few important safeguards to make the method even more privacy-friendly. As a user, you are classified locally on your system in a cohort based on an algorithm that does all its work client-side and the identification number is sent encrypted to Google.

Criticism

Despite this, FLoC has been criticized a lot since its introduction. It comes from multiple angles, often predictable. One of the first prominent voices is the American Electronic Frontier Foundation. At the beginning of this year, he called FLoC ‘ a terrible idea’. Mozilla conducts research with predictable conclusions and obviously the ad industry is wary .
Criticism also comes from within. FLoC will be part of Chromium and thus Chrome, but most other major Chromium-based browsers are already blocking the technology. Microsoft does that with Edge , as well as Brave and Vivaldi . To be fair, those browser makers naturally have their own interests in this; Microsoft is working ona FLoC alternative called Parakeet about which little is known at the moment, Brave and Vivaldi market themselves as privacy-friendly browsers, and Mozilla… Well, that goes without saying. But one thing should be clear, the resistance to FLoC is increasing.

Authorities have now also opened their eyes. In the United Kingdom, the competition watchdog launched an investigation into the initiative. That has nothing to do with the privacy aspects of FLoC, but with the dominant position it may give Google.

Dominance

The latter is also a bit what complicates the discussion. Different groups criticize FLoC for different reasons. On the one hand, there is the debate about competition. Google dared to speak out about this last week. The company made several commitments to the British CMA about FLoC to regain trust a little bit – but privacy was hardly discussed. Google says it will “remain in open, constructive and continuous dialogue with the CMA” and will “not favor itself” when it comes to displaying ads. For the time being, it remains only with promises and the CMA continues to investigate the dominance that FLoC may give Google. However, Google’s commitments to the CMA are legally binding.

In line with this is the fact that Google presents FLoC as an alternative to tracking cookies, without asking whether users want this at all. Many tweakers might not be quick to say so, but in some cases you might not mind sharing tracking with a website. This is possible, for example, if they trust the website and want to support it without paying for it, for example. And of all the millions of internet users, there are bound to be a few who simply prefer relevant ads via tracking over non-personalized ads. That group may be in the minority, but with FLoC they no longer have any say in their preferences.

From the privacy rain in the privacy drip

In addition to the possible abuse of power, critics also fall on other points. Contrary to what Google promises, they don’t think FLoC is more privacy-friendly at all. Or, more accurately, FLoC doesn’t eliminate problems per se, but replaces them with other potential privacy risks.

First of all, there’s the way Google is pushing FLoC. You can safely talk about pushing, because the figures support that – with the caveat that the technology is still early in the testing phase. Recently, users of Canary versions of Chrome can invoke a toggle to enable or disable FLoC in the browser. During that test, the toggle is off for one group, but also on for another group. It is not known what the effect is, for example who turns the toggle.

Website administrators can also block FLoC by adding a policy to their headers. Hardly anyone does that. Google offers the option explicitly in the documentation for FLoC, but so far with little success. A study by Adalytics shows that only 10 of the 100,000 most popular websites on the internet currently do so. This includes the websites of Brave and DuckDuckGo, but also the newspaper The Guardian. For the time being, the blocking of FLoC does not really seem to be gaining steam among websites, although you can also refer to the fact that it is still a beta.

Sensitive information

Another problem is that of ‘sensitive cohorts’. In the documentation, Google is adamant about that on the one hand: that blocks it. In practice, that means that certain subjects are not given cohorts, such as porn or gambling. So if you visit pornographic websites in Chrome, Google does not classify users who share certain kinks or a gambling addiction into one cohort. When users go to such a site, their FLoC ID changes automatically. But, some critics note, that’s not watertight at the moment. Firstly, it is not clear what exactly is ‘sensitive’ and what is not. Google refers to a page with information about ad categories , but on a page about FLoCalso immediately admits that it ‘cannot prevent sensitive information from being leaked’. “Some people may or may not be sensitive to certain categories,” the company says, adding that “there is no widely recognized definition of sensitive categories.”

Browser maker Brave struggles with the fact that Google is becoming an arbitrator at the moment. Fundamentally, the idea of ​​establishing a global list of ‘sensitive categories’ is illogical and immoral . For example, Brave refers to searches about pregnancies. For an adult that might be very normal, but for a teenager it might not be at all. “In general, interests can be very banal for one person, but sensitive, private or even dangerous for another,” Brave says.

In addition, Google has not yet resolved certain technical issues surrounding those sensitive topics. For example , privacy activist Don Marti notes that there is not yet a good solution for when users move from a sensitive cohort to a non-sensitive cohort and vice versa.

Fingerprinting

But the biggest problem with FLoC seems to be that it doesn’t completely remove tracking at all. In fact, FLoC could actually make tracking easier. This can be done via the IDs that are assigned to users. Cohorts are given a unique identification number, and users can see what that number is in their Chrome settings, and thus which cohort they are in.

It is not known how many users will be in a cohort. That’s a close balance. Too large a cohort is too general and not targeted enough for advertisers. But if a cohort is too small, it becomes very easy to identify individual users. There’s a risk there. Google only generally speaks of “several thousand,” and says there will be a minimum of 2,000 users in a cohort , but in all likelihood there will be experimentation with group size during the beta. After all, you want to be able to optimize effectiveness.

Nor is it known how many cohorts there are. Initially, SimHash was used to divide users into cohorts, but since Chrome 89, Google has moved to a proprietary technology it calls PrefixLSH. That “looks like SimHash variant SortingLSH” and resulted in 33,872 possible cohorts for users to be in in the first test. Of these, 792 would have been filtered out because it concerned the aforementioned sensitive categories. But, again, it’s all experimental for now.

Fingerprinting

Those numbers may support what critics warn against, which is that it will be possible to trace the pseudonymization of users by FLoC back to individual users. It’s one of the biggest criticisms of the EFF : FLoC allows fingerprinting.

Fingerprinting means that a software builder collects so many data points from a user that you know for sure that you only have one person in front of you. If you have someone’s specific screen size, user agent, browser and browser version and some more of that info, you can quickly distill specific users from large groups – and then bombard them with appropriate advertisements. In a cohort of 2000 users, you shouldn’t need a lot of information to find an individual user that way.

The EFF, as well as Brave and Vivaldi and other critics, warn that some advertisers may misuse FLoC cohorts to fingerprint individual users. “In a FLoC cohort, a browser only needs to distinguish between a few thousand others, rather than a few hundred million others,” the privacy movement’s Bennet Cyphers wrote at the time.

That is not an unfounded fear, it emerged earlier this month . Then it became clear that several large advertising companies were collecting FLoC IDs and then combining them with other data such as IP addresses, URLs visited and timestamps. “The idea that FLoC IDs are an additional dimension to how you figure out identities is certainly correct,” one of them told Digiday.

There are other ways besides fingerprinting to identify individual users from a cohort. Users who, for example, make multiple visits to a website can be traced. Google also allows some cookies in Privacy Sandbox under certain conditions, for example to offer single sign-ons. That information can also be misused for individual tracking.

Countermethods

Now that problem with fingerprinting is again not as black and white as you might think. Google has thought about this for a long time, and the solution is unsurprisingly also in Privacy Sandbox. That must be given multiple APIs that precisely block fingerprinting. For example, there is the User Agent Client Hints api that blocks user agent strings. Even more interesting is Privacy Budget , where advertisers can collect a limited amount of user information but only get meaningless data after that when that budget ‘is gone’.

The problem is that User Agent Client Hints, Privacy Budget and the other methods are far from ready. In fact, the text of the proposals has often not been updated for months or years. FLoC is currently one of the only Privacy Sandbox APIs under active testing, but countermeasures against additional issues are still a long way off.

Conclusion

FLoC itself may be tested, but the initiative is still in its infancy. It has been criticized since the first day it was announced. This is partly justified, partly motivated by fear of a stronger position for Google in the advertising market. FLoC does something that, in theory, should be good for all internet users, which is to ban annoying tracking cookies. Part of the advertising industry agrees, because that sector also realizes that users are more than tired of tracking and that there is a need for a privacy-friendly alternative. But you can ask yourself whether FLoC really solves problems, or not mainly replaces them with new ones. The advertisers who are already abusing FLoC show that the industry still has a long way to go.

You might also like