Cookies have been viewed as the enemy for quite some time, with the result that 3rd party cookies are (quite rightly) being treated with high levels of suspicion.
Unfortunately, the focus being on cookies rather than the tracking/profiling that they enable has left an opening for the unscrupulous to offer a cookie-less alternative.
Enter Google, who a while back announced they were building something called Federated Learning of Cohorts (FLoC) into Chrome. The basic underlying idea of FLoC is that it assigns the browser a cohort ID - grouping it in with other browsers who have a similar browsing history.
The browser's history never leaves the browser, with the cohort ID being calculated locally (updating once per week, based on the previous week's browsing), websites can then query the browser for it's cohort ID (by calling
document.interestCohort()) and serve appropriate ads based on the ID returned.
However, deeper inspection has shown that rather than solving privacy issues, FLoC simply presents new ones - in fact there's an obvious vector in the paragraph above - your cohort ID is the same across all sites you visit...
Plus, although I say new, some of these issues were highlighted in 2019 and remain unaddressed.
Multiple groups have identified that FLoC can be used in fingerprinting, for example:
- A site that a user logs into can link their credentials and cohort ID
- A government site may identify a cohort ID that commonly contains dissidents and can link this ID to the IP of any cohort member who visits a government site
- Users with a specific medical condition may get grouped into a cohort - while it may not be possible to identify the users it's fairly likely they wouldn't consent to being targeted based on that condition
There are many, many writeups on the issues with FLoC (many linked to from here) that do a better job of covering this that I can here.
To summarise, though, Google's only defence is to prevent a false dichotomy - they argue that FLoC is better for privacy than 3rd party cookie based tracking. This rather ignores that that tracking is being killed by browsers - we could instead opt for a world without either (not so secret option c).
The bit I want to focus on though, is one of consent (both publisher and user).
Currently, if your page view is being profiled by Google's advertising systems, it's because you've hit a page that contains Google ads (or some other Google resource - analytics etc). Although you as a user may not have consented to this, the person who authored/published the page you're viewing took an active decision to use a Google resource on that page so Google can claim some sort/form of consent.
FLoC represents a huge step change away from that, as the original trial profiled visits to any page whether they use Google resources or not:
All sites with publicly routable IP addresses that the user visits when not in incognito mode will be included in the POC cohort calculation.
So, despite having actively chosen to remove all Google resources from my sites - they now want to use my content to help calculate an advertising identifier for visitors.
They no longer require on the goodwill of publishers - they've built tracking into the tool you use to browse the web...
Presumably, Google have identified that there are some consent issues with this approach, as they've opted not to use FLoC for users in countries that are covered by GDPR (aside from the wider issues with FLoC, this new antifeature is opt-out not opt-in).
I don't and can't consent to my content being used in a way that erodes user privacy - that's part of why I removed ads in the first place.
The FLoC spec introduces a new value for the Permissions-Policy header which, for now, will cause FLoC to not function
As correctly noted in this blog post, adding this header is not a magic bullet - it may not need to be added at all, and adding it but calling in untrusted resources would still be problematic.
What it misses, in my opinion, though is that adding the header acts as an additional line of safety. During the Origin Trial, FLoC triggered on sites that contained ads or ad related resources. What this means in practice is that if Chrome's ad-tagging threw a false-positive on your site, it was included.
We don't know what Google's future criteria for FLoC triggering is going to be (and the policy is only likely to get more, not less, liberal), so I'd rather err on the side of caution, as it's simply an additional entry in the
A small positive side-effect too, is that it means an extra set of sites explicitly telling Google to FLoC off when people like Scott Helme do analysis of response headers.
My site will now include
interest-cohort in the Permissions-Policy (as well as denying access to a bunch of other APIs). Most won't make any different to www.bentasker.co.uk (as I don't include many, if any, 3rd party resources any more), but should provide additional privacy protection on some of my subdomains.
ben@milleniumfalcon:~$ curl -v -o/dev/null https://www.bentasker.co.uk 2>&1 | grep permiss < permissions-policy: interest-cohort=(), payment=(), camera=(), geolocation=(), microphone=(), usb=()
Avoiding FLoC as a user is fairly simple, if initially disruptive:
Don't use Chrome.
There are multiple browser vendors (including Chromium derivatives) who have said they're not implementing FLoC:
There's a wide variety of choice there, and a good few retain extension compatibility with Chrome, so a move to a more privacy friendly browser needn't necessarily mean losing all your extensions.
Other than that, in effect we're left with just hoping that this goes the way of most other Google projects - killed with little fanfare in a couple of years.