Quantcast
Channel: Max Schrems | TechCrunch

EU puts out final guidance on data transfers to third countries

$
0
0

The European Data Protection Board (EDPB) published its final recommendations yesterday setting on guidance for making transfers of personal data to third countries to comply with EU data protection rules in light of last summer’s landmark CJEU ruling (aka Schrems II).

The long and short of these recommendations — which are fairly long; running to 48 pages — is that some data transfers to third countries will simply not be possible to (legally) carry out. Despite the continued existence of legal mechanisms that can, in theory, be used to make such transfers (like Standard Contractual Clauses; a transfer tool that was recently updated by the Commission).

However it’s up to the data controller to assess the viability of each transfer, on a case by case basis, to determine whether data can legally flow in that particular case. (Which may mean, for example, a business making complex assessments about foreign government surveillance regimes and how they impinge upon its specific operations.)

Companies that routinely take EU users’ data outside the bloc for processing in third countries (like the US), which do not have data adequacy arrangements with the EU, face substantial cost and challenge in attaining compliance — in a best case scenario.

Those that can’t apply viable ‘special measures’ to ensure transferred data is safe are duty bound to suspend data flows — with the risk, should they fail to do that, of being ordered to by a data protection authority (which could also apply additional sanctions).

One alternative option could be for such a firm to store and process EU users’ data locally — within the EU. But clearly that won’t be viable for every company.

Law firms are likely to be very happy with this outcome since there will be increased demand for legal advice as companies grapple with how to structure their data flows and adapt to a post-Schrems II world.

In some EU jurisdictions (such as Germany) data protection agencies are now actively carrying out compliance checks — so orders to suspend transfers are bound to follow.

While the European Data Protection Supervisor is busy scrutinizing EU institutions’ own use of US cloud services giants to see whether high level arrangements with tech giants like AWS and Microsoft pass muster or not.

Last summer the CJEU struck down the EU-US Privacy Shield — only a few years after the flagship adequacy arrangement was inked. The same core legal issues did for its predecessor, ‘Safe Harbor‘, though that had stood for some fifteen years. And since the demise of Privacy Shield the Commission has repeatedly warned there will be no quick fix replacement this time; nothing short of major reform of US surveillance law is likely to be required.

US and EU lawmakers remain in negotiations over a replacement EU-US data flows deal but a viable outcome that can stand up to legal challenge as the prior two agreements could not, may well require years of work, not months.

And that means EU-US data flows are facing legal uncertainty for the foreseeable future.

The UK, meanwhile, has just squeezed a data adequacy agreement out of the Commission — despite some loudly enunciated post-Brexit plans for regulatory divergence in the area of data protection.

If the UK follows through in ripping up key tenets of its inherited EU legal framework there’s a high chance it will also lose adequacy status in the coming years — meaning it too could face crippling barriers to EU data flows. (But for now it seems to have dodged that bullet.)

Data flows to other third countries that also lack an EU adequacy agreement — such as China and India — face the same ongoing legal uncertainty.

The backstory to the EU international data flows issues originates with a complaint — in the wake of NSA whistleblower Edward Snowden’s revelations about government mass surveillance programs, so more than seven years ago — made by the eponymous Max Schrems over what he argued were unsafe EU-US data flows.

Although his complaint was specifically targeted at Facebook’s business and called on the Irish Data Protection Commission (DPC) to use its enforcement powers and suspend Facebook’s EU-US data flows.

A regulatory dance of indecision followed which finally saw legal questions referred to Europe’s top court and — ultimately — the demise of the EU-US Privacy Shield. The CJEU ruling also put it beyond legal doubt that Member States’ DPAs must step in and act when they suspect data is flowing to a location where the information is at risk.

Following the Schrems II ruling, the DPC (finally) sent Facebook a preliminary order to suspend its EU-US data flows last fall. Facebook immediately challenged the order in the Irish courts — seeking to block the move. But that challenge failed. And Facebook’s EU-US data flows are now very much operating on borrowed time.

As one of the platforms subject to Section 702 of the US’ FISA law, its options for applying ‘special measures’ to supplement its EU data transfers look, well, limited to say the least.

It can’t — for example — encrypt the data in a way that ensures it has no access to it (zero access encryption) since that’s not how Facebook’s advertising empire functions. And Schrems has previously suggested Facebook will have to federate its service — and store EU users’ information inside the EU — to fix its data transfer problem.

Safe to say, the costs and complexity of compliance for certain businesses like Facebook look massive.

But there will be compliance costs and complexity for thousands of businesses in the wake of the CJEU ruling. And in a recent open letter to lawmakers ahead of an EU-US summit earlier this month, startup associations on both sides of the Atlantic urged policymakers to find ways to partner on regulatory standards alignment — writing that recent developments in the digital sphere, such as the invalidation of Privacy Shield, “threaten to leave our ecosystems at a disadvantage in tough globally competitive markets”. 

Discussing the concerns with TechCrunch, Benedikt Blomeyer, director of EU policy for Allied for Startups, added: “Startups are global from day one and as such, a US startup has much to offer for EU consumers. Why is that despite increasingly interconnected markets, and more and more data protection laws coming into force, increasingly trade barriers are arising in the digital economy?”

Though asked if the startups backing the call for the EU and US to work towards reducing regulatory divergence are lobbying for anything as specific as US surveillance law reform at this stage of their campaign Blomeyer declined to comment for now.

Commenting on the EDPB’s adoption of final recommendations, chair Andrea Jelinek said: “The impact of Schrems II cannot be underestimated: Already international data flows are subject to much closer scrutiny from the supervisory authorities who are conducting investigations at their respective levels. The goal of the EDPB Recommendations is to guide exporters in lawfully transferring personal data to third countries while guaranteeing that the data transferred is afforded a level of protection essentially equivalent to that guaranteed within the European Economic Area.

“By clarifying some doubts expressed by stakeholders, and in particular the importance of examining the practices of public authorities in third countries, we want to make it easier for data exporters to know how to assess their transfers to third countries and to identify and implement effective supplementary measures where they are needed. The EDPB will continue considering the effects of the Schrems II ruling and the comments received from stakeholders in its future guidance.”

The EDPB put out earlier guidance on Schrems II compliance last year.

It said the main modifications between that earlier advice and its final recommendations include: “The emphasis on the importance of examining the practices of third country public authorities in the exporters’ legal assessment to determine whether the legislation and/or practices of the third country impinge — in practice — on the effectiveness of the Art. 46 GDPR transfer tool; the possibility that the exporter considers in its assessment the practical experience of the importer, among other elements and with certain caveats; and the clarification that the legislation of the third country of destination allowing its authorities to access the data transferred, even without the importer’s intervention, may also impinge on the effectiveness of the transfer tool”.

Commenting on the EDPB’s recommendations in a statement, law firm Linklaters dubbed the guidance “strict” — warning over the looming impact on businesses.

“There is little evidence of a pragmatic approach to these transfers and the EDPB seems entirely content if the conclusion is that the data must remain in the EU,” said Peter Church, a Counsel at the global law firm. “For example, before transferring personal data to third country (without adequate data protection laws) businesses must consider not only its law but how its law enforcement and national security agencies operate in practice. Given these activities are typically secretive and opaque, this type of analysis is likely to cost tens of thousands of euros and take time. It appears this analysis is needed even for relatively innocuous transfers.”

“It is not clear how SMEs can be expected to comply with these requirements,” he added. “Given we now operate in a globalised society the EDPB, like King Canute, should consider the practical limitations on its power. The guidance will not turn back the tides of data washing back and forth across the world, but many businesses will really struggle to comply with these new requirements.”

This report was updated with additional comment 

 


German government bodies urged to remove their Facebook Pages before next year

$
0
0

Germany’s federal information commissioner has run out of patience with Facebook.

Last month, Ulrich Kelber wrote to government agencies “strongly recommend[ing]” they to close down their official Facebook Pages because of ongoing data protection compliance problems and the tech giant’s failure to fix the issue.

In the letter, Kelber warns the government bodies that he intends to start taking enforcement action from January 2022 — essentially giving them a deadline of next year to pull their pages from Facebook.

So expect not to see official Facebook Pages of German government bodies in the coming months.

While Kelber’s own agency, the BfDi, does not appear to have a Facebook Page (although Facebook’s algorithms appear to generate this artificial stub if you try searching for one) plenty of other German federal bodies do — such as the Ministry of Health, whose public page has more than 760,000 followers.

The only alternative to such pages vanishing from Facebook’s platform by Christmas — or else being ordered to be taken down early next year by Kelber — seems to be for the tech giant to make more substantial changes to how its platform operators than it has offered so far, allowing the Pages to be run in Germany in a way that complies with EU law.

However Facebook has a long history of ignoring privacy expectations and data protection laws.

It has also, very recently, shown itself more than willing to reduce the quality of information available to users — if doing so further its business interests (such as to lobby against a media code law, as users in Australia can attest).

So it looks rather more likely that German government agencies will be the ones having to quietly bow off the platform soon…

Kelber says he’s avoided taking action over the ministries’ Facebook Pages until now on account of the public bodies arguing that their Facebook Pages are an important way for them to reach citizens.

However his letter points out that government bodies must be “role models” in matters of legal compliance — and therefore have “a particular duty” to comply with data protection law. (The EDPS is taking a similar tack by reviewing EU institutions’ use of US cloud services giants.)

Per his assessment, an “addendum” provided by Facebook in 2019 does not rectify the compliance problem and he concludes that Facebook has made no changes to its data processing operations to enable Page operators to comply with requirements set out in the EU’s General Data Protection Regulation.

A ruling by Europe’s top court, back in June 2018, is especially relevant here — as it held that the administrator of a fan page on Facebook is jointly responsible with Facebook for the processing of the data of visitors to the page.

That means that the operators of such pages also face data protection compliance obligations, and cannot simply assume that Facebook’s T&Cs provide them with legal cover for the data processing the tech giant undertakes.

The problem, in a nutshell, is that Facebook does not provide Pages operates with enough information or assurances about how it processes users’ data — meaning they’re unable to comply with GDPR principles of accountability and transparency because, for example, they’re unable to adequately inform followers of their Facebook Page what is being done with their data.

There is also no way for Facebook Page operators to switch off (or otherwise block) wider processing of their Page followers by Facebook. Even if they don’t make use of any of the analytics features Facebook provides to Page operators.

The processing still happens.

This is because Facebook operates a take-it-or-leave it ‘data maximizing’ model — to feed its ad-targeting engines.

But it’s an approach that could backfire if it ends up permanently reducing the quality of the information available on its network because there’s a mass migration of key services off its platform. Such as, for example, every government agency in the EU deleted its Facebook Page.

A related blog post on the BfDi’s website also holds out the hope that “data protection-compliant social networks” might develop in the Facebook compliance vacuum.

Certainly there could be a competitive opportunity for alternative platforms that seek to sell services based on respecting users’ rights.

The German Federal Ministry of Health’s verified Facebook Page (Screengrab: TechCrunch/Natasha Lomas)

Discussing the BfDis intervention, Luca Tosoni, a research fellow at the University of Oslo’s Norwegian Research Center for Computers and Law, told TechCrunch: “This development is strictly connected to recent CJEU case law on joint controllership. In particular, it takes into account the Wirtschaftsakademie ruling, which found that the administrator of a Facebook page should be considered a joint controller with Facebook in respect of processing the personal data of the visitors of the page.

“This does not mean that the page administrator and Facebook share equal responsibility for all stages of the data processing activities linked to the use of the Facebook page. However, they must have an agreement in place with a clear allocation of roles and responsibilities. According to the German Federal Commissioner for Data Protection and Freedom of Information, Facebook’s current data protection ‘Addendum’ would not seem to be sufficient to meet the latter requirement.”

“It is worth noting that, in its Fashion ID ruling, the CJEU has taken the view that the GDPR’s obligations for joint controllers are commensurate with those data processing stages in which they actually exercise control,” Tosoni added. “This means that the data protection obligations a Facebook page administrator would normally tend to be quite limited.”

Warnings for other social media services

This particular compliance issue affects Facebook in Germany — and potentially any other EU market. But other social media services may face similar problems too.

For example, Kelber’s letter flags an ongoing audit of Instagram, TikTok and Clubhouse — warning of “deficits” in the level of data protection they offer too.

He goes on to recommend that agencies avoid using the three apps on business devices.  

In an earlier, 2019 assessment of government bodies’ use of social media services, the BfDi suggested usage of Twitter could — by contrast — be compliant with data protection rules. At least if privacy settings were fully enabled and analytics disabled, for example.

At the time the BfDi also warned that Facebook-owned Instagram faced similar compliance problems to Facebook, being subject to the same “abusive” approach to consent he said was taken by the whole group.

Reached for comment on Kelber’s latest recommendations to government agencies, Facebook did not engage with our specific questions — sending us this generic statement instead:

“At the end of 2019, we updated the Page Insights addendum and clarified the responsibilities of Facebook and Page administrators, for which we took questions regarding transparency of data processing into account. It is important to us that also federal agencies can use Facebook Pages to communicate with people on our platform in a privacy-compliant manner.”

An additional complication for Facebook has arisen in the wake of the legal uncertainty following last summer’s Schrems II ruling by the CJEU.

Europe’s top court invalidated the EU-US Privacy Shield arrangement, which had allowed companies to self-certify an adequate level of data protection, removing the easiest route for transferring EU users’ personal data over to the US. And while the court did not outlaw international transfers of EU users’ personal data altogether it made it clear that data protection agencies must intervene and suspend data flows if they suspect information is being moved to a place, and in in such a way, that it’s put at risk.

Following Schrems II, transfers to the US are clearly problematic where the data is being processed by a US company that’s subject to FISA 702, as is the case with Facebook.

Indeed, Facebook’s EU-to-US data transfers were the original target of the complainant in the Schrems II case (by the eponymous Max Schrems). And a decision remains pending on whether the tech giant’s lead EU data supervisor will follow through on a preliminary order last year to it should suspend its EU data flows — due in the coming months.

Even ahead of that long-anticipated reckoning in Ireland, other EU DPAs are now stepping in to take action — and Kelber’s letter references the Schrems II ruling as another issue of concern.

Tosoni agrees that GDPR enforcement is finally stepping up a gear. But he also suggested that compliance with the Schrems II ruling comes with plenty of nuance, given that each data flow must be assessed on a case by case basis — with a range of supplementary measures that controllers may be able to apply.

“This development also shows that European data protection authorities are getting serious about enforcing the GDPR data transfer requirements as interpreted by the CJEU in Schrems II, as the German Federal Commissioner for Data Protection and Freedom flagged this as another pain point,” he said.

“However, the German Federal Commissioner sent out his letter on the use of Facebook pages a few days before the EDPB adopted the final version its recommendations on supplementary measures for international data transfers following the CJEU Schrems II ruling. Therefore, it remains to be seen how German data protection authorities will take these new recommendations into account in the context of their future assessment of the GDPR compliance of the use of Facebook pages by German public authorities.

“Such recommendations do not establish a blanket ban on data transfers to the US but impose the adoption of stringent safeguards, which will need to be followed to keep on transferring the data of German visitors of Facebook pages to the US.”

Another recent judgment by the CJEU reaffirmed that EU data protection agencies can, in certain circumstances, take action when they are not the lead data supervisor for a specific company under the GDPR’s one-stop-shop mechanism — expanding the possibility for litigation by watchdogs in Member States if a local agency believes there’s an urgent need to act.

Although, in the case of the German government bodies’ use of Facebook Pages, the earlier CJEU ruling finding on joint law controllership means the BfDi already has clear jurisdiction to target these agencies’ Facebook Pages itself.

 

Dutch court will hear another Facebook privacy lawsuit

$
0
0

Privacy litigation that’s being brought against Facebook by two not-for-profits in the Netherlands can go ahead, an Amsterdam court has ruled. The case will be heard in October.

Since 2019, the Amsterdam-based Data Privacy Foundation (DPS) has been seeking to bring a case against Facebook over its rampant collection of internet users’ data — arguing the company does not have a proper legal basis for the processing.

It has been joined in the action by the Dutch consumer protection not-for-profit, Consumentenbond.

The pair are seeking redress for Facebook users in the Netherlands for alleged violations of their privacy rights — both by suing for compensation for individuals; and calling for Facebook to end the privacy-hostile practices.

European Union law allows for collective redress across a number of areas, including data protection rights, enabling qualified entities to bring representative actions on behalf of rights holders. And the provision looks like an increasingly important tool for furthering privacy enforcement in the bloc, given how European data protection regulators’ have continued to lack uniform vigor in upholding rights set out in legislation such as the General Data Protection Regulation (which, despite coming into application in 2018, has yet to be seriously applied against platform giants like Facebook).

Returning to the Dutch litigation, Facebook denies any abuse and claims it respects user privacy and provides people with “meaningful control” over how their data gets exploited.

But it has fought the litigation by seeking to block it on procedural grounds — arguing for the suit to be tossed by claiming the DPS does not fit the criteria for bringing a privacy claim on behalf of others and that the Amsterdam court has no jurisdiction as its European business is subject to Irish, rather than Dutch, law.

However the Amsterdam District Court rejected its arguments, clearing the way for the litigation to proceed.

Contacted for comment on the ruling, a Facebook spokesperson told us:

We are currently reviewing the Court’s decision. The ruling was about the procedural part of the case, not a finding on the merits of the action, and we will continue to defend our position in court. We care about our users in the Netherlands and protecting their privacy is important to us. We build products to help people connect with people and content they care about while honoring their privacy choices. Users have meaningful control over the data that they share on Facebook and we provide transparency around how their data is used. We also offer people tools to access, download, and delete their information and we are committed to the principles of GDPR.

In a statement today, the Consumentenbond‘s director, Sandra Molenaar, described the ruling as “a big boost for the more than 10 million victims” of Facebook’s practices in the country.

“Facebook has tried to throw up all kinds of legal hurdles and to delay this case as much as possible but fortunately the company has not succeeded. Now we can really get to work and ensure that consumers get what they are entitled to,” she added in the written remarks (translated from Dutch with Google Translate).

In another supporting statement, Dick Bouma, chairman of DPS, added: “This is a nice and important first step for the court. The ruling shows that it pays to take a collective stand against tech giants that violate privacy rights.”

The two not-for-profits are urging Facebook users in the Netherlands to sign up to be part of the representative action (and potentially receive compensation) — saying more than 185,000 people have registered so far.

The suit argues that Facebook users are “paying” for the “free” service with their data — contending the tech giant does not have a valid legal basis to process people’s information because it has not provided users with comprehensive information about the data it is gathering from and on them, nor what it does with it.

So — in essence — the argument is that Facebook’s tracking and targeting is in breach of EU privacy law.

The legal challenge follows an earlier investigation (back in 2014) of Facebook’s business by the Dutch data protection authority that identified problems with its privacy policy and — in a 2017 report — found the company to be processing users’ data without their knowledge or consent.

However, since 2018, Europe’s GDPR has been in application and a “one-stop-shop” mechanism baked into the regulation — to streamline the handling of cross-border cases — has meant complaints against Facebook have been funnelled through Ireland’s Data Protection Commission. The Irish DPC has yet to issue a single decision against Facebook despite receiving scores of complaints. (And it’s notable that  “forced consent” complaints were filed against Facebook the day GDPR begun being applied — yet still remain undecided by Ireland.)

The GDPR’s enforcement bottleneck makes collective redress actions, such as this one in the Netherlands a potentially important route for Europeans to get rights relief against powerful platforms that seek to shrink the risk of regulatory enforcement via forum shopping.

Although national rules — and courts’ interpretations of them — can vary. So the chance of litigation succeeding is not uniform.

In this case, the Amsterdam court allowed the suit to proceed on the grounds that the Facebook data subjects in question reside in the Netherlands.

It also took the view that a local Facebook corporate entity in the Netherlands is an establishment of Facebook Ireland, among other reasons for rejecting Facebook’s arguments.

How Facebook will seek to press a case against the substance of the Dutch privacy litigation remains to be seen. It may well have other procedural strategies up its sleeve.

The tech giant has used similar stalling tactics against far longer-running privacy litigation in Austria, for example.

In that case, brought by privacy campaigner Max Schrems and his not-for-profit noyb, Facebook has sought to claim that the GDPR’s consent requirements do not apply to its advertising business because it now includes “personalized advertising” in its T&Cs — and therefore has a “duty” to provide privacy-hostile ads to users — seeking to bypass the GDPR by claiming it must process users’ data because it’s “necessary for the performance of a contract,” as noyb explains here.

A court in Vienna accepted this “GDPR consent bypass” sleight of hand, dealing a blow to European privacy campaigners.

But an appeal reached the Austrian Supreme Court in March — and a referral could be made to Europe’s top court.

If that happens it would then be up to the CJEU to weigh in whether such a massive loophole in the EU’s flagship data protection framework should really be allowed to stand. But that process could still take over a year or longer.

In the short term, the result is yet more delay for Europeans trying to exercise their rights against platform giants and their in-house armies of lawyers.

In a more positive development for privacy rights, a recent ruling by the CJEU bolstered the case for data protection agencies across the EU to bring actions against tech giants if they see an urgent threat to users — and believe a lead supervisor is failing to act.

That ruling could help unblock some GDPR enforcement against the most powerful tech companies at the regulatory level, potentially reducing the blockages created by bottlenecks such as Ireland.

Facebook’s EU to U.S. data flows are also now facing the possibility of a suspension order in a matter of months — related to another piece of litigation brought by Schrems that hinges on the conflict between EU fundamental rights and U.S. surveillance law.

The CJEU weighed in on that last summer with a judgment that requires regulators like Ireland to act when user data is at risk. (Germany’s federal data protection commissioner, for instance, has warned government bodies to shut their official Facebook pages ahead of planned enforcement action at the start of next year.)

So while Facebook has been spectacularly successful at kicking Europe’s privacy rights claims down the road, for well over a decade, its strategy of legal delay tactics to shield a privacy-hostile business model could finally hit a geopolitical brick wall.

The tech giant has sought to lobby against this threat to its business by suggesting it might switch off its service in Europe if the regulator follows through on a preliminary suspension order last year.

But it has also publicly denied it would actually follow through and close service in Europe.

How might Facebook actually comply if ordered to cut off EU data flows? Schrems has argued it may need to federate its service and store European users’ data inside the EU in order to comply with the eponymous Schrems II CJEU ruling.

Albeit, Facebook has certainly shown itself adept at exploiting the gaps between Europeans’ on-paper rights, national case law and the various EU and member state institutions involved in oversight and enforcement as a tactic to defend its commercial priorities — playing different players and pushing agendas to further its business interests. So whether any single piece of EU privacy litigation will prove to be the silver bullet that forces a reboot of its privacy-hostile business model very much remains to be seen.

A perhaps more likely scenario is that each of these cases further erodes user trust in Facebook’s services — reducing people’s appetite to use its apps and expanding opportunities for rights-respecting competitors to poach custom by offering something better. 

 

Ireland probes TikTok’s handling of kids’ data and transfers to China

$
0
0

Ireland’s Data Protection Commission (DPC) has yet another Big Tech GDPR probe to add to its pile: The regulator said yesterday it has opened two investigations into video sharing platform TikTok.

The first covers how TikTok handles children’s data, and whether it complies with Europe’s General Data Protection Regulation.

The DPC also said it will examine TikTok’s transfers of personal data to China, where its parent entity is based — looking to see if the company meets requirements set out in the regulation covering personal data transfers to third countries.

TikTok was contacted for comment on the DPC’s investigation.

A spokesperson told us:

The privacy and safety of the TikTok community, particularly our youngest members, is a top priority. We’ve implemented extensive policies and controls to safeguard user data and rely on approved methods for data being transferred from Europe, such as standard contractual clauses. We intend to fully cooperate with the DPC.

The Irish regulator’s announcement of two “own volition” enquiries follows pressure from other EU data protection authorities and consumers protection groups which have raised concerns about how TikTok handles’ user data generally and children’s information specifically.

In Italy this January, TikTok was ordered to recheck the age of every user in the country after the data protection watchdog instigated an emergency procedure, using GDPR powers, following child safety concerns.

TikTok went on to comply with the order — removing more than half a million accounts where it could not verify the users were not children.

This year European consumer protection groups have also raised a number of child safety and privacy concerns about the platform. And, in May, EU lawmakers said they would review the company’s terms of service.

On children’s data, the GDPR sets limits on how kids’ information can be processed, putting an age cap on the ability of children to consent to their data being used. The age limit varies per EU member state but there’s a hard cap for kids’ ability to consent at 13 years old (some EU countries set the age limit at 16).

In response to the announcement of the DPC’s enquiry, TikTok pointed to its use of age-gating technology and other strategies it said it uses to detect and remove underage users from its platform.

It also flagged a number of recent changes it’s made around children’s accounts and data — such as flipping the default settings to make their accounts private by default and limiting their exposure to certain features that intentionally encourage interaction with other TikTok users if those users are over 16.

On international data transfers it claims to use “approved methods.” However the picture is rather more complicated than TikTok’s statement implies. Transfers of Europeans’ data to China are complicated by there being no EU data adequacy agreement in place with China.

In TikTok’s case, that means, for any personal data transfers to China to be lawful, it needs to have additional “appropriate safeguards” in place to protect the information to the required EU standard.

When there is no adequacy arrangement in place, data controllers can, potentially, rely on mechanisms like Standard Contractual Clauses (SCCs) or binding corporate rules (BCRs) — and TikTok’s statement notes it uses SCCs.

But — crucially — personal data transfers out of the EU to third countries have faced significant legal uncertainty and added scrutiny since a landmark ruling by the CJEU last year which invalidated a flagship data transfer arrangement between the U.S. and the EU and made it clear that DPAs (such as Ireland’s DPC) have a duty to step in and suspend transfers if they suspect people’s data is flowing to a third country where it might be at risk.

So while the CJEU did not invalidate mechanisms like SCCs entirely they essentially said all international transfers to third countries must be assessed on a case-by-case basis and, where a DPA has concerns, it must step in and suspend those non-secure data flows.

The CJEU ruling means just the fact of using a mechanism like SCCs doesn’t mean anything on its own re: the legality of a particular data transfer. It also amps up the pressure on EU agencies like Ireland’s DPC to be proactive about assessing risky data flows.

Final guidance put out by the European Data Protection Board, earlier this year, provides details on the so-called “special measures” that a data controller may be able to apply in order to increase the level of protection around their specific transfer so the information can be legally taken to a third country.

But these steps can include technical measures like strong encryption — and it’s not clear how a social media company like TikTok would be able to apply such a fix, given how its platform and algorithms are continuously mining users’ data to customize the content they see and in order to keep them engaged with TikTok’s ad platform.

In another recent development, China has just passed its first data protection law.

But, again, this is unlikely to change much for EU transfers. The Communist Party regime’s ongoing appropriation of personal data, through the application of sweeping digital surveillance laws, means it would be all but impossible for China to meet the EU’s stringent requirements for data adequacy. (And if the U.S. can’t get EU adequacy it would be “interesting” geopolitical optics, to put it politely, were the coveted status to be granted to China.)

One factor TikTok can take heart from is that it does likely have time on its side when it comes to the EU’s enforcement of its data protection rules.

The Irish DPC has a huge backlog of cross-border GDPR investigations into a number of tech giants.

It was only earlier this month that Irish regulator finally issued its first decision against a Facebook-owned company — announcing a $267 million fine against WhatsApp for breaching GDPR transparency rules (but only doing so years after the first complaints had been lodged).

The DPC’s first decision in a cross-border GDPR case pertaining to Big Tech came at the end of last year — when it fined Twitter $550,000 over a data breach dating back to 2018, the year GDPR technically begun applying.

The Irish regulator still has scores of undecided cases on its desk — against tech giants including Apple and Facebook. That means that the new TikTok probes join the back of a much criticized bottleneck. And a decision on these probes isn’t likely for years.

On children’s data, TikTok may face swifter scrutiny elsewhere in Europe: The U.K. added some “gold-plaiting” to its version of the EU GDPR in the area of children’s data — and, from this month, has said it expects platforms meet its recommended standards.

It has warned that platforms that don’t fully engage with its Age Appropriate Design Code could face penalties under the U.K.’s GDPR. The U.K.’s code has been credited with encouraging a number of recent changes by social media platforms over how they handle kids’ data and accounts.

Ireland’s draft GDPR decision against Facebook branded a joke

$
0
0

Facebook’s lead data protection regulator in the European Union is inching toward making its first decision on a complaint against Facebook itself. And it looks like it’s a doozy.

Privacy campaign not-for-profit noyb today published a draft decision by the Irish Data Protection Commission (DPC) on a complaint made under the EU’s General Data Protection Regulation (GDPR).

The DPC’s draft decision proposes to fine Facebook $36 million — a financial penalty that would take the adtech giant just over two and a half hours to earn in revenue, based on its second quarter earnings (of $29 billion).

Yeah, we lol’d too.

But even more worrying for privacy advocates is the apparent willingness of the DPC to allow Facebook to simply bypass the regulation by claiming users are giving it their data because they’re in a contract with it to get, er, targeted ads.

In a summary of its findings, the DPC writes: “There is no obligation on Facebook to seek to rely solely on consent for the purposes of legitimising personal data processing where it is offering a contract to a user which some users might assess as one that primarily concerns the processing of personal data. Nor has Facebook purported to rely on consent under the GDPR.”

“I find the Complainant’s case is not made out that the GDPR does not permit the reliance by Facebook on 6(1)(b) GDPR in the context of its offering of Terms of Service,” the DPC also writes, suggesting it’s totally bona fide for Facebook to claim a legal right to process people’s information for ad targeting because it’s now suggesting users actually signed up for a contract with it to deliver them ads.

Yet — simultaneously — the DPC’s draft decision does find that Facebook infringed GDPR transparency requirements — specifically: Articles 5(1)(a), 12(1) and 13(1)(c) — meaning that users were unlikely to have understood they were signing up for a Facebook ad contract when they clicked “I agree” on Facebook’s T&Cs.

So the tl;dr here is that Facebook’s public-facing marketing — which claims its service “helps you connect and share with the people in your life” — appears to be missing a few critical details about the advertising contract it’s actually asking you to enter into, or something.

Insert your own facepalm emoji right here.

Mind the enforcement gap

The GDPR came into application across the EU back in May 2018 — ostensibly to cement and strengthen long-standing privacy rules in the region which had historically suffered from a lack of enforcement, by adding new provisions such as supersized fines (of up to 4% of global turnover).

However EU privacy rules have also suffered from a lack of universally vigorous enforcement since the GDPR update. And those penalties that have been issued — including a handful against Big Tech — have been far lower than that theoretical maximum. Nor has enforcement led to an obvious retooling of privacy hostile business models — yet.

So the reboot hasn’t exactly gone as privacy advocates hoped.

Adtech giants especially have managed to avoid a serious reckoning in Europe over their surveillance-based business models despite the existence of the GDPR — through the use of forum shopping and cynical delay tactics.

So while there is no shortage of GDPR complaints being filed against adtech, complaints over the lack of regulatory enforcement in this area are equally stacking up.

And complainants are now also resorting to legal action.

The issue is, under GDPR’s one-stop-shop mechanism, cross-border complaints and investigations, such as those targeted at major tech platforms, are led by a single agency — typically where the company in question has its legal base in the EU.

And in Facebook’s case (and many other tech giants’) that’s Ireland.

The Irish authority has long been accused of being a bottleneck to effective enforcement of the GDPR, with critics pointing to a glacial pace of enforcement, scores of complaints simply dropped without any discernible activity and — in instances where the complaints aren’t totally ignored — underwhelming decisions eventually popping out the other end.

One such series of adtech-related GDPR complaints were filed by noyb immediately the regulation came into application three years ago — targeting a number of adtech giants (including Facebook) over what noyb called “forced consent.” And these complaints of course ended up on the DPC’s desk.

noyb’s complaint against Facebook argues that the tech giant does not collect consent legally because it does not offer users a free choice to consent to their data being processed for advertising.

This is because under EU law consent must be freely given, specific (i.e., not bundled) and informed in order to be valid. So the substance of the complaint is not exactly as complicated as rocket science.

Yet a decision on noyb’s complaint has taken years to emerge from the DPC’s desk — and even now, in dilute draft form, it looks entirely underwhelming.

Per noyb, the Irish DPC has decided to accept what the campaign group dubs Facebook’s “trick” to bypass the GDPR — in which the company claims it switched away from relying on consent from users as a legal basis for processing people’s data for ad targeting to claiming users are actually in a contract with it to get ads injected into their eyeballs the very moment the GDPR came into force.

“It is painfully obvious that Facebook simply tries to bypass the clear rules of the GDPR by relabeling the agreement on data use as a ‘contract,'” said noyb founder and chair, Max Schrems, in a statement which goes on to warn that were such a basic wheeze allowed to stand it would undermine the whole regulation. Talk about a cunning plan!

“If this would be accepted, any company could just write the processing of data into a contract and thereby legitimize any use of customer data without consent. This is absolutely against the intentions of the GDPR, that explicitly prohibits to hide consent agreements in terms and conditions.”

“It is neither innovative nor smart to claim that an agreement is something that it is not to bypass the law,” he adds. “Since Roman times, the Courts have not accepted such ‘relabeling’ of agreements. You can’t bypass drug laws by simply writing ‘white powder’ on a bill, when you clearly sell cocaine. Only the Irish DPC seems to fall for this trick.”

Ireland has only issued two GDPR decisions in complaints against Big Tech thus far: Last year in a case against a Twitter security breach ($550,000 fine); and earlier this year in an investigation into the transparency of (Facebook-owned) WhatsApp T&Cs ($267 million fine).

Under the GDPR, a decision on these type of cross-border GDPR complaints must go through a collective review process — where other DPAs get a chance to object. It’s a check and balance on one agency getting too cosy with business and failing to enforce the law.

And in both the aforementioned cases objections were raised on the DPC drafts that ended up increasing the penalties.

So it is highly likely that Ireland’s Facebook decision will face plenty of objections that end in a tougher penalty for Facebook.

noyb also points to guidelines put out by the European Data Protection Board (EDPB) — which it says make it clear that bypassing the GDPR isn’t legal and must be treated as consent. But it quotes the Irish DPC saying it is “simply not persuaded” by the view of its European Colleagues and suggests the EDPB will therefore have to step in yet again.

“Our hope lies with the other European authorities. If they do not take action, companies can simply move consent into terms and thereby bypass the GDPR for good,” says Schrems.

noyb has plenty more barbs for the DPC — accusing the Irish authority of holding “secret meetings” with Facebook on its “consent bypass” (not for the first time); and of withholding documents it requested — going on to denounce the regulator as acting like a “‘Big Tech’ adviser” (not, y’know, a law enforcer).

“We have cases before many authorities, but the DPC is not even remotely running a fair procedure,” adds Schrems. “Documents are withheld, hearings are denied and submitted arguments and facts are simply not reflected in the decision. The [Facebook] decision itself is lengthy, but most sections just end with a ‘view’ of the DPC, not an objective assessment of the law.”

We reached out to the DPC for comment on noyb’s assertions — but a spokesperson declined, citing an “ongoing process.”

One thing is beyond doubt at this point, over three years into Europe’s flagship data protection reboot: There will be even more delay in any GDPR enforcement against Facebook.

The GDPR’s one-stop-shop mechanism — of review plus the chance for other DPAs to file objections — already added multiple months to the two earlier DPC Big Tech decisions. So the DPC issuing another weak draft decision on a late-running investigation looks like it’s becoming a standard procedural lever to decelerate the pace of GDPR enforcement across the EU.

This will only increase pressure for EU lawmakers to agree alternative enforcement structures for the bloc’s growing suite of digital regulations.

In the meanwhile, as DPAs fight it out to try to hit Facebook with a penalty Mark Zuckerberg can’t just laugh off, Facebook gets to continue its lucrative data-mining business as usual — while EU citizens are left asking where are my rights?

Facebook’s lead EU privacy supervisor hit with corruption complaint

$
0
0

Facebook’s problems with European privacy law could be about to get a whole lot worse. But ahead of what may soon be a major (and long overdue) regulatory showdown over the legality of its surveillance-based business model, Ireland’s Data Protection Commission (DPC) is facing a Facebook-shaped problem of its own: It’s now the subject of a criminal complaint alleging corruption and even bribery in the service of covering its own backside (we paraphrase) and shrinking the public understand of the regulatory problems facing Facebook’s business.

European privacy campaign group noyb has filed the criminal complaint against the Irish DPC, which is Facebook’s lead regulator in the EU for data protection.

noyb is making the complaint under Austrian law — reporting the Irish regulator to the Austrian Office for the Prosecution of Corruption (aka WKStA) after the DPC sought to use what noyb terms “procedural blackmail” to try to gag it and prevent it from publishing documents related to General Data Protection Regulation (GDPR) complaints against Facebook.

The not-for-profit alleges that the Irish regulator sought to pressure it to sign an “illegal” non-disclosure agreement (NDA) in relation to a public procedure — its complaint argues there is no legal basis for such a requirement — accusing the DPC of seeking to coerce it into silence, as Facebook would surely wish, by threatening not to comply with its regulatory duty to hear the complainant unless noyb signed the NDA. Which is quite the (alleged) quid pro quo.

The letter sent by the DPC to noyb seeking an agreement to maintain the confidentiality of all material relating to objections by other DPAs (as well as any associated observations by the data controller (Facebook), complainant (noyb et al), DPC or other EU supervisory authorities) vis-a-vis a draft decision related a complaint against Facebook that’s undergoing an active dispute resolution procedure — “on the grounds that such arrangements are necessary to preserve/maintain free and frank exchanges” and to ensure that “interim views” are not aired in order to “preserve the confidentiality and integrity of the co-decision-making procedure” as the DPC’s letter circularly demands — has been published by noyb here (redacting the name/s of the DPC officer/s who put their name/s to the demand).

“The DPC acknowledges that it has a legal duty to hear us but it now engaged in a form of ‘procedural coercion,'” said noyb chair, Max Schrems, in a statement. “The right to be heard was made conditional on us signing an agreement, to the benefit of the DPC and Facebook. It is nothing but an authority demanding to give up the freedom of speech in exchange for procedural rights.”

The regulator has also demanded noyb remove documents it has previously made public — related to the DPC’s draft decision of a GDPR complaint against Facebook — again without clarifying what legal basis it has to make such a demand.

As noyb points out, it is based in Austria, not Ireland — so is subject to Austrian law, not Irish law. But, regardless, even under Irish law it argues there’s no legal duty for parties to keep documents confidential — pointing out that Section 26 of the Irish Data Protection Act, which was cited by the DPC in this matter, only applies to DPC staff (“relevant person”), not to parties.

“Generally we have very good and professional relationships with authorities. We have not taken this step lightly, but the conduct of the DPC has finally crossed all red lines. They basically deny us all our rights to a fair procedure unless we agree to shut up,” added Schrems.

He went on to warn that “Austrian corruption laws are far-reaching” — and to further emphasize: “When an official requests the slightest benefit to conduct a legal duty, the corruption provisions may be triggered. Legally there is no difference between demanding an unlawful agreement or a bottle of wine.”

All of which looks exceptionally awkward for the Irish regulator. Which already, let’s not forget — at the literal start of this year — agreed to “swiftly” finalize another fractious complaint made by Schrems, this one relating to Facebook’s EU-U.S. data transfers, and which dates all the way back to 2013, following noyb bringing a legal procedure.

(But of course there’s still no sign of a DPC resolution of that Facebook complaint either … So, uhhh, “Siri: Show me regulatory capture” … )

Last month noyb published a draft decision by the DPC in relation to another (slightly less vintage) complaint against Facebook — which suggested the tech giant’s lead EU data regulator intended not to challenge Facebook’s attempt to use an opaque legal switch to bypass EU rules (by claiming that users are actually in a contract with it receive targeted ads, ergo GDPR consent requirements do not apply).

The DPC had furthermore suggested a wrist-slap penalty of $36 million — for Facebook failing transparency requirements over the aforementioned “ad contract.”

That decision remains to be finalized because — under the GDPR’s one-stop-shop mechanism for deciding cross-border complaints — other EU DPAs have a right to object to a lead supervisor’s preliminary decision and can ratchet out a different outcome. Which is what noyb is suggesting may be about to happen vis-a-vis this particular Facebook complaint saga.

Winding back slightly, despite the EU’s GDPR being well over three years old (in technical application terms), the DPC has yet to make a single final finding against Facebook proper.

So far it’s only managed one decision against Facebook-owned WhatsApp — which resulted in an inflated financial penalty for transparency failures by the messaging platform after other EU DPAs intervened to object to a (similarly) low-ball draft sanction Ireland had initially suggested. In the end WhatsApp was hit with a fine of $267 million — also for breaching GDPR transparency obligations. A notable increase on the DPC’s offer of a fine of up to $56 million.

The tech giant is appealing that penalty — but has also said it will be tweaking its privacy policy in Europe in the meanwhile. So it’s a (hard won) win for European privacy advocates — for now.

The WhatsApp GDPR complaint is just the tip, of course. The DPC has been sitting, hen-like, on a raft of data protection complaints against Facebook and other Facebook-owned platforms — including several filed by noyb on the very the day the regulation came into technical application all the way back in May 2018.

These “forced consent” complaints by noyb strike at the heart of the headlock Facebook applies to users by not offering them an opt-out from tracking-based advertising. Instead the “deal” Facebook (now known as Meta) offers is a take-it-or-leave-it “choice” — either accept ads or delete your account — despite the GDPR setting a robust standard for what can legally constitute consent that states it must be specific, informed and freely given.

Arm twisting is not allowed. Yet Facebook has been twisting European arms before and since the GDPR all the same.

So the “forced consent” complaints — if they do ever actually get enforced — have the potential to purge the tech giant’s surveillance-based business model once and for all. As, perhaps, does the vintage EU-U.S. data transfers issue. (Certainly it would crank up Facebook’s operational costs if it had to federate its service so that Europeans’ data was stored and processed within the EU to fix the risk of U.S. government mass surveillance.)

However, per the draft DPC decision on the forced consent issue, published (by noyb) last month, the Irish regulator appeared to be preparing to (at best) sidestep the crux question of the legality of Facebook’s data mining, writing in a summary:

There is no obligation on Facebook to seek to rely solely on consent for the purposes of legitimising personal data processing where it is offering a contract to a user which some users might assess as one that primarily concerns the processing of personal data. Nor has Facebook purported to rely on consent under the GDPR.

noyb has previously accused the DPC of holding secret meetings with Facebook around the time it came up with the claimed consent bypass and just as the GDPR was about come into application — implying the regulator was seeking to support Facebook in finding a workaround for EU law.

The not-for-profit also warned last month that if Facebook’s relabelling “trick” (i.e., switching a claim of “consent” to a claim of “contract”) were to be accepted by EU regulators it would undermine the whole of the GDPR — making the much lauded data protection regime trivially easy for data-mining giants to bypass.

Likewise, noyb argues, had it signed the DPC’s demanded NDA it would have “greatly benefited Facebook.”

It would also have helped the DPC by keeping a lid on the awkward detail of lengthy and labyrinthine proceedings — at a time when the regulator is facing rising heat over its inaction against Big Tech, including from lawmakers on home soil. (Some of which are now pushing for reform of the Commission — including the suggestion that more commissioners should be recruited to remove sole decision-making power from the current incumbent, Helen Dixon.)

“The DPC is continuously under fire by other DPAs, in public inquiries and the media. If an NDA would hinder noyb’s freedom of speech, the DPC’s reputational damage could be limited,” noyb suggests in a press release, before going on to note that had it been granted a benefit by signing an NDA (“in direct exchange for the DPC to conduct its legal duties”) its own staff could have potentially committed a crime under the Austrian Criminal Act.

The not-for-profit instead opted to dial up publicity — and threaten a little disinfecting sunlight — by filing a criminal complaint with the Austrian Office for the Prosecution of Corruption.

It’s essentially telling the DPC to put up a legal defence of its procedural gagging attempts — or, well, shut up.

Here’s Schrems again:

We very much hope that Facebook or the DPC will file legal proceedings against us, to finally clarify that freedom of speech prevails over the scare tactics of a multinational and its taxpayer-funded minion. Unfortunately we must expect that they know themselves that they have no legal basis to take any action, which is why they reverted to procedural blackmail in the first place.

Nor is noyb alone in receiving correspondence from the DPC that’s seeking to apply swingeing confidentiality clauses to complainants.

Following publication of noyb’s criminal complaint, Johnny Ryan, a fellow at the Irish Council for Civil Liberties, tweeted that it received a “confidentiality demand” from the DPC in relation to a GDPR complaint raised against Google’s adtech — suggesting the regulator is seeking to use the same threat of silence or be removed from the proceeding against another complainant against Big Tech.

“Everything I and my lawyers read would be tracked in a ‘data room.’ Otherwise, DPC withholds all materials from us (including Google docs that are already public),” he wrote.

TechCrunch has also reviewed correspondence sent to the Irish regulator earlier this fall by (yet) another complainant — who writes to query its legal basis for a request to gag disclosure of correspondence and draft reports.

Despite repeated requests for clarification by the complainant, the DPC appears to have entirely failed — over the course of more than a month — to reply to the request for its legal basis for making such a gag request.

This suggests noyb’s experience of threats and scare tactics lacking legal substance is not unique — by looks rather more like modus operandi — backing up its claim that the DPC has questions to answer about “how it conducts its office.”

We’ve reached out to the DPC for comment on the allegations it’s facing.

Update 1: The DPC has responded at length to what it says have been multiple media queries related to noyb’s action. In its remarks, which are written in a Q&A style — ostensibly responding to a number of specific questions it suggests were received from numerous media outlets (none of which were asked by this media outlet, however) — the regulator claims that information related to an ongoing procedure must be kept confidential in order to ensure “fairness to all parties”, which it further describes as a “Constitutional obligation”.

It also writes that it “must balance its obligations to protect confidential information against the complainant’s and [emphasis its] the data controller’s rights to fair procedures.” But it does not specify the legal basis for this claimed ‘balancing obligation’.

On the question of what legal basis the DPC is relying on to demand confidentiality, it writes vaguely that it “draws on its obligations under the GDPR, the Irish Data Protection Act 2018 and its Constitutional obligation to apply fair procedures”.

Later it reiterates its claim that Section 26 of the Data Protection Act “provides that the DPC may designate [emphasis its] information as being confidential so that it must be kept confidential while the inquiry is ongoing” for a number of reasons it already claimed in its letter to noyb — such as wanting to preserve a free and frank exchange; avoid parallel exchanges around an ongoing procedure; and avoid the publication of material that “may reasonably be considered likely to compromise the decision-making process and/or give rise to procedural unfairness and/or cause harm to the interests of the complainant and/or controller”.

Section 26 of the Irish Data Protection Act does deal with prohibitions on the disclosure of confidential information — where clause (1) states: “A relevant person shall not disclose confidential information obtained by him or her while performing functions under this Act or the Data Protection Regulation unless he or she is required or permitted by law, or duly authorised by the Commission, to do so.”

However — as Schrems/noyb already pointed out — the Act goes on to define “a relevant person” as either a Commissioner; a member of staff of the Commission; an authorized officer; any other person engaged under a contract for services by the Commission or a member of the staff of such a person; or a person who has acted in any of those capacities — none of which describes Schrems or noyb.

So here the DPC appears to be attempting to misdirect the crux legal question — i.e. on what lawful grounds is the regulator demanding confidentiality around the procedure? — by not engaging with the substance of the legal critique.

In short, it looks like copy-paste bluster.

The DPC’s response includes further misdirection when it makes a reference to the Austria data protection authority — where the GDPR complaint in question was originally filed, before being referred to the Irish DPC under the regulation’s one-stop-shop mechanism for dealing with cross-border cases — writing that the Austrian DPA “held that Mr Schrems was not entitled to sight of documents exchanged between the DPC and its fellow data protection authorities”.

But given this section of the procedure is being undertaken in Ireland by the Irish regulator, not in Austria by the Austrian regulator, it’s not clear what relevance Austrian procedural decisions vis-a-vis process openness have here.

TechCrunch asked Schrems about this point — and he described it as “typical ‘reframing” by the DPC, accusing the regulator of “deliberately” mixing up two separate issues. Aka, whether a part of the procedure is open to the parties in general (as is the case in Ireland; but not in Austria); and “details about documents within a procedure”.

“The Austrian DPA takes the view that the entire [GDPR] Article 60 cooperation procedure is not open to the parties at all (neither Facebook nor noyb), but only among DPAs. That’s arguable, even when I disagree personally,” Schrems explained. “The DPC takes the view it is open to the parties.

“We urged them to coordinate, but they didn’t. Now we have one DPA that sees the procedure to be open (to Facebook in Ireland) and the other DPA to be closed (that’s ours in Austria).

“IF it would be open to the parties, the Austrians would provide the documents (no doubt about it), they even made USB drives for us. So they [the DPC] deliberately mix up two things: If a part of the procedure is open to the parties in general — and — the details about documents within a procedure… ”

NB: noyb has now responded in detail to the DPC’s claims — see update (1)b below for their full commentary. 

The DPC’s 1,339-word response (which we’ve pasted below in full for reference; see update (1)a) also does not directly address the question of the fairness of removing noyb from the procedure as its earlier letter threatens to unless it agrees to the confidentiality demand.

Instead the regulator opts to pose the question of “what will happen to the case if Mr Schrems declines to give an undertaking — actionable in the Irish courts — that there will be no more publication of documents”; and whether “the objections phase and the final decision can proceed without him/noyb/the complainant receiving documents”.

“The objections phase at least will proceed as planned,” the DPC writes on that, before equivocating an answer to what will happen after. “What happens at any later stage will depend on a number of factors to include the outcome of the consultation process as between the DPC and the other data protection authorities, but also on whether Mr Schrems’ maintains his present position that he must be given access to all materials on the basis that it will be for Mr Schrems alone to decide what (if anything) he may publish or use, and retaining the right to change his position at his sole election and at any time of his choosing.”

The DPC concludes its screed by observing that: “Ultimately, noyb will also have a right of appeal against the final decision delivered at the end of the co-decision-making procedure” — which does not in any way compensate for an unfair procedure.

But it does, perhaps, sound like a regulator that’s very comfortable with legal challenges — and may even be anticipating an additional layer of court action down the line, i.e. related to the DPC withholding documents from the complainant (when, presumably, it’s not withholding the same stuff from Facebook… Fairness eh!).

TechCrunch’s view, after examining the DPC’s response, is there is nothing here to prevent a reasonable observer concluding that the bulk of the regulator’s ‘sweating toil’ is actually aimed at generating obfuscating friction (and even suggestive fictions) — which in turn is only likely to build in fresh delays that slow down regulatory procedures and, ultimately, further retard enforcement against tech giants like Facebook. But do take a good 15-20mins of your own time for a close reading of the DPC’s remarks (below) to make up your own mind.

But what about Facebook? noyb’s press release goes on to predict a “tremendous commercial problem” looming for the data-mining giant — as it says DPC correspondence “shows that other European DPAs have submitted ‘relevant and reasoned objections’ and oppose the DPC’s view” [i.e., in the consent bypass complaint against Facebook].

“If the other DPAs have a majority and ultimately overturn the DPC’s draft decision, Facebook could face a legal disaster, as most commercial use of personal data in the EU since 2018 would be retroactively declared illegal,” noyb suggests, adding: “Given that the other DPAs passed Guidelines in 2019 that are very unfavourable to Facebook’s position, such a scenario is highly likely.”

The not-for-profit has more awkward revelations for the DPC and Facebook in the pipe, too.

It says it’s preparing fresh document releases in the coming weeks — related to correspondence from the DPC and/or Facebook — as a “protest” against attempts to gag it and to silence democratic debate about public procedures.

“On each Sunday in advent, noyb will publish another document, together with a video explaining the documents and an analysis why the use of these documents is fully compliant with all applicable laws,” it notes, adding that what it’s billing as the “advent reading” will be published on noyb.eu“so tune in!”

So looks like the next batch of “Facebook Papers” that Meta would really rather you didn’t see will be dropping soon.

via GIPHY

Update (1)a: Here’s the DPC’s response to our request for comment in full (NB: We’ve redacted the name of the DPC officer which was appended to the end of the text):

“Thank you for your recent media query to this office. The DPC has had numerous queries on the matter and so we have prepared a compilation of the answers in the hope that the information will be of assistance to you.

1. There seems to be a standoff between the original Austrian complainant and the DPC over confidentiality of documents. In the one-stop shop arrangements for GDPR, in a dispute like this, which jurisdiction has primacy: where the case was filed, or where it is being processed?

Under the GDPR, because the relevant data controller (in this case, Facebook Ireland) has its main establishment in Ireland, the Irish DPC is what is called the “lead supervisory authority” and so has the obligation to investigate and make a preliminary decision about the issues raised in the complaint. The Austrian data protection supervisory authority referred the complaint to the Irish DPC on this basis. Once we have reached a “draft decision” (which is how our proposed decision is referred to under Article 60 of the GDPR), it is then sent to and considered by our colleagues in the data protection authorities in the other EU member states as part of a co-decision-making procedure. Following this process, the Irish DPC reaches a final decision on the complaint reflecting either the consensus achieved amongst data protection authorities or, where differences arise between them which cannot be reconciled, a decision of the European Data Protection Board following a dispute resolution procedure.

The Irish DPC is obliged to follow Irish fair procedures law as part of our decision-making process. These fair procedures obligations have been confirmed on several occasions by the Irish courts, including the Supreme Court.

One of the considerations here is that, as a matter of fairness to all parties, the integrity of the inquiry process should be respected and the confidentiality of information exchanged between the parties upheld. What we mean by this is that it would be unfair to any party under investigation by a regulator (not just the DPC) if the materials that they provide to that regulator, and the regulator’s queries to and correspondence with them, should be made public before any decision is reached in relation to the matters that are under investigation. This would effectively mean an investigation against anybody would be turned into an open, public process before any decision is reached against them, and this is not fair nor has it ever been a feature of regulation in Ireland up to now.

Reflecting these sorts of considerations, Section 26 provides that the DPC may designate information as being confidential so that it must be kept confidential while the inquiry is ongoing. The reasons why information is designated as confidential include the following:

– to preserve/maintain free and frank exchanges between the DPC and each of the complainant and the controller, facilitating the kind of dialogue (and associated information flows) necessary to ensure that all of the issues under examination can be fully and effectively explored, and positions advanced by relevant parties fully and properly tested;

– to ensure that the issues under examination can be addressed within the confines of the decision-making process itself, and to reduce the scope for parallel exchanges taking place outside that process; and,

– to avoid the publication (or other disclosure to third parties) of exchanges identifying interim views and/or positions that remain under consideration by the DPC and which, if disclosed prior to the conclusion of the decision-making process, may reasonably be considered likely to compromise the decision-making process and/or give rise to procedural unfairness and/or cause harm to the interests of the complainant and/or controller, as the case may be.

It is of note here that both the Irish and Austrian data protection authorities agree that neither the complainant nor the controller have a right to participate in the consultation process that forms a key part of the co-decision-making procedure described above. From there, the Austrian DPA held that Mr Schrems was not entitled to sight of documents exchanged between the DPC and its fellow data protection authorities.

For its part, the DPC believes that the parties should be given sight of such materials, provided only that they agree to treat them as confidential within the decision-making process

2. According to noyb/Schrems, the Austrian DPA says there is no confidentiality clause covering such procedural documents. You say in your letters to noyb that there is a confidentiality clause.

As noted, the Austrian SA has made it clear on two separate occasions now that it did not consider that Mr Schrems was entitled to sight of documents exchanged between the DPC and its fellow data protection authorities in the course of the co-decision-making procedure.

It has also expressed the view to the DPC that Mr Schrems would not have been entitled to the draft decision and accordingly its publication on foot of the equivalent Austrian process could not arise.

The DPC’s position is as outlined under point 1.

a. Does the DPC draw on legislation outside the 2018 data protection act regarding confidentiality of procedures? And, if so, where?

The Irish DPC draws on its obligations under the GDPR, the Irish Data Protection Act 2018 and its Constitutional obligation to apply fair procedures (as set out above).

b. NOYB says the paragraphs of the 2018 act the DPC cites apply only to a “relevant person” which includes DPC employees and contractors. Is this correct, or is there another section of the act that applies to parties in a complaint, too?

One of the legal obligations on the Irish DPC is under Section 26 of the Data Protection Act 2018. This requires that “relevant persons” (which include officers of the DPC) must not disclose confidential information, unless this is required (for example, by fair procedures obligations, as explained above) or is permitted by law.

Even then, however, the DPC must balance its obligations to protect confidential information against the complainant’s and the data controller’s rights to fair procedures.

In practical terms, the DPC is bound to take all reasonable steps to ensure that the confidentiality of such material is upheld in its own hands but also when it passes to the hands of a third party.

To put it another way, the DPC can’t comply with its obligation to protect the confidentiality of material in its own hands, if it then passes that same material to a third party, without restriction, knowing or reasonably believing there is a strong likelihood the third party will publish it

3. What happens to the case if Mr Schrems declines to give an undertaking – actionable in the Irish courts – that there will be no more publication of documents? Can the objections phase and the final decision proceed without him/NOYB/the complainant receiving documents

As flagged above, neither the complainant nor the controller are afforded an active role in the co-decision-making procedure described briefly above, save to the extent that, for reasons derived from Irish procedural law, the DPC takes steps to afford the complainant and controller a right to see the objections and to make written observations if any adjustments are proposed to the current iteration of the draft decision. As such, the objections phase at least will proceed as planned. What happens at any later stage will depend on a number of factors to include the outcome of the consultation process as between the DPC and the other data protection authorities, but also on whether Mr Schrems’ maintains his present position that he must be given access to all materials on the basis that it will be for Mr Schrems alone to decide what (if anything) he may publish or use, and retaining the right to change his position at his sole election and at any time of his choosing.

Ultimately, NOYB will also have a right of appeal against the final decision delivered at the end of the co-decision-making procedure.”

Update (1)b: Noyb has now sent a detailed rebuttal of the DPC’s response — which we’re also publishing in full below.

NB: Here the DPC’s source text is presented in quotation marks and formatted in italics; while noyb’s responses are presented below in bold to distinguish between them. Further note: noyb did not respond to the DPC’s response to question 3 — so we have excluded repeating that chunk of text:

DPC: “1. There seems to be a standoff between the original Austrian complainant and the DPC over confidentiality of documents. In the one-stop shop arrangements for GDPR, in a dispute like this, which jurisdiction has primacy: where the case was filed, or where it is being processed?”

Note by noyb: There is nothing in the answers below that answers the question in the headline about applicable procedural laws.

DPC: “Under the GDPR, because the relevant data controller (in this case, Facebook Ireland) has its main establishment in Ireland, the Irish DPC is what is called the “lead supervisory authority” and so has the obligation to investigate and make a preliminary decision about the issues raised in the complaint.  The Austrian data protection supervisory authority referred the complaint to the Irish DPC on this basis.” 

Note from noyb: This is correct, but what is left out, is that the DPAs have to “coordinate” under Article 60(1) GDPR and that each DPA applies its own procedural law in such a case. So there is a “One Stop Shop” for the controller and the complainant, in their local language and under the local procedure.

DPC: “Once we have reached a “draft decision” (which is how our proposed decision is referred to under Article 60 of the GDPR), it is then sent to and considered by our colleagues in the data protection authorities in the other EU member states as part of a co-decision-making procedure.  Following this process, the Irish DPC reaches a final decision on the complaint reflecting either the consensus achieved amongst data protection authorities or, where differences arise between them which cannot be reconciled, a decision of the European Data Protection Board following a dispute resolution procedure.

The Irish DPC is obliged to follow Irish fair procedures law as part of our decision-making process.  These fair procedures obligations have been confirmed on several occasions by the Irish courts, including the Supreme Court.” 

Note from noyb: This is correct, but there is no mention about what “fair procedure obligations” exactly were confirmed by what court case. In fact there is not a single case that would provide for confidentiality before the DPC. We have asked for a legal basis in the law or in case law, but the DPC is silent on this. Just saying “some court said something about fair procedures” is not a basis to demand NDAs from parties or kick them out of the procedure (in fact it’s the opposite of a “fair procedure”).

DPC: “One of the considerations here is that, as a matter of fairness to all parties, the integrity of the inquiry process should be respected and the confidentiality of information exchanged between the parties upheld.  What we mean by this is that it would be unfair to any party under investigation by a regulator (not just the DPC) if the materials that they provide to that regulator, and the regulator’s queries to and correspondence with them, should be made public before any decision is reached in relation to the matters that are under investigation.  This would effectively mean an investigation against anybody would be turned into an open, public process before any decision is reached against them, and this is not fair nor has it ever been a feature of regulation in Ireland up to now.”

Note from noyb: This is Facebook’s position, but in fact public debate and criticism (especially when it comes to the data protection right of millions) in a democratic society cannot be limited to after a decision is made. In fact, it is crucial that parties and the public can form an opinion during a decision process. As a default political, regulatory or court procedures are therefore open to the public – unless there are serious grounds to limit information. The DPC take the view that by default the public and the parties may not voice concerns or just get informed about a procedure before it is too late. What comes in addition to that, is that the DPC is extremely complicated and slow in the decision process. The pending case lasts for about 3.5 years by now. Usually such decisions are shorter and the room for public debate is therefore more limited. In the “EU-US data transfer” case, the investigation is ongoing for more than 8 years. The public would never have been informed about the background of two CJEU decisions, if such “fairness” rules would have continuously applied since 2013.

DPC: “Reflecting these sorts of considerations, Section 26 provides that the DPC may designate information as being confidential so that it must be kept confidential while the inquiry is ongoing. The reasons why information is designated as confidential include the following:”

Note from noyb: This is incorrect. Section 26 does not have the word “designate” in it. It does not allow the DPC to (one-sidedly) just decide what is “confidential” or not. Instead there is an objective test to be applied, which may be contested by the parties, because the DPC’s view may go too far or not far enough. It is not an absolute right by the DPC to just “declare” things to be confidential.

DPC: “-        to preserve/maintain free and frank exchanges between the DPC and each of the complainant and the controller, facilitating the kind of dialogue (and associated information flows) necessary to ensure that all of the issues under examination can be fully and effectively explored, and positions advanced by relevant parties fully and
properly tested;
–        to ensure that the issues under examination can be addressed within the confines of the decision-making process itself, and to reduce the scope for parallel exchanges taking place outside that process; and,
–        to avoid the publication (or other disclosure to third parties) of exchanges identifying interim views and/or positions that remain under consideration by the DPC and which, if disclosed prior to the conclusion of the decision-making process, may reasonably be considered likely to compromise the decision-making process and/or give rise to procedural unfairness and/or cause harm to the interests of the complainant and/or controller, as the case may be.” 

Note from noyb: This is incorrect. Section 26 does not name any of these elements. They are completely made up by the DPC.

DPC: “It is of note here that both the Irish and Austrian data protection authorities agree that neither the complainant nor the controller have a right to participate in the consultation process that forms a key part of the co-decision-making procedure described above. From there, the Austrian DPA held that Mr Schrems was not entitled to sight of documents
exchanged between the DPC and its fellow data protection authorities.”

Note from noyb: this is misleading – the Austrian DPA in fact only takes the view that the cooperation process under Article 60(3) to (5) GDRP is not open the (both) parties. The DPC instead explicitly says that both parties have a right to be heard in its letters. We urged both DPAs to come to consensus, but it seems they were unable to reach such a consensus. There is now a situation where the Irish DPA takes the view that there is a role for the parties, but that documents are secret and the Austrian DPA takes the view that there is no role for the parties, but if there would be a role, § 17 AVG make the documents useable for anyone. Bottom line is: Facebook will be heard and noyb will not.

DPC: “For its part, the DPC believes that the parties should be given sight of such materials, provided only that they agree to treat them as confidential within the decision-making process”

Note from noyb: There is no basis for such a conclusion. In fact, the DPC itself may violate Section 26 if it shares “confidential” documents with the parties, as Section 26 is absolute in the consequences. The reality is that Section 26 is binary: If it is “confidential” it has to stay within the DPC, if it is not “confidential” it may be shared with external parties, who are themselves not subject to Section 26.

DPC: “2. According to noyb/Schrems, the Austrian DPA says there is no confidentiality clause covering such procedural documents. You say in your letters to noyb that there is a confidentiality clause.
As noted, the Austrian SA has made it clear on two separate occasions now that it did not consider that Mr Schrems was entitled to sight of documents exchanged between the DPC and its fellow data protection authorities in the course of the co-decision-making procedure.”

Note from noyb: This is incorrect and/or misleading. The Austrian DPA took the view that this entire process is not open to the parties (neither the complainant nor Facebook), so it does not fall under the right to access to documents (independent of the documents being confidential or not). The DPC take the opposite view, that the process is open to the parties, but the documents are confidential. The DPAs were unable to agree on a joint position.

DPC: “It has also expressed the view to the DPC that Mr Schrems would not have been entitled to the draft decision and accordingly its publication on foot of the equivalent Austrian process could not arise.”

Note from noyb: This is absolutely incorrect. The Austrian DPA never said that. They even provided us with a USB drive with all the documents of the procedure. § 17 AVG is binary: Once you get the documents, they are free. See for example the Austrian Supreme Administrative Court (VwGH 22. 10. 2013, 2012/10/0002; VwGH 21. 2. 2005, 2004/17/0173; Rz 5).

DPC: “The DPC’s position is as outlined under point 1.
a. Does the DPC draw on legislation outside the 2018 data protection act regarding confidentiality of procedures? And, if so, where?
The Irish DPC draws on its obligations under the GDPR, the Irish Data Protection Act 2018 and its Constitutional obligation to apply fair procedures (as set out above).”

Note from noyb: The GDPR has 99 Articles, the Irish Data Protection Act has hundreds of Sections and “Constitutional obligations” are not any clear framework for such a specific question. In fact the DPC cannot point to any specific provision, because there are none.

DPC: “b. NOYB says the paragraphs of the 2018 act the DPC cites apply only to a “relevant person” which includes DPC employees and contractors. Is this correct, or is there another section of the act that applies to parties in a complaint, too?

One of the legal obligations on the Irish DPC is under Section 26 of the Data Protection Act 2018.  This requires that “relevant persons” (which include officers of the DPC) must not disclose confidential information, unless this is required (for example, by fair procedures obligations, as explained above) or is permitted by law. Even then, however, the DPC must balance its obligations to protect confidential information against the complainant’s and the data controller’s rights to fair procedures.”

Note from noyb: This is not in the law or any case law and just made up.

DPC: “In practical terms, the DPC is bound to take all reasonable steps to ensure that the confidentiality of such material is upheld in its own hands but also when it passes to the hands of a third party.”

Note from noyb: This is not in the law or any case law and just made up.

DPC: “To put it another way, the DPC can’t comply with its obligation to protect the confidentiality of material in its own hands, if it then passes that same material to a third party, without restriction, knowing or reasonably believing there is a strong likelihood the third party will publish it”

Note from noyb: This “conflict” is not really existing. The DPC has in fact blackened any documents that it considered “sensitive” or somehow protected. The rest is simply not falling under Section 26 and therefore there is no need to “balance”. The conflict that the DPC tries to generate here, is just because it declares even the most trivial email as “confidential”.”

This report has been updated with a link to the DPC’s letter to noyb; with Johnny Ryan’s confirmation of another confidentiality demand by the regulator in its complaint against Google’s adtech; with comment from the DPC and our analysis of its claims, including additional comment from Schrems; and with noyb’s detailed rebuttal of the DPC’s commentary

EU warns adtech giants over ‘legal tricks’ as it moots changes to centralize privacy oversight

$
0
0

The European Commission has given its clearest signal yet that it’s prepared to intervene over weak enforcement of the EU’s data protection rules against big tech.

Today the bloc’s executive also had a warning for adtech giants Google and Facebook — accusing them of choosing “legal tricks” over true compliance with the EU’s standard of “privacy by design” — and emphasizing the imperative for them to take data protection “seriously”.

Speaking at a privacy conference this morning, Vera Jourová, the EU’s commissioner for values and transparency, said enforcement of the General Data Protection Regulation (GDPR) at a national level must buck up — and become “effective” — or else it “will have to change”, warning specifying that any “potential changes” will move toward centralized enforcement.

“When I was looking at existing enforcement decisions and pending cases, I also came to another conclusion,” she also said.  “So, we have penalties or decisions against Google, Facebook, WhatsApp.

“To me this means that clearly there is a problem with compliance culture among those companies that live off our personal data. Despite the fact that they have the best legal teams, presence in Brussels and spent countless hours discussing with us the GDPR. Sadly, I fear this is not privacy by design.

I think it is high time for those companies to take protection of personal data seriously. I want to see full compliance, not legal tricks. It’s time not to hide behind small print, but tackle the challenges head on.”

In parallel, an influential advisor to the bloc’s top court has today published an opinion which states that EU law does not preclude consumer protection agencies from bringing representative actions at a national level — following a referral by a German court in a case against Facebook Ireland — which, if the CJEU’s judges agree, could open up a fresh wave of challenges to tech giants’ misuse of people’s data without the need to funnel complaints through the single point of failure of gatekeeper regulators like Ireland’s Data Protection Commission (DPC).

Towards centralized privacy oversight?

On paper, EU law provides people in the region with a suite of rights and protections attached to their data. And while the regulation has attracted huge international attention, as other regions grapple with how to protect people in an age of data-mining giants, the problem for many GDPR critics, as it stands, is that the law decentralizes oversight of these rules and rights to a patchwork of supervisory agencies at the EU Member State level.

While this can work well for cases involving locally bounded services, major problems arise where complaints span borders within the EU — as is always the case with tech giants’ (global) services. This is because a one-stop-shop (OSS) mechanism kicks in, ostensibly to reduce the administrative burden for businesses.

But it also enables a huge get-out clause for tech giants, allowing them to forum shop for a ‘friendly’ regulator through their choice of where to locate their regional HQ. And working from a local EU base, corporate giants can use investment and job creation in that Member State as a lever to work against and erode national political will to press for vigorous oversight of their European business at the local authority level.

“In my view, it does take too long to address the key questions around processing of personal data for big tech,” said Jourová giving a keynote speech to the Forum Europe data protection & privacy conference. “Yes, I understand the lack of resources. I understand there is no pan-European procedural law to help the cross-border cases. I understand that the first cases need to be rock-solid because they will be challenged in court.

“But I want to be honest — we are in the crunch time now. Either we will all collectively show that GDPR enforcement is effective or it will have to change. And there is no way back to decentralised model that was there before the GDPR. Any potential changes will go towards more centralisation, bigger role of the EDPB [European Data Protection Board] or Commission.”

Jourová added that the “pressure” to make enforcement effective “is already here” — pointing to debate around incoming legislation that will update the EU’s rules around ecommerce, and emphasizing that, on the Digital Services Act, Member States have been advocating for enforcement change — and “want to see more central role of the European Commission”.

Point being that if there’s political will for structural changes to centralize EU enforcement among Member States, the Commission has the powers to propose the necessary amendments — and will hardly turn its nose up at being asked to take on more responsibility itself.

Jourová’s remarks are a notable step up on her approach to the thorny issue of GDPR enforcement back in summer 2020 — when, at the two year review mark of the regulation entering into application, she was still talking about the need to properly resource DPAs — in order that they could “step up their work” and deliver “vigorous but uniform enforcement”, as she put it then.

Now, in the dying days of 2021 — with a still massive backlog of decisions yet to be issued around cross-border cases, some of which are highly strategic, targeting adtech platforms’ core surveillance business model (Jourová’s speech, for example, noted that 809 procedures related to the OSS have been triggered but only 290 Final Decisions have been issued) — the Commission appears to be signalling that it’s finally running out of patience on enforcement.

And that it is already eyeing a Plan B to make the GDPR truly effective.

Criticism of weak enforcement against tech giants has been a rising chorus in Europe for years. Most recently frustration with regulatory inaction led privacy campaigner Max Schrems’ not-for-profit, noyb, to file a complaint of criminal corruption against the GDPR’s most infamous bottleneck: Ireland’s DPC, accusing the regulator of engaging in “procedural blackmail” which it suggested would help Facebook by keeping key developments out of the public eye, among other eye-raising charges.

The Irish regulator has faced the strongest criticism of all the EU DPAs over its role in hampering effective GDPR enforcement.

Although it’s not the only authority to be accused of creating a bottleneck by letting major complaints pile up on its desk and taking a painstaking ice-age to investigate complaints and issue decisions (assuming it opens an investigation at all).

The UK’s ICO — when the country was still in the EU — did nothing about complaints against real-time-bidding’s abuse of people’s data, for example, despite sounding a public warning over behavioral ads’ unlawfulness as early as 2019. While Belgium’s DPA has been taking a painstaking amount of time to issue a final decision on the IAB Europe’s TCF’s failure to comply with the GDPR. But Ireland’s central role in regulating most of big tech means it attracts the most flak. 

The sheer number of tech giants that have converged on Ireland — wooed by low corporate tax rates (likely with the added cherry of business-friendly data oversight) — gives it an outsized role in overseeing what’s done with European’s data.

Hence Ireland has open investigations into Apple, Google, Facebook and many others — yet has only issued two final decisions on cross-border cases so far (Twitter last year; and WhatsApp this year).

Both of those decisions went through a dispute mechanism that’s also baked into the GDPR — which kicks in when other EU DPAs don’t agree with a draft decision by the lead authority.

That mechanism further slowed down the DPC’s enforcement in those cases — but substantially cranked up the intervention the two companies ultimately faced. Ireland had wanted to be a lot more lenient vs the collective verdict once all of the bloc’s oversight bodies had had their say.

That too, critics say, demonstrates the DPC’s regulatory capture by platform power.

An opinion piece in yesterday’s Washington Post skewered the DPC as “the wrong privacy watchdog for Europe” — citing a study by the Irish Council for Civil Liberties that found it had only published decisions on about 2% of the 164 cross border cases it has taken on.

The number of complaints the DPC has chosen to entirely ignore — i.e. by not opening a formal investigation — or else to quietly shutter (“resolve”) without issuing a decision or taking any enforcement action is likely considerably higher. 

The agency is shielded by a very narrow application of Freedom of Information law, which applies only in relation to DPC records pertaining to the “general administration” of its office. So when TechCrunch asked the DPC, last December, how many times it had used GDPR powers such as the ability to order a ban on processing it declined to respond to our FOIs — arguing the information did not fall under Ireland’s implementation of the law.

Silence and stonewalling only go so far, though.

Calls for root and branch reform of the DPC specifically, and enforcement of the GDPR more generally, can now be heard from Ireland’s own parliament all the way up to the European Commission. And big tech’s game of tying EU regulators in knots looks as if it’s — gradually, gradually — getting toward the end of its rope.

What comes next is an interesting question. Last month the European Data Protection Superviso (EDPS) announced a conference on the future of “effective” digital enforcement — which will take place in June 2022 — and which he said would discuss best practice and also “explore alternative models of enforcement for the digital future”.

“We are ambitious,” said Wojciech Wiewiorowski as he announced the conference. “There is much scope for discussion and much potential improvement on the way current governance models are implemented in practice. We envisage a dialogue across different fields of regulation — from data protection to competition, digital markets and services, and artificial intelligence as well — both in the EU, and Europe as a continent, but also on the global level.”

Discussion of “different” and “alternative” models of enforcement will be a focus of the event, per Wiewiorowski — who further specified that this will include discussion of “a more centralized approach”. So the EDPS and the Commission appear to be singing a similar tune on reforming GDPR enforcement.

As well as the Commission itself (potentially) taking on an enforcement role in the future — perhaps specifically on major, cross border cases related to big tech, in order to beef up GDPR’s application against the most powerful offenders (as is already proposed in the case of the DSA and enforcing those rules against ‘very large online platforms’; aka vLOPs) — the GDPR steering and advisory body, the EDPB, also looks set to play an increasingly strategic and important role.

Indeed, it already has a ‘last resort’ decision making power to resolve disputes over cross border GDPR enforcement — and Ireland’s intransigence has led to it exercising this power for the first time.

In the future, the Board’s role could expand further if EU lawmakers decide that more centralization is the only way to deliver effective enforcement against tech giants that have become experts in exhausting regulators with bad faith arguments and whack-a-mole procedures, in order to delay, defer and deny compliance with European law.

The EDPB’s chair, Andrea Jelinek, was also speaking at the Forum Europe conference today. Asked for her thoughts on how GDPR enforcement could improve, including problematic elements like the OSS, she cautioned that change will be a “long term project”, while simultaneously agreeing there are notable “challenges” at the point where national oversight intersects with the needs of cross border enforcement.

“Enforcing at a national level and at the same time resolving cross border cases is time and resource intensive,” she said. “Supervisory authorities need to carry out investigations, observe procedural rules, coordinate and share information with other supervisory authorities. For the current system to work properly it is of vital important that supervisory authorities have enough resources and staff.

“The differences in national administrative procedures and the fact that in some Member States no deadlines are foreseen for handling a case also creates an obstacle to the efficient functioning of the OSS.”

Jelinek made a point of emphasizing that EDPB has been taking action to try to remedy some of issues identified — implementing what she described as “a series of practical solutions” to tackle problems around enforcement.

She said this has included developing (last year) a co-ordinated enforcement framework to facilitate joint actions (“in a flexible and coordinated manner”) — such as launching enforcement sweeps and joint investigations.

The EPBD is also establishing a pilot project to provide a pool of experts to support investigations and enforcement activities “of significant common interest”, she noted, predicting: “This will enhance the cooperation and solidarity between all the supervisory authorities by addressing their operational needs.”

“Finally we should not forget that the GDPR is a long term project and so is strengthening cooperation between supervisory authorities,” she added. “Any transformation of the GDPR will take years. I think the best solution is therefore to deploy the GDPR fully — it is likely that most of the issues identified by Member States and stakeholders will benefit from more experience in the application of the regulation in the coming years.”

However it is already well over three years since GDPR came into application. So many EU citizens may query the logic of waiting years more for regulators to figure out how to jointly work together to get the job of upholding people’s rights done. Not least because this enforcement impasse leaves data-mining tech giants free to direct their vast data-enabled wealth and engineering resource at developing new ‘innovations’ — to better evade legal restrictions on what they can do with people’s data.

One thing is clear: The next wave of big tech regulatory evasion will come dressed up in claims of privacy “innovation” from the get-go.

Indeed, that is already how adtech giants like Google are trying to re-channel regulators’ attention from enforcing against their core attention-manipulation, surveillance-based business model.

Google SVP Kent Walker also took to the (virtual) conference stage this morning for a keynote slot in which he argued that the novel ad targeting technologies Google is developing under its “Privacy Sandbox” badge (such as FloCs; aka federated learning of cohorts) will provide the answer to what big (ad)tech likes to claim is an inherent tension between European fundamental rights like privacy and economic growth.

The truth, as ever, is a lot more nuanced than that. For one thing, there are plenty of ways to target ads that don’t require processing people’s data. But as most of Europe’s regulators remain bogged down in a mire of corporate capture, under-resourcing, culture cowardice/risk aversion, internecine squabbles and, at times, a sheer lack of national political will to enforce the law against the world’s wealthiest companies, the adtech duopoly is sounding cockily confident that it will be allowed to carry on and reset the terms of the game in its own interests once again.

(The added irony here is that Google is currently working under the oversight of the UK’s Competition and Markets Authority and ICO on shaping behavioral remedies attached to its Sandbox proposals — and has said that these commitments will be applied globally if the UK is minded to accept them; which does risk tarnishing the GDPR’s geopolitical shine, given the UK is no longer a member of the EU… )

For EU citizens, it could well mean that — once again — it’s up to the CJEU to come to the rescue of their fundamental rights — assuming the court ends up concurring with advocate general Richard de la Tour’s opinion today that the GDPR:

” … does not preclude national legislation which allows consumer protection associations to bring legal proceedings against the person alleged to be responsible for an infringement of the protection of personal data, on the basis of the prohibition of unfair commercial practices, the infringement of a law relating to consumer protection or the prohibition of the use of invalid general terms and conditions, provided that the objective of the representative action in question is to ensure observance of the rights which the persons affected by the contested processing derive directly from that regulation.”

Consumer protection agencies being able to pursue representative legal actions to defend fundamental rights against tech giants’ self interest — at the Member State level, and therefore, all across the EU — could actually unblock GDPR enforcement via a genuinely decentralized wave of enforcement that’s able to route around the damage of captured gatekeepers and call out big adtech’s manipulative tricks in court.

Facebook’s internal assessment of EU-US data transfers shows it has no legal leg to stand on, says noyb

$
0
0

In its latest (and last) pre-Christmas document reveal, European privacy advocacy group noyb has published details of an 86-page internal assessment by Facebook of its (continued) transfers of European’s personal data to the U.S. — and the resulting conclusion can be best summed up as “The Emperor, Mark Zuckerberg, Has No Clothes”.

The convoluted backstory here is that Facebook’s transfers of EU users’ data to the U.S. remain ongoing — in spite of two rulings by the bloc’s top court finding the U.S. is a risky jurisdiction for such data (aka Schrems I and Schrems II); and a preliminary order by Facebook’s lead EU DPA, over a year ago, saying it must suspend EU-U.S. transfers in the wake of the aforementioned Schrems II ruling.

And if that wasn’t enough, it’s also almost a year since Facebook’s lead EU DPA, the Irish Data Protection Commission (DPC), settled a legal challenge from noyb — agreeing last January to “swiftly” finalize the complaint in question.

Yet there’s still no final decision from Ireland on the legality of Facebook’s EU-U.S. data transfers — some 8.5 years after the complaint was first filed by noyb founder and chair, Max Schrems (noyb didn’t even exist when he filed this complaint!).

Asked whether a decision on Facebook’s data transfers will — at long, long last — be issued this year, the DPC’s deputy commissioner, Graham Doyle, told us the inquiry is “fairly well progressed at this stage” but he admitted it will not be finalized in the next few weeks.

Asked if a decision will be issued in January, Doyle ducked specifying a time frame — saying that the DPC is unsure “exactly when” the decision will be made.

So perhaps 2022 will — finally — be the year of reckoning for Facebook.

But, if not, 2022 may well be a year of substantial reckoning for the Irish DPC, which is now facing intense scrutiny over the sedate pace and convoluted form of its enforcements in major cases against tech giants like Facebook.

The European Commission warned earlier this month that unless “effective” enforcement arrives soon it will step in and move the bloc toward a system of centralized oversight.

So the message from EU lawmakers to DPAs such as Ireland (and, really, especially to Ireland) is simple: Use your enforcement powers soon — or you’ll lose them.

Returning to Facebook, if an EU data transfer suspension order does ever actually get enforced, the tech giant faces having to make drastic changes to its infrastructure and/or its business model.

Or it could even shut down service in Europe — a possibility Facebook has floated in an earlier legal submission — although its chief spin doctor, Nick Clegg, quickly denied it would ever actually do that.

Facebook and Clegg have preferred to resort to economic scare tactics to lobby the bloc’s lawmakers against enforcing the rule of law against the national-state-sized data-mining empire — suggesting that any suspension order against Facebook’s data flows would wreak economic damage against European SMEs that use its ad tools to target consumers.

It’s a classic Big Tech tactic to lobby against tighter regulation of its own market power by claiming that limits on its operations will be far more damaging for the smaller businesses that rely on powerful platforms to reach potential buyers.

The adtech industry also likes to imply that you can either have privacy or competition, not both.

However, on that front, regional competition authorities are becoming increasingly sophisticated in their assessment of adtech platform power — including understanding how data abuse by tech giants can itself be a lever to lock in market power. (See, for example Germany’s Federal Cartel Office’s antitrust case against Facebook’s consentless superprofiling of users.)

So how much runway such self-serving framing has left, as the bloc hastens to pass ex ante rules to boss tech giants, is up for debate.

Facebook has managed to use the courts to defer a final countdown on its data transfers issues for years. But its business model is now under attack on multiple fronts — with the European Parliament, for example, pushing for tighter restrictions on behavioral ads and an outright ban on dark patterns in the Digital Markets Act.

In recent weeks, noyb has also been shining more disinfecting sunlight onto the EU’s enforcement failures — where Facebook is concerned — by protesting at being removed from an ongoing procedure against it by the Irish DPC, after the regulator tried to get it to sign a gag order in exchange for remaining a party to the proceeding.

The DPC has been accused of acting in Facebook’s interests in trying to keep procedural documents confidential without a valid legal basis for ordering third parties not to publish information related to ongoing procedures.

(And other pre-Christmas document-reveals by noyb have made especially awkward reading for the DPC — which can be seen apparently trying to insert a notorious Facebook GDPR consent bypass tactic into European Data Protection Board (EDPB) guidance — by arguing for allowing T&Cs to be laundered via contract clause — and getting roundly slapped back by other EU DPAs.)

Last month, the not-for-profit also took the further step of filing a complaint of criminal corruption against the DPC — in another sign of how frustrated European privacy campaigners have gotten at inaction against rights-trampling tech giants.

As noted above, despite a complaint that dates back to the Snowden disclosures, two landmark CJEU rulings and countless court challenges, Facebook continues to pass Europeans’ data to the U.S. — as if the rule of law can’t touch it.

Yet, back in May, the company lost in the Irish High Court after trying (and failing) to challenge the DPC’s procedure; including by arguing the DPC was being too hasty and did not properly investigate before it sent the preliminary suspension order. (NB: The original complaint dates back to June 2013 so it’s fast approaching a decade old at this point.)

Details of Facebook’s Transfer Impact Assessment (TIA) revealed by noyb yesterday are long on claimed justifications for Facebook to ignore the CJEU — and short on substantive arguments to stand up Facebook’s claim that it’s totally not a problem for it to continue to take European’s data to the U.S. for processing despite the CJEU ruling that there are huge legal implications if you do that.

The CJEU has — not once, but twice — struck down flagship transfer agreements between the EU and the U.S. on the grounds that U.S. surveillance law is in fatal conflict with European privacy rights.

And while, back in July 2020, the court did allow the possibility that data can be legally moved out of the EU to third countries, it made it clear that DPAs must step in and suspend data flows where they suspect people’s information is going somewhere where it’s at risk.

Given the court simultaneously struck down the EU-U.S. Privacy Shield, the U.S. was clearly identified as a problem third country.

Add to that, Facebook has the additional problem of its data processing being subject to U.S. surveillance law (via NSA programs like PRISM). So there’s no easy fix for Facebook’s EU data transfers, as we’ve said before.

However, having a friendly regulator that doesn’t rush to do anything about really obvious problems is sure to help, though…

In a statement accompanying its publication of details of Facebook’s TIA, Schrems said: “Facebook has been ignoring EU law for 8.5 years now. The newly released documents show that they simply take the view that the Court of Justice is wrong — and Facebook is right. It is an unbelievable ignorance of the rule of law, supported by the lack of enforcement action by the Irish DPC. No wonder that Facebook wants to keep this document confidential. However, it also shows that Facebook has no serious legal defence when continuing to ship European’s data to the US.”

Noyb details the contents of the TIA via a number of videos — including several where Schrems summarizes the contents of the document in detail. (In some locations in Europe it also provides data from the TIA itself but notes that it is withholding this content from the U.K. and Ireland on account of the legal risk of Facebook and/or the DPC bringing baseless SLAPP suits against it to try to exhaust its limited resources.)

Per its analysis, one of Facebook’s tactics to try to deny/evade legal reality is to seize on newer developments, such as the Commission’s updated Standard Contractual Clauses (SCCs) or the adequacy decision recently granted to the U.K. (despite that country’s own surveillance practices) — to claim as new evidence that the earlier CJEU ruling no longer applies.

That means Facebook has variously sought to argue that the DPC was too quick to come to a conclusion vis-à-vis the legality of its data flows; and that circumstances on the ground have changed in a way that means its flows are now totally fine anyway.

All of which serves to underline how delaying enforcement is itself a key strategy for Facebook to evade the application of EU law.

That, in turn, directly implicates its lead EU regulator — because, by taking such a painstakingly long time over investigations the regulator generates ample time and space for Facebook to come up with fresh lines to cynically reboot its arguments against any enforcement taking place.

In short, it allows for a perpetual game of regulatory whack-a-mole that gives Facebook a thumbs up to carry on with data-mining business as usual in the meanwhile. While EU people’s fundamental rights exist only on paper.

The DPC declined to comment on noyb’s fourth Advent Reading when we reached out.

But here’s Schrems’ assessment again: “The Irish DPC is extremely slow and is not in control of this procedures. Facebook constantly moves to another argument, while the DPC has not even decided on the decision from 2013. Facebook is dominating this procedure — instead of the DPC.”

Per noyb, Facebook’s TIA also details what it claims as “supplementary measures” to boost protection for the data — something the EDPB has said may be possible for data controllers to apply to transfers to risky third countries to make such flows achieve compliance with EU standards.

For example, robust, end-to-end encryption may, in theory, be applied to prevent access to data in a readable form when it’s in the U.S.

However, Facebook’s business model is based on profiling users via its big data analysis of their information so it’s certainly not in a position to lock its own business out of people’s data. Not without a radical change of business model.

Unsurprisingly, then, noyb found the TIA’s section on claimed “supplementary measures” contained nothing more than a (long) list of industry standard policies and procedures. So no extra steps at all, then.

“According to the documents we received, absolutely no new or relevant measures were taken by Facebook on foot of the CJEU judgment of 16.6.2020,” noyb notes.

We reached out to the EDPB for a view on the sorts of policies and procedures Facebook’s TIA lists as “supplementary measures” — and will update this post with any response. Update: The EDPB secretariat said:

“[T]he GDPR introduces the new cross-functional principle of accountability. This means that each organisation must analyze its own situation and implement the organizational and technical measures necessary in its specific case. This is a case by case analysis, depending on the risk presented by the processing of personal data by the organisation.

The same principle applies to the Recommendations on measures that supplement transfer tools, which can be found here.”

Asked for its response to noyb’s assessment of its TIA, Facebook sent this statement — attributed to a Meta spokesperson:

Like other companies, we have followed the rules and relied on international transfer mechanisms to transfer data in a safe and secure way. Businesses need clear, global rules, underpinned by the strong rule of law, to protect transatlantic data flows over the long term.


European parliament found to have broken EU rules on data transfers and cookie consents

$
0
0

The European Union’s chief data protection supervisor has sanctioned the European Parliament for a series of breaches of the bloc’s data protection rules.

The decision sounds a loud warning to sites and services in the region about the need for due diligence of personal data flows and transfers — including proper scrutiny of any third-party providers, plug-ins or other bits of embedded code — to avoid the risk of costly legal sanction. Although the parliament has avoided a financial penalty this time.

The European Data Protection Supervisor’s (EDPS) intervention relates to a COVID-19 test booking website which the European Parliament launched in September 2020 — using a third-party provider, called Ecolog.

The website attracted a number of complaints, filed by six MEPs, last year — with the support of the European privacy campaign group noyb — over the presence of third-party trackers and confusing cookie consent banners, among a raft of other compliance problems, which also included transparency and data access issues.

Following an investigation, the EDPS found the parliament was at fault in several respects and it has issued a reprimand — ordering rectification of any outstanding issues within one month.

The test booking website was found to be dropping cookies associated with Google Analytics and Stripe — but the parliament failed to demonstrate it had applied any special measures to ensure that any associated personal data transfers to the U.S. would be adequately protected in light of the landmark Schrems II decision by the EU’s top court.

In July 2020, the CJEU struck down the bloc’s flagship data transfer agreement with the U.S. (aka, the EU-U.S. Privacy Shield) and issued further guidance that transfers of EU people’s personal data to all third countries must be risk assessed on a case by case basis.

The ruling also made it clear that EU regulators must step in and suspend data flows if they believe people’s information is at risk. So in order for some transfers to be legal (such as EU-U.S. data flows) additional measures may be needed to raise the level of protection to the required standard of essential equivalence with EU law — something the European Data Protection Board (EDPB) has since issued detailed guidance on.

However — in the case of the parliament’s COVID-19 test booking site — the EDPS found no evidence that it or its provider had applied any such extra measures to safeguard EU-U.S. transfers resulting from the inclusion of Google Analytics and Stripe cookies.

Turns out the provider had copypasted code from another website it had built, for a test centre in the Brussels International Airport — hence the presence of cookies for payment company Stripe on the parliament site (despite no payments actually being required for testing booked via the website).

Google Analytics cookies, meanwhile, had apparently been included by the provider to “minimise the risk of spoofing and for website optimisation purposes”, according to the EDPS’ findings.

Post-Schrems II, the presence of cookies designed to send data to U.S.-based providers for processing creates immediate legal risk for EU-based websites — and/or their clients (in this case the parliament was found by the EDPS to be the sole data controller, while Ecolog was the data processor). So incorporating Google Analytics may do the opposite of “optimizing” your site’s compliance with EU data protection law.

That said, enforcement of this particular compliance issue has been a slow burn, even since the 2020 CJEU ruling — with only a smattering of regulator-led investigations, and the clearest leadership coming from the EDPS itself.

A (very) long-running complaint against Facebook’s EU-U.S. data transfers, meanwhile, brought by noyb founder Max Schrems in the wake of the 2013 Snowden disclosures about NSA mass surveillance of social network and internet data, still hasn’t resulted in a final decision by its lead data protection supervisor, the Irish Data Protection Commission (DPC) — despite the latter agreeing a full year ago that it would “swiftly” finalize the complaint.

Again, though, that makes the EDPS’ intervention on the parliament complaint all the more significant. tl;dr: EU banhammers are, gradually, falling.

In another finding against the parliament, the EDPS took issue with confusing cookie consent notices shown to visitors to the test booking website — which it found provided inaccurate information; did not always offer clear choices to reject third-party tracking; and included deceptive design, which could manipulate consent.

By contrast, EU law on consent as a legal basis to process people’s data requires is clear that choice must be informed, specific (i.e. purpose limited, rather than bundled) and freely given.

The parliament was also found to have failed to respond adequately to complainants’ requests for information — breaching additional legal requirements law which provide Europeans with a suite of access rights related to their personal data.

While the parliament has landed in the embarrassing situation of being reprimanded by the EDPS, it has avoided a fine — as the regulator only has narrow powers to issue financial penalties which it said these infringements did not trigger.

But the findings of fault by the bloc’s chief data protection supervisor draw fresh red lines around routine regional use of U.S.-based tools like Google Analytics (or, indeed, Facebook Pages) in the wake of the Schrems II decision by the Court of Justice of the European Union.

Copypasting code with standard analytics calls might seem like a quick win to a website builder — but not if the entity responsible for safeguarding visitors’ information fails to properly assess EU-based legal risk.

The EDPS’ reprimand for the parliament thus has wider significance as it looks likely to prefigure a wave of aligned decisions by EU regulators, given the scores of similar complaints filed by noyb in August 2020 targeting websites across the bloc.

“We expect more rulings on this matter in the next month,” noyb’s honorary chairman, Max Schrems, told TechCrunch. “The fact that the EDPS has taken a clear position is a good sign for other DPAs.”

The EDPS’ sanction of the parliament over confusing cookie banners also sends a strong signal over what’s acceptable and what’s not when it comes to obtaining users’ consent to tracking — despite confusing dark patterns still being shamefully widespread in the EU.

(For a particularly ironic example of that, see this blog post by analyst Forrester — which warns that regulators are coming for “dark patterns”, even as the analyst’s own webpage serves what very much looks like a non-compliant cookie notice given the only obvious button reads “Accept cookies” and it takes multiple clicks through sub-menus to find an option to reject tracking cookies, so er… )

Noyb also kicked off a major effort targeting this type of cookie non-compliance last year — which it suggested could lead to it filing up to 10,000 complaints about dubious cookie banners with EU regulators.

Regional regulators are clearly going to have their work cut out to clean up so much infringement — which in turn may encourage DPAs to coordinate on standardizing enforcements to drive the necessary scale of change.

The EDPS decision adds high-level accelerant by sending a clear signal that confusing cookie banners are the same as non-compliant cookie banners from the body responsible for providing EU lawmakers with expert guidance on how to interpret and apply data protection law.

Here’s an illustrative snippet from its decision — which describes a portion of the confusion that hit visitors to the parliament website as they tried to parse the cookie notices at the time of the complaints (tracking cookies have since been removed from the site):

The English version only referred to essential cookies and prompted the user to either click on the ‘accept all’ or the ‘save’ button. The difference between the two buttons was unclear. The French version of the second layer of cookie banner referred both to essential cookies and ‘external media’. These external media cookies included cookies from Facebook, Google Maps, Instagram, OpenStreetMap, Twitter, Vimeo and Youtube. The visitor could also choose between ‘accept all’ or ‘save’. The German version of the second layer of the cookie banner referred to only one ‘external media’ cookie — Google Maps — in addition to the essential cookie.

The EDPS’ conclusion was that the cookie banners in all three languages failed to meet the EU standard for consent.

In another sign of the cookie (non)compliance reckoning that’s now unfolding in the region, some EU regulators have been taking actual action — such as France’s CNIL which issued a major slap-down to Google and Facebook last week, announcing fines of $170 million and $68 million, respectively, for choosing dark pattern design over clear choices in their cookie consent flows.

The EDPB, which supports DPAs’ enforcement of pan-EU rules like the General Data Protection Regulation, established a task force on the cookie issue last fall — saying it would “coordinate the response to complaints concerning cookie banners” noyb had filed with a number of regional agencies.

Schrems describes that step as a “good” development — but said it is also slowing things down.

Although he suggested the direction of travel is toward a standard that will require a simple yes/no for tracking. (Which will of course mean a firm “no” in the vast majority of cases, given how few people like being stalked by ads — hence the U.K. DPA’s recent warning to adtech that the end of tracking is nigh.)

“The CNIL and the EDPS decisions support the view by us that we need to move to fair ‘yes or no’ options,” Schrems told us. “We expect other authorities to follow this lead.”

What about his vintage data flows complaint via-a-vis Facebook’s EU-U.S. transfers? Is there any sign of Ireland’s promised “swift” resolution to that particular complaint — which should have led to a DPA order to Facebook to suspend data flows years ago? But has so far only led to a preliminary order in September 2020 that Facebook suspend transfers.

“They always say that each decision is coming any day — I stopped following these rumors but there is a rumor on this again right now… ” Schrems said on the DPC, concluding his text with an eyeroll emoji.

In bad news for US cloud services, Austrian website’s use of Google Analytics found to breach GDPR

$
0
0

A decision by Austria’s data protection watchdog upholding a complaint against a website related to its use of Google Analytics does not bode well for use of US cloud services in Europe.

The decision raises a big red flag over routine use of tools that require transferring Europeans’ personal data to the US for processing — with the watchdog finding that IP address and identifiers in cookie data are the personal data of site visitors, meaning these transfers fall under the purview of EU data protection law.

In this specific case, an IP address “anonymization” function had not been properly implemented on the website. But, regardless of that technical wrinkle, the regulator found IP address data to be personal data given the potential for it to be combined — like a “puzzle piece” — with other digital data to identify a visitor.

Consequently the Austrian DPA found that the website in question — a health focused site called netdoktor.at, which had been exporting visitors’ data to the US as a result of implementing Google Analytics — had violated Chapter V of the EU’s General Data Protection Regulation (GDPR), which deals with data transfers out of the bloc.

“US intelligence services use certain online identifiers (such as the IP address or unique identification numbers) as a starting point for the surveillance of individuals,” the regulator notes in the decision [via a machine translation of the German language text], adding: “In particular, it cannot be excluded that these intelligence services have already collected information with the help of which the data transmitted here can be traced back to the person of the complainant.”

In reaching its conclusion, the regulator assessed various measures Google said it had implemented to protect the data in the US — such as encryption at rest in its data centers; or its claim that the data “must be considered as pseudonymous” — but did not find sufficient safeguards had been put in place to effectively block US intelligence services from accessing the data, as required to meet the GDPR’s standard.

“As long as the second respondent himself [i.e. Google] has the possibility to access data in plain text, the technical measures invoked cannot be considered effective in the sense of the above considerations,” it notes at one point, dismissing the type of encryption used as inadequate protection.

Austria’s regulator also quotes earlier guidance from German DPAs to back up its dismissal of Google’s “pseudonymous” claim — noting that this states:

” …the use of IP addresses, cookie IDs, advertising IDs, unique user IDs or other identifiers to (re)identify users do not constitute appropriate safeguards to comply with data protection principles or to safeguard the rights of data subjects. This is because, unlike in cases where data is pseudonymised in order to disguise or delete the identifying data so that the data subjects can no longer be addressed, IDs or identifiers are used to make the individuals distinguishable and addressable. Consequently, there is no protective effect. They are therefore not pseudonymisations within the meaning of Recital 28, which reduce the risks for the data subjects and assist data controllers and processors in complying with their data protection obligations.”

The DPA’s wholesale dismissal of any legally relevant impact of the bundle of aforementioned “Technical and Organizational Measures” (such as standard encryption) — which were cited by Google to try to fend off the complaint — is significant because such claims are the prevailing tactic used by US-based cloud giants to try to massage compliance and ensure EU-to-US data transfers continue so they can continue business as usual.

So if this tactic is getting called out here, as a result of a single website’s use of Google Analytics, it can and will be sanctioned by EU regulators elsewhere. After all, Google Analytics is everywhere online.

(See also the extensive list of extremely standard measures cited by Facebook in an internal assessment of its EU-to-US data transfers’ — in which it too tries to claim ‘compliance’ with EU law, per an earlier document reveal.)

The complaint back story here is that back in August 2020 European privacy campaign group noyb filed a full 101 complaints with DPAs across the bloc targeting websites with regional operators that it had identified as sending data to the US via Google Analytics and/or Facebook Connect integrations.

Use of such analytics tools may seem intensely normal but — legally speaking, in the EU — it’s anything but because EU-to-US transfers of personal data have been clouded in legal uncertainty for years.

The underlying conflict boils down to a clash between European privacy rights and US surveillance law — as the latter affords foreigners zero rights over how their data is scooped up and snooped on, nor any route to legal redress for whatever happens to their information when it’s in the US, making it extremely difficult for exported EU data to get the necessary standard of “essentially equivalent” protection that it gets at home when it’s abroad.

To radically simplify: EU law says European levels of protection must travel with data. While US law says ‘we’re taking your data; we’re not telling you what we’re doing; and you can’t do anything about it anyway, sucker!’.

US cloud providers that are subject to Section 702 of the Foreign Intelligence Surveillance Act (FISA) are all in the frame — which takes in a broad sweep of tech giants, including Google and Facebook, since this law applies broadly to “electronic communications services”.

While Executive Order 12,333, a Reagan era mandate that’s also relevant as it also expanded intelligence agency powers to acquire data, is thought to target vulnerabilities in telecoms infrastructure.

The EU-US legal clash between privacy and surveillance dates back almost a decade at this point.

It was catalyized by the 2013 Snowden disclosures which revealed the extent of US government mass surveillance programs — and led, back in 2015, to the EU’s Court of Justice to invalidate the Safe Harbor arrangement between the bloc and the US on the grounds that EU data could no longer be considered safe when it went over the pond.

And whereas Safe Harbor had stood for around 15 years, its hastily agreed replacement — the EU-US Privacy Shield — lasted just four. So the lifespan of commercially minded European Commission decisions seeking to grease transatlantic data flows in spite of the massive privacy risks has been shrinking radically.

Some complaints about risky EU-to-US data transfers also date back almost a decade at this point. But there’s fresh enforcement energy in the air since a landmark ruling by the CJEU in July 2020 — which struck down the Commission’s reupped data transfer arrangement (Privacy Shield), which — since 2016 — had been relied upon by thousands of companies to rubberstamp their US transfers.

The court did not outlaw personal data transfers to so-called third countries entirely. Which is why these data flows didn’t cease overnight smack bang in the middle of 2020.

However it clarified that such data flows must be assessed on a case by case basis for risks. And it made it clear that DPAs could not just turn a blind eye to compliance — hi Ireland! — rather they must proactively step in and suspend transfers in cases where they believe data is flowing to a risky location like the US.

In a much watched for follow-on interpretation of the court ruling, the European Data Protection Board’s (EDPB) guidance confirmed that personal data transfers out of the EU may still be possible — if a set of narrow circumstances and/or conditions apply. Such as the data can be genuinely anonymized so that it is truly no longer personal data.

Or if you can apply a suite of supplementary measures (such as technical stuff like applying robust end-to-end encryption — meaning there’s zero access to decrypted data possible by a US entity) — in order to raise the level of legal protection.

The problem for adtech firms like Google and Facebook is that their business models are all about accessing people’s data. So it’s not clear how such data-mining giants could apply supplementary measures that radically limit their own access to this core business data without a radical change of model. Or, well, federating their services — and localizing European data and processing in the EU.

The Austrian DPA decision makes it clear that Google’s current package of measures, related to how it operates Google Analytics, is not adequate because it does not remove the risk of surveillance agencies accessing people’s data.

The decision puts heavy underscoring on the need for any such supplementary measures to actually enhance standard provisions if they’re to do anything at all for your chances of compliance.

Supplementary of course means extra. tl;dr you can’t pass off totally standard security processes, procedures, policies, protocols and measures as some kind of special Schrems II-busting legal magic, no matter how much you might want to.

(A quick comparable scenario that might hammer home the point: One can’t — legally speaking — hold a party during a pandemic if lockdown rules ban social gatherings simply by branding a ‘bring your own bottle’ garden soirée as a work event. Not even if you’re the prime minister of the UK. At least not if you want to remain in post for long, anyway… )

It’s fair to say that the the tech industry response to the Schrems II ruling has been a massive, collective putting of heads into sand. Or, as the eponymous Max Schrems himself, honorary chair of noyb, puts it in a statement: “Instead of adapting services to be GDPR compliant, US companies have tried to simply add some text to their privacy policies and ignore the Court of Justice. Many EU companies have followed the lead instead of switching to legal options.”

This charade has been possible because — to date — there hasn’t been much regulatory renforcement following the July 2020 ruling.

Despite the European Data Protection Board warning immediately that there would be no grace period for coming into compliance.

To the untrained eye that might suggest the industry’s collective strategy — of ignoring the legal nightmare wrapping EU-to-US transfers in the hopes the problem would just go away — has been working.

But, as the Austria decision indicates, regulatory gears are grinding towards a bunch of rude awakenings.

The European Commission — which remains eager for a replacement to the EU-US Privacy Shield — has also warned there will be no quick fix this time around, suggesting major reforms of US surveillance law are required to bridge the legal divide. (Although negotiations between the Commission and the US on a replacement data transfer agreement are continuing.)

In the meanwhile Schrems II enforcements are starting to flow — and orders to cease US data flows may soon follow.

In another sign of enforcement ramping up, the European Data Protection Supervisor (EDPS) — just this week — upheld a complaint against the European Parliament over US data transfers involving use of Google Analytics and Stripe.

The EDPS’ decision reprimands the parliament and also orders it to fix outstanding issues within one month.

The other 101 complaints noyb filed back in 2020 are also still awaiting decisions. And as Schrems notes EU DPAs have been coordinating their response to the data transfer issue. So there’s likely to be a pipeline of enforcements striking at usage of US cloud services in the coming months. And, well, a lot of sand falling out of eyes.

Here’s Schrems on the Austria DPA’s reasoning again: “This is a very detailed and sound decision. The bottom line is: Companies can’t use US cloud services in Europe anymore. It has now been 1.5 years since the Court of Justice confirmed this a second time, so it is more than time that the law is also enforced.”

“We expect similar decisions to now drop gradually in most EU member states,” he adds, further noting that Member State authorities have been coordinating their response to the flotilla of complaints (the EDPB announced a taskforce on the issue last fall).

“In the long run we either need proper protections in the US, or we will end up with separate products for the US and the EU,” Schrems also said, adding: “I would personally prefer better protections in the US, but this is up to the US legislator — not to anyone in Europe.”

While netdoktor has been found to have violated the GDPR, it’s not clear whether it will face a penalty as yet.

It may also seek to appeal the Austrian DPA’s decision.

The company has since moved its HQ to Germany, which complicates the regulatory jurisdiction component of this process — and means it may face additional enforcement, such as an order banning transfers, in a follow on action by a German regulator.

There is another notable element of the decision that has gone Google’s way — for now.

While the regulator upheld the complaint against netdoktor it did not find against Google’s US business for receiving/processing the data — deciding that the rules on data transfers only apply to EU entities and not to the US recipients.

That bit of the decision is a disappointment to noyb which is considering whether to appeal — with Schrems arguing: “It is crucial that the US providers cannot just shift the problem to EU customers.”

noyb further flags that Google may still face some pending sanction, however, as the Austria DPA has said it will investigate further in relation to potential violations of Article 5, 28 and 29 GDPR (related to whether Google is allowed to provide personal data to the US government without an explicit order by the EU data exporter).

The DPA has said it will issue a separate decision on that. So Google may yet be on the hook for a GDPR breach in Austria.

Penalties under the regulation can scale as high as 4% of a company’s annual global turnover. Although orders to ban data transfers may ultimately prove a lot more costly to certain types of data-mining business models.

To wit: Long time EU privacy watchers will be aware that Facebook’s European business is on penalty time in Ireland over this same EU-US transfers issue. A preliminary order that Facebook suspend transfers was issued by Ireland in fall 2020 — triggering legal action from the social media giant to try to block the order.

Facebook’s court challenge failed but a final decision remains pending from the Irish regulator — which promised noyb a swift resolution of the vintage complaint a full year ago. So the clock really is ticking on that data transfer complaint. And someone should phone Meta’s chief spin doctor, Nick Clegg, to ask if he’s ready to pull the plug on Facebook’s European service yet?

 

On Meta’s ‘regulatory headwinds’ and adtech’s privacy reckoning

$
0
0

What does Meta/Facebook’s favorite new phrase to bandy around in awkward earnings calls — as it warns of “regulatory headwinds” cutting into its future growth — actually mean when you unpack it?

It’s starting to look like this breezy wording means the law is finally catching up with murky adtech practices which have been operating under the radar for years — tracking and profiling web users without their knowledge or consent, and using that surveillance-gleaned intel to manipulate and exploit at scale regardless of individual objections or the privacy people have a legal right to expect.

This week a major decision in Europe found that a flagship ad industry tool which — since April 2018 — has claimed to be gathering people’s “consent” for tracking to run behavioral advertising has not in fact been doing so lawfully.

The IAB Europe was given two months to come up with a reform plan for its erroneously named Transparency and Consent Framework (TCF) — and a hard deadline of six months to clean up the associated parade of bogus pop-ups and consent mismanagement which force, manipulate or simply steal (“legitimate interest”) web users’ permission to microtarget them with ads.

The implications of the decision against the IAB and its TCF are that major ad industry reforms must come — and fast.

This is not just a little sail realignment as Facebook’s investor-soothing phrase suggests. And investors are perhaps cottoning on to the scale of the challenges facing the adtech giant’s business — given the 20% drop in its share price as it reported Q4 earnings this week.

Facebook’s ad business is certainly heavily exposed to any regulatory hurricane of enforcement against permission-less Internet tracking since it doesn’t offer its own users any opt out from behavioral targeting.

When asked about this the tech giant typically points to its “data policies” — where it instructs users it will track them and use their data for personalized ads but doesn’t actually ask for their permission. (It also claims any user data it sucks into its platform from third parties for ad targeting has been lawfully gathered by those partners in one long chain of immaculate adtech compliance!)

Fb also typically points to some very limited “controls” it provides users over the type of personalized ads they will be exposed to via its ad tools — instead of actually giving people genuine control over what’s done with their information which would, y’know, actually enable them to protect their privacy.

The problem is Meta can’t offer people a choice over what it does with their data because people’s data is the fuel that its ad targeting empire runs on.

Indeed, in Europe — where people do have a legal right to privacy — the adtech giant claims users of its social media services are actually in a contract with it to receive advertising! An argument that the majority of the EU’s data protection agencies look minded to laugh right out of the room, per documents revealed last year by local privacy advocacy group noyb which has been filing complaints about Facebook’s practices for years. So watch that space for thunderous regulatory “headwinds”.

(noyb’s founder, Max Schrems, is also the driving force behind another Meta earnings call caveat, vis-a-vis the little matter of “the viability of transatlantic data transfers and their potential impact on our European operations“, as its CFO Dave Wehner put it. That knotty issue may actually require Meta to federate its entire service if, as expected, an order comes to stop transferring EU users’ data over the pond, with all the operational cost and complexity that would entail… So that’s quite another stormy breeze on the horizon.)

While regulatory enforcement in Europe against adtech has been a very slow burn there is now movement that could create momentum for a cleansing reboot.

For one thing, given the interconnectedness of the tracking industry, a decision against a strategic component like the TCF (or indeed adtech kingpin Facebook) has implications for scores of data players and publishers who are plugged into this ecosystem. So knock-on effects will rattle down (and up) the entire adtech ‘value chain’. Which could create the sort of tipping point of mass disruption and flux that enables a whole system to flip to a new alignment. 

European legislators frustrated at the lack of enforcement are also piling further pressure on by backing limits on behavioral advertising being explicitly written into new digital rules that are fast coming down the pipe — making the case for contextual ad targeting to replace tracking. So the demands for privacy are getting louder, not going away.

Of course Meta/Facebook is not alone in being especially prone to regulatory headwinds; the other half of the adtech duopoly — Alphabet/Google — is also heavily exposed here.

As Bloomberg reported this week, digital advertising accounts for 98% of Meta’s revenue, and a still very chunky 81% of Alphabet’s — meaning the pair are especially sensitive to any regulatory reset to how ad data flows.

Bloomberg suggested the two giants may yet have a few more years’ grace before regulatory enforcement and increased competition could bite into their non-diversified ad businesses in a way that flips the fortunes of these data-fuelled growth engines.

But one factor that has the potential to accelerate that timeline is increased transparency.

Follow the data…

Even the most complex data trail leaves a trace. Adtech’s approach to staying under the radar has also, historically, been more one of hiding its people-tracking ops in plain sight all over the mainstream web vs robustly encrypting everything it does. (Likely as a result of how tracking grew on top of and sprawled all over web infrastructure at a time when regulators were even less interested in figuring out what was going on.)

Turns out, pulling on these threads can draw out a very revealing picture — as a comprehensive piece of research into digital profiling in the gambling industry, carried out by researcher Cracked Labs and just published last week, shows.

The report was commissioned by UK based gambling reform advocacy group, Clean Up Gambling, and quickly got picked up by the Daily Mail — in a report headlined: “Suicidal gambling addict groomed by Sky Bet to keep him hooked, investigation reveals”.

What Cracked Labs’ research report details — in unprecedented detail — is the scale and speed of the tracking which underlies an obviously non-compliant cookie banner presented to users of a number of gambling sites whose data flows it analyzed, offering the usual adtech fig-leaf mockery of (‘Accept-only’) compliance.

The report also explodes the notion that individuals being subject to this kind of pervasive, background surveillance could practically exercise their data rights.

Firstly, the effort asymmetry that would be required to go SARing such a long string of third parties is just ridiculous. But, more basically, the lack of transparency inherent to this kind of tracking means it’s inherently unclear who has been passed (or otherwise obtained) your information — so how can you ask what’s being done if you don’t even know who’s doing it?

If that is a system ‘functioning’ then it’s clear evidence of systemic dysfunction. Aka, the systemic lawlessness that the UK’s own data protection regulator already warned the adtech industry in a report of its own all the way back in 2019.

The individual impact of adtech’s “data-driven” marketing, meanwhile, is writ large in a quote in the Daily Mail’s report — from one of the “high value” gamblers the study worked with, who accuses the gambling service in question of turning him into an addict — and tells the newspaper: “It got to a point where if I didn’t stop, it was going to kill me. I had suicidal ideation. I feel violated. I should have been protected.”

“It was going to kill me” is an exceptionally understandable articulation of data-driven harms.

Here’s a brief overview of the scale of tracking Cracked Lab’s analysis unearthed, clipped from the executive summary:

“The investigation shows that gambling platforms do not operate in a silo. Rather, gambling platforms operate in conjunction with a wider network of third parties. The investigation shows that even limited browsing of 37 visits to gambling websites led to 2,154 data transmissions to 83 domains controlled by 44 different companies that range from well-known platforms like Facebook and Google to lesser known surveillance technology companies like Signal and Iovation, enabling these actors to embed imperceptible monitoring software during a user’s browsing experience. The investigation further shows that a number of these third-party companies receive behavioural data from gambling platforms in realtime, including information on how often individuals gambled, how much they were spending, and their value to the company if they returned to gambling after lapsing.”

A detailed picture of consentless ad tracking in a context with very clear and well understood links to harm (gambling) should be exceedingly hard for regulators to ignore.

But any enforcement of consent and privacy must and will be universal, as the law around personal data is clear.

Which in turn means that nothing short of a systemic adtech reboot will do. Root and branch reform.

Asked for its response to the Cracked Labs research, a spokeswoman for the UK’s Information Commissioner’s Office (ICO) told TechCrunch: “In relation to the report from the Clean Up Gambling campaign, I can confirm we are aware of it and we will consider its findings in light of our ongoing work in this area.”

We also asked the ICO why it has failed to take any enforcement action against the adtech industry’s systemic abuse of personal data in real-time bidding ad auctions — following the complaint it received in September 2018, and the issues raised in its own report in 2019.

The watchdog said that after it resumed its “work” in this area — following a pause during the coronavirus pandemic — it has issued “assessment notices” to six organisations. (It did not name these entities.)

“We are currently assessing the outcomes of our audit work. We have also been reviewing the use of cookies and similar technologies of a number of organisations,” the spokeswoman also said, adding: “Our work in this area is vast and complex. We are committed to publishing our final findings once our enquiries are concluded.”

But the ICO’s spokeswoman also pointed to a recent opinion issued by the former information commissioner before she left office last year, in which she urged the industry to reform — warning adtech of the need to purge current practices by moving away from tracking and profiling, cleaning up bogus consent claims and focusing on engineering privacy and data protection into whatever for of targeting it flips to next.

So the reform message at least is strong and clear, even if the UK regulator hasn’t found enough puff to crack out any enforcement yet.

Asked for its response to Cracked Labs’ findings, Flutter — the UK-based company that owns Sky Betting & Gaming, the operator of the gambling sites whose data flows the research study tracked and analyzed — sought to deflect blame onto the numerous third parties whose tracking technologies are embedded in its websites (and only referenced generically, not by name, in its ‘Accept & close’ cookie notice).

So that potentially means onto companies like Facebook and Google.

“Protecting our customers’ personal data is of paramount importance to Sky Betting & Gaming, and we expect the same levels of care and vigilance from all of our partners and suppliers,” said the Sky Bet spokesperson.

“The Cracked Labs report references data from both Sky Betting & Gaming and the third parties that we work with. In most cases, we are not — and would never be — privy to the data collected by these parties in order to provide their services,” they added. “Sky Betting & Gaming takes its safer gambling responsibilities very seriously and, while we run marketing campaigns based on our customers’ expressed preferences and behaviours, we would never seek to intentionally advertise to anyone who may potentially be at risk of gambling harm.”

Regulatory inaction in the face of cynical industry buck passing — whereby a first party platform may seek to deny responsibility for tracking carried out by its partners, while third parties which also got data may claim its the publishers’ responsibility to obtain permission — can mire complaints and legal challenges to adtech’s current methods in frustrating circularity.

But this tedious dance should also be running out of floor. A number of rulings by Europe’s top court in recent years have sharpened guidance on exactly these sorts of legal liability issues, for example.

Moreover, as we get a better picture of how the adtech ecosystem ‘functions’ — thanks to forensic research work like this to track and map the tracking industry’s consentless data flows — pressure on regulators to tackle such obvious abuse will only amplify as it becomes increasingly easy to link abusive targeting to tangible harms, whether to vulnerable individuals with ‘sensitive’ interests like gambling; or more broadly — say in relation to tracking that’s being used as a lever for illegal discrimination (racial, sexual, age-based etc), or the democratic threats posed by population scale targeted disinformation which we’ve seen being deployed to try to skew and game elections for years now.

Google and Facebook respond

TechCrunch contacted a number of the third parties listed in the report as receiving behavioral data on the activities of one of the users of the Sky Betting sites a large number of times — to ask them about the legal basis and purposes for the processing — which included seeking comment from Facebook, Google and Microsoft.

Facebook and Google are of course huge players in the online advertising market but Microsoft appears to have ambitions to expand its advertising business. And recently it acquired another of the adtech entities that’s also listed as receiving user data in the report — namely Xandr (formerly AppNexus) — which increases its exposure to these particular gambling-related data flows.

(NB: the full list of companies receiving data on Sky Betting users also includes TechCrunch’s parent entity Verizon Media/Yahoo, along with tens of other companies, but we directed questions to the entities the report named as receiving “detailed behavioral data” and which were found receiving data the highest number of times*, which Cracked Labs suggests points to “extensive behavioural profiling”; although it also caveats its observation with the important point that: “A single request to a host operated by a third-party company that transmits wide-ranging information can also enable problematic data practices”; so just because data was sent fewer times doesn’t necessarily mean it is less significant.)

Of the third parties we contacted, at the time of writing only Google had provided an on-the-record comment.

Microsoft declined to comment.

Facebook provided some background information — pointing to its data and ad policies and referring to the partial user controls it offers around ads. It also confirmed that its ad policies do permit gambling as an targetable interest with what it described as “appropriate” permissions.

Meta/Facebook announced some changes to its ad platform last November — when it expanded what it refers to as its “Ad topic controls” to cover some “sensitive” topics — and it confirmed that gambling is included as a topic people can choose to see fewer ads with related content on.

But note that’s fewer gambling ads, not no gambling ads.

So, in short, Facebook admitted it uses behavioral data inferred from gambling sites for ad targeting — and confirmed that it doesn’t give users any way to completely stop that kind of targeting — nor, indeed, the ability to opt out from tracking-based advertising altogether.

While its legal basis for this tracking is — we must infer — its claim that users are in a contract with it to receive advertising.

Which will probably be news to a lot of users of Meta’s “family of apps”. But it’s certainly an interesting detail to ponder alongside the flat growth it just reported in Q4.

Google’s response did not address any of our questions in any detail, either.

Instead it sent a statement, attributed to a spokesperson, in which it claims it does not use gambling data for profiling — and further asserts it has “strict policies” in place that prevent advertisers from using this data.

Here’s what Google told us:

“Google does not build advertising profiles from sensitive data like gambling, and has strict policies preventing advertisers from using such data to serve personalised ads. Additionally, tags for our ad services are never allowed to transmit personally identifiable information to Google.”

Google’s statement does not specify the legal basis it is relying upon for processing sensitive gambling data in the first place. Nor — if it really isn’t using this data for profiling or ad targeting — why it’s receiving it at all.

We pressed Google on these points but the company did not respond to follow up questions.

Its statement also contains misdirection that’s typical of the adtech industry — when it writes that its tracking technologies “are never allowed to transmit personally identifiable information”.

Setting aside the obvious legalistic caveat — Google doesn’t actually state that it never gets PII; it just says its tags are “never allowed to transmit” PII; ergo it’s not ruling out the possibility of a buggy implementation leaking PII to it — the tech giant’s use of the American legal term “personally identifiable information” is entirely irrelevant in a European legal context.

The law that actually applies here concerns the processing of personal data — and personal data under EU/UK law is very broadly defined, covering not just obvious identifiers (like name or email address) but all sorts of data that can be connected to and used to identify a natural person, from IP address and advertising IDs to a person’s location or their device data and plenty more besides.

In order to process any such personal data Google needs a valid legal basis. And since Google did not respond to our questions about this it’s not clear what legal basis it relies upon for processing the Sky Betting user’s behavioral data.

“When data subject 2 asked Sky Betting & Gaming what personal data they process about them, they did not disclose information about personal data processing activities by Google. And yet, this is what we found in the technical tests,” says research report author Wolfie Christl, when asked for his response to Google’s statement.

“We observed Google receiving extensive personal data associated with gambling activities during visits to skycasino.com, including the time and exact amount of cash deposits.

“We did not find or claim that Google received ‘personally identifiable’ data, this is a distraction,” he adds. “But Google received personal data as defined in the GDPR, because it processed unique pseudonymous identifiers referring to data subject 2. In addition, Google even received the customer ID that Sky Betting & Gaming assigned to data subject 2 during user registration.

“Because Sky Betting & Gaming did not disclose information about personal data processing by Google, we cannot know how Google, SBG or others may have used personal data Google received during visits to skycasino.com.”

“Without technical tests in the browser, we wouldn’t even know that Google received personal data,” he added.

Christl is critical of Sky Betting for failing to disclose Google’s personal data processing or the purposes it processed data for.

But he also queries why Google received this data at all and what it did with it — zeroing in on another potential obfuscation in its statement.

“Google claims that it does not ‘build advertising profiles from sensitive data like gambling’. Did it build advertising profiles from personal data received during visits to skycasino.com or not? If not, did Google use personal data received from Sky Betting & Gaming for other kinds of profiling?”

Christl’s report includes a screengrab showing the cookie banner Sky Betting uses to force consent on its sites — by presenting users with a short statement at the bottom of the website, containing barely legible small print and which bundles information on multiple uses of cookies (including for partner advertising), next to a single, brilliantly illuminated button to “accept and close” — meaning users have no choice to deny tracking (short of not gambling/using the website at all).

Under EU/UK law, if consent is being relied upon as a legal basis to process personal data it must be informed, specific and freely given to be lawfully obtained. Or, put another way, you must actually offer users a genuine choice to accept or deny — and do so for each use of non-essential (i.e. non-tracking) cookies.

Moreover if the personal data in question is sensitive personal data — and behavioral data linked to gambling could certainly be that, given gambling addiction is a recognized health condition, and health data is classed as “special category personal data” under the law — there is a higher standard of explicit consent required, meaning a user would need to affirm every use of this type of highly sensitive information.

Yet, as the report shows, what actually happened in the case of the users whose visits to these gambling sites were analyzed was that their personal data was tracked and transmitted to at least 44 third party companies hundreds of times over the course of just 37 visits to the websites.

They did not report being asked explicitly for their consent as this tracking was going on. Yet their data kept flowing.

It’s clear that the adtech industry’s response to the tightening of European data protection law since 2018 has been the opposite of reform. It opted for compliance theatre — designing and deploying cynical cookie pop-ups that offer no genuine choice or at best create confusion and friction around opt-outs to drum up consent fatigue and push consumers to give in and ‘agree’ to give over their data so it can keep tracking and profiling.

Legally that should not have been possible of course. If the law was being properly enforced this cynical consent pantomime would have been kicked into touch long ago — so the starkest failure here is regulatory inaction against systemic law breaking.

That failure has left vulnerable web users to be preyed upon by dark pattern design, rampant tracking and profiling, automation and big data analytics and “data-driven” marketers who are plugging into an ecosystem that’s been designed and engineered to quantify individuals’ “value” to all sorts of advertisers — regardless of individuals’ rights and freedoms not to be subject to this kind of manipulation and laws that were intended to protect their privacy by default.

By making Subject Access Requests (SARs), the two data subjects in the report were able to uncover some examples of attributes being attached to profiles of Sky Betting site users — apparently based on inferences made by third parties off of the behavioral data gathered on them — which included things like an overall customer “value” score and product specific “value bands”, and a “winback margin” (aka a “predictive model for how much a customer would be worth if they returned over next 12 months”).

This level of granular, behavioral background surveillance enables advertising and gaming platforms to show gamblers personalized marketing messages and other custom incentives tightly designed to encourage them return to play — to maximize engagement and boost profits.

But at what cost to the individuals involved? Both literally, financially, and to their health and wellbeing — and to their fundamental rights and freedoms?

As the report notes, gambling can be addictive — and can lead to a gambling disorder. But the real-time monitoring of addictive behaviours and gaming “predilections” — which the report’s technical analysis lays out in high dimension detail — looks very much like a system that’s been designed to automate the identification and exploitation of people’s vulnerabilities.

How this can happen in a region with laws intended to prevent this kind of systematic abuse through data misuse is an epic scandal.

While the risks around gambling are clear, the same system of tracking and profiling is of course being systematically applied to websites of all sorts and stripes — whether it contains health information, political news, advice for new parents and so on — where all sorts of other manipulation and exploitation risks can come into play. So what’s going on on a couple of gambling sites is just the tip of the data-mining iceberg.

While regulatory enforcement should have put a stop to abusive targeting in the EU years ago, there is finally movement on this front — with the Belgian DPA’s decision against the IAB Europe’s TCF this week.

However where the UK might go on this front is rather more murky — as the government has been consulting on wide-ranging post-Brexit changes to domestic DP law, and specifically on the issue of consent to data processing, which could end up lowering the level of protection for people’s data and legitimizing the whole rotten system.

Asked about the ICO’s continued inaction on adtech, Rai Naik — a legal director of the data rights agency AWO, which supported the Cracked Labs research, and who has also been personally involved in long running litigation against adtech in the UK — said: “The report and our case work does raise questions about the ICO’s inaction to date. The gambling industry shows the propensity for real world harms from data.”

“The ICO should act proactively to protect individual rights,” he added.

A key part of the reason for Europe’s slow enforcement against adtech is undoubtedly the lack of transparency and obfuscating complexity the industry has used to cloak how it operates so people cannot understand what is being done with their data.

If you can’t see it, how can you object to it? And if there are relatively few voices calling out a problem, regulators (and indeed lawmakers) are less likely to direct their very limited resource at stuff that may seem to be humming along like business as usual — perhaps especially if these practices scale across a whole sector, from small players to tech giants.

But the obfuscating darkness of adtech’s earlier years is long gone — and the disinfecting sunlight is starting to flood in.

Last December the European Commission explicitly warned adtech giants over the use of cynical legal tricks to evade GDPR compliance — at the same time as putting the bloc’s regulators on notice to crack on with enforcement or face having their decentralized powers to order reform taken away.

So, by hook or by crook, those purifying privacy headwinds gonna blow.

*Per the report: “Among the third-party companies who received the greatest number of network requests while visiting skycasino.com, skybet.com, and skyvegas.com, are Adobe (499), Signal (401), Facebook (358), Google (240), Qubit (129), MediaMath (77), Microsoft (71), Ve Interactive (48), Iovation (28) and Xandr (22).”

This report was updated to correct a typo: Flutter is a UK-based company, not “US-based” as we wrote originally

France’s privacy watchdog latest to find Google Analytics breaches GDPR

$
0
0

Use of Google Analytics has now been found to breach European Union privacy laws in France — after a similar decision was reached in Austria last month.

The French data protection watchdog, the CNIL, said today that an unnamed local website’s use of Google Analytics is non-compliant with the bloc’s General Data Protection Regulation (GDPR) — breaching Article 44 which covers personal data transfers outside the bloc to so-called third countries which are not considered to have essentially equivalent privacy protections.

The U.S. fails this critical equivalence test on account of having sweeping surveillance laws which do not provide non-U.S. citizens with any way to know whether their data is being acquired, how it’s being used or to seek redress for any misuse.

Whereas the EU’s GDPR demands that data protection travels with citizens’ information as a stipulation of legal export.

France’s CNIL has been investigating one of 101 complaints filed by European privacy advocacy group, noyb, back in August 2020 — after the bloc’s top court invalidated the EU-U.S. Privacy Shield agreement on data transfers.

Since then (indeed, long before) the legality of transatlantic transfers of personal data have been clouded in uncertainty.

While it has taken EU regulators some time to act on illegal data transfers — despite an immediate warning from the European Data Protection Board of no grace period in the wake of the July 2020 CJEU ruling (aka ‘Schrems II) — decisions are now finally starting to flow. Including another by the European Data Protection Supervisor last month, also involving Google Analytics.

In France, the CNIL has ordered the website which was the target of one of noyb’s complaints to comply with the GDPR — and “if necessary, to stop using this service under the current conditions” — giving it a deadline of one month to comply.

As in Austria, the CNIL’s assessment of Google’s claimed supplementary measures (which it had argued ensured EU citizens’ data which was taken, via Google Analytics, to the U.S. was adequately protected) found them to be inadequate.

“[A]lthough Google has adopted additional measures to regulate data transfers in the context of the Google Analytics functionality, these are not sufficient to exclude the accessibility of this data for U.S. intelligence services,” the CNIL writes in a press release announcing the decision.

“There is therefore a risk for French website users who use this service and whose data is exported.”

The CNIL does leave open the door to continued use of Google Analytics — but only with substantial changes that would ensure only “anonymous statistical data” gets transferred. (And the Austrian decision against Google Analytics last month took a broad interpretation of what constitutes personal data in this context, finding that an IP address could be enough given how it may be combined with other bits of data held by Google to identify a site user.)

The French regulator is also very emphatic that under “current conditions” use of Google Analytics is non-compliant — and may therefore need to cease in order for the site in question to comply with the GDPR.

The CNIL also suggests use of an alternative analytics tool which does not involve a transfer outside the EU to end the breach.

Additionally, it says it’s launched an evaluation program to determine which website audience measurement and analysis services may be exempt from the need to obtain user consent (i.e. because they only produce anonymous statistical data which can be exported legally under GDPR). Which suggests the CNIL could issue guidance in future that recommends GDPR compliant alternatives to Google Analytics.

The decision on this complaint has clear implications for any website based in France that’s currently using Google Analytics — or, indeed, any other tools that transfer personal data to the U.S. without adequate supplementary measures — at least in the near term.

For one thing, the CNIL’s decision notes it has made “other” compliance orders to website operators using Google Analytics (again without naming any sites).

While, given joint working by EU regulators on these 101 strategic complaints, the ramifications likely scale EU-wide.

The CNIL also warns that its investigation — along with the parallel probes being undertaken by fellow EU regulators — extends to “other tools used by sites that result in the transfer of data of European Internet users to the United States”, adding: “Corrective measures in this respect may be adopted in the near future.”

So all U.S.-based tools that are transferring personal data are facing regulatory risk.

We’ve asked the CNIL which other tools it’s looking at and will update this report with any response.

Update: The regulator told us the use of Facebook Connect by French site managers “has also been the subject of complaints to the CNIL, which are currently being investigated”.

Google was also contacted for comment on the CNIL’s decision and how it plans to respond but at the time of writing it had not responded.

Commenting on the French watchdog’s slapdown in a statement, noyb’s founder and honorary chair, Max Schrems, said: “It’s interesting to see that the different European Data Protection Authorities all come to the same conclusion: the use of Google Analytics is illegal. There is a European task force and we assume that this action is coordinated and other authorities will decide similarly.”

Privacy Shield v3 to the rescue?

One factor that could change the situation is a new agreement between the EU and U.S. on data transfers.

Negotiations between the European Commission and U.S. counterparts are ongoing in an attempt to plug the data transfer gap as happened after the CJEU struck down Safe Harbor in 2015 (aka Schrems I), meaning it was fairly quickly replaced by Privacy Shield, until that too was soon invalidated.

This pattern of complaints leading to (quicker) strike downs makes a ‘quick fix’ impossible — even if the enforcements now landing against mainstream tools like Google Analytics will certainly concentrate minds in Brussels and Washington, increasing political and economic urgency to find a way to solve this issue.

The Commission has said it’s keen for a replacement data transfer agreement with the U.S. However it has also warned repeatedly that any such deal must be robust to future legal challenge — meaning it must substantially addressing the CJEU’s concerns. And without broad reform of U.S. surveillance practices that looks difficult.

Still, in recent weeks, reports have suggested the EU and U.S. are nearing agreement on a new data transfer arrangement — potentially as soon as this month, according to reporting by Politico, which also suggested the two sides could unveil a new accord in May at an upcoming meeting of the Trade and Tech Council.

Details of how exactly the U.S. and EU will be able to square the data transfer (il)legality circle are scant, though.

Per Politico, one redress mechanism that is being discussed would allow EU citizens to directly (or via their national governments) submit complaints to an independent judicial body if they believe U.S. national security agencies have unlawfully handled their personal information.

But that still leaves plenty of questions. Not least how a EU citizen could know to complain in the first place, given the lack of notification of U.S. surveillance intercepts.

The U.S. also still does not have a federal privacy law similar to the EU’s GDPR, meaning its own citizens lack comprehensive protections for their information — illustrating quite how far apart the two jurisdictions remain on this issue.

And while some U.S. states — such as California — have taken matters into their own hands in recent years, passing laws to provide residents with some legal rights wrapping their information, privacy protections for U.S. citizens remain, at best, a patchwork. Given that, it may be tricky for the Biden administration to provide greater rights for non-U.S. citizens to complain about U.S. surveillance vs what the country provides to its own citizens.

Commercial pressure is building on this issue though.

Just this week Facebook/Meta felt moved to publish a blog post — rejecting reporting of its financial filing that claimed its disclosures amounted to a threaten to pull its service out of Europe as a result of the data transfer uncertainty.

“We want to see the fundamental rights of EU users protected, and we want the internet to continue to operate as it was intended: without friction, in compliance with applicable laws — but not confined by national borders,” the tech giant wrote, urging progress on a new deal.

Meta does have its own very pressing cause to press for a fresh ‘fix’, though, given that its business is subject to a very long-standing data transfers complaint — and it’s now over a year since its lead EU data regulator, the Irish Data Protection Commission, promised to swiftly resolve that complaint.

By contrast, EU-based platforms that can localize and legally firewall user data in the bloc, where it’s shielded by the GDPR, have reasons to be cheerful.

To wit: Last month — in the wake of the Austrian ruling — one Poland-based Google Analytics competitor, Piwik Pro, told us that Schrems II was one of the main concerns raised by organisations contacting it to seek a Google Analytics alternative.

“Just two weeks after the Noyb’s 101 complaints list was published we’ve acquired as a customer one the major banks listed there,” said CEO Maciej Zawadziński. “Interest in our product and services is directly affected by all the developments in the privacy & compliance space. The Schrems II ruling was big for us last year, just like the Austrian DPA’s ruling is fairly big now.

“We predict that in 2022 local EU data storage that eliminates offshore data transfers completely will be an important selling point.”

Zawadziński added that the company had opened an EU located data center to host and process client data for this reason, noting: “The data center is managed by an EU company and neither we nor any of our suppliers is subject to the [U.S.] Cloud Act.”

Schrems also predicts a splintering of digital services and dedicated EU product provision — unless or until the U.S. reforms its approach to privacy. “In the long run we either need proper protections in the U.S., or we will end up with separate products for the U.S. and the EU. I would personally prefer better protections in the U.S., but this is up to the U.S. legislator — not to anyone in Europe,” he added in a statement.

This report was updated with additional comment

Meta sent a new draft decision on its EU-US data transfers

$
0
0

Facebook has received a “revised” preliminary decision from its lead EU privacy regulator with implications for its ability to continue to export user data to the US, TechCrunch has learned.

“Meta has 28 days to make submissions on this preliminary decision at which point we will prepare a draft Article 60 decision for other Concerned Supervisory Authorities (CSAs). I’d anticipate that this will happen in April,” a deputy commissioner at the Irish Data Protection Commission (DPC), Graham Doyle, told us.

Doyle declined to detail the contents of the preliminary decision.

However, back in September 2020, the DPC sent a preliminary order telling Facebook to suspend data transfers, per a Wall Street Journal report at the time, citing people familiar with the matter.

Meta, as the tech giant has recently rebranded its data-mining empire, has been flagging the ongoing risk to its EU-US data transfers in calls with investors.

It also immediately sought to challenge the DPC’s earlier draft order in the courts — but that legal avenue ran out of road in May last year when the Irish High Court issued a ruling dismissing the challenge to the DPC procedures.

It’s not clear there has been any material change to the facts of the case — which hinges on the clash between European data protection law and US surveillance powers — since the earlier draft order telling the company to suspend transfers that would lead the regulator to arrive at a different conclusion now, regardless of what Meta submits at this next stage.

Moreover, in recent months, other European data protection agencies have been issuing decisions against other US services that involve the transfers of personal data to the US — such as Google Analytics — which is, from an optics perspective at least, amping up the pressure on the DPC to finalize a decision against Meta.

The regulator also faced a procedural challenge by the original complainant, Max Schrems, who extracted an agreement from it, in January 2021, that it would swiftly finalize the long-standing complaint — so that’s another quasi deadline in play.

Under the terms of that settlement, the DPC agreed Schrems would also be heard in its (parallel) “own volition” procedure — which it opened in addition to its complaint-based enquiry related to his original (2013) complaint, and which is now moving forward via this new preliminary decision issued to Meta.

Schrems confirmed he has been sent the decision by the DPC — but made no further comment.

(For yet more twists, back in November, the privacy advocacy group founded by Schrems filed a complaint of criminal corruption against the DPC — accusing the regulator of “procedural blackmail” in relation to attempts to prevent publication of other draft complaints… )

It’s still not clear how long exactly this multi-year data transfer saga could drag on before a final decision hits Meta — potentially ordering it to suspend transfers.

But it should be closer to months than years, now.

The Article 60 process loops in other interested data protection agencies — who have the ability to make reasoned objections to a draft decision by a lead authority within, initially, a month timeframe. Although there can be extensions. And if there is major disagreement between DPAs over a preliminary decision it can add months to the final decision-making process — and could ultimately require the European Data Protection Board to step in and push a final decision.

All that’s still to come; for now the ball is back in Meta’s court to see what fresh blather its lawyers can come up with.

The tech giant was contacted for comment on the latest development and in a statement a Meta spokesperson told us:

“This is not a final decision and the IDPC have asked for further legal submissions. Suspending data transfers would be damaging not only to the millions of people, charities, and businesses in the EU who use our services, but also to thousands of other companies who rely on EU-US data transfers to provide a global service. A long-term solution on EU-US data transfers is needed to keep people, businesses and economies connected.”

There is another moving piece to this apparently neverending story — as negotiations between the European Commission and the US on a replacement to the defunct Privacy Shield data transfer arrangement remain ongoing.

In recent months, Facebook and Google have been making public calls for a new transatlantic data transfer deal to be agreed — urging a high level fix for the legal uncertainty now facing scores of US cloud services (or at least those that refuse to give up their own access to people’s data-in-the-clear).

However the Commission has previously warned there will be no ‘quick fix’ this time — saying back in 2020 that a replacement would only be possible if all the issues identified by the European Court of Justice in its July ruling which invalidated Privacy Shield can be resolved (which means both a legal and accessible means of redress for Europeans and tackling disproportionate US surveillance powers which rely on bulk intercepts of Internet communications).

So, in short, Privacy Shield 3.0 looks like a tall order — certainly in the kind of short order that Meta’s business-as-usual demands… So chief lobbyist, Nick Clegg, certainly has his work cut out!

 

Privacy Shield 2.0 is ‘high priority’ but ‘not easy’, warns EU’s Vestager

$
0
0

Agreeing a new data transfer agreement with the US is a “high priority” for the EU, Margrethe Vestager, the bloc’s executive VP for digital strategy, said yesterday — but she also warned that a replacement for the defunct EU-US Privacy Shield (and Safe Harbor before that) is by no means a done deal, given the fundamental legal clash between European privacy rights and US surveillance overreach.

In recent weeks some press reports have suggested a new deal on transatlantic data transfers is immanent — potentially as soon as this month, per a Politico report from February 3.

However the mood music from commissioner Vestager suggests otherwise.

“This is a high priority endeavour to make such an agreement with the Americans,” she said during a Q&A session at a press conference on the Commission’s latest proposal around data sharing (aka the Data Act). “This is not easy, to say it really understated. Because we take the guidance of course from the court [CJEU] who ruled on the basis of the Charter of Fundamental Rights which is not something that we can or will change.

“So we need to find a way of working with the Americans that is in accordance with this — in order of course not to get a negative Schrems III judgment, if so be. But it is a priority for us in order to enable the business community to make the most of data but again to do that under safe and clear transparent conditions — and this is why we’re pushing this.”

The reason the data transfers issue came up in the context of the Data Act — which Vestager herself suggested is mostly concerned with non-personal data (whereas the Schrems’ ruling that nixed Privacy Shield and Safe Harbor concers exports of personal data out of the bloc) — is that the draft legislation proposes a sort of ‘Schrems II for non-personal data’, as data protection experts quickly dubbed it.

An explanatory memorandum prefixed to the draft Data Act proposal lists “safeguards against unlawful data transfer without notification by cloud service providers” as one of its specific objectives — explaining: “This is because concerns have been raised about non- EU/European Economic Area (EEA) governments’ unlawful access to data. Such safeguards should further enhance trust in the data processing services that increasingly underpin the European data economy.”

Article 27 of the Data Act, which deals with international access and transfer, also states:

“Providers of data processing services shall take all reasonable technical, legal and organisational measures, including contractual arrangements, in order to prevent international transfer or governmental access to non-personal data held in the Union where such transfer or access would create a conflict with Union law or the national law of the relevant Member State”

Summing up the intent, an EU source familiar with the matter told us: “We are saying that non personal data shouldn’t leave EU if it’s likely to fall into hands of foreign spooks we don’t trust” — also likening it to a “Schrems II for non-personal data”.

So for anyone fondly imagining that the regional legal uncertainty that’s been hanging over (especially) US-based cloud services, since the middle of 2020, is but a little fog that’s bound to clear, this plain-text stipulation on data transfers looks ominous.

Here in the draft text of the Data Act the Commission can be seen essentially doubling down on Schrems II — rather than seeking ways to circumvent the CJEU judgement, as it did after Schrems I by rushing to agree a Privacy Shield with such obvious legal flaws.

The European Court of Justice’s two strikes in quick succession on this issue appear to have put paid to any equally cynical attempt to paper over fundamental legal cracks.

Which in turn means that talk of service segregation/federation, and increasing data localization in the EU, feels very real — at least failing major US surveillance law reforms.

During the Data Act press conference, Vestager rejected a journalist’s suggestion that the Data Act is protectionist, asserting: “It is beneficial for companies no matter where they are from that data can flow.”

But she also made it clear that the EU’s rulebook is binding — so it is clear that without a replacement data transfer agreement between the EU and the US data will not free flow.

Even, it seems, ‘non-personal’ data. Which raises the stakes even further — and risks casting the Data Act itself as a bit of a Privacy Shield negotiating tool given that, without a robust new data transfer deal between the EU and the US — one which can survive fresh legal challenges — cloud service switching may only be easier in the future if it’s moving data from a US to an EU provider, not vice versa.

“The thing is that we of course have obligations to make sure that the way things are flowing is in accordance with data protection provisions — this is why we can do these adequacy decisions,” Vestager emphasized yesterday. “That goes beyond the Data Act. Right now our colleague Didier Reynders [justice commissioner] is chef de file [leader] of the negotiations with the US to the follow up of the judgement Schrems II.

“So the Data Act will not stand alone. We will continue this work in making adequacy decisions with third country jurisdictions where we can see that things they are as they should be.”

Also reiterating the point at the presser was internal market commissioner, Thierry Breton. “The aim with the Data Act is opening up and unblocking industrial data,” he said. “It’s important we give rules and explanations so that all companies, European or otherwise, know exactly what the rules of the game are on the single market of the EU. We give that readability.

“For the cloud services we need to make sure there are safeguards in place to protect personal data against elicit access by a third party — a foreign government say — where there is no procedural protection or international agreement that’s why we’re discussing this with our partners to set the rules.”

“It certainly does not prevent voluntary transfer of data if the company or the citizen so wishes,” he added. “It’s obvious but we need to recall it. International cooperation between judicial authorities and police authorities are obviously included in this.”

With the US, the data protection situation is definitely not where it “should be” vis-a-vis equivalence with EU law as it stands. Au contraire.

This is why, in recent months, data protection regulators around the bloc have been issuing enforcement decisions that implicate the use of mainstream US based services like Google Analytics, Google Fonts and Stripe — not out-and-out ordering a halt to the usage of such services but saying usage must be compliant with EU law (and currently isn’t), and therefore that it may be necessary to seek alternatives, given… y’know, the obvious gap there.

France’s watchdog, for example, kicked off a piece of work to evaluate alternatives to Google Analytics for website audience measurement and analytics that may be exempt from the need to obtain user consent.

European public sector bodies’ use of cloud services is also facing coordinated scrutiny via a joint enforcement action which began earlier this month — similarly zeroing in on concern over international data transfers.

Plus of course there’s a major decision still looming over Facebook’s EU-US data flows — which were Schrems’ original target, all the way back in 2013.

An order to suspend those could be coming as soon as May, according to the Irish Data Protection Commission’s (DPC) chief, Helen Dixon, in an interview with Reuters. Although she also made it clear the Irish regulator won’t be issuing widespread orders off the foot of whatever it decides on Facebook.

“The decision that the DPC will ultimately make in relation to Facebook will be specific to Facebook and addressed only to Facebook,” she said. “The consequence of the CJEU decision is that we can’t make a broader and more sweeping finding. We have to go company by company by company” — further noting there are “hundreds of thousands of entities” that would potentially have to be looked at, per the Reuters report, starting with other large internet platforms.

The DPC already issued a preliminary suspension order to Facebook soon after the CJEU Schrems II ruling, in September 2020, but the tech giant quickly obtained a stay — before going on to lose its challenge to regulatory procedure in the Irish High Court last May.

And as we reported earlier this week the DPC has now submitted a revised preliminary decision to Facebook’s parent, Meta — giving the company a month to respond.

After which the other EU data supervisors will have a chance to review and potentially object to the Irish draft decision, which could add months more to the decision-making process. But if there’s broad agreement over whatever Ireland has concluded Dixon’s line is that “the earliest time we could have a final decision could be the end of May”.

Ireland’s slow pace of enforcement on investigations into tech giants means there’s absolutely no prospect of any other near term decisions landing on the data transfers issue against companies like Google.

However, EU wide, we are seeing other regulators taking action where they have local competence — so it may be a case of ‘death by a thousands complaints’ against tools like Google Analytics, for which viable alternatives do absolutely exist (Facebook isn’t the only social network but it’s a stickier beast, owing to network effects and data portability challenges).

One burning question is whether there will be a fresh ‘Privacy Shield 2.0’ agreed by the EU and US before Ireland decides on Facebook’s data flows — assuming there’s a final decision from Ireland at the end May.

Even if there’s basic agreement between the two sides on the substance of a new deal by then that timeline looks tight — with any new draft adequacy arrangement still needing to be adopted by the Commission which would need to wait for an opinion from the European Data Protection Board (EDPB).

Last time, after Safe Harbor was invalidated in October 2015, it took around seven months between the draft Privacy Shield deal being published (February 2016) and the mechanism being adopted by the Commission — and finally going live for businesses to self certify (August 2016).

Although, notably, the Working Party 29 — aka the body made up of Member State data protection agencies’ which has since morphed into the EDPB — agreed not to cut off any transfers during the Privacy Shield hashing out period.

Meta may well be banking on a similarly generous implementation grace period for any new Privacy Shield — to allow it to keep dodging an order to suspend its EU-US data flows.

That said, it’s not clear whether the EDPB would feel it’s in its gift to do so this time around, given enforcements on the data transfers issue are already happening without the need to wait on Ireland.

Schrems’ August 2020 101 complaints, deliberately filed with agencies around the EU to counteract forum shopping, have made sure of that.

The CJEU is also of course likely to take a very dim view of any replacement adequacy agreement that repeats the mistakes of the past. And the court has shown an ability to accelerate deliberations where it perceives major risks to fundamental rights. So while Privacy Shield limped along for four years, any flawed replacement — let’s call it a ‘Privacy Umbrella’ — may have an even shorter run before being blown hopelessly inside out.

Perhaps most saliently: A third strike from the CJEU would be a massive embarrassment for the Commission — which explains Vestager’s loud, cautionary signals, to the point of explicitly stating that it does not want “a negative Schrems III judgment”.

Whether the Commission will once again willingly carry the illegal data flows of Meta et al is a particularly interesting question.

It is not the same college that went through all this last time round. Moreover, it has embarked on an ambitious tech policy agenda — of which the Data Act is just the latest puzzle piece, next to sweeping new plans to reign in tech giants’ market power, update ecommerce rules and define a framework for ‘trusted AI’, among numerous other legislative moves it wants to reshape the digital economy and European society to fire up the EU economy.

Hence it talks a big game of ‘digital sovereignty’.

Yet the EU’s appetite for finding out what digital sovereignty means in practice, at the business end of scores of disrupted data flows, could be sorely tested very soon.

 

EU, US agree on data transfer deal to replace defunct Privacy Shield

$
0
0

The European Union has just announced reaching an agreement in principle with the U.S. on a revived trans-Atlantic data flows deal — potentially signaling an end to the many months of legal uncertainty that has dogged cloud services after a landmark court ruling in July 2020 that struck down the EU-U.S. Privacy Shield.

“We have found an agreement in principle on a new framework for trans-Atlantic data flows,” European Commission President Ursula von der Leyen said at a joint press conference with U.S. President Joe Biden today.

“This will enable predictable, trustworthy data flows between the EU and the U.S., safeguarding privacy and civil liberties.”

The legal uncertainty hanging over EU-U.S. data flows has led, in recent months, to European data protection agencies issuing orders against flows of personal data passing via products such as Google Analytics, Google Fonts and Stripe, among others.

Facebook’s lead EU regulator also finally sent a revised draft decision to Meta last month, in a multiyear complaint related to its EU-U.S. data flows, after the company exhausted legal challenges against an earlier preliminary suspension order in fall 2020.

Although the social networking giant still hasn’t actually been ordered to suspend its EU-U.S. data flows — and may now dodge that bullet entirely if EU regulators agree to suspend data transfer enforcements now that there’s a political agreement in place with the U.S., as they did when Privacy Shield was agreed in principle, allowing a grace period of suspended enforcements during however many months are needed to secure final agreement and adopt the new EU-U.S. data flows deal.

That will surely be what Meta has been hoping would happen as it sought to delay earlier enforcement.

The detail of what has been agreed by the EU and U.S. in principle — and how exactly the two sides have managed to close the gap between what remain two very differently oriented legal systems — is not clear. And since the sustainability of the deal will hinge on exactly that fine detail, there is little that can be taken away from today’s announcement beyond the political gesture.

The uncertainty over EU-U.S. data transfers actually extends back further than 2020. A much longer-standing predecessor agreement, called Safe Harbor, was invalidated by Europe’s top court in 2015 over the same core clash between EU privacy rights and U.S. surveillance laws.

This dynamic means that any replacement deal faces the daunting prospect of fresh legal challenges to test how robust it is when it comes to ensuring that EU citizens’ rights are adequately protected when their data flows to the U.S.

“We managed to balance security and the right to privacy and data protection,” von der Leyen suggested in further brief remarks during a wide-ranging press conference. She also couched the agreement reached as “balanced and effective” but provided no specifics on what has actually been decided.

Update: The White House has now released this “fact sheet” on the transatlantic data framework agreement which sheds a little light on where the two sides have focused — noting for example that EU peoples will be able to seek redress from “a new multi-layer redress mechanism that includes an independent Data Protection Review Court” that the U.S. administration says would consist of individuals “chosen from outside the U.S. Government who would have full authority to adjudicate claims and direct remedial measures as needed”.

The Commission had very similar things to say about Privacy Shield (and Safe Harbor) — until the court took a very different view, of course. So it’s important to understand that a full and final assessment does not and cannot rest with EU commissioners or their U.S. counterparts.

Only the European Court of Justice can weigh in.

Max Schrems, the privacy lawyer and campaigner whose name has become synonymous with striking down trans-Atlantic data transfer deals (aka Schrems I and Schrems II) was quick to sound a note of skepticism.

Responding to von der Leyen’s announcement in a tweet, he wrote: “Seems we do another Privacy Shield especially in one respect: Politics over law and fundamental rights.

“This failed twice before. What we heard is another ‘patchwork’ approach but no substantial reform on the U.S. side. Let’s wait for a text but my [first] bet is it will fail again.”

Schrems famously — and correctly — called Privacy Shield lipstick on a pig. So his assessment of the text, when it emerges, will arguably have more weight than the Commission’s.

Via his privacy advocacy not-for-profit, noyb, Schrems also said he expects to be able to get any new agreement that does not meet the requirements of EU law back to the CJEU “within a matter of months” via civil litigation and preliminary injunction.

“[O]nce [the final text] arrives we will analyze it in depth, together with our U.S. legal experts. If it is not in line with EU law, we or another group will likely challenge it. In the end, the Court of Justice will decide a third time. We expect this to be back at the Court within months from a final decision,” he noted in a statement, “It is regrettable that the EU and U.S. have not used this situation to come to a ‘no spy’ agreement, with baseline guarantees among like-minded democracies. Customers and businesses face more years of legal uncertainty.”

The response from the tech industry to the news of another revived data transfer deal was predictably positive.

Google, which along with Meta has been pressing hard in recent months for the two sides to come up with a viable compromise, was quick to welcome the announcement.

In a statement, a company spokesperson told us:

“People want to be able to use digital services from anywhere in the world and know that their information is safe and protected when they communicate across borders. We commend the work done by the European Commission and U.S. government to agree on a new EU-U.S. framework and safeguard transatlantic data transfers.”

The CCIA tech industry association, which has also lobbied hard for a replacement to Privacy Shield, welcomed today’s announcement as “good news.” Although its director, Alexandre Roure, found a little space in his response statement to express needling displeasure with incoming EU rules on industrial and connected device data reuse — which he suggested will introduce fresh “data restrictions.”

Editor’s note: This report was updated with additional comment and a pointer to the White House fact sheet


EU-US trans-Atlantic data transfers ‘deal in principle’ faces tough legal review

$
0
0

The political agreement reached late last month between the European Union and the United States on a new trans-Atlantic data transfers pact, which aims to end years of legal uncertainty for businesses exporting data from the bloc, is not yet a done deal.

The deal in principle faces scrutiny in the coming months once the full text is published — and will most likely face fresh (and fast) legal challenges if it does get adopted, so everything hinges on the detail. 

Yesterday, the European Data Protection Board (EDPB), which advises on compliance with EU data protection law, put out a statement signaling where it will be directing its attention when it reviews this detail — saying it will be paying “special attention to how this political agreement is translated into concrete legal proposals.”

“The EDPB looks forward to assessing carefully the improvements that the new framework may bring in light of EU law, CJEU case law and previous recommendations of the Board, once the EDPB receives all supporting documents from the European Commission,” the board wrote.

“In particular, the EDPB will analyse whether the collection of personal data for national security purposes is limited to what is strictly necessary and proportionate. In addition, the EDPB will examine how the announced independent redress mechanism respects EEA individuals’ right to an effective remedy and to a fair trial. More specifically, the EDPB will look into whether any new authority part of this mechanism has access to relevant information, including personal data, when exercising its mission and whether it can adopt decisions binding on the intelligence services. The EDPB will also consider whether there is a judicial remedy against this authority’s decisions or inaction.”

The EDPB also warned that the political deal is not yet a legal agreement — emphasizing that data exporters must continue to comply with the case law of the bloc’s top court in the meanwhile, especially with the July 2020 ruling by the CJEU, aka Schrems II, which struck down the last EU-U.S. data transfers deal, the EU-US Privacy Shield.

Talking up the political deal reached last month to replace the defunct Privacy Shield, the Biden administration said the U.S. has committed to putting in place “new safeguards” that it said would ensure that state surveillance agencies’ data-gathering activities will be “necessary and proportionate” and linked to “defined national security objectives.”

The clash between the primacy of U.S. surveillance laws and robust EU privacy rights remains the fundamental schism — so it’s difficult to see how any new deal will be able to stand against fresh legal challenges unless it commits to putting hard limits on U.S. mass surveillance programs.

The replacement deal will also need to create a proper avenue for EU individuals to seek and obtain redress if they believe U.S. intelligence agencies have unlawfully targeted them. And that also looks difficult.

Last month, ahead of the announcement of the political agreement, The Hill reported on a U.S. Supreme Court ruling in a case related to FBI surveillance that it suggested made the chance of a deal harder — as the court reinforced state secrets privilege for spying cases by finding that Congress did not eliminate this privilege when it enacted surveillance reforms in the Foreign Intelligence Surveillance Act (FISA).

“Though the opinion left open the possibility that people … nonetheless could pursue claims based on public information about the government’s surveillance, most people need sensitive information from the government to help prove that its surveillance was illegal. The decision could make it easier for the government to shield such information from judges, and therefore harder for most people challenging surveillance to prove their claims and obtain justice in court,” the publication reported.

The need for deeper reforms of FISA has been a key call from critics of previous EU-U.S. data transfer deals (before Privacy Shield, there was Safe Harbor — which was struck down by the CJEU in 2015).

Last month, the White House said the deal agreed in principle would enable EU individuals to “seek redress from a new multi-layer redress mechanism that includes an independent Data Protection Review Court that would consist of individuals chosen from outside the U.S. Government who would have full authority to adjudicate claims and direct remedial measures as needed.”

However, the legal status of this “Review Court” will be key — as the EDPB’s statement underlines.

Moreover, if the U.S. Supreme Court takes a different view that essentially overrides any deal the Biden administration is promising by making it impossible for EU individuals to obtain the information they need to be able to bring a claim against the U.S. government, that would undermine the ability of EU people to actually obtain redress. … And, well, the CJEU has made it clear that EU individuals subject to illegal surveillance in a third country must have a genuine and meaningful way to pursue accountability.

The EDPB’s statement elucidates exactly these concerns — with the board flagging that any “new authority” set up under a claim of delivering redress will need “access to relevant information, including personal data” in order to be able to live up to that mission and will also need to be able to adopt decisions that are binding on the intelligence services.

It’s worth remembering that the Privacy Shield “ombudsperson” regime, which was tested in Privacy Shield, didn’t pass muster with the CJEU — both on grounds of independence and because of the inability of the ombudsperson to adopt decisions that are binding on the intelligence services.

How different a “Data Protection Review Court” would be in those regards remains to be seen.

Max Schrems, the EU privacy campaigner who successfully brought down the last two EU-U.S. data transfers deals, remains skeptical that the latest “fix” offers anything substantially different — recently tweeting another eye-catching visual metaphor to illustrate his early assessment:

Failing genuine surveillance reform in the U.S., it may well be that squaring the data-transfer circle is as steep a challenge as it has proved the last two times around the block. But even if the political imperative inside the EU to do a deal overrides obvious legal gaps — as it did when the last Commission ignored concerns and adopted the Privacy Shield — that will just mean the two sides are buying time until the next CJEU strike down.

Likely not very much time, either.

While Safe Harbor stood for 15 years, Privacy Shield only lasted four — and Schrems has suggested a fresh challenge to another flawed replacement would be fast-tracked into the CJEU “within months” of a final decision to adopt it. So EU lawmakers have been warned.

Italy’s data watchdog latest to warn over use of Google Analytics

$
0
0

Another strike against use of Google Analytics in Europe: The Italian data protection authority has found a local web publisher’s use of the popular analytics tool to be non-compliant with EU data protection rules owing to user data being transferred to the U.S. — a country that lacks an equivalent legal framework to protect the info from being accessed by U.S. spooks.

The Garante found the web publisher’s use of Google Analytics resulted in the collection of many types of user data, including device IP address, browser information, OS, screen resolution, language selection, plus the date and time of the site visit, which were transferred to the U.S. without adequate supplementary measures being applied to raise the level of protection to the necessary EU legal standard.

Protections applied by Google were not sufficient to address the risk, it added, echoing the conclusion of several other EU DPAs who have also found use of Google Analytics violates the bloc’s data protection rules over the data export issue.

Italy’s DPA has given the publisher in question (a company called Caffeina Media Srl) 90 days to fix the compliance violation. But the decision has wider significance as it has also warned other local websites that are using Google Analytics to take note and check their own compliance, writing in a press release [translated from Italian with machine translation]:

[T]he Authority draws the attention of all Italian managers of websites, public and private, to the illegality of transfers made to the United States through GA [Google Analytics], also in consideration of the numerous reports and questions that are being received by the Office, and invites all data controllers to verify the compliance of the methods of use of cookies and other tracking tools used on its websites, with particular attention to Google Analytics and other similar services, with the legislation on the protection of personal data.

Earlier this month, France’s data protection regulator issued updated guidance warning over illegal use of Google Analytics — following a similar finding of fault with a local website’s use of the software in February.

The CNIL’s guidance suggests only very narrow possibilities for EU-based site owners to use Google’s analytics tool legally — either by applying additional encryption where keys are held under the exclusive control of the data exporter itself or other entities established in a territory offering an adequate level of protection; or by using a proxy server to avoid direct contact between the user’s terminal and Google’s servers.

Austria’s DPA also upheld a similar complaint over a site’s use of Google Analytics in January.

While the European Parliament found itself in hot water over the same core issue at the start of the year.

All these strikes against Google Analytics link back to a series of strategic complaints filed in August 2020 by European privacy campaign group noyb — which targeted 101 websites with regional operators it had identified as sending data to the U.S. via Google Analytics and/or Facebook Connect integrations.

The complaints followed a landmark ruling by the bloc’s top court in July 2020 — which invalidated a data transfer agreement between the EU and the U.S., called Privacy Shield, and made it clear that DPAs have a duty to step in and suspend data flows to third countries where they suspect EU citizens’ information of being at risk. 

The so-called ‘Schrems II’ ruling is named after noyb founder and longtime European privacy campaigner, Max Schrems, who filed a complaint against Facebook’s EU-U.S. data transfers, citing surveillance practices revealed by NSA whistleblower Edward Snowden, which ended up — via legal referral — in front of the CJEU. (A prior challenge by Schrems also resulted in the previous EU-U.S. data transfer arrangement being struck down by the court in 2015.)

In a more recent development, a replacement for Privacy Shield is on the way: In March, the EU and the U.S. announced they had reached political agreement on this.

However the legal details of the planned data transfer framework still have to be finalized — and the proposed mechanism reviewed and adopted by EU institutions — before it can be put to any use. Which means that use of U.S.-based cloud services remains shrouded in legal risk for EU customers. 

The bloc’s lawmakers have suggested the replacement deal may be finalized by the end of this year — but there’s no simple legal patch EU users of Google Analytics can reach for in the meanwhile. 

Additionally, the gap between U.S. surveillance law and EU privacy law continues to grow in certain regards — and it’s by no means certain the negotiated replacement will be robust enough to survive the inevitable legal challenges.

A simple legal patch for such a fundamental clash of rights and priorities looks like a high bar — failing substantial reform of existing laws (which neither side looks moved to offer).

Hence we’ve started to see software-level responses by certain U.S. cloud giants — to provide European customers with more controls over data flows — in a bid to find a way to route around the data transfers legal risk.

Update: A Google spokesman sent us this statement following the Garante decision:

People want the websites they visit to be well designed, easy to use, and respectful of their privacy. Google Analytics helps publishers understand how well their sites and apps are working for their visitors — but not by identifying individuals or tracking them across the web. These organizations, not Google, control what data is collected with these tools, and how it is used. Google helps by providing a range of safeguards, controls and resources for compliance.

He also told us Google is reviewing the Italian DPA’s decision.

In a blog post back in January the company sought to reframe the narrative around Google Analytics — claiming the tool isn’t used to track people around the web or profile them; and arguing it’s not a privacy risk by suggesting customers remain in control of the data they collect via the analytics tool; as well as pointing out it offers an IP anonymization feature.

Google’s blog post also emphasizes “numerous measures” it claims it applies to “protect data, and safeguard it from any government access.”

However a number of European DPAs have now come to a very different conclusion vis-a-vis the Schrems II-related risk of using Google Analytics — including (in the case of Austria‘s DPA) finding that even if IP anonymization had been enabled by the site it would not have fixed the risk.

Although — responding to this point — Google’s spokesman pointed to a recent update (Google Analytics 4) that he said has introduced more controls and product configurations since the versions of the software which originated the complaints to DPAs; and which he suggested could help address concerns about data export risks — such as via the ability to stop the transfer of IP addresses (including anonymized IP addresses) outside the EU; disable Google Signals data collection at country level; and disable granular location and device data collection at the country level. 

He added that while Google remains convinced that the only sustainable solution to the recurring uncertainty around EU to U.S. data exports is a durable legal framework, the tech giant is exploring developing additional controls to provide its customers with further assurances around the safeguarding of user data.

This report was updated with additional responses from Google.

Summer decision looms for Facebook’s EU-US data transfers

$
0
0

The wheels of privacy enforcement are slowly turning against Facebook in Europe — where its lead data protection regulator, Ireland’s Data Protection Commission (DPC), has taken a key procedural step on a data transfers complaint whose substance dates back almost a decade.

The DPC confirmed today that a draft decision on the legality of Meta’s EU-U.S. data transfers has been sent to other data protection agencies to review. Deputy commissioner, Graham Doyle, declined to provide any details about the decision itself — confirming only that it has been sent.

“We have sent it to our colleague data protection authorities for their views and they have one month to come back to us,” he told TechCrunch.

Politico, which reported this development earlier today, is also reporting that the DPC’s draft decision orders Meta to cease EU-U.S. data exports — and the publication goes on to claim that the order could result in Europeans being cut off from services such as Facebook and Instagram as soon as this summer, if the order is confirmed by other EU data protection agencies who are reviewing it.

A DPC order to Facebook blocking it from exporting EU citizens’ data to the US for processing, which is essentially how its service works currently, would not be a surprise: Back in September 2020, The Wall Street Journal also reported that the DPC had sent Meta a preliminary order to suspend EU-U.S. data flows.

The regulator did not confirm the substance of the order then either but the development followed a landmark decision by the bloc’s top court, in July 2020, which blew a fresh hole in the legal framework around data exports to the U.S. owing to the clash between U.S. surveillance law and EU privacy rights — so the specific substance of the order did not need spelling out.

What would be a surprise, in this painfully long and twisted data protection (lack of) enforcement saga, would be if the wheels of Europe’s regulators turned so fast that Facebook’s data flows were actually ordered to cease this summer.

Plus — given parallel reports that EU-U.S. negotiations to finalize the replacement for the defunct Privacy Shield data transfer mechanism have stalled since a political deal was reached on it back in March, and may now no longer be completed by the end of the year (as the bloc has previously suggested) — cynics might suggest that a leak now about Facebook’s data flows being on the cusp of being blocked could be a strategic ploy to grease the wheels of those high level talks.

Commission lawmakers certainly won’t relish reading summer headlines about Europeans’ Facebook access being cut off — even if the company itself continues to have a poor reputation across the wider sweep of EU institutions, following years of privacy scandals.

Max Schrems, the lawyer and European privacy campaigner who filed the original Facebook data transfer complaint back in 2013, is also doubtful that today’s development will lead to a swift resolution. In a statement responding to press reports of the draft decision, he said he anticipates that procedural objections will keep spinning out the enforcement process — potentially for many more months, or even as long as a year.

“We expect other DPAs to issue objections, as some major issues are not dealt with in the DPC’s draft,” he wrote in a response posted to the website of noyb, his privacy rights not-for-profit. “This will lead to another draft and then a vote. In other cases this took another year overall, as the DPC did not implement comments from other DPAs voluntarily and took more than half a year to forward the case for a vote.”

So — tl;dr — don’t bet the farm on Facebook shutting down in Europe before the new school year.

Schrems also points out the draft decision passed by the DPC to other EU DPAs is still not a decision on his original complaint. That’s because the regulator opened an ‘own volition’ enquiry alongside his complaint, which is what this draft decision relates to. So his complaint is still very much unresolved — underlining the challenge for citizens to exercise the EU rights they have on paper against powerful tech giants.

This is also why Schrems is calculating his wait for enforcement as nine years (it’s also two years since the landmark CJEU decision that struck down the EU-U.S. Privacy Shield mechanism and yet Facebook’s data still flows).

Schrems expects yet more delays to enforcement too — predicting the tech giant will throw the kitchen sink at litigating against any order; and querying why the DPC (seemingly) isn’t reaching for a financial penalty in this case which he argues could actually be a useful enforcement lever here, especially if backdated to his original complaint… (We asked Schrems about the substance of the DPC’s draft decision but he said he’s unable to provide public comment.)

“Facebook will use the Irish legal system to delay any actual ban of data transfers,” he predicts in the prepared remarks. “Ireland will have to send the police to physically cut the cords before these transfers actually stop. What would be however easy to do, is a fine for the past years, where the CJEU has clearly said the transfers were illegal. It is strange, that the DPC seems to ‘forget’ about the only efficient penalty in this case. You could get the impression, that the DPC just wants to have this case go in circles again and again.”

Delays do seem a given.

Back in February, when the DPC sent a revised decision on the complaint to Meta, the regulator told us it expected this procedural step to be done in April — so even that piece has taken months longer than anticipated without an obvious reason why. (We asked the DPC — but Doyle just said it took “a few weeks longer” than expected.)

Reached for comment on the DPC draft decision being sent to other DPAs for review, a Meta spokesperson sought to play down the whole complaint by suggesting that a fresh data transfer agreement between the EU and the US will soon fix its legal headache.

Here’s Meta’s statement:

This draft decision, which is subject to review by European Data Protection Authorities, relates to a conflict of EU and US law which is in the process of being resolved. We welcome the EU-US agreement for a new legal framework that will allow the continued transfer of data across borders, and we expect this framework will allow us to keep families, communities and economies connected.

What Meta doesn’t mention is that, once adopted, any fresh EU-U.S. data transfer deal is likely to face a fresh legal challenge.

Privacy experts also expect it will take less time for such a challenge to arrive in front of the CJEU this (third) time around, as well as pointing out that the court has also shown itself willing to expedite rulings when there are risks to EU citizens’ fundamental rights. So if Meta is banking on a strategy of perpetually kicking its regional privacy problems into the legal long grass it may, finally — finally! — find itself running out of road and forced to a hard stop.

But the chances of Facebook’s service lights being turned off in Europe this summer look vanishingly small.

On the replacement EU-US data transfer framework that’s still being negotiated, a Commission official declined to offer a revised timeline for likely adoption. “The work is very much ongoing but I do not have a specific timeline to share with you,” she told us. “It is now first of all for the U.S. to translate the political agreement into legal texts and we are working with them on this.”

The EU’s executive is not the only entity which has to be involved in the adoption process, either, with input also required from the European Data Protection Board; a committee of representatives of EU Member States; and the European Parliament.

This report was updated with comment from the Commission

Facebook avoids a service shutdown in Europe for now

$
0
0

Facebook has avoided the risk of being forced to shut down its service in Europe this summer as a result of the latest twist in a long-running data protection complaint saga that relates to a clash between EU privacy and U.S. surveillance law.

The delay — in what’s still widely expected to be a suspension order to Meta, Facebook’s parent company, to stop illegal data exports — follows objections to a draft decision by its lead data protection authority by other regional DPAs who have been reviewing it. The Irish Business Post picked up on the development in an earlier report.

Under the bloc’s General Data Protection Regulation (GDPR), cross-border complaints typically require cooperation and at least consensus from DPAs in affected regions so it provides a right for interested authorities to weight in on draft decisions by a lead data supervisor.

“We have received some objections from a small number of Data Protection Authorities in this case,” confirmed the Irish Data Protection Commission (DPC)’s deputy commissioner, Graham Doyle. “We are currently assessing the objections and will engage with the relevant authorities to try and resolve the issues raised.”

Doyle declined to provide details of specific objections received.

The development means that a final decision on the (seemingly) neverending saga over the legality of Facebook’s data transfers — and the fate of its service in Europe — will be kicked down the road for several more months at least.

In a previous cross-border GDPR complaint, related to WhatsApp, where objections were similarly raised to Ireland’s proposed enforcement, it took a total of around nine months before a final decision (and hefty fine) was issued.

Meta will also very likely challenge a suspension order in the Irish courts — and could also seek a stay, as it did previously, to try to keep operating as is in the meanwhile.

Back in September 2020, the DPC sent a preliminary suspension order to Facebook over the data transfers issue — triggering a legal challenge. Facebook won a stay but its bid to roll back the regulator’s decision via judicial review, challenging its procedure, was, eventually, dismissed in May 2021 reviving the enforcement process — which has been grinding on ever since.

The DPC would not comment on an expected timeframe for a final decision to be issued in light of the objections to its draft.

That will, in any case, depend on whether differing views on enforcement between DPAs can be settled without requiring a formal dispute resolution mechanism in the GDPR — which can require the European Data Protection Board to step in (as happened in the WhatsApp case).

If DPAs can’t come to agreement among themselves and the EDPB has to get involved it’s not beyond the bounds of possibility that a final decision gets pushed into 2023.

Max Schrems, the privacy campaigner and lawyer who originally raised the Facebook data transfers complaint (all the way back in 2013!), has said he expects considerable further delays in enforcement of any suspension order — including by Meta lodging appeals — as we reported previously.

The tech giant has a specific incentive to delay enforcement as long as possible as it may be banking on (or, well, hoping for) a fresh data transfer deal between the EU and the U.S. landing to save Facebook’s service bacon in Europe.

A preliminary agreement on a new high level EU-U.S. accord on data transfers — replacing the defunct Privacy Shield (which is one very tangible casualty of this Facebook data transfers complaint saga thus far; its predecessor Safe Harbor is another) — was reached back in March. And, earlier this year, the European Commission was suggesting it could be finalized by the end of this year.

Since then some reports have suggested progress towards agreeing a final text may not be going as smoothly as hoped, so a replacement deal may not arrive so quick — which would complicate Meta’s ‘strategy’ (if we can call it that) of banking on further delays to enforcement buying it enough time to switch its European data transfers onto a fresh, unchallenged legal basis.

The latter outcome would of course reset the whole game of legal and regulatory whack-a-mole yet again. So, well, it’s possible this saga could still have years, plural, to run…

Irish government criticized over proposed law-change that would ‘muzzle’ Big Tech critics

$
0
0

An 11th-hour amendment to the government of Ireland’s Courts and Civil Law (Miscellaneous Provisions) Bill 2022 will serve to “muzzle” people looking to speak out about how Big Tech and public bodies are misusing their data.

That’s according to civil liberties and human rights nonprofit the Irish Council for Civil Liberties (ICCL), which is calling on politicians to veto the amendments when they’re presented for debate in Parliament on Wednesday this week.

Since the introduction of GDPR back in 2018, Ireland has emerged as a primary enforcer of Europe’s data privacy regulations, due in large part to the fact that most of the major U.S. tech platforms have their European subsidiaries on the Emerald Isle.

GDPR, in a nutshell, is designed to give citizens control of their data and the ability to hold companies to account through greater transparency and appropriate legal remedies should they mistreat their users’ personal information. Prominent data privacy advocates and activists have used GDPR to do just that, including the ICCL and Austrian lawyer Max Schrems, who has filed numerous complaints against the likes of Amazon, Apple, Netflix and Facebook’s parent Meta over their handling and transferring of user data.

But with the proposed amendment by Ireland’s Government, this could silence any meaningful critique both of billion-dollar companies and the Irish Data Protection Commission (DPC) itself.

“Ireland’s enforcement of the GDPR against Big Tech, and how it upholds the data rights of everyone in Europe, should not be the subject of eleventh-hour amendments inserted during the end-of-term legislative rush,” ICCL senior fellow Dr. Johnny Ryan said in a statement.

‘Confidential information’

The amendment proposes a new section 26A for the Data Protection Act 2018, which would “prohibit the disclosure of confidential information” revealed at any point during a complainant’s interaction with the DPC. So for example, an activist or advocate or citizen who has filed a complaint with the DPC would not be able to reveal any findings or information garnered as a result of the complaint (for example to the media) if that information has been deemed “confidential” by the DPC itself*. It’s not entirely clear what kind of information that it could class as “confidential,” but it seems fairly broad, encompassing “commercially sensitive” information, any information that has been “given in confidence” or information that would “reasonably expected to prejudice the effectiveness and performance of a relevant function.”

According to the ICCL, if this amendment is greenlighted, it would “make it impossible for journalists to properly report on Ireland’s GDPR supervision of Big Tech firms,” or any organization, that counts Ireland as their European base — this includes Meta, Apple, Microsoft, Google and TikTok.

“Justice should be done in public,” Ryan said. “The DPC should be holding public GDPR hearings. Instead, the Government is attempting to make DPC decision making even more opaque.”

None of Your Business (NOYB), an Austria-based nonprofit co-founded by Max Schrems in 2017, also commented on the proposed amendments, saying that Big Tech and the DPC “want privacy for themselves” by preventing people from simply talking about the specifics of a complaint.

“You cannot criticize an authority or big tech companies if you are not allowed to say what’s going on in a procedure,” Schrems said. “By declaring every tiny information ‘confidential’ they try to hinder public discourse and reporting. Instead of reacting to legitimate criticism, they now try to criminalize it. The proposed law in Ireland makes it criminal to share any information on a procedure. This shows that they fear the public and reporters more than anything.The law would however allow the DPC to selectively share information when it sees fit. It is mind blowing that this would happen in a European country.”

TechCrunch has reached out to the DPC for comment, and will update here when we hear back.

*This article was updated to clarify that the proposed amendment would still enable a complainant to reveal details of the complaint itself and that a complaint had been made, it would just mean that they couldn’t divulge any details that emerged in the aftermath of the complaint. 





Latest Images