Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Two years in, GDPR defined by mixed signals, unbalanced enforcement (complianceweek.com)
136 points by joering2 on May 29, 2020 | hide | past | favorite | 210 comments


I would pay a subscription to a news site if they spent all their time evaluating 2-5 year old events and determining which side was right.

2 years ago comments of "this will only benefit the lawyers" would be -50 points. Turns out... actually yeah.


Reminds me when the EU "Fixed" Cookies and now we have these stupid click-through warnings everywhere that have pretty much ruined the user experience. Root cause: people passing laws they have idea what about.


Nothing about the EU law requires sites to put up cookie warnings and degrade the ux. They choose to do that.


But they all choose to do that, so that's the actual outcome of the legislation.

I don't understand why I keep seeing this argument. We all have to deal with cookie dickbars regardless of whether or not your armchair lawyer argument is technically correct. If this is what the law does in practice, and the behavior is generally seen as compliant, then it's a dumb law.


Plenty don't. Hyperbole isn't helpful.

Lots of websites seemingly actually break the law, with full page "can't see the page unless you click accept" etc. The problem seems to be under-enforcement, and then we're right back at the point of TFA.


I agree that (under) enforcement is part of the equation, but I don't think it's the primary issue.

The problem starts when legislators write vague or ill-posed laws because they don't understand the underlying technical issues. If your understanding of the problem is that "cookies are some sort of tracking token and tracking is bad," you will not be able to write effective legislation. You need to have a basic understanding of HTTP, you need to know how cookies fit into HTTP, and you need to be aware of some basic cookie usage patterns. You need to be able to identify that some things that certain companies build using cookies are problematic, and other things are totally benign and are required for basic functionality. You need to be capable of understanding that a user's "allow/deny cookies" preference usually can't even be saved without a cookie.

When the law actually comes out, it's so vague and seemingly self-contradictory that lawyers at these companies are going to say "We have no clue WTF they meant here, or how they intend to enforce this law, or if they even intend to enforce it at all, but just to be safe, let's just do it this way that's obviously stupid, but appears to be what everyone else thinks will pass the sniff test."

Then the law isn't actually enforced, because the enforcers don't understand the law either, so the lawyers are like, "Well, no guidance based on patterns of enforcement, in fact, they don't seem to be enforcing this thing at all, so let's just do whatever we want," which is how you get your laundry list of obviously non-compliant websites.

Legislation needs to be clear, enforcement needs to actually happen, and needs to happen consistently in order to reinforce the clarity of the original law. If you don't have these things, your legislation is going to fail. Cookie law used in this example, but the same thing applies to GDPR. So far, very little enforcement, and enforcement has been extremely inconsistent. It's a really bad start.


> You need to be able to identify that some things that certain companies build using cookies are problematic, and other things are totally benign and are required for basic functionality. You need to be capable of understanding that a user's "allow/deny cookies" preference usually can't even be saved without a cookie.

But they did all that. Functional cookies (shopping carts, preferences, etc.) all need no consent. This is not some kind of complicated thing. It only gets complicated if you want to try to trick users into allowing other cookies and/or hope that whenever those things get enforced, they’ll start with bigger fishes than you.


The worst part is that all alternatives to cookies are worse privacy wise... Or at least it would have been if every single browser didn't tacitly accept and keep all cookies. It's getting better, but making cookies permanent should really count as an additional privilege (I mean it does for browser extensions, so why on earth not arbitrary webpages?), also session cookies should really just go away when the tab closes, and first party isolation should probably be the default.


hate to say that but "check your privilege".

You know what cookies are and made your informed decision to accept them in your browsers. I do not, for example, and block most of them.

99% of internet users do not had that knowledge before those "stupid click-through warnings everywhere".

So if you want to write off the outcome of the EU cookie law, it is not "entitled Californian software engineers got a little annoyed", but instead "the whole world woke up to the fact advertising companies are tracking everything they do online via cookies".


That's actually a good idea. It's really frustrating how (in other types of news) a lot of buzz can be generated and then just silence and we forget it all and move on. But it's not really something that would sell well. Not many people care about yesterday's news, people want to know what's coming next and not what came out of some magazine's prediction several years ago.


I agree it wouldn't sell very well.

But postmortems in the tech world do trend sometimes

a news cycle postmortem - someone pitch this in an elevator


It’s a great idea, but I doubt it would succeed. Human nature tends to include not admitting fault. Also, many readers seem to choose their news (at least political news) for confirmation bias (whether intentional or not), so a news site/paper saying they were wrong would defeat that.

Not saying that retractions don’t happen, but they seem to be always buried under the headlines.


They could call out the news sites/papers of the opposite side though.


That is something I considered. Many partisan sites love to point out the errors of the other side. I know Fox News loves to call out CNN all the time. However, if every site did it, I fear that would just lead to more confirmation bias. And why report that the other side was right? That hurts your viewpoint.

What we need is a non-partisan non-profit to do it. But then there’s the problem of funding (which results in conspiracy theories).


Seems like the best way now is for the reader to make notes to themselves and regularly check back to them and see how things changed, compared to where the hive mind was earlier.


Since we're talking about ideas for news services: I would love to be able to get a list of the most important news in a month or a year. Not a top 10 list but simply a way to try to catch up if you miss a few months.


There are various newsletters for this (mostly weekly), as well as those "what happened this year" summaries everywhere in December.

Also, there are physical magazines that get issued monthly (though it's rarer for political and "news news" topics).


There is a bit deja vu, since at that time we were pointing out similar flaws in the DPD (lack of enforcement, lack of clarity, govt inefficiencies, the inability for proponents to separate intent from reality, etc).

Sadly, there is an absolute "for or against" mentality out there. You can't make it clear that the implementation of such a law would be poor enough to not justify it being enacted in the first place lest you are told "well, should we do nothing?". We can easily start with easy-to-understand/implement transparency requirements (maybe even just as guidelines or requirements for a form of certification at first while encouraging technical solutions in the meantime). Never-realized scary fines might as well have never been brought forth.


I think the app should be called "Captain Hindsight"


There was a popular pushback against American tech in Europe at the time. Criticism of GDPR was conflated with criticism of that pushback.


Do nothing is an untenable position. Software companies have become so brazen and scummy that even a law which is unevenly enforced is absolutely necessary.

The GDPR brought privacy to the front and into the attention of software companies. It gives us individuals at least a chance to control our data.


I think the reason why these situations boil down to "for-or-against" is because people craft narratives about these measures/changes to law. If the narratives are pushed hard enough then they end up overpowering nuanced discussion.

"If you don't agree with GDPR then you must want to steal my data". It's difficult to make nuanced arguments against it when you get shouted down by statements like that. These narratives are used to label someone and it seems to be common in modern politics.


I thought that’s what everyone thought back then. At least all my friends were like, the lawyers will have a good time and be the only ones benefiting from this


That's a pretty default thing that most educated people know though. Regulation and bureaucracy usually benefit the established behemoths with enough lawyers, while gray zones, sluggish laws or easy processes benefit new players or small ones without all the legal armor.

No wonder that Facebook is lobbying for getting regulated and Microsoft proposed regulating some computer vision uses (faces) etc. Some people of course eat it up and think it's because they are just mature now and understand their responsibility and want to benefit the public etc. In reality it's because they have armies of lawyers who can follow all the legal minutiae, have the internal processes for compliance and documentation, audits etc. Which allow them to do whatever they did before (obviously they lobby for laws that allow their use cases) but make it difficult for others to enter. It's the "kicking the ladder" idea.


Most of my Euro friends didn't think this. There's a huge difference in approach to regulation between the EU and the US.

I would guess that this article is written from a US viewpoint - the "isn't it strange how everyone is approaching enforcement of this differently?" attitude isn't even remotely strange to a European.

As lots of people pointed out at the time, GDPR in Europe isn't that groundbreaking - almost all EU countries had/have data privacy laws that approach the GDPR (not least because the GDPR itself is a continuation of EU regulation in this area). It came as a shock to US companies because of the sudden "well, none of you paid any attention when we didn't give this regulation teeth, so here's the fangs" enforcement change.

And yeah, I'd love to take part in retrospective reviews of old news to work out who was right :)


That's why such a news app would be so interesting.

I remember comments from back then slightly differently.


HN was (is) super into GDPR and any dissonance was (is, but fortunately less nowadays) quickly downvoted.


A person commented and asked me about suggestions, but deleted his comment before I could answer so here it is anyways:

Super quickly (I'm sure you have heard of, or can quickly use a search engine to find the commonly listed issues):

Damages: damages need to be scaled according to the company size, severity and amount. GDPR was created to punish Big Players, but the wording that would have fit them is equally (and should be, laws should be equal) applied to small companies resulting in an impedance mismatch. Frankly, the damages are too small for the Big Players, but insane to the small ones. GDPR also does not apply to the state, but holy shit it fucking should!

Enforcement: it needs to be equally enforced and you need to be able to sue by yourself over it instead of just limiting it to a state organisation.

Data: it should be data that is directly tied to you, ie leave the normal web logs etc out of it. PII is just a sham as it's defined today. A factor of usage also needs to play into it, ie normal web server ip logs that are separate and don't feed into a user specific connection into a database should not be a consideration.

Access: access _needs_ to be able to be done online if the data is collected or transferred online. Ie no this "you need to physically mail us a certified mail with your id" shit. GPDR is a fucking failure in this aspect. Also no required strong authentication: access should be just directly through your account you can access normally without strong authentication.

Usage: GDPR does not allow you to trade tracking for access (ie monetisation of content is almost impossible if you care about user privacy): this is insane. GDPR also supposedly does not allow for those complicated "accept all or modify your preferences" windows, but it should have no saying in that: if a site wants to make the experience painful, that's up to them. It is up to the user to select if they want to use that site or not.


Not quite the time scale you're looking for, but "Delayed Gratification" provides retrospective news and analysis from the previous quarter.

I'm in no way affiliated with the magazine other than I accidentally bought a copy once and enjoyed it.

https://www.slow-journalism.com


I have been habitually sending "I have finished using your service, could you please delete my account" emails since around 2008 or so.

Prior to GDPR, 9 replies in 10 would be polite but dismissive responses, basically telling me that I'm making an unreasonably burdensome request.

Post GDPR, everyone responds with a message stating they have followed my request in a timely fashion.

Am I disappointing that GDPR has not fined Facebook into oblivion? Yeah. I was hoping for global scale schadenfreude as much as the next person.

However, GDPR has fundamentally normalized the notion that peoples relationships with companies need not be permanent, and that submitting to eternal spam is not the accepted price of buying a flight online. GDPR has established in law that it's totally reasonable for people to not want to give their local gym an iris scan in order to enter the gym and work out, and it is indeed the gym owner who's the arsehole in that situation. This grants leverage against the arsehole.

In that respect, it's been a smashing success. There is much we could improve on, but on the statement "it only benefited the lawyers"...hard disagree.


Where in the article did you read that that was the only outcome of GDPR? Did you miss how all European companies now need to take privacy seriously?


> Turns out... actually yeah.

That might be how you feel. For me, GDPR and the “Cookie Law” have been amazing, as they make it incredibly easy to detect which websites and businesses you should avoid.

I do however wish they’d be a lot more aggressive with the fines.


Nothing says GDPR is something that can't be improved upon. Better enforcement, refinement of laws, everything is possible. It has to begin somewhere and that beginning is rarely perfect. Every failure is also an opportunity to learn what to do better. As some other people have commented, the intent is right, the execution has to be improved. Edit: Fixed grammar and some words


The issue is the collateral damage. The EU doesn't have a thriving web/tech sector to begin with when compared to the US or China. These kinds of things likely make it worse.


I see this argument every so often but I'm wondering, what did we actually lose?

Nasty social media that makes their money on outrage and exposing people to scam ads? That's about the only thing I can think of, and I don't think it's a big loss. The legal environment of the EU might actually pave the way for better social media, if the market wasn't already monopolized by the current incumbents.

As a counter-argument, Europe and especially the UK has a thriving fintech scene that produces solutions light-years ahead of what's currently in the US, despite the stronger consumer protection laws that we have.


I see this argument every so often but I'm wondering, what did we actually lose?

As many of us pointed out two years ago: time and money.

The collateral damage aspect is all the businesses that weren't doing dodgy things in the first place but still had to spend that time and money, because documentation had to be rewritten according to new formats, and policies had to be expressed in terms of the new sets of acceptable X, Y and Z, and so on.

I was not happy back then to find that despite having run businesses that were scrupulously respectful of privacy and security, we still ended up wasting weeks just on figuring out what we had to change (spoiler: nothing of substance, it was all red tape) and for a small business that is a nasty blow.

If you assume, probably rather naively, that all small businesses here in the UK had a similar minimum cost to ours just to review everything and dot the i's and cross the t's to ensure compliance with the new letter of the law, that alone would represent a cost of billions of pounds for little if any benefit to anyone in many of those cases.

The fact that the typical response from many posters on HN was to dismiss that cost as being somehow necessary or justified, with no regard at all for the very direct effects it would have on many small, bootstrapped businesses, showed an astonishing lack of perspective. The number of people in various forums around that time who just straight-up accused me of lying about my businesses being privacy-conscious already, for no other reason than that I run tech businesses and they treated all tech businesses as the enemies of privacy, was also pretty disappointing. There was very little objectivity in the discussions then, the much-lauded benefits to individuals faced with privacy intrusions by certain big players have almost entirely failed to materialise, and the costs and legal ambiguities for everyone are still there two years later.


> I see this argument every so often but I'm wondering, what did we actually lose?

All the old comments on Raymond Chen's Old New Thing blog for example.


We didn't lose that much because I suspect big business in Europe is largely ignoring the more difficult parts of the GDPR. I work for a large bank that is totally non-compliant with GDPR and does not really even have a strategy for getting there. My impression is that we (the bank) looked at the draconian requirements of the bill, realized that, with the total mess that the IT of the bank is in, implementing GDPR would cost billions, and just sort of gave up. It looks like we wait for the regulators to fine us and hope that it won't be a nine figure fine.


Which parts are so difficult? Trying to find all the data about a user in the system?

I have some sympathy for an giant mash of databases like that.

I have no sympathy if someone claims that adding a tracking toggle to a single web site is too hard.


Normally it's hard enough to ensure that you have retained an authoritative copy of data, but now it's even harder to ensure that you have destroyed every incidental copy throughout the org on short notice. Then there's the bureaucratic "prior consultation" that will delay launches by months


Two major issues that I can remember offhand:

1. Deletion/rectification of all copies (that includes backups!) of personal data on demand. We currently are not sure where (in which systems) we store all that data, not to mention adding features to delete/update all data on request in each of those systems.

2. The requirement to complete description of all processes within the bank which touch personal data. That involves creating a fuckton of documentation, a lot of it for systems where required knowledge is missing (i.e. no one is quite sure how they actually work).


>Nasty social media that makes their money on outrage and exposing people to scam ads?

Last I checked Facebook and friends still exist.

>what did we actually lose?

* Many europeans lost access to various publishing sites (another win for the big guys)

* Collectively who knows how many millions went to lawyers to reverse engineer the vague GDPR standards


> Last I checked Facebook and friends still exist.

Last I checked there are studies that suggest the current social-media solutions have a negative effect on mental health, and those effects are likely because of the platforms' efforts to drive up "engagement" levels. Regarding the ads, I have first-hand experience of my non-technical friends falling for outright scams (requiring a chargeback), dubious snake-oil being advertised or malware on major online ad networks (not an issue anymore thanks to an ad blocker).

> Many europeans lost access to various publishing sites (another win for the big guys)

This doesn't seem to significantly impact me or anyone in my network. If this was a big problem we'd notice it and/or a EU-based, compliant competitor will step in to fill the void.

> Collectively who knows how many millions went to lawyers to reverse engineer the vague GDPR standards

Somewhat agreed but this seems to be a side-effect of companies trying to lawyer their way out of the law, and the reason this works is because of the lack of enforcement. If it was enforced it would be a clear message that these efforts don't work and should be stopped.


Somewhat agreed but this seems to be a side-effect of companies trying to lawyer their way out of the law

Not necessarily. One of the main criticisms of the GDPR was that it was vague and ambiguous on several very important points, and in theory deferred to more concrete guidance from the national regulators, which in turn was then either inconsistent or absent in some of the most important areas anyway.

The GDPR penalty regime was also heavily stacked against smaller businesses: for a large business, the costs are capped at the 4% level, but for any business earning less than half a billion each year, the absolute cap takes precedence and means that a regulator can literally threaten the very existence of any business earning less than probably 100M.

In that environment, you need proper legal advice on interpretation and possibly, as absurd as it seems, just to show that you have made a serious, good faith attempt at compliance, as a preemptive defence if a regulator does subsequently take a different view to yours.


My entire point of my facebook comment is that GDPR gave us nothing, and people paid by losing news site and lawyer salaries.

I don't care if you aren't personally affected by this. That isn't the argument you should be trying to make. How did GDPR improve your life? AFAICT Facebook may still have your shadow profile


> How did GDPR improve your life?

People are more aware of privacy violations and even though companies don't fully comply with the regulation, many are at least trying.

I've personally had success in getting multiple EU-based businesses to delete my data and/or fix issues with their marketing infrastructure sending me spam despite not opting into it.

Facebook still has a shadow profile for me but between Facebook having it or Facebook plus a hundred more bad actors having it too I'd still prefer if it was only Facebook.


>This doesn't seem to significantly impact me or anyone in my network. If this was a big problem we'd notice it and/or a EU-based, compliant competitor will step in to fill the void.

Access to fewer news sites is access to fewer news. The new site isn't going to replace the old. Also, we're not getting replacements for them in the EU because the business model for these sites doesn't work with GDPR. Making their life financially more difficult just pushes them more into clickbait and yellow journalism.


> Making their life financially more difficult just pushes them more into clickbait and yellow journalism.

Clickbait is explicitly caused by advertising - it's right there in the name, it's there to drive clicks, the content itself is secondary.

If advertising becomes unsustainable then other business models will take over. At the moment subscribing to news websites is too expensive because 1) we don't have an easy to use micropayment system and 2) they are greedy and charge way more than what they would get in ad revenue.


I've also seen a ton of people complain about it on this website alone [0, 1].

[0]: https://hn.algolia.com/?dateEnd=1590782149&dateRange=custom&...

[1]: https://hn.algolia.com/?dateEnd=1590782149&dateRange=custom&...


We care about your privacy notices have become the bane of my life.


The majority of these aren't actually compliant.

Tracking should be opt-in and consent should be freely given. If your notice is annoying enough that most people click accept (or if clicking decline is harder) then you are already in breach.

A lot of websites also consider analytics cookies as essential and don't provide a way to decline those which isn't compliant either.

These websites can be detected very easily by running a web scraper and looking for one of these non-compliant "consent management" solutions (looking at you TrustArc) and fining every single company that uses it.


Here's an example of a broken site:

https://www.europarl.europa.eu/privacy-policy/en

The only two cookie options are "Accept" or "More". But the More option is broken and just brings up the same cookie notice again and again on my browser. It drops cookies on the browser regardless of whether you choose to accept or not (search your cookies in the browser for europarl.europa.eu, you'll find the unique "atuserid" and "atidvisitor" analytics identifiers it has set to identify you).

If that's the result on the EU Parliament's own website, on their privacy policy page, it's safe to say the EU doesn't actually care about privacy.


That's incredible and just shows how stupid this directive was. In my opinion it destroyed the user privacy and user experience: the average user now clicks "accept" as soon as he visits a site. This made it easy to fool non-technical users to subscribe to push notifications, give access to location and other privacy-invading features. I used the phone of my sister for a few minutes and her notification center was bombarded with random push notifications froms sites he visited and to which she unknowingly subscribed to.


I agree that the current lack of enforcement is bad for the intent of the law and its long-term impact.

Currently the lack of enforcement allows non-compliant solutions (where accepting is easier than declining) to thrive so people get used to accepting everything.

Down the line, even when enforcement catches up and compliant solutions start appearing, users will still be clicking accept because they've been trained to do so.

This is unfortunately good for adtech/martech not just now but in the future, so all of those currently making your money on stalking users, don't cry, it's all gonna be okay.


Maybe all these websites are simply taking the EU Parliament's site as an example on how you should do it? They made the regulation, surely you should follow their example.


> The majority of these aren't actually compliant

There is insufficient evidence attempting to comply with GDPR is worth the cost.


Absolutely, given the current lack of enforcement. However, if you're going to be in breach, you might as well improve UX and not bother with the whole "consent management" thing, not to mention that the TrustArc garbage solution doesn't seem cheap.


But then it's much more apparent that you are in breach. If you pretend to care then the chance of being caught is much lower.


Do you mean the direct cost of implementing the compliance, or the indirect cost of no longer getting extra ad revenue in an illegal way?

In most cases I bet the former isn't all that much. The latter is a harder nut to crack, with everyone trying to toe the line and referencing what other companies are able to get away with. Lack of good faith is a big obstacle.


> I bet the former isn't all that much

One of the fundamental problems with GDPR is it contains a federated complain-investigate enforcement model. So your bet would have to apply to each of the EU’s twenty-eight members, now and in the future.

In that context, throwing up notices and calling a day makes sense. One can argue one tried. But not go so far as to potentially create new liabilities by interpreting these enforcers’ current and future preferences too strongly.


> There is insufficient evidence attempting to comply with GDPR is worth the cost.

The number of people working at the respective national privacy regulators is appalling. All of them have an extremely scarce amount of privacy auditors that are qualified to extensively investigate privacy breaches. Even Ireland, which has a huge tech hub with the social media companies especially, has a scarce amount of them.

The Financial Times wrote an article about this awhile back, which tends to have good reporting on tech and privacy issues.


On mobile, some of the opt-out toggle switches don't even function. You literally cannot disable the toggle in iOS Safari.


Yeah but that says more about the clumsiness of iOS. Since upgrading to 13, a ton of stuff is broken, like tap to zoom, and significant ability to edit text fields. Web pages randomly freeze, forcing a kill and reopen, which I rarely saw before.


> A lot of websites also consider analytics cookies as essential

For a lot of websites, they are.


Under PECR, a cookie being essential means necessary to provide the requested service, not necessary to stay in business. If the user wants to view a news article, and you can serve the article without using analytics cookies, then PECR doesn't allow the cookie. (The situation for paywalls is complicated.)


Providing the service assumes staying in business, no?


No, there's no protection for failed business models, as there should not be.


A business model that fails because you explicitly make them illegal isn't exactly a failed business model. The lawmakers made them fail and they either knew it was going to happen or were incompetent.


> A business model that fails because you explicitly make them illegal isn't exactly a failed business model.

It literally is, by definition. Any business success has to happen within the legal context it exists in.

> The lawmakers made them fail and they either knew it was going to happen or were incompetent.

I could reword this as "the elected representatives of the people decided that certain business models were undesirable and anti-consumer, so legislated against them".


>It literally is, by definition. Any business success has to happen within the legal context it exists in.

Yes, and they had business success until the rules were changed from under them.

>I could reword this as "the elected representatives of the people decided that certain business models were undesirable and anti-consumer, so legislated against them".

And I could reword this as "lobbying groups have bought our politicians and use them to enact laws to put our competitors out of business".

I'd say my rewording is closer to reality, because of course the two biggest ad networks increased in size while the smaller ones decreased as a result of this regulation.


A goal of the GDPR was to make these business models illegal, because they are considered bad. It has not been successfull at this, mostly due to lack of enforcement.


Or maybe that's what the voters are told. Why do so many people suddenly believe politicians? Do you think European politicians don't constantly lie through their teeth?


The business model of Google isn't a failed business model.

What the GDPR does do, quite successfully, is build a moat around Google so wide and deep as to minimize competition with them, because they're one of the few firms that can both (a) afford the engineers with the technical expertise to comply with the law while accomplishing their goals and (b) afford the lawyers to address the issue when they fail at the former.


They are a giant and those can be hard to topple in one go, but the fact that a giant can continue a failed business model for a longer time does not make it a not failed model.

Hell, at least around 2017, there were voices from Google that they considered ads to be unsustainable long term and sought to diversificate income streams.

The fun thing is, GDPR didn't actually introduce much change in law. It just gave, for the first time in history, existing laws a real set of teeth, even if they are still baby teeth.

So yeah, all that data companies had been vacuuming for no sensible purpose? It was always illegal


> afford the engineers with the technical expertise to comply with the law while accomplishing their goals

Google is in breach of the GDPR as it stands, so no.

> afford the lawyers to address the issue when they fail at the former

Potentially, though again a clear-cut breach like theirs should result in a fine regardless of how much money they throw at the problem.

As far as building a moat, I'm not sure. Whether it's Google or a one-man shop, neither can accurately track users without being in breach. There is no moat that I can see, you either break the law or you don't.


> Google is in breach of the GDPR as it stands, so no.

I don't believe that is true. What is your source?

They were fined in Jan 2019, but are they still out of compliance? If yes, why are they not being continuously fined?

> you either break the law or you don't.

That's the result on the other side of a trial, sure.

Which is why good lawyers are so important.


> They were fined in Jan 2019, but are they still out of compliance?

They were fined on one specific thing and they maybe fixed it (or silently replaced with an equivalent, non-compliant thing once they went out of the spotlight), however they are plenty of other things they do that are in breach and those are not being investigated nor fined which is why we're discussing the lack of enforcement.

> Which is why good lawyers are so important.

True, but a good law should be one that you can't lawyer your way out of and so far the GDPR outcome of that is inconclusive given there is barely any enforcement at all.


Not for this regulation. Business considerations do not matter, only technical ones.


The true hallmark of an ill-conceived law.


In this case you could say anti-drug-trafficking laws are ill-conceived because they go against the cartels' business models.


You may want to choose another example. US drug law is a case-study in how law divorced from consideration of ramifications to existing businesses and societal norms is a disastrous way to craft law.

https://www.aclu.org/other/american-drug-laws-new-jim-crow


Drugs by themselves are indeed a bad example (I am personally in favor of legalizing drugs), but let's substitute them with violence:

Would you say that anti-violence laws are ill-conceived because they go against cartels/mobs' business models of extorting money from people under threat of violence?


Letting companies opt-out of a regulation purely because it hurts their business model sounds much more ill-conceived to me.


Good law doesn't "let companies opt-out;" it is crafted with consideration for what is already happening and the consequences of the law.

it doesn't appear, 2 years in, GDPR passes that test. If the goal was to minimize "privacy violation" by FB and Google, it's failing. FB and Google are stronger than ever, but their competitors are starved out of the market trying to comply with an onerous suite of policies.

It's such a ladder-pulling set of laws it's surprising Google and Facebook didn't craft it.


Did we read the same article? Most fines are tiny. It hasn't fixed everything (yet) but that doesn't mean it's hurting competition.

> Good law doesn't "let companies opt-out;" it is crafted with consideration for what is already happening and the consequences of the law.

And sometimes the outcome of that consideration is "stop doing that".


You'd have a hard time arguing that breaking GDPR is the only way to stay in business. There are enough compliant news websites to undermine that argument.


Internet Explorer was absolutely essential for Microsoft Windows to provide the requested service.


> The majority of these aren't actually compliant.

The title of the article is "Two years in, GDPR defined by mixed signals, unbalanced enforcement".

So sure, maybe they're not complaint, but nobody is enforcing anyway.

EDIT: removed unnecessary pejorative statement from last paragraph


I've seen a lot of people say this, but I'm just not convinced it's actually the law. It's not obvious to me that analytics cookies categorically can't be essential or that "freely given" implies strict UI neutrality between accepting and declining.


The majority of websites use third-party analytics that collect way more information than necessary and may use it for their own purposes (surprisingly a lot of these companies' main business is ad-tech). The problem here is that not only do you get analytics but the third-party (with whom the user has no relationship and their interests might be against the user's) is now able to track that user across other websites.


Well, essential means that a web service cannot technically work without it. For example, a session cookie.


The law explicitly has exemptions around purely functional cookies so you are free to set session cookies without requiring consent/disclosure.


Yesterday the highest German court (BGH) ruled that "default accept" in cookie dialog boxes is illegal.

It was a pre-GDPR case, but the court said it interpreted the them-in-force law in a GDPR-friendly way.


Found this Chrome Extension: I don't care about cookies - Remove cookie warnings from almost all websites!

https://chrome.google.com/webstore/detail/i-dont-care-about-...


Just pair them with cookie notices for extra effectiveness!


Right up there with emails from software companies with “our response to Covid-19.”

You are a software company. Unless the server has the virus, I really don’t care.


Indeed. Frankly I miss the days when my popups had breasts in them.


Disrespectful web developers have become the bane of my life.

Be thankful that GDPR exposes them, and look for alternatives.


Setting aside GDPR for a moment, the cookie thing just means that if I want to use these websites, I have to enable cookies so that I can dismiss the cookie dialog.


You are allowed a cookie that tracks their opt in to your other cookies, so long as it is anonymous (or so our lawyer tells us).

On our site, we ping whether that cookie is set before we load the rest of the cookies.


Is it acceptable to store a cookie that defaults to "false" provided it is generic? That would solve the problem of not being able to detect if cookies are enabled in the browser until you try to store them.

added: person down thread indicated that there's an API for determining if cookies are enabled for the host on your page's origin called navigator.cookieEnabled which I am shocked I've not seen nor heard of even once before today. Hallelujah. I now agree that everyone who doesn't check that before pestering people about cookies, when JavaScript is available, is literally satan.


We didn't trust using the host cookie settings because most users don't know they exist.

But we were told anonymous cookies were totally fine and within the spirit of the law. If you hit the "Accept" button, you got a cookie that allowed more cookies.


It is acceptable to use cookies in general as long as they are required for the functionality of the site. This includes logging in, shopping carts, gdpr cookiewalls, ...


Even if there was no new API, storing a cookie "just to check if it works" is not against the GDPR. It's a functional cookie.


Again, only because of incompetent and/or immoral developers.


How, exactly, is a developer supposed to change state in a stateless protocol to denote that you've dismissed the cookie dialog if the user has disabled the feature that allows the developer to add state to the stateless protocol?


The law does not forbid you from storing any cookie, even if the user declined. The user can decline cookies that are not needed to provide what the user asked for. Even after the user declines, you are allowed to use cookies to keep track of what's in the user cart, whether the user has logged in or whether the user rejected cookies.


I believe his point is that if the user disabled cookies, there's no point in showing the banner. If you don't show the banner you can't track the user, of course.

It's not hard. It's just a matter of checking for navigator.cookieEnabled.


cookieEnabled is not reliable for handling third-party cookies, unless you also load a third-party frame running JS. (And even then it doesn't work like you'd want in many browsers.)


Sorry if I missed something, but AFAIK grandparent wasn't talking about third party cookies, only about having cookies disabled and being unable to store the consent flag, so I don't see how this applies to this specific discussion.


It does. Most people concerned about the GDPR are concerned about third-party cookie tracking.

It also highlights how, in general, this is a hard problem. Compliance with this law without creating a dead-static page has subtle complications.


I never dismiss those dialogs anyway. Just make sure your site works with the browsers reader mode and I’m fine.

But on the top of my head only display the dialog if the browser cache is cold. Could embed a timestamp in some cachable resource. (Edit: Perhaps this counts as a “cookie“ in the legal sense)


How else can you even store a consent for cookies/localStorage? You can go around calling people immoral and incompetent, but what is the actual way to ask for permission to store data?

If your contention is that it is immoral or incompetent to store any data except through some specific user interaction related to those data, sure that's an opinion.

But if your job is literally "tell me which other pages users go to after this one", it's not really that crazy of an ask.

The law seems to call upon you to make it conspicuous, but when you make it conspicuous it is annoying, the law then calls upon you to make it not annoying.

The better solution, in my mind, is just making cookie control features more visible in browsers. They work great, and it's the right place for this form of consent.

Malicious actors abuse the current circumstance, because it relies on there being a responsible party with collateral to complain against. This is one of those times where the engineered solution is better than the social one.


> How else can you even store a consent for cookies/localStorage?

You said "the cookie thing just means that if I want to use these websites, I have to enable cookies so that I can dismiss the cookie dialog".

But if cookies are disabled, then there's no point in asking for consent. There should be no cookie banner in this case.


> But if cookies are disabled, then there's no point in asking for consent.

But the only way to determine if cookies are actually disabled in the browser is to attempt to store cookies, which is the thing you're asking consent for.


A cookie storing simply whether the person has accepted, declined or not yet responded to your consent dialog does not require consent.


Not really, there's also navigator.cookiesEnabled now.

And even if there wasn't, attempting to store a cookie with a dummy value just to check if cookies are enabled does not break the GDPR for three reasons: first, if it's a dummy value it's not really personally identifiable information. Second, it's a functional cookie required for the site to work. And third, the site can just delete it afterwards the checking. No consent required in this case.


what are you referring to? I fear I am one of these incompetent devs since I don’t understand the implied transgression.


Has anyone beyond big tech actually figured out what the rules are yet?


Informally, EU citizens own their personal data, and only ever grant revocable licenses to it.

More precisely, to collect any personally identifiable (PII) of an EU citizen, you need their consent. PII includes things like name and email, but also anything like an IP address that can be used to "unmask" a person. Consent must be freely given and can be withdrawn at any time. If requested by a citizen, you must turn over or delete any PII of the inquirer. You must do your best to keep the PII safe, and follow security best practices - and in case of a data breach you must inform the data authorities and affected citizens. You may only ever hand over data to other companies (sub processors) if you have a contract guaranteeing that they will also abide by the above constraints (nice little GPL-esque twist there).

I hate the barrage of popups from websites trying to weasel out of it in order to continue business as usual with ad-tracking. But at its core, the GDPR is actually a pretty good piece of legislation - we now have a right to be forgotten anywhere.


Except on any servers entirely out of EU control. China and Israel for instance, have flatly refused to comply with GDPR and furthermore refute any claimed authority the EU has over anyone or any servers residing in their respective nations. I know I refuse to comply with my small US customer-only physical data storage company even though I regularly get visitors from the EU on my company site (And my site says I specifically only serve US customers mainly because I refuse as an owner to deal with the bullshit surrounding the storage and shipping of encrypted tapes and drives full of data belonging to other people outside of the USA).


Part of the problem with the GDPR is that although many people believe that what you wrote is roughly what the GDPR says and although much of what you wrote is probably more true than not, almost nothing there is strictly correct and in a few places it is wildly misleading. It's not just about collection of data but also about how you process it. Consent is only one of several possible lawful bases for processing, and in practice it's one that many experts have advised against relying on any more than necessary. The right to erasure is actually quite limited in scope. There are multiple mechanisms by which data may be legally shared with other parties, and there's a whole controller/processor system you haven't touched on in relation to that. There are grey areas and ambiguities all over the place, and there are widespread misunderstandings even on points where the GDPR itself is reasonably clear, not helped by media reports often getting the facts wrong themselves around two years ago.


Big tech hasn't figured it out either.


What makes you think big tech follows the rules?


They continue to operate and are not bogged down in regulation. Even if there are unofficial rules of what matters and what doesn't, they have figured out those rules.


The rules are very clear once you look past the fear-mongering. Don't stalk people, and if you want to stalk them you need to ask them nicely and allow them to decline. Don't be careless with user data so you minimize the likelihood of a breach, and if you do get breached then report it to the regulator and cooperate with them.

In fact, "big tech" has figured out how to get around the rules by exploiting the lack of enforcement. The majority of big tech is knowingly not GDPR-compliant.


> Don't stalk people, and if you want to stalk them you need to ask them nicely and allow them to decline

Ok, that's nice in a fantasy world, but in the real world a lot of people/sites rely on ad revenue, and ad revenue for the most part, requires tracking built in. So now if you legally force me to allow users to decline "stalking" you are basically allowing users to decline my monetization model and use my website/product for free. And why should I allow that?

Why can't I say: "accept that my site is ad-supported or don't use my site?"


The argument goes, if the monetization model is unethical, then it shouldn't exist. I'll demonstrate this by taking your post and rewriting it about a different industry. I am NOT saying these are the same situation, because most people have different views on tracking vs child labor. I am demonstrating that the argument makes sense IF you think tracking is similarly immoral.

> Ok, that's nice in a fantasy world, but in the real world a lot of people/clothing companies rely on cheap manufacturing, and cheap manufacturing for for the most part, requires child labor. So now if you legally force clothing companies to allow consumers to decline "child labor" you are basically allowing consumers to reject clothing companies' monetization model and get their clothes at a loss to the company. And why should clothing companies allow that?

edit: maybe I've couched my argument a little too much. I think it's a pretty good comparison, actually: I think most (but not all!) people agree that tracking and child labor are bad, but turn a blind eye because they enable cheap/free stuff.


The child worker is a third party that has nothing to do with the customer and the merchant. If the merchant is offering me content for the exchange of information, why should the government able to stop this transaction between parties that mutually agree?

It's already legal to "force" your customers to exchange money for content. If anything, it's even worse because a lot of children end up malnourished because their parents spent to much money on entertainment.


> The child worker is a third party that has nothing to do with the customer and the merchant

This is a fair counterpoint; it is not a perfect comparison. I think it did its job though, to clarify (now with your help) the actual point of contention:

> why should the government able to stop this transaction between parties that mutually agree?

Most nations have a concept of human/inalienable/natural rights, which cannot be signed away. For example, it's usually illegal to sell yourself into slavery. In the GDPR's view, privacy is such a right.

Do you think the government should not be able to stop these transactions at all, or do you think privacy should not fall into that category? Or a different objection I haven't thought of.

---

I have my own views, but I'm not really interested in arguing them. It largely comes down to what moral system you subscribe to, which I think is a waste of time to discuss on HN (or any public forum) -- there's too many people on the internet to argue with everyone you disagree with; better to learn to get along, and focus advocacy on communities closer to home. As for why I comment at all: figuring out how to frame issues is not a waste of time — it's much easier (less emotional) and provides a useful reference for future conversation.


I appreciate your response, and I think my view is broadened as well :)

> do you think privacy should not fall into that category?

This one. Supermarket rewards are an example exchanging privacy for money that is rather uncontroversial. Obviously tech companies collect a lot more information, but I believe the same principle applies.


The major difference is that the child labor affects someone else negatively, while tracking affects the consumer themselves.


Fair. See my cousin reply to Aunche for slightly more detail.


I'll bite that bullet and argue that child labor laws, like GDPR, are well-intentioned but cause more harm than they prevent. Let me explain.

First: If child labor laws were abolished tomorrow, almost no parent in a developed country would encourage their children work, and almost no employer would accept child labor. Parents want their children to succeed in life, and in developed countries that means sending them to school until they have enough knowledge to work a lucrative job. So a parent must be desperate for money or have a very low opinion of formal schooling if they're willing to let their child leave school at age 14 and find employment. Such instances are exceedingly rare in developed countries with social safety nets, but they're not as uncommon in poorer countries. In those places, a parent must sometimes choose between their child going hungry or their child going to work. It's a terrible choice, but we should let families make that decision. The state doesn't have as much information as the parents do, and parents care about their children far more than the state does. A blanket policy prevents families from choosing what they think is best.

Second: As we've seen with laws banning drugs and prostitution, the net effect is to drive the practice underground. This means that anyone involved can't rely on the police and courts. It encourages violence and discourages victims from reporting worse crimes for fear of being prosecuted themselves. It means that banned products aren't subject to controls on quality or safety (as they would be if they were regulated like everything else). So too with child labor. In poor countries, child labor laws mean that children work illegally. They have no protection from hazardous working conditions or abusive practices. The fact that their labor is illegal also means they can't join a union.

Third: In the US, there are exemptions to child labor laws. The Amish are allowed to leave school and work full-time at 14. They work on farms. They use power tools[1]. Some even work in woodshops and sawmills (though usually less hazardous jobs such as cashier or stacking wood). As young as 10 years old, they drive horse-drawn buggies on public roads. Compared to other children in the US, Amish kids don't seem to be particularly unhappy or have worse life outcomes overall.

I think people should have more rights sooner in life. That includes the right to vote, the right to leave school, and yes the right to exchange labor for money. There are plenty of successful people who left school early. Heck, the former Prime Minister of Australia quit school at 14.[2] Scroll through Wikipedia's list of autodidacts[3] and you'll find quite a few high school dropouts.

One can criticize a practice while still arguing against blunt laws that outright ban it. I'm certainly not a fan of child labor, but I think that many of the laws around it are counterproductive, especially in developing countries. I also think many drugs are dangerous, but imprisoning people who engage in such practices causes more harm than allowing it. And I'm worried about how governments and large companies are collecting information, but I'm also convinced that GDPR doesn't help. If anything, it makes the problem worse because larger companies can more easily afford to comply. And there's the issue with GDPR's exceptions for government agencies such as law enforcement and intelligence. Funny how that works.

1. The Amish use compressed air or hydraulic tools. Some sects allow electricity if it's from self-sufficent sources. For more info, see https://www.tested.com/tech/453794-how-amish-are-adopting-po...

2. https://en.wikipedia.org/wiki/Paul_Keating

3. https://en.wikipedia.org/wiki/List_of_autodidacts


It wasn't my intent to have this argument, so I'm not going to (I did acknowledge that not everyone agrees for just this reason, though). My cousin comment, reply to Aunche, describes what my intent was (tl;dr: explain the framework in which GDPR is reasonable, so discussion can proceed without outrage). I think I achieved that, and I'm happy you found the analogy strong enough to carry it forward :)


Because the EU says you have it backwards? Ignoring European users is perfectly fine, but if you want to monetize them, you better play by their rules, unless you are more powerful than the EU. In theory. In practice do whatever you want.


> Ignoring European users is perfectly fine,

See this: https://news.ycombinator.com/item?id=23353051


That comment speaks to the option of making access to the site conditional on agreeing to the cookies.

That’s not relevant to the parent’s alternative —- as I understood it —- of unconditionally blocking European users entirely.


Not following a law I'm not subject to is perfectly fine.

I'm free to insult the King of Thailand and drink scotch whisky all night long (as long as I don't die behind the wheel) contrary to the laws of Thailand and Saudi Arabia, respectively.


Your site can be ad-supported, without GDPR popup, as long as you don't track your users to display those ads. You could even have relevant ads by using keywords instead of user data.

That's what the NYT has done, and they've seen their revenue grow: https://news.ycombinator.com/item?id=18920079


I just visited NYT's sit. It immediately set 20 cookies before asking for consent. It also loaded stuff from google.com sites.

The options that appeared are (a) Accept (b) Manage Trackers

Manage trackers leads to this page

https://www.nytimes.com/subscription/privacy-policy#/cookie

Which seems to list lots of 3rd parties that will track me if I view the site. I'm told I have to go to their sites and opt out.


That is absolutely in breach of the regulation.


DuckDuckGo does ads without tracking

In fact, every advertisement outside of the web works without tracking/stalking the consumer. At most, you get a discount code for "seeing this ad on X place"

> Why can't I say: "accept that my site is ad-supported or don't use my site?"

Because that's the equivalent of me giving you my address, dob, SSN, etc just for entering your store


> Because that's the equivalent of me giving you my address, dob, SSN, etc just for entering your store

So what? If you don't want to give me those things (ad tracking isn't nearly that bad, btw), then don't enter my store. And likewise, if a consenting adult doesn't mind giving out that info in exchange for entering my store, why prevent them from doing so?

Don't agree to my terms, don't enter my shop. It's as simple as that.


Well since you replied to me I'm going to require your SSN and date of birth as part of my privacy policy. Didn't like it, you shouldn't have replied to me.

Sounds like you're the type of business the law was created for


You can? If the user doesn’t opt-in to advertising then you could simply not allow them to use the site...

Note, I’m not in any way an expert on the legality of it, but this seems like a silly argument to make. If anything, being more up-front with the model may open alternative methods of monetization (subscription-based, etc).


That's actually not allowed under GDPR. You can't deny access to a website based on whether a user accepted non-essential tracking.


You can not [1] (though this is definitely the clause that at least in the US, I'm confident would be beaten down in court).

And this is why I have such a problem with the rule. Without some dramatic changes, any video streaming site almost surely wouldn't be able to handle the substantially lower ad revenue and still make a profit if everyone actually opted out. Sites like Facebook would probably turn a profit, but there's a large portion of the internet that would not survive when faced with 90% less revenue (untargeted ads payout drastically less).

>“Freely given” consent essentially means you have not cornered the data subject into agreeing to you using their data. For one thing, that means you cannot require consent to data processing as a condition of using the service. They need to be able to say no. According to Recital 42, “Consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.” [1]

[1] https://gdpr.eu/gdpr-consent-requirements/

EDIT: Come on can we not down vote just because you disagree here? It's just going to be come a circle jerk like most of Reddit if you guys keep this up... I linked a source even proving this OP wrong... from the official GDPR resource...


Is YouTube blocked in the EU? Let’s assume it’s turning a profit there, then start wondering how. The obvious answer is that monetised videos generally have a subject, and advertisements can be tied to that subject.


I don't think Google has ever released the numbers on how many people have opted out, one can assume it's not 100%.

And you are clearly tracked across Google services at least, as anyone who has ever google something shopping related, then viewed a youtube video can tell you.

>The obvious answer is that monetized videos generally have a subject, and advertisements can be tied to that subject.

The obvious answer to me is that not everyone opted out. Businesses generally want to make money, collecting data clearly made them more money, or they wouldn't have done it. Now they make less on those who opted out.


> in the real world a lot of people/sites rely on ad revenue

I want to start a car rental business, however that requires upfront capital to buy the cars and I don't have that, so I'm gonna break the law and steal the cars to offer them up for rental. Is that what you're advocating for? Being in business is not a right; if your business model isn't sustainable without breaking the law then find a different one.

> ad revenue for the most part, requires tracking built in

Magazines are essentially full of ads and people actually pay for them and yet there's no tracking other than very generalized targeting (a car magazine will have car-related ads in it). Just because web advertising became the wild-west doesn't mean we should now legitimize it.

> and use my website/product for free

The biggest offenders (Google, Facebook, etc) don't even allow you the option to pay for it to decline tracking (and a lot of them track non-users as well; Google Analytics and the social media networks track people regardless of whether you're a user and agreed to their ToS/privacy policy).

> Why can't I say: "accept that my site is ad-supported or don't use my site?"

Because it's the law and the same reason why I can't walk into a store, start shoplifting and when caught say "sorry, should've read my ToS and not let me in if you disagreed with it".

You are welcome to vote and nag your local politicians to amend the law if you believe it's wrong but until then you need to respect it regardless whether you agree with it or not. The advertising and marketing industry has proven it is unable to self-regulate so the law has now stepped in. The industry had many chances to clean up their act (the do-not-track header was one of these chances) and they clearly told us all to fuck off so now the GDPR is a stronger solution.


I thought GDPR allowed you to put ads on your site? If you mean you’d prefer that the ads be based on personal user data, and you want collection of that data to be a condition of using your site, that’s a great debate to have, but not when you equate it with the possibility of having ads at all.


>Don't stalk people

The problem is there is no consistent definition of stalking in this context. Which could be the difference between a store manager watching how people move around a store vs following you home and going through your trash.


Stalking is collecting any information, that either by itself or combined with other information can be used to identify someone with reasonable probability. IP addresses, browser/device details (fingerprinting, etc), usage patterns can fall into this category.


By that definition, literally everybody in real life is stalking me just by seeing what I look like. That's not a terribly useful or reasonable distinction.


If they are collecting and storing that data into neatly labeled folders, any prosecutor worth their salt could make a compelling case for stalking.


Human memory does not constitute a storage device for the purposes of GDPR.


Your analogy would be more accurate if they were logging/taking pictures of what you look like, and that can actually be considered stalking in certain countries.

Seeing the information is one thing. Collecting and storing it is another.


> The rules are very clear once you look past the fear-mongering.

Hence the massive debate to this day over the rules.


Sarcastic comments aside, I'd blame adtech for that.


The massive debate is either because of people misunderstanding the rules despite them seeming very clear (functional cookies are allowed and don't require consent/disclosure, and the GDPR is more about just cookies so local storage or browser fingerprinting is also covered so you can't just stalk users without consent because you don't use cookies) or because these people's livelihood depends on being in breach of the GDPR so they try to justify their behavior or spread misinformation.


> The rules are very clear once you look past the fear-mongering

> Don't stalk people

Are you just not aware of your own hypocrisy?


I work as a developer in the European public sector, we already took privacy and security rather serious because the laws governing it had always been and are still tougher than the GDPR.

I actually like that the EU is doing something, and I guess this is the best you get from a bureaucracy, but what it’s changed is that we document everything. Whenever I build anything that moves privacy data, even if it’s just hooking up a new system to our ADFS which accesses employee names, I need to fill out 4 forms and write a risk assessment. It all goes somewhere I suppose, I’m not sure because once I file them I never hear anything about it unless my wording wasn’t good enough.

As far as security goes, it hasn’t actually changed anything. I guess it does if you weren’t taking security very serious before, but the idea that we as developers will think about security first or design better systems if a bunch of lawyers force us to fill out forms and write essays on what can go wrong... I just can’t wrap my head about why anyone would actually believe that stuff.

Like I said, it’s a great idea, on paper, but the bureaucracy that is enforcing it is just so clueless. Passing inspections is more about having the right answers and documentation than having actual security, so it’s no wonder that the outcome is full of mixed signals and weird enforcement.

Still better than nothing, in my opinion, and it’ll probably get better with time.


Not sure why this is getting downvoted, seems to be a perfectly reasonable point?


because there's a rabid anti-regulation and anti-EU bias on this site. Well thought out answers get frequently downvoted while free market platitudes get upvoted.


There is a privacy benefit to adding friction to spreading personal data around everywhere. At the margin, some services will decide not to bother processing non-essential personal data just to avoid the paperwork. And really, that's one of the excesses that GDPR was a reaction to: that the "default" was "track everything in case the data magically becomes valuable," and now the it's become "perhaps not."


A lot of GDPR can be summarized as "GDPR makes PII into high-grade radioactive waste. You want the least of it, and take care of the remaining bits you end up with"


Relevant: https://idlewords.com/talks/haunted_by_data.htm

(Maybe you were familiar). The closing is particularly worth taking in:

> Finally, don't be surprised. The current model of total surveillance and permanent storage is not tenable.

> If we keep it up, we'll have our own version of Three Mile Island, some widely-publicized failure that galvanizes popular opinion against the technology.

> At that point people who are angry, mistrustful, and may not understand a thing about computers will regulate your industry into the ground. You'll be left like those poor saps who work in the nuclear plants, who have to fill out a form in triplicate anytime they want to sharpen a pencil.

> You don't want that. Even I don't want that.

> We can have that radiant future but it will require self-control, circumspection, and much more concern for safety that we've been willing to show.

> It's time for us all to take a deep breath and pull off those radium underpants.

I'm not sure it came in the form of a single event (though I can think of some candidates -- the cambridge analytica scandal made a big impression for one), but it's clear to me at this point that the warning was ignored, and our industry has missed the window for self-regulation.


The fears about GDPR when it passed, if I remember correctly, were mainly around arbitrary draconian enforcement. This article seems to only be talking about under enforcement. The causes of this under enforcement seem fixable. Ireland, putatively afraid of the big tech companies choosing to put their Europe HQs elsewhere, has been dragging their feet on privacy investigations. But the investigations are happening. Then there are some countries not putting enough money into it. The rest seems to be the various countries not being in alignment. For a sweeping, two-year-old regulation that has spent about an eighth of its life in the time of a major global crisis, this doesn't strike me as all that shocking.

Does anyone have any actual examples of draconian fines being handed out for good-faith misunderstandings of the regulation? Big Tech has professed confusion over how they're supposed to comply, but it seems to me like like they would simply prefer not to.


I think GDPR has its heart in the right place.

I don't think it really helps and I suspect that is because users themselves really don't know what is actually happening behind the scenes and no amount of banners or otter things changes their level of knowledge.

And I fear even if they know, users don't care and are happy to click past a banner / trade their privacy for free things.

GDPR seems to play out as a strangely legally mechanical beast that people are largely disconnected from.


> users don't care and are happy to click past a banner / trade their privacy for free things.

Are we discounting the possibility that users make a rational choice that we happen not to like?


Tough question. For some things, I'd say that informed consent is hard to give - if you consent, you're not informed.

I don't believe that the average user is making informed choices. The choices may be rational as long as the users don't understand the consequences. It's perfectly rational to trade in your life savings for a fancy meal if you don't understand what "life savings" means.


I wonder how much information can be provided, much of the "giving your data" is really about the side effects and possible consequences....

But you can only tell people so much. Just saying "hey you're giving google your location" (just a generic example here) ... honestly if that's all I know ... so what?

But really the larger issues are other implications.

That's a hard thing to explain.


Exactly, and I think we're usually way off intuitively.

For example when considering how many facts of what nature I'd need to individually identify you. SSN? Ok, done, everybody knows that. But how far do I get with birth date, height and city? What if I add one chronic health issue, no matter how small? Chronic sinusitis, born on August 8th, lives in $city and is 186cm? In most cases, I probably don't even need all four.

But most people intuitively don't think about it in combination, they figure "oh so you know I live in $city, big deal, so do a million other people", "oh so you know my birthday, well a million other people in the country have that birthday".


this seems incredibly patronizing. perhaps we shouldn't even let people shop for themselves, instead do their shopping for them


Yes, they could be making a rational choice that we don't agree with.


Then maybe we shouldn't be making laws to force things "we happen to like" to everyone


It's one of the few ways to negotiate back against contracts of adhesion. Giving that choice would theoretically be nice, but the status quo makes it not really a free choice, and the downside is bigger than the upside.


I don't think GDPR forces you to choose not to share information. You simply get a a mechanism to make that choice.

IMO it is a flawed and wonky mechanism.

Still I'd prefer a the option, later if nobody cares then maybe remove the choice.


It forbids the exchange of information for providing service. There is no choice


> users don't care and are happy to click past a banner / trade their privacy for free things.

The GDPR explicitly mandates that consent should be freely given (it should not be more difficult to decline than to accept) and that consent should be informed, so you can't bury the information in 30 pages of ToS or privacy policies.

The problem is that there is currently zero enforcement around those things. I'd argue that this is very bad for the intent of the law because even when enforcement starts happening and declining consent becomes possible users would've already been trained to just click accept to everything.


It gets even better: consent needs to be possible to be taken away as easy as its granted. Where are all the banners that nag me to take away ma consent?


GDPR is poorly designed. It deliberately uses super vauge and imprecise wording. That is bad enough when operating in a common-law legal system where that is the norm. It is inexcusable in the Civil Law system that much of Europe operates in.

Consider you offer the ability to users to voluntarily submit reviews of restaurants. One reviewer complained that person seated at table next to them was excessively noisy, and that upon asking the waitstaff to do something about it, they did nothing.

It is actually entirely plausible for a company like Facebook to have enough information to be able to determine exactly who that other person is, given that review. For example if say both posted images of their receipts to Instagram. Why would they do that? Beats me, but plenty of people do things like that. Under a wide but not at all implausible reading of the personal information definition in the GDPR, that review qualifies as personal data of the person at the other table. The definition of personal data is "any information relating to an identified or identifiable natural person". And that review does include information about a natural person, and we have shown that the person is theoretically identifiable by Facebook.

This means if that other person asks for all their personal info from the site, technically that review should be included. But most likely even Facebook does not yet have the ability to automatically identify this other individual. However, the regulation does not specify any applicable exception, so if you fail to turn over that data (despite having no way of knowing that review pertains to this specific individual), the supervisory authority could still legally fine you.

Would you ever be fined for that? No of course not. Strictly speaking nothing in the wording of the regulation would prevent them from doing so. But obviously they have so many bigger concerns, and are unlikely to bother interpreting things so widely.

After all, there is not a single large company that operates in Europe that is fully compliant with the GDPR if interpreted widely. There quite simply cannot be, since the costs of actually identifying everything that could count as personal data under a wide interpretation and ensuring the company can always look up 100% of it without ever missing any would bankrupt every such large company.

And that is only touching on one little aspect of the GDPR, and one that is unlikely to actually be a major deal. Much worse is how vague the "legitimate interests" reason for processing is. That is the reason that many companies are relying on for much of their processing, and nobody can say with any certainly what cases are included in that, and what are not.

So obviously the best the companies can due is attempt to follow the spirit of the regulation rather than the letter. But of course, if you do that, you cannot be entirely sure the regulators will agree with you.


> GDPR has its heart in the right place

I'm not sure we can jump to that conclusion. If the outcome of the GDPR was bewilderingly out of left field, and totally unpredictable, perhaps we could pass this off as being a big-hearted failure.

The thing that I struggle to get past, though, is just how many of us warned of these outcomes _before_ the regulations were implemented.

I fear very few had their heart in the right place. After all, it's pretty evident now, that those of of who saw this as a corporate effort to erect unnatural barriers to entry were correct. The people who designed the GDPR aren't idiots - they knew what they were doing too.


I'm happy to see this as the top post on Hacker News, though would wonder if anyone would be able to provide me with a summary of the article since $399 is a bit steep for me (as in, I can afford it, but it's obviously WAY too much for what's promised by the title).

I'd also be interested in case anyone has any thoughts on what the short or long-term outcome of the situation would be. Come to think of it, I'd like it if someone could give me a rough outline of GDPR at all.

I'm a software developer working in Britain, and I reckon the local consequences are "the lawyers make lots of money", but am always keen to hear other viewpoints.


You can open it on Incognito mode.


At the moment, lawyers and all the scummy industry around the GDPR (whether it's advice/consulting or "consent management") are indeed the only ones making the money.

There is very little enforcement and flawed solutions from the aforementioned industry are allowed to proliferate despite not actually being compliant (the majority of "consent management" solutions are in breach, so they are making money while not even helping their client become compliant).


Another good page with stats [1] including this:

$63 million in fines issued

$57 million of that issued to Google

[1] https://www.varonis.com/blog/gdpr-effect-review/


> Google was hit with a fee of $57 million for not making it clear to users how they were harvesting data from the Google search engine, YouTube and Google Maps for personalized ads. This fine only amounts to .04% of Google’s yearly revenue.

Not only does this not cover Google's main offense (tracking of users and non-users with Google Analytics even on non-Google properties) but the fine is so minimal that it's basically the cost of doing business.

I wish people would stop bringing this one up.


We value your privacy. Like, it's valuable. We sell it for money. We're going to nag you until you click this button so we can't get in trouble for profiting off the data you give us.

Good legislation is important to let us penalize bad actors—does any one know of any accounts of some bad actors getting stopped by the GDPR?

What do you guys think: are there laws that should be in place to incentivize privacy-preserving tools?


> We're going to nag you until you click this button so we can't get in trouble for profiting off the data you give us.

That is explicitly against the regulation. Consent should be freely given otherwise it's invalid.

The problem is that there is no enforcement around this (despite it being very easy to detect this behavior at scale by running a web scraper) so they keep doing it and profiting off it.


You're right; I'm exaggerating a bit. I do find it annoying when the banners take up a significant section of the screen though. On mobile is especially bad; I've had banners take up a good 1/3 of the screen.


> Consent should be freely given otherwise it's invalid.

I tried to figure out what this actually means but it's very hazy. A naggy news website isn't performing a contract. Are they provisioning a service (assuming you did not buy or order or subscribe to anything)?

"When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract."

https://gdpr-info.eu/art-7-gdpr/


For starters, it simply means that if declining consent is harder/more annoying than accepting then it's already in breach, regardless of anything else.

If your website takes 1 click to accept tracking but several clicks to deny it then you're already in breach (assuming the law was actually enforced, which it isn't at the moment).


Can you link a reliable source?


This is from the ICO, the UK privacy regulator: https://ico.org.uk/for-organisations/guide-to-data-protectio...

> Consent requires a positive opt-in. Don’t use pre-ticked boxes or any other method of default consent.

If providing consent requires a single click (accept the pre-ticked options) but declining consent requires multiple clicks (to untick the pre-ticked options) then that is already in breach.

> Be specific and ‘granular’ so that you get separate consent for separate things. Vague or blanket consent is not enough.

A big "accept" button for everything is not good enough either.


Guys, we have a lot of European customers and we completely ignored GDPR rules. After it was introduced, only 2 potential customers asked us about it and we just moved their emails to trash. Not worth the hassle! There is nothing they can do anyway to force it if you are not living within the EU (unless there is a special agreement between your country and EU). I even know some startups who are located within the EU, but still don't care about GDPR :D


This has been a constant headache, not the rules or how to apply it, but our customers still act like there isn't such a thing like GDPR, and actively demand, DEMAND that we put in place functionality that is in clear violation of GDPR, and when you try to inform and explain to them who things work they get mad at me and threaten that they will get a more professional shop to do things for them shrugs


Given that enforcement would have to be stepped up considerably to even be called selective, compliance means your competition has a large advantage.


I'm not going to lie, if one of them switches shop and they make what they asked us, I'm going to report, in Finland law isn't selective even if it isn't enforced.


The worst effect the GDPR had was on offline bureaucracy. "Data protection" has become the go-to excuse for blocking every single goddamn thing.


Beyond the impact of crushing knowledge transfer, it’s going to kill a lot of people.

In companies with world-wide products/solutions/support, older employees tend to have a broad base of knowledge and learning from being exposed to all customer experiences and issues. Now, everything is in a protected silo and new employees will only learn through a soda-straw looking glass. Older employees are learning to say “not my problem” when getting cleared to look at something overseas because there are 3 layers of data protection officers and it cant be proven there’s not 1 bit of PPD in that 10GB dump.

Some day soon, an industrial or other large-scale accident will kill people and someone in the back office will say “That team didn’t know about X? Doh!”


Personally I am just annoyed by the cookie warning on every site. Gdpr does not apply to vast portions of the internet.


If you use an AdBlocker, you can add the list called "EasyList Cookie" and it will remove some of the annoying cookie notifications, but even with that list a lot of cookie notifications will be showed.


>Gdpr does not apply to vast portions of the internet

Europe wants it to apply to anything a European might touch.


Of course they do, and just like my ancestors did in 1776 I am free to tell them to mind their own business on their own side of the pond.


GDPR was known to be, is known to be, and will known to be a shit law that's not tied to reality. It did have some good (allowing you to know what they have on you in general, and asking them to delete some of that), but the rest is just bad, bad, bad.

I wish people would be rational when supporting privacy increasing things. GDPR could have been much better and it saddens me that it was ruined, and defended by, zealots.


A person commented and asked me about suggestions, but deleted his comment before I could answer so here it is anyways:

Super quickly (I'm sure you have heard of, or can quickly use a search engine to find the commonly listed issues):

Damages: damages need to be scaled according to the company size, severity and amount. GDPR was created to punish Big Players, but the wording that would have fit them is equally (and should be, laws should be equal) applied to small companies resulting in an impedance mismatch. Frankly, the damages are too small for the Big Players, but insane to the small ones. GDPR also does not apply to the state, but holy shit it fucking should!

Enforcement: it needs to be equally enforced and you need to be able to sue by yourself over it instead of just limiting it to a state organisation.

Data: it should be data that is directly tied to you, ie leave the normal web logs etc out of it. PII is just a sham as it's defined today. A factor of usage also needs to play into it, ie normal web server ip logs that are separate and don't feed into a user specific connection into a database should not be a consideration.

Access: access _needs_ to be able to be done online if the data is collected or transferred online. Ie no this "you need to physically mail us a certified mail with your id" shit. GPDR is a fucking failure in this aspect. Also no required strong authentication: access should be just directly through your account you can access normally without strong authentication.

Usage: GDPR does not allow you to trade tracking for access (ie monetisation of content is almost impossible if you care about user privacy): this is insane. GDPR also supposedly does not allow for those complicated "accept all or modify your preferences" windows, but it should have no saying in that: if a site wants to make the experience painful, that's up to them. It is up to the user to select if they want to use that site or not.


Who could have predicted that an overarching government program to control and regulate content on the internet is failing?


It is worth pointing out GDPR is really two sets of laws - one around data security and breaches, and another set of laws around privacy. It's the second set of laws that get the most criticism for their opaqueness, but I don't know if they are better or worse than the first.

They probably should have held off completely on the second set - the upcoming ePrivacy regulations are promised to actually do something, rather than just provide a really frustrating and opaque set of consent guidelines.

As it is now, the law doesn't require anyone to actually stop what they are doing. The only difference now is you have to retain a lawyer to do it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: