I make some apps that use precise location. We don't sell the data, we share it with the interested party who the app user is working for only. The user is fully aware when the data is collected and when not, they can even turn it off at will. Nobody even cares who has the device, just that they're delivering a package or something to that effect. It's all good.
Then I hit the app-stores and it's like pulling teeth with paperwork and rejections (depends on the reviewer, sometimes you fly through, sometimes very much not).
I have to attest that we don't do shit with an email that may or may not appear in a field, that we don't do porn (I have no idea where that reviewer got that idea). Some reviewer misread something and now I have to explain that no we don't use contacts, we never do... wtf. More delays.
I'm privacy minded so I don't so much mind the IDEA that I have to attest to all these things, but all the hoops I have to run through is because the bad guys who steal all this info and sell it do their thing, but they're still doing their bad thing just fine ...
Meanwhile my app is stuck in review.
It feels like being pulled over randomly by the cops because "well there's a lot of speeders out there" when I'm not one of them. It's all hassle for everyone doing it right, and for the bad guys or the apps that clearly don't have to follow the rules that everyone else does it has no impact on them.
The whole system is borked.
Furthermore, you cannot contract away criminal liability if any exists.
The fact that 100% of its users, except the litigant, skimmed through the EULA and did not notice anything does not relieve the company from the responsibility.
that is defined as extortion, but labled as onboarding.
GDPR tried. And the narrative around GDPR was deliberately completely derailed by adtech.
Lack of enforcement didn't help either
I have read GDPR and don't work in adtech. It is vague and it is pretty easy to find pathological scenarios that don't make much sense or impose an unusually high burden for no benefit. Every European law firm seems to agree with this assessment despite what proponents assert. Consequently, it forces a lot of expensive defensive activity in practice.
To some extent, it was just a failure of imagination on the part of GDPR's authors. Many things are not nearly as simple as it seems to assume and it bleeds into data models that have nothing to do with people.
It is what it is but no one should pretend it is not a burden for companies that have nothing to do with adtech or even data about people.
The problem is not the GDPR, the problem is the surveillance industry that wants to grab as much data as possible and try to do as much malicious compliance as possible.
The costs are often worse on industrial side because the data is so much larger and faster than web or mobile data.
The trouble started when lawyers correctly noticed that these are incidentally capable surveillance systems even though that isn't how we use them or what they were designed for.
GDPR frames everything in the context of a person's data. There is no "person_id" or similar field in these data models. That isn't the purpose of the data, it would be expensive to extract it, and then it would create obvious liability under GDPR. This makes the idea of finding a person's data expensive -- brute-force search on huge data volumes.
Compounding this, these data systems are often operational and some of the data may be in situ at the edge because it is too large to move all of it. The power and compute budget may not exist to find a person using brute force.
AFAICT, current best practice is to maintain a polite fiction that people aren't being tracked because that is not the intent. No one thinks that would stand up to serious legal scrutiny though. If the regulators come after you then plead best effort based on the technical infeasibility of doing anything else.
Forget tracking workers' movements and stuff like that because that's even more complicated (the data is tied to a person, but only in their capacity as an employee and not as a private individual).
Focus on a case like a cluster of sensors attached to various equipment powered by electric motors, or using RFID to detect when a pallet enters the warehouse. Let's say that all goes to a cloud platform and I store it, I build a bunch of derived analytics stuff from it, and I send it up to Anthropic (with no-train-on-me-pls contract clause) for my cool new AI insights engine.
Does GDPR apply at all to that? I would have assumed it doesn't have any relevance whatsoever, but you're implying that it does. Or are you specifically talking about the case when individual employees are the data collection subjects, like a fleet management platform with a telematics component?
--- start quote ---
Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.
The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. 3In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.
An approved certification mechanism pursuant to Article 42 may be used as an element to demonstrate compliance with the requirements set out in paragraphs 1 and 2 of this Article.
--- end quote ---
IANAL, but this basically covers all your bases, together with https://gdpr-info.eu/art-32-gdpr/
Unless, of course, your industrial-scale data collection actually collects significantly more data than you let on, and extraction of personal data is not as hard as you make it sound
The GDPR is there to protect your personal/sensitive data, or data that can personally identify you. If has nothing whatsoever to do with data capture from industrial machinary.
I remain astounded how ignorant some people are of basic GDPR principle: protecting your _personal_ data.
How is this not your personal data?
Exploitation of these types of data sources has been demonstrated for 15+ years at this point. Abuse is often impractical for technical reasons but GDPR doesn't give you free pass on collecting personal data just because you aren't using it like personal data.
Many systems were not explicitly designed for surveillance, and are. Because many systems collect too much data to begin with.
Hence the problem: people who collect too much data claim that GDPR is complicated, complex, convoluted, impossible to comply with... instead of changing what data they collect, and how.
Additionally, people confuse the complexity of human endeavours with the complexity of the law. GDPR itself is neither complex nor complicated. It doesn't try to carve out exceptions, rules, and regulations for every possible activity humans may attempt. Then it would become impossible to understand or comply with.
As is, it has enough carveouts for industries which require more data than strictly necessary, called "legitimate interest" (which still doesn't allow you to just use this data willy-nilly). E.g. banks collect significantly more data about customers than strictly necessary (because KYC, fraud, security etc.), and store that data for significantly longer amount of time than allowed by privacy-related laws (because they are governed by bank laws of respective countries). It doesn'tmean they can sell that data or spy on users.
Same here. It's not on the law to tell you exactly how to operate your "industrial-scale operation". It's on you to fix your shit, stop collecting more data than necessary, have data protection in place, delete data after a reasonable time, anonymize data etc.
Congrats on gullibly believing the ad tech narrative.
YOUR collection of user's data is an overreach and breach of privacy. MY collection of data is absolutely necessary to grow my scrappy small business and provide value. I am a good person with good intentions, so its OK. You are a bad person doing bad things, so its not OK.
What is data processing essential for the services being provided? Many publishers assumed that getting paid was an essential part of providing a service, and it was not until 3 months before the implementation deadline that the committee clarified that getting paid is not included when you are being paid by a third party.
How are you to know whether or not the user is an EU citizen (and thus subject to the GDPR)? Is making that determination a service essential for providing your service? The answers apparently were "You don't" and "No", which would effectively make companies assume that the GDPR applies to everyone on the planet.
The GDPR also is fundamentally opposed to how things currently work in the internet, making almost all advertising on the web illegal overnight. It was too big of a change to happen at once, so it effectively only loosely enforced in practice.
I like the idea of the GDPR, but the implementation sucks.
What utter utter FUD
You are free to collect as much personal data as you want, PROVIDING you have my explicit opt-in informed consent to do so.
What about this is difficult to understand?
> How are you to know whether or not the user is an EU citizen (and thus subject to the GDPR)?
The GDPR provides _basic_ data safety and consumer protection. If you aren't protecting users private data regardless of where they live in line with GDPR principles (such as collecting it fairly, and not selling it to randoms) then you are playing fast and loose with your users private, sensitive data. In which case you need to _seriously_ consider if what you are doing is ethical.
> The GDPR also is fundamentally opposed to how things currently work in the internet, making almost all advertising on the web illegal overnight.
Utter Bullshit!
You are free to advertise as much as you like! But if you want to track me with your advertising (hello scummy adtech industry) then you need my explicit informed consent to do so. And so you should!
Again, what about this is difficult to understand?
It's interesting and revealing when someone responds to a law that says "You're not allowed to abuse users in countries X, Y, and Z" with "How can I figure out who's in the other countries, so I can abuse them?" instead of "I'll just stop abusing everyone, and then I don't even need to worry about where anyone is."
Whenever you find yourself asking "how do I toe as close to the 'illegal' line as I can without technically going over it?" I think it's time to ask yourself some pretty hard questions.
DPA won't punish you for not following EDPB's recommendations, they will punish you for breaking GDPR. You are free to ignore EDPB if you think your legal position is strong, but you carry the risk if you are wrong.
The rest of the "It'S So LaRgE AnD UndErSpEciFieD" is just FUD. The regulators don't just slap fines, they work with you to get you to comply, and they just want to see that you're putting in the effort instead of messing them about.
I have literally never been surprised by the GDPR. Whenever I thought "surely this is allowed" it was, whenever I thought "this can't be allowed", it wasn't. For everything in the middle, nobody will punish you for an honest mistake.
This is not too hard if you do proper engineering work ahead of time and are purposeful about how you move and manage data (step 1 is just not collecting it unless its vital). But the industry encourages us to be very bad about that because we gotta "move fast and break things or you're not gonna make it."
How do you know that? Again the law establishes a rules making body that can at any time change or add rules, and as far as I can tell there's no public review process.
Please quote the exact text of the law that you claim does that. And since the law has been in force for 10 years, perhaps you can point at the website of said body.
If you say "DPAs", then...erm... perhaps learn something about the world around you? Who do you think monitors compliance, say, for food, or for construction? It just appears out of nowhere? Same here
Just don't spy on people.
Until that changes you're going to be stuck.
Something as simple as the data protections act 1998 (https://en.wikipedia.org/wiki/Data_Protection_Act_1998) would kneecap a lot of the shady shit that goes on in the USA.
So the current feedback process involves: construction → exploitation → reporting → public awareness → legislation. This is too slow. Moreover, operating in this environment is exhausting.
We need a different feedback loop altogether. I'm not sure which one would work best, but something different needs to be considered.
And critically, it is not someone becoming aware of private information that is the abuse of privacy, it is exploiting that private information which is the abuse. There may be countless legitimate technical reasons you need to collect data, but there can not possibly be a technical justification for selling it.
[1] https://pluralistic.net/2023/10/21/the-internets-original-si...
There was one person with a feminine name who showed up with a “home address” that would correspond to being my “neighbor” at home, at my clinic, at church, when I went to college, etc. All the years corresponded correctly, and the addresses were some residential place about a block or less away from the places where I went.
For all I know, this person was either fictional or an innocent bystander. She did appear to have a Facebook account or two. I was never able to directly contact her. But I found it very strange and I wondered what would be gained by doxxxing me in this manner?
Of course this has nothing directly to do with GPS coordinates, but imagine if the GPS began to be part of your public record as well, or on your credit report. Imagine if it was entered into the public record what coffee house you visited every morning, or if there were errors in this record.
* coordinates
There are many ways of establishing ones latitude and longitude without recourse to one particular GNSS system.
https://citizenlab.ca/research/analysis-of-penlinks-ad-based...
The previous views on privacy didn't take into account the fact that everyone now has video cameras and people are incentivized to violate privacy to make money as influencers. I think people's privacies need to be protected and I think that means making laws around it much, much stricter. This includes things like location data, it shouldn't be sold or exposed at all.
Imagine a option on your iPhone that says “Enable this to allow geo-location tracking for organisations registered under the NOADSJUSTPUBLICGOOD Act” - then any wifi endpoint could locate you as long based on signal strength etc and that data could only be made available to people registered under the act.
Would we see new understanding of how people move around in cities, would we see better traffic information, Inthink so - as long as people believe that there are real teeth to the laws and they enforced loudly and publically.
We should embrace the benefits of a society wide epidemiology experiment - the benefits for public health are incredible. (Add to that supply chain logistics on open ledgers and many of the new things that just were not possible before and the future of open transparent but well regulated democracies is bright.
Let me know if you spot one.
Alas, I was stymied by not having any cash to work on it, and the unit economics were not very VC friendly (at least I assume that’s one of the reasons why I didn’t get any traction from VCs).
What about: "If something bad happens because of the data your company shared or lost, it is criminally and financially liable?"
Fitness apps can be local. We have pocket supercomputers; certainly, we don't need help from the clown to keep track of how far (or how energetically) we biked or walked today, or where that took place.
And the FLOSS/Linux phone hardware attempts have frankly sucked.
I was hoping that my PinePhone Pro would actually be usable. But no, its a PineDoorstop.
Proper Linux would be a great 3rd choice. But yeah. We've got a duopoly and not much we can do about it.
Missed opportunity by the EU when they wrote GDPR.
Not really.
There are legitimate reasons why I might wish to be tracked or give my personal data to a company. As long as I'm asked to give clear, opt-in informed consent, this is perfectly fine. This is the very essence of the GDPR!
Instead, direct your ire to the scummy adtech industry who are constantly asking to invade my privacy and smell my knickers trying to work out what I ate for lunch. Another law to ban the adtech industry would be welcome from me, though would meet fierce resistance from the likes of Google.
The GDPR is well written.
In these cases they don't even need to ask for your permission.
> Instead, direct your ire to the scummy adtech industry who are constantly asking to invade my privacy and smell my knickers trying to work out what I ate for lunch. Another law to ban the adtech industry would be welcome from me, though would meet fierce resistance from the likes of Google.
No, the EU should have done more to prevent this. They didn't want to kill a billions-of-euros industry. But they should have.
GDPR has literally nothing to do with cookie popups. That was, and is, adtech
that's what causes the popups.
it should prohibit it outright, consent or not.
The adtech industry has, time and again, proven they cannot self-regulate to any decent capacity. At this point, the only reasonable course of action is to shackle them down with such heavy legislative burdens they're rendered de facto extinct.
I will not mourn their loss.
EU is first and foremost a capitalist economy which nevertheless tries to protect people from abuse. Who are they to forbid someone to collect data, and to someone to provide this data? Even things like quality surveys are collecting personal data.
However, adtech and tracking (also capitalists, (un)ironically) ruined everything for it for everyone.
For example, giving consent should be the same difficulty as denying it. So one click consent means there must be also one click non-consent. But this is policed very poorly.
I think they should just ban adtech altogether, at least any form of targeted advertising, individual pricing (which is already illegal in many EU countries) and ideally also deep market research.
It's a rhetorical fiction the ad industry tells itself.
Then it's not anonymous.
Simple as that.
Edit: It's a rhetorical fiction the ad industry tells us.
https://arxiv.org/abs/cs/0610105
If movie ratings are vulnerable to pattern-matching from noisy external sources, then it should be obvious that location data is enormously more vulnerable.
waiting for legislation or eulas to fix this is a lost cause since adtech always finds a loophole. the fix has to be architectural. moving toward stateless proxies that strip device identifiers at the edge before they even hit upstream servers. if the payload never touches a persistent db there is literally nothing to de-anonymize. stateless infra is the only sane way forward
why would someone include tech that makes people think twice about using the app, unless it is required if you want to "sell" in a particular venue.
if your developing geolocation based apps, location tracking is a core function.
a calender, absolutely does not require location tracking beyond what side of the prime meridian are you on.
But the subsequent sale of that data is not—is the discussion here.
you cant sell what you dont have unless you lie lower than a rug.
fix the data collection problem and a second order effect of no data for sale emerges.
Because the overwhelming majority of people don't think twice about this tech.
I do, and that's why I use a lot of web tools or old-fashioned phone calls, but most people think metadata=unimportant and assume that the purpose of the app is what it does for them rather than to gather their personal information for sale.
Even if Google and Apple both want to commit to fighting this, it becomes a game of whack-a-mole, because there are all sorts of different ways to track users that the platforms can't control.
As an easy example: every time you share an Instagram post/video/reel, they generate a unique link that is tracked back to you so they can track your social graph by seeing which users end up viewing that link. (TikTok does the same thing, although they at least make it more obvious by showing that in the UI with "____ shared this video with you").
Is there not also a requirement for clean consent? Ie a weather app can’t track your precise location?
I think a lot of people don't realize the power of a big enough sample size. With enough samples even something pretty innocent looking like your daily step counter could make you identifiable.
As far as I know we don't have large enough databases to make this happen in practice, but I don't think this is impossible in the future.
Alone, these points are not deanonymizing, it's when there's other data associated.
The analytic reconstruction of identity from location is far more sophisticated than the scenarios people imagine. You don't need to know where they live to figure out who they are. Every human leaves a fingerprint in space-time.
It's not though.
Critical for myriad elective purposes? Sure.
Could you be more specific with maybe a single example of where my physical geographic location is electronically critical for a purpose that isn't elective/optional/avoidable?
(And I'm not just trying to be obtuse. I think you're touching on at least part of the 'heart' of both this conversation and that of digital ID verification.)
Edit: I assume I am missing a crucial part of logistics that you’re familiar with, genuinely curious.
A lot isn't good enough.