Will the GDPR save you from killer robots?

Automated decision-making & profiling in a near future dystopia

Image: Siyan Ren
* I mean if you’re an E.U. citizen that is. Because the rest of us seem more or less f*&%ked. 

EU playing hardball

The EU’s General Data Protection Regulation, when it comes into force next year on the 25th of May, 2018, appears that it will be the most far-reaching and comprehensive privacy and data protection system out there (that I’ve seen anyway). And companies both inside and outside the Union who want to target EU citizens will have to play ball or be excluded from the European Economic Area. In exchange, regulation sets up one arm of the digital “one-stop shop” for companies doing business in the EU, so they don’t have to be subject to a nest of varying rules in each member state.

Automated processing and you

So anyway, in Chapter 3: Rights of the Data Subject, Article 22, paragraph 1, the GDPR reads:

“The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”

I’ve been trying to grasp what is probably meant by “legal effects” here in a more mundane context. Given that the relationship between users (data subjects) and service providers (data controllers) is generally a legal agreement or contract like a Terms of Service, or an End-User License Agreement, how should we interpret this clause? Is it just anything that touches on the ToS or EULA?

“…the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision.”

Meanwhile, IRL

In actual practice, we know from ample life evidence that rules and laws are often followed where and when expedient. I suspect killer robots won’t be much different.

[Man runs down a dark alley between warehouses at night.][Private security robot appears.]ROBOT: (to Man) Stop, or I will shoot! [Man stops, slowly turns around, raising his hands in the air.]MAN: I--I’m an EU citizen. I have rights. [Robot scans the biometrics of the man, processing through a proprietary database…]MAN: I do not consent to be scanned. (looks around at ambient cameras filming the alley)ROBOT: Scanning is obligatory in this area for purposes of public safety.[Robot continues scanning.]MAN: Come on, man. Just let me get my ident card. I’ll show you — (seems to reach for wallet).[Robot shoots the man.]

Material Scope

Without a doubt such a balancing act will be necessary, but it’s explicitly outside the “Material Scope” of the GDPR as laid out in Article 2:

“This Regulation does not apply to the processing of personal data: […]

by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security.”

So, I’m guessing based on this exclusion that any corporation which potentially could bring to market killer robots would probably end up being itself a “competent authority” or its products used by one in the protection of public safety.

--

--

AI Publisher from Canada, eh https://lostbooks.gumroad.com/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store