Communication: Precire Technologies

The company Precire of Aachen, Germany receives the BigBrotherAward 2019 in the “Communications” Category for the speech analysis software. Precire is not only used to preselect job candidates, but also by call centres to analyse callers’ emotions.
Laudator:
Rena Tangens am Redner.innenpult der BigBrotherAwards 2021.
Rena Tangens, Digitalcourage
Collage: Auf der linken Seite Rena Tangens am Redner.innenpult der BBAs 2019. Auf der rechten Seite ein Zitat aus ihrer Laudatio.

The BigBrotherAward 2018 in the “Communications” Category goes to Precire Technologies GmbH in Aachen for their scientifically dubious, probably illegal, and dangerous speech analysis.

Precire — Never heard of them? Maybe. But it is quite possible that Precire has heard from you.

Imagine you just applied for a new job. And — yay! — you have taken the first hurdle: you are invited for an interview. However, it’s going to be over the phone.

Please describe your typical Sunday.”

Weird. That’s not exactly the question you expected at the beginning of a job interview.

How was your last vacation?”

Your question “why are you interested in that?” is useless. Because this telephone interview is conducted by a computer in the first place, and secondly, it is not interested in you at all, and thirdly, even the company behind this phone-based job test is not interested in the content of your answers either. The computer on the phone simply wants a sample of your speech. And that means it needs to keep you talking about anything for 15 minutes.

What do you enjoy most about your current occupation or your learned trade?”

Erm, interacting with people? Just kidding. But as we said, it’s not about what you say, but about how you say it. Precire, the company to whose telephone voice you are talking, claims that every person’s voice is as unique as their DNA. Precire also claims that they can derive a person’s character from a 15-minute speech sample, and figure out how suited they are for a certain job.

To that end, Precire allegedly decomposes recorded speech samples into over 500 000 components and analyses everything imaginable: pitch, volume, modulation, speed and rhythm. In addition it analyses the frequency of certain words, lengths of sentences, lengths of pauses, how convoluted the sentences are, how many filler words are used, how often words like “I” and the neutral pronoun “one” appear, how much variation there is in the voice, and also the acoustics. The computer also immediately assesses the speech as emotional or rational, cautious or forthright. To achieve this, Precire compares patterns it has found to those of test persons in its database. Precire then quickly applies to you the results of psycho-tests of persons who sound similar — and deduces your character from that.

The results are ratings such as those for freelance journalist Eva Wolfangel, who tested Precire on herself: she was found to be extremely curious (8 out of 9), but luckily was also very agreeable (also 8), sociable (7) and ready to take risks (6). At the same time the system judged that she had a “reserved style of communicating” and that she was “not very supportive”.1

First one thing, then another, on the one hand, and on the other hand. It sounds like the weekly horoscope on a crossword puzzle. Here is a example of such a horoscope:

You wish for your fellow humans to like you, but still have a tendency to criticise yourself.” Or “Sometimes you seriously doubt that you have taken the right decision.” And a real gem: “You are intelligent and not easily fooled.”

Texts like these are trivial and the wording is somewhat flattering, but also slightly critical. There is always something that applies to everybody. Many people are taken in by that and think, “there’s really something to it!” The scientific term for this phenomenon is “Barnum effect”, named after the big circus.2

Precire, though, claims that their method is scientifically valid.3 They point to a book published by Springer Gabler, an academic publisher focusing on economics.

Scientific?

But it would be wrong to portray this as reputable science, as Dr. Uwe Kanning, professor for business psychology at the University of Applied Sciences Osnabrück critically explains in his book review.4 There is no independent research on the topic, he argues, and the algorithm is just a black box, the vendor’s trade secret. The available studies have not gathered their own numbers, but are masters’ theses based on data provided by Precire. The publication of further studies has resolutely been denied, even upon request. This was also criticised by freelance journalist Bärbel Schwertfeger.5

To recap: There is no evidence for a direct relationship between speech parameters and job performance. And there is no evidence that predictions created on that basis have any validity concerning the future performance of job applicants.6

Precire’s speech analysis will usually not find causality, but mere correlation. These can be completely random. One might as well base the decision on someone’s astrological sign. Or on shoe size. It would then go like this: if the computer finds that the three previously most successful employees all had shoe size 10, then it would hereafter recommend only persons with size 10 for management staff.

Media reports on Precire have appeared since 2015, and with few exceptions7 their gist largely was, “Look at this fascinating technology we have today!” Sometimes there is a slight sense of creeps, but hardly any journalist asks really tough questions or demands access to the studies allegedly supporting the scientific validity of Precire’s methods, which have been kept under wraps.

And nobody pierces through their numerous contradictions: On the one hand Precire claims that psychological tests are inconclusive, because the test persons could sense the desired answers. But on the other hand, Precire uses a data pool of speech samples of thousands of test subjects, whose characters were evaluated using psychological tests. To put it another way: their algorithms try to mimic the results of psychological tests. And nobody inquires why on the one hand there is talk about “immutable speech DNA”, while on the other hand Precire offers a speech training app8. Personnel development through speech training is the second part of Precire’s business model. The supposed “DNA” does not seem to be so immutable after all.

Participation in such a speech analysis is voluntary. Applicants cannot be forced to submit to a computer-based telephone interview. This is also true after the entry into force in 2018 of the European General Data Protection Regulation (GDPR), particularly Article 7, Paragraph 4 (“freely given consent”). But hey, how freely will the consent be given when it is about getting a very desirable job?

So who is using Precire? Allegedly more than 100 companies in Germany. Some of them are known: the recruitment agency Randstad, the insurance companies HDI and Gothaer, the transport business giant Fraport, the Handelsblatt newspaper, consulting company KPMG, electricity supplier RWE, health insurance company DAK, the Accor Hotels group, IBM and the mobile communications provider Vodafone.

Why on earth are companies doing that?

Because it is modern, and because they believe that it will lower their burden. The burden of preselection, and also the burden of decision-making. Computers are fast, and there are numbers. Numbers are facts, and computers are objective. Or are they?

Objective?

Thomas Belker, previously a board member for human resources at the Talanx insurance company — Talanx is a Precire customer — is convinced of the efficiency. So convinced, in fact, that he changed jobs and became the new CEO of Precire by 1 May 2019. He thinks that the programme not only saves time and money, but is also more objective, and therefore, more fair. Quote: “The machine knows nothing about subconscious prejduce, to which every human is subject. It does not care if someone is a man or a woman, or what colour his skin is.”9

As Dirk Gratzel, Precire co-founder, puts it: “A machine is much less error-prone than a recruiter. ‘Objective’ is the only thing a machine can do.”10

This is a very naïve point of view. Computers cannot “understand” us. But we must not confuse this with neutrality. Computer programmes are also subject to prejudice, because they are programmed by humans. Humans, who take their own sets of values for granted and incorporate those in the software. Humans with certain questions, goals and motives. But above all, the software is trained with a selected reference group of persons, which then defines what is seen as “normal”, good or bad.

Precire also offers services for the analysis of qualified management personnel. Here is the theory: whoever sounds like a manager of a DAX (german stock market index) company, might qualify as manager. Speech samples from managers of 30 DAX-listed companies were used as a basis. Of course they did not partake in these telephone interview shenanigans. Instead, Precire simply used speeches of the board members published on Youtube as base material. This not only methodically dubious. In a witty commentary Professor for business psychology Uwe Kannings (University of Applied Scienes Osnabrück) writes: “This study’s strong side is its entertainment value.” 11

But whether or not this is scientific: if the method is actually going to be used for selection of management personnel, the consequences are severe. As can be seen in the Precire advertising clip: “You are looking for someone who fits in with your team”. The picture shows a man in a grey suit and a red tie, welcoming to the team — you guessed it — another man in a grey suit and a red tie.12

According to a 2019 study, there are more managers called Michael than there are female managers of any name. The same is true for Thomas, Andreas, Peter and Christian.13

More diversity in the boards of directors? This is certainly not how it is going to happen. Selection methods like this, based on similarity, will only lead to hiring “more of the same”. Only now it is dressed up in the guise of Artificial Intelligence.

It is clear when it concerns humans, that the “human factor” plays a role. Prejudice by algorithm is more dangerous. It cannot be recognised as long as we believe the fairy tale of the “objective computer”.

Precire is riding a trend of using speech analysis for exploring and classifying people, and of ascribing magical powers of judgement to “Artificial Intelligence”. There is an unwitting side-effect to this: The cleaning-up of prejudices in the “algorithmic washing-machine” discreetly makes them respectable again. “The computer made that selection. We have no idea why”.

Not all company CEOs can warm to the idea of selecting personnel by computer. Sebastian Saxe, IT board member of the Hamburg Port Authority, says: “I am fond of digitalisation. But if we were to preselect applicants by software, we would never get to see people with rough edges who in the end turn out to be a good fit for us.14

Thank you for this clear statement.

Call centre

Maybe so far your thoughts were, “what a disgusting technology. But I am not looking for a job, so I am not affected.” Well, your thoughts would be wrong, because we now come to the third part of Precire’s business model: using speech analysis in call centres.

So how does that work? Call centres apparently record calls, forward them to one of Precire’s servers and have them analysed there in real-time. Precire then offers suggestions to the call centre agent on how to proceed. It is as if there were a psychologist sitting in the call centre agent’s ear at all times, listening in and whispering advice on what to say or what to offer next. For example, the software supposedly notices whether a customer is so enraged that she wants to terminate the contract, or whether she only wants to get a discount. Precire touts:

Complaints are raw unconscious, emotional speech, a unique data source and an excellent opportunity to intensify lasting customer relationships.”

Using a direct IT connection Precire analyses incoming complaints, recognises psychological patterns (…) and points out indications and anomalies to the deciders.”

All the while the system itself is learning — in the case of the next-best-action-advice — with every new user (“Artificial Intelligence”) and helps you turn your customers’ speech data into real added value.15

Here, too, Precire’s marketing-speak sounds exaggerated. But in contrast to the prediction of a person’s suitability for a job, it is actually possible that speech analysis will find out something about the caller, such as someone’s origin via their dialect; emotions, insecurity, stress.

Since 2014 Precire has been advertising its services to call centres. Fittingly, the company became a member of the call centre association and of the German dialogue marketing association in 2016.16 It is however curiously reserved about providing tangible information about German call centres using speech analysis on callers. A quote by Precire founder Dirk Gratzel gives some insight into the possibilities: “We have an answer for everything desired by our customers.” For example: “What can the customer’s speech tell us about him?”, “Is this health-insured person depressed?”, “Is this client lying when filing his claim report?”17

This use of speech analysis in a call centre is not only unethical, it is illegal: it violates the confidentiality of telecommunications and the confidentiality of the spoken word (German criminal code: Strafgesetzbuch, “StGB”, Para. 201). This is a crime with substantial penalties.

It is high time for someone to rise up and open a criminal case. For example the data protection commissioner of the State of North Rhine-Westphalia. And the appropriate regulatory authorities should also take a very close look at Precire’s competitors, namely the start-ups “100 Worte” (“100 Words”) from Heilbronn and “Audeering” from Munich.

Detecting emotions and motivations via speech analysis is dangerous, because it can take place without our knowledge, hidden in the background, whenever we speak. This kind of speech analysis is perfectly suited for taking advantage of us. Individual people will lose more and more power, and unassailable power is getting concentrated at large corporations, insurance companies, banks and government authorities, who have access to our data and to these kinds of technologies.

We therefore call on the legislator: The use of speech analysis and “Artificial Intelligence” for the purposes of assessing personalities, emotions and motivations must be prohibited.

Dear listeners, the next time you are on the phone with a call centre and you hear the announcement that “parts of this call will be recorded for quality improvement”, state decisively at the outset: “No!” and “I do not want you to record this call!”

Learn to say “No”. With a friendly, but clear voice.

We say “No” to Precire.

Congratulations for receiving the BigBrotherAward 2019!

Laudator.in

Rena Tangens am Redner.innenpult der BigBrotherAwards 2021.
Rena Tangens, Digitalcourage
Quellen (nur eintragen sofern nicht via [fn] im Text vorhanden, s.u.)

1 Source: Riffreporter, Eva Wolfangel: "Google, wird meine Ehe halten?" (Google, will my marriage survive?) (Web-Archive-Link)

2 Barnum effect / Forer effect (Web-Archive-Link)

3 Precire website (translated): “Precire’s scientific basis is at the heart of the technology: numerous validation studies (internal and external) safeguard the process.” Original: “Die wissenschaftliche Fundierung von Precire ist das Kernstück der Technologie: Zahlreiche Validierungsstudien (intern, extern) sichern das Verfahren ab.” Quoted in Wirtschaftspsychologie aktuell, 6 Nov 2017: Ärger des Monats – Algorithmische Verirrungen (annoyance of the month: algorithmic aberrations) [Content no longer available]

4 Uwe P. Kanning: Sprachanalyse: eine neue Methode der Personalauswahl? (Speech analysis: a new method of selecting personnel?) (Web-Archive-Link)

5 Personalmagazin, 12/2015. By Bärbel Schwertfeger: Personalauswahl per Sprachtest. (Selecting personnel via speech test.) (PDF)
Comment by Bärbel Schwertfeger (Web-Archive-Link) below (translated): “Hi, as the author of the Personalmagazin article I have looked at Precire in detail and was left speechless several times about how they tried to ‘convince me’ and would never divulge any relevant data (a scientist who took an interest was even asked to sign a gagging contract) – including a cease and desist letter from a lawyer clearly intended to intimidate BEFORE the article appeared (nothing happened afterwards). A strange business strategy at least! But at least my test results earned me some great laughs from people who know me well. Bärber Schwertfeger” Original: “Hallo, als Autorin des Artikels im Personalmagazin habe ich mich ausführlich mit Precire beschäftigt und war doch mehrmals sprachlos, wie man versucht hat, mich “zu überzeugen” und partout keine relevanten Daten herausrückte (Ein Wissenschaftler, der sich dafür interessierte, sollte sogar einen Knebelvertrag unterschreiben) – einschließlich einer wohl als Einschüchterung gedachten anwaltlichen Abmahnung VOR Erscheinen des Artikels (danach kam nichts). Zumindest ein seltsames Geschäftsgebaren! Aber immerhin ernte ich mit meinem Testergebnis schöne Lacherfolge bei Menschen, die mich gut kennen. Bärbel Schwertfeger”

6 wirtschaftspsychologie-aktuell.de: Psychologische Diagnostik durch Sprachanalyse [Inhalt nicht mehr verfügbar]

7 Exceptions e.g. Bärbel Schwertfeger in Personalmagazin, 12/2015: Personalauswahl per Sprachtest. (Selecting personnel via speech test.) (PDF)
Eva Wolfangel at Riffreporter: "Google, wird meine Ehe halten?" (Google, will my marriage survive?) (Web-Archive-Link)

8 Precire-Coach described on the website (translated): “How we speak matters: As a coaching App, Precire helps to analyse your language and train your expression skills.” Original: “Wie wir sprechen, zählt: Als Coaching-App hilft PRECIRE, die eigene Sprache zu analysieren und das Ausdrucksvermögen zu trainieren.” [Content no longer available]

9 Source: Frankfurter Rundschau, 26 May 2018 Künstliche Intelligenz – Vorstand von Computers Gnaden. (AI – Leadership Position by the Grace of the Computer.) By Annika Leister. (Web-Archive-Link)

10 Source: Spiegel Nr. 3, 12 Jan 2019. By Martin U. Müller: Plaudernd zum Job (Chatting towards your job)

11 Source: Wirtschaftspsychologie-aktuell.de, 25 Apr 2018: Fachbuch im Fokus. By Prof. Dr. Uwe Peter Kanning. Book review of Klaus P. Stulle (Ed.): Psychologische Diagnostik durch Sprachanalyse. Validierung der Precire-Technologie für die Personalarbeit. Wiesbaden: SpringerGabler 2018 [Content no longer available]

12 Precire commercial video [Video no longer available]

13 Source: Manager Magazin, 6 Mar 2015, Analyse zur Frauenquote – Weniger Frauen in Vorständen als Männer, die Thomas heißen. By Christoph Rottwilm. (Web-Archive-Link) and Gründerszene: Michaels, Thomasse und Andreasse dominieren die Gründerszene. (Web-Archive-Link)

14 Source: Computerwoche, 16 Jun 2016: Personalauswahl 4.0 – Wenn Software in die Seele des Bewerbers schaut. (Personnel selection 4.0 – When Software looks into the Candidate’s Soul) By Michael Schweizer. (Web-Archive-Link)

15 These quotes were taken from Precire’s old website, today they can only be found with the Wayback Maching at archive.org. Back then Precire was called “Psyware” and used much more outright language. In 2016 the company changed its name – maybe the similarity to the word “spyware” was too sensitive to them after all … (Web-Archive-Link)

16 News on the Precire website, 22 Jul 2016: membership of the call centre association and of the German dialogue marketing association. [Content no longer available] (Web-Archive-Link)

17 Source: FAZ, 20 May 2015: Persönlichkeitsanalyse – Deine Sprache verrät dich. (Personality analysis – your language betrays you.) By Katrin Hummel. (Web-Archive-Link)

Jahr
Kategorie
Biotechnology (2019)

Ancestry.com

The company Ancestry.com receives the BigBrotherAward 2019 in the Biotechnology Category for enticing people with an interest in genealogy into submitting saliva samples. Ancestry.com sells genetic data to commercial pharmaceutical researchers, creates a data source prerequisite for genetic screening by the police and enables covert paternity tests, potentially causing familiy drama.
Laudator:
Dr. Thilo Weichert am Redner.innepult der BigBrotherAwards 2021.
Dr. Thilo Weichert, DVD, Netzwerk Datenschutzexpertise
Collage: Auf der linken Seite Thilo Weichert am Redner.innenpult der BBAs 2019. Auf der rechten Seite ein Zitat aus seiner Laudatio.

The BigBrotherAward 2018 in the “Biotechnology” Category goes to the company Ancestry.com and its subsidiary in Munich for exploiting an interest in genealogy to entice people into submitting saliva samples.

Research into one’s family ancestry, also known as genealogy, is a relatively harmless pastime: Who am I? Where do I come from? To whom am I related? These questions used to be answered by looking at birth certificates, wedding and death certificates, familiy trees and church registers. But genetics offers whole new insight opportunities, because an analysis of our genes, our DNA, tells us to whom we are related biologically – down to the third or fourth degree. With a certain probability even the so-called biogeographical roots of our ancestors can be determined: the question in which geographical region family members of past generations have lived.

DNA is the abbreviation of deoxyribonucleic acid, the scientific name for our genome, the totality of our hereditary disposition. Knowledge gained from that can be baffling. It is therefore no wonder that many family-tree researchers submit their saliva sample for analysis, to find out more about themselves.

Genealogy is a widespread hobby in the United States, for which many companies offer services. Ancestry.com is the market leader, allegedly with more than 10 million customers globally und 20 billion data records and documents, followed by the Google company “23andMe” with 5 million DNA analyses1.

Ancestry has created a subsidiary in Munich and started a massive push onto the German market just before Christmas 2018. They promise a “journey of discovery”, “astonishing facts about yourself”, a “key to the past”. All of that for an introductory offer of only €79, meanwhile raised to €89, including taxes, excluding shipping. A real bargain, after all, the first decoding of all 3 billion base pairs of the human genome in 2003 cost 3 billion Euros in the framework of the Human Genome Project. In 2008 costs per genome plummeted to one million Euros. By 2011 a Next-Generation sequencing could be obtained for a mere 10,000 Euros. A year after that the 1,000-Euro genome analysis could be performed within a few hours, using the computing power and new analysis technologies that had become available.

The offer is not only “cheap”, but also easy to obtain: I can open an Ancestry account online and use that to order a test kit. My saliva sample is sent to a laboratory, and after six to eight weeks I can download the results from my online account. Great!

In 2018 the ancestry.com website listed several German-language “partners” who allegedly vouched for the company’s proper conduct, including several state archives, the German National Library, the German Emigration Center (“Auswandererhaus”), the Navy Academy at Mürwik, the Swiss Federal Archive or the Lower Saxony Society of Family History (“Niedersächsischer Landesverein für Familienkunde”). However, when we asked these institutions about their partnership, they didn’t know anything about it. Subsequently, this illegal advertising quickly disappeared.

The website claims that the offer was “compliant with data protection”. (Quote:) “Security and data protection are of the highest priority at Ancestry”. The customers would maintain (quote) “ownership of their data”. Data and tissue samples would be deleted or destroyed, respectively, on request. No forwarding to third parties would take place, except (quote) “as required by law” or “when you have given us explicit permission”. So everything’s fine, then?

The catch, as often, can be found in the small print, and in the case of Ancestry it is hidden in a thicket of legal provisions: a 16-page data protection statement2, eleven pages of terms of service3, and seven and a half pages of agreement to a research project, the “Ancestry Human Diversity Project”4.

By sending in the saliva, agreement is given to the data protection statement, which grants Ancestry unrestricted research rights to my data concerning (quote) “attributes, personal health and personal well-being”. By agreeing to the “Ancestry Human Diversity Project”, “collaborating partners” become part of the game. Those partners are located (quote) “in the United States and other countries”. They can be “academic institutions, non-profit organisations, commercial enterprises and government agencies”.

Whoever has given permission to this “Ancestry Human Diversity Project”, gives up control over their genetic data and no longer has any influence over who performs what kind of research on it. According to press reports, about 80% of participants released their DNA sent to 23andMe for “research purposes” and also provided additional information about themselves and their families5. Figures are probably similar for Ancestry.

But there’s more. Customers, being the “owners of their data” are being denied any information by Ancestry about the so-called research, about methods, partners or conclusions drawn from it. The reasons behind this become obvious when taking a close look at the up-and-coming industry of gene-data leeches. For example, 23andMe, a competitor to Ancestry with only half as big a data repository, recently finalised a 300-million-dollar cooperation agreement with pharma company GlaxoSmithKline for use of the data. These companies’ business models are not genealogy, instead they are all about making big money from gene data, especially with the pharmaceutical industry as the customer.

The whole thing is not a win-win scheme, with customers paying for a service they ordered. Indeed, it is rather a rip-off. This racket brings to mind Google, Facebook and Internet data. The affected persons do not receive any information about the use of their data beyond meagre evaluation results, let alone a share of the profits as the supposed “owners” of the data. On the contrary: they are even barred by Ancestry from sharing the results of the analysis with third parties6.

What other kinds of greed Ancestry’s data may evoke came to light in the US in 2018: people who had had their DNA analysed found themselves and their families in the focus of the police, for example for being remotely related to the so-called “Golden State Killer”. To find this criminal, all relatives were subjected to investigations. At Ancestry, not a word could be found about potential criminal investigations of biological relatives.

Ancestry does not provide human-genetic counselling to German customers prior to the DNA analysis, as is required by the German Genetic Diagnostics Act (Gendiagnostikgesetz). The company also does not check whether a person is legally entitled to have the submitted saliva sample analysed. Consequently, a father might submit DNA samples from himself and his children, to effectively obtain a paternity test. Ancestry neither advises its customers that such actions are punishable according to German law, nor does it inform that biological relatives have a “right not to know”, or warn about the grave family disruptions and psychological consequences this may cause if, for example, a DNA test shows that a child was born out of wedlock, or if a supposedly anonymous sperm donor is suddenly exposed.

There is nothing wrong with genetic analysis itself: this can prove an important source of information for genealogy, and especially for medicine. But donors of samples should be aware of what they are doing. Providers such as Ancestry abuse the interest in genealogy to pile up a treasure trove of genome data for commercial research, because that is their actual business model. Data protection rights of donors and their relatives have to be respected. German obligations for disclosure and data protection are being wilfully ignored by Ancestry for increased profits. We observe a trend here: after the exploitation of Internet data, the exploitation of gene data will be the Next Very Big Thing. Ancestry is the top dog, and it has no scruples concerning data protection or basic human rights.

As a consequence, Ancestry receives the BigBrotherAward 2019. Congratulations.

Laudator.in

Dr. Thilo Weichert am Redner.innepult der BigBrotherAwards 2021.
Dr. Thilo Weichert, DVD, Netzwerk Datenschutzexpertise
Jahr
Kategorie
Authorities & Administration (2019)

Peter Beuth

The Hessian Minister of the Interior, Peter Beuth, receives the BigBrotherAward 2019 in the “Authorities & Administration” Category for the acquisition and use of an analysis software by Palantir, a company close to the CIA that has consequently gained access to the highly sensitive data networks of Hessian Police. With this software, mass data from police and external sources can automatically be interrelated, analysed and evaluated.
Laudator:
Portraitaufnahme von Rolf Gössner.
Dr. Rolf Gössner, Internationale Liga für Menschenrechte (ILFM)
Grafik zweier sich schüttelnder Hände. An der linken Hand steht das Palantir-Logo an der rechten das Logo der Polizei.

The BigBrotherAward 2018 in the “Authorities & Administration” Category goes to the Interior Minister of the Federal State of Hesse, Peter Beuth (CDU).

He receives the negative award for

1. Acquiring an analysis software from Palantir, a company close to the CIA, the first purchase of this kind in Germany,

2. the fact that via the deployment and operation of this software, this controversial US company is given access to Hessian Police data networks,

3. the use of a software with which mass data from police and external sources can automatically be interrelated, analysed and evaluated within seconds – with disastrous effects on basic rights, data protection, and on the rule of law.

Yes, it was just last year when we decorated the conservative–green government coalition in the State of Hesse with a BigBrotherAward, for its then plans to tighten up the laws on the state’s police and its domestic intelligence agency (Verfassungsschutz).1 Despite all protests, these laws were adopted in July 2018 and have been in force since then. This now enables Hessian Police to take up new surveillance measures far ahead of actual suspicion or of a possible threat – for example, state trojans can be installed or people can be shackled with electronic tags only because the Police assumes that they might commit crimes in the future.

And there is more: to carry out these new preventive tasks and deal with the flood of data that is generated in the process, the Police has even got Palantir involved, a controversial company that is close to the CIA. Therefore, for the first time in German BigBrotherAwards history, we have no choice but to give the second successive punitive award to a data sinner in the same governing coalition of the same Federal State.

The Hessian interior minister Peter Beuth is responsible for commissioning the US company Palantir to install and activate its “Gotham” analysis software in the Hessian Police IT system. This software is named after the fictitious city, riddled by criminality and corruption, where Batman hunts perpetrators and ensures that law and order are upheld. After the “Gotham” software was adapted to the needs of Hessian Police, it was named “Hesse Data” (Hessen-Data). Paragraph 25a of the tightened Hessian Police Law (Hessisches Sicherheits- und Ordnungsgesetz, HSOG, literally “Hessian Security and Order Act”) gives the Police permission to use this software, which is why it has mockingly been called the “Palantir Enabling Paragraph”.2 Via this paragraph, comprehensive analyses may be conducted to take preventive action against more than forty crimes listed in § 100a section 2 of the Criminal Procedure Code (on telecommunications surveillance) and to avert certain threats.

What, then, makes this software for relating and analysing data by the US company Palantir so problematic and damaging to fundamental rights?

“Palantir” is named after the “Seeing-stones” in “Lord of the Rings”, and according to German newspaper Süddeutsche Zeitung it is “one of the most controversial companies in Silicon Valley”. The US civil rights organisation ACLU calls them a “key company in the surveillance industry”.3 The US “star investor” and billionaire Peter Thiel, who also co-founded the online payment service Paypal, founded this company in 2004 with financial support from the US Central Intelligence Agency (CIA). The company’s customer list reads like a “who is who” of the US military and security bureaucracy: CIA, FBI, NSA, Pentagon, Marines, and Air Force.4 To put it another way: As the in-house supplier of these institutions, the company is deeply entangled with the US military-digital complex, and its business model is: Big Data for Big Brother.5 Peter Thiel is also on the Facebook board and he supported Donald Trump’s election campaign with over a million US dollars.6

So Hessian Police commissioned this highly controversial surveillance company to analyse its police databases and interrelate them with social media data and other external documents. There is no way to rule out that confidential police data from Hesse was allowed to flow to the United States – particularly as up to six software developers employed by the company installed the software using their own laptops, operated it for Hessian Police and have maintenance access. As a US company, Palantir is subject to the notorious “Foreign Intelligence Surveillance Act” (FISA). That means that all information on non-US citizens to which Palantir can gain access, in whatever way, must be shared with US secret services if warranted.7 And in the view of the opposition in the Hessian parliament, the FDP and Left parties, there are no reliable control mechanisms to prevent this.8

The hope with “Hesse Data”, the data interrelation and analysis software, is that threats can be recognised more easily and so-called terrorist “endangerers” can be identified and tracked down – in other words, people who have not committed a crime but whom police believe to be capable of doing so on the basis of some evidence or behaviour. Modern police work has long gone beyond averting concrete threats towards police “reconnaissance” far ahead of presumed threats, as has been legalised in the latest tightening of the Hessian Police Law. Police are thus entering the domain of intelligence agencies, which is none of their business as a matter of principle. And if the new conservative–green coalition treaty of December 2018 is to be believed, the analysis software could even be used below the threshold of fighting Islamist terrorism and organised crime – and therefore to a much larger extent as initially intended.9 Meanwhile a mobile version of “Hesse Data” also exists, for example to track target persons or coordinate police observers.10

“Hesse Data” opens floodgates for police IT operations: Police data for criminal investigation used to be separate from those for threat defence, because data protection principles mandate that personal data must only be used for the purpose for which they were acquired – either for criminal investigation or for threat defence. This principle of purpose limitation is abolished with the use of “Hesse Data”.11 What is more, not only various police databases but also metadata as well as content from telecommunications surveillance are linked and trawled through, as well as from diverse information systems held by other institutions such as registration authorities and the central database on foreigners. But it doesn’t end there: another floodgate opened by “Hesse Data” is the use of information from social media such as Facebook, Twitter, WhatsApp, Instagram or YouTube, which for the first time can be automatically retrieved, linked and matched with police data in no time at all.

Through this fast-paced data linkage and analysis, Palantir’s software delivers complex movement and contact profiles, relationship networks, personal dossiers as well as anomalies or patterns of behaviour to the police, in exciting graphical forms.12 Who communicates or meets whom? Who is behaving in an unusual or suspicious way? You only have to be a contact or companion, witness, informant, or injured party to become the target of police attention, even if your connection to a supposed suspect is distant or coincidental.

The objective no longer is hard evidence but more or less arbitrary results from the analysis of these automatically amalgamated data pools. Imagine that your everyday activities, which create innumerable digital traces, suddenly can turn you into a suspect because they are taken away from their original source and put into a new, completely different context. Maybe the Hessian Police will “track you down” just because you happened to be in the wrong place at the wrong time, live near a crime scene or you were simply confused with another person. This analysis tool might seem very powerful – but it is also quite vulnerable to manipulation and caprice.

Through the new surveillance powers given to Hessian Police, the fallout from such an analysis can have particularly grave consequences for the affected persons. If the data linkage and analysis should filter someone out as conspicuous, an alleged risk or a so-called endangerer, these people may have to expect secretive and preventive surveillance through state trojans, and they could be subject to restrictions regarding movement or contacts, compelled to report to police, be electronically tagged, or put under preventive or punitive arrest.

Exactly how the “Hesse Data” software conducts its analysis remains a trade secret of the Palantir company. Thus the algorithms behind any potential police “findings” are beyond public and democratic scrutiny.13

What is also remarkable is how this cooperation between Hessian Police and Palantir was contrived.14 An inquiry in the Hessian parliament deliberated over several months last year on whether the commission to Palantir had been awarded in violation of the law and what the role of the Interior Minister had been. These questions have not been clearly answered to this day. In any case, the contract was awarded in an intransparent way, the requirements were specifically tailored to Palantir and their software, so that competitors had no equal chance to bid even though alternatives were available.

On top of that, if the public is left in the dark about the price paid for the Palantir software there is clearly a reason for suspicion. According to an official statement, the product’s value is “€ 0.01 excluding VAT”. The Hessian Interior Ministry admits to German news site Spiegel Online15 that this was “not the actual price”, but does not want to disclose that price “due to public safety interests of the State of Hesse”. How can such an information put public safety at risk – do they actually fear street riots or even attacks? Hessian Interior Minister Peter Beuth clearly prefers to let speculations run wild rather than to work transparently, which should be a matter of course in a democracy.

Conclusion: The use of the Palantir software “Hesse Data”, presumably at a cost of millions, takes data processing to a new level – in fact the Hessian Police enthuses about “a quantum leap in police work”. To put it in another and clearer way: With “Hesse Data” the conservative–green government of Hesse is taking another large step towards a control and surveillance state.

The “Hesse Data” analysis platform is in a continual conflict with the right to informational self-determination, which is an expression of common personality law (Article 2, Paragraph 1 of the German Constitution). The use of the Palantir software also demolishes an important pillar in data protection, that is, the limitation of the use of personal data to the purpose for which it was acquired. And all of that largely without effective scrutiny and in an unholy alliance with one of the main actors of the US military–intelligence complex. The only possible response is for us to say:

Congratulations, Interior Minister Peter Beuth, for the BigBrotherAward 2019.

Laudator.in

Portraitaufnahme von Rolf Gössner.
Dr. Rolf Gössner, Internationale Liga für Menschenrechte (ILFM)
Quellen (nur eintragen sofern nicht via [fn] im Text vorhanden, s.u.)

1 BigBrotherAward 2018 für die Fraktionen von CDU und Bündnis90/Die Grünen im hessischen LandtagEnglish translation

2 Palantir-Ermächtigung, an allusion to the Enabling Act (Ermächtigungsgesetz) of 1933 that was a cornerstone of Hitler’s seizure of control over the German state. (Web-Archive-Link)

3 Oliver Voss, Glaskugel der Geheimdienste, in: Tagesspiegel 5 Jun 2018 (Web-Archive-Link)

4 Oliver Voss, Glaskugel der Geheimdienste, in: Tagesspiegel 5 Jun 2018 (Web-Archive-Link)

5 Palantir employees are also suspected to have maintained contacts with Cambridge Analytica, who are said to have used illegally obtained Facebook data to try to influence the 2016 US presidential election. German media reports about this include netzpolitik.org (Web-Archive-Link) and Zeit-online (Web-Archive-Link) on 6 Apr 2018

6 manager-magazin.de: Tech-Milliardär Thiel spendet Trump 1,25 Millionen Dollar (Web-Archive-Link)

7 Foreign Intelligence Surveillance Act (Web-Archive-Link)

8 Frankfurter Rundschau on 11 Jan 2019, D4 (Web-Archive-Link)

9 Koalitionsvertrag zwischen CDU Hessen und Bündnis90/Die Grünen Hessen für die 20. Legislaturperiode (PDF)

10 Oliver Teutsch, Hessische Polizisten ermitteln wie im Agenten-Thriller, in: Frankfurter Rundschau 5.4.2019, D2 f. (Web-Archive-Link)

11 Tobias Singelnstein, Big Data bei der Polizei: Hessen sucht mit US-Software nach Gefährdern, in: Grundrechte-Report 2019, Frankfurt/M. 2019

12 golem.de: Wo die Polizei alles sieht (Web-Archive-Link)

13 The CTO of the Hessian Centre for Data Processing (Hessische Zentrale für Datenverarbeitung, HZD), where the Palantir servers were set up under police watch and now operated, declared in the parliamentary inquiry that he has no insight whatsoever about which data are processed, what the scope of information processing is and who has access. This would be the sole responsibility of the Police and the Interior Ministry – Oliver Teutsch, Hessische Polizisten ermitteln wie im Agenten-Thriller, in: Frankfurter Rundschau 5.4.2019, D2 f. (Web-Archive-Link)

14 In May 2016 a Hessian delegation visited the US company in Silicon Valley, among the visiting party was the (CDU) Interior Minister and BigBrotherAwards winner, Peter Beuth. The original intention was to find a software to combat cyber crime. Returning from Silicon Valley, the group changed its objective to the fight against terror and the protection of the state. Palantir was exclusively considered as a potential software supplier – police-it.net: Palantir in Hessen – vereint Daten von Facebook & Co mit polizeilichen Datenbanken?? (Web-Archive-Link)

15 spiegel.de: Hessens Polizei kauft Software von umstrittener US-Firma (Web-Archive-Link)

Jahr

Protect us, don’t Spy on us

In our new venue we continued the tradition of asking our audience which award they had found particularly “impressive, surprising, shocking, or outrageous”.
Das Publikum während der BBAs 2018.

With just over a third of the vote (a good deal ahead of the second place, which received about a fifth of the vote, and the rest of the field), the audience award went to the winner in the Politics category, the parliamentary groups of the Christian Democrats and the Greens in the state parliament of Hesse, for their plans for a new domestic intelligence law.

Here is a selection of comments that our electorate left on their ballot papers.

Workplace

The information of “the weak” is being handed to “the powerful” with a blank check.

Work is indispensable für many people in the current system, therefore incursions into personal freedom in this area are absolutely wrong.

PR & Marketing

Personally, this award affects me the most.

This had the biggest “wow factor” for me – my jaw dropped, in a negative way!

My city intends to become a “smart city” just now, this is being “sold” as something factastic by the Smart Factory – I wasn’t aware how small comforts can lead to such horrific consequences.

I want to feel free as a I move through public spaces.

Smart City is so ubiquitous and invisible, and tempting for smart politicians, that creating awareness for this is a really smart move.

Technology

Utter powerlessness and subjugation of users, without remedy.

This award is affecting me already.

My company is about to introduce it, and now I am dreading that even more!

As this is the de facto standard in business, and because people do not know anything else and many programs run exlusively on it, people often become Windows users by coercion. This company is one of the prime Big Data monopolies in disregard of privacy. It must be broken up!

Administration

Restrictions of the rights of refugees are too often accepted or ignored. This award is important as it focuses this blind spot. It is caused by the racism that is entrenched in society.

We must not give up our humanity!

The weak position that refugees have is further weakened by cynical technology and exploited for political purposes.

To run “first trials” of surveillance technology on people who have had such a cruel deal in life should make us extremely suspicious against those that try to use such instruments!

People that seek refuge with us are not inconvenient and potentially dangerous creatures that would have to be controlled, categorised and labelled. Human dignity is inviolable!1

Consumer Protection

Alexa would only be the beginning – look towards China.

There is still time to escape self-chosen surveillance and control. Action against Alexa!

Violations of the right to self-determination happen – mostly unnoticed – in people’s own homes.

It seems so harmless, like a toy.

Politics

Fundamental rights, which the state particularly should protect, are the basis for everything else!

To be able to justify total surveillance practically by saying anything at all, as the vague wording of this law allows, opens floodgates to the enslavement of the people by the state and surely also by business.

Politics should protect us citizens and not spy on us.

State Trojans – a full-scale intrusion on all fundamental rights.

I think it is dire how politics is not protecting us citizens, and criminalising us instead!

The ignorance with which politicians approach this draft law is absolutely shattering. Against all expert advice, the parliamentary groups for the CDU and Green parties are trying to whip through yet another surveillance law.

The diagnosis that the Greens are saying farewell to their status as a party of civil rights in the Federal States can not be disputed. (Hesse, Bremen, Baden-Württemberg)

How sad that the Greens are so readily sacrificing the ideals of the freedom and civil rights movements for a little political power.

Others

A great evening, thank you very much! ♡

Quellen (nur eintragen sofern nicht via [fn] im Text vorhanden, s.u.)

1 Translator’s note: quoted from Article 1 of the German constitution

Jahr
Kategorie
PR & Marketing (2018)

Idea of a “Smart City”

The BigBrotherAward 2018 in the “PR and Marketing” Category goes to the idea of a “Smart City”. The advertising concept of a “Smart City” is used by tech companies to try to sell the idea of a “safe city” to municipal administrations: A city, covered in sensors, under total surveillance, remote-controlled and commercialised. “Smart Cities” reduce their citizens to mere consumers, change consumers into data sources and our democracy into a privatised service.
Laudator:
Rena Tangens am Redner.innenpult der BigBrotherAwards 2021.
Rena Tangens, Digitalcourage

The BigBrotherAward 2018 in the “PR and Marketing” Category goes to the idea of a “Smart City”!

The “Smart City” concept espouses the “Safe City”: a city covered in sensors, under total surveillance, remote-controlled and commercialised. “Smart cities” reduce their citizens to mere consumers, change consumers into data sources and our democracy into a privatised service.

A “Smart City” is the perfect combination of the totalitarian police state in George Orwell’s “1984” and the standardised, only seemingly free consumers in Aldous Huxley’s “Brave New World”.

The term “Smart City” is a shining, colourful bag of tricks – it promises to everyone whatever they want to hear: innovation and modern city marketing, efficient government and citizen participation, sustainability and climate protection, security and comfort, perfectly phased traffic lights for cars and always a free parking spot. It all started with IBM in 2008 with their slogan of a “smart planet”, implying that they would be able to make our planet more “intelligent”. Many more companies are now in the business of trying to sell their services to cities, for example Siemens, Microsoft, Cisco, Huawei, Hitachi and Osram.

But what does a “Smart City” really look like?

One example of the great achievements of a “Smart City” is a new type of street light. It doesn’t only light up the dark, it also includes a CCTV camera, pedestrian recognition, number plate scanners, environmental sensors, a microphone with gunshot-detector, and a locator beacon for recording the position. When we imagine all this, combined with WiFi for determining the positions of smartphones, face recognition and motion analysis, it becomes clear that in a city with this kind of technology it will become impossible to walk a single step without being watched.

With today’s technology (…) we can create absolutely safe cities. The new face recognition technology enables governments and private enterprises to recognise and archive all faces, where previously this was limited to registered criminals”, the Turkish surveillance technology provider Ekin gushed in a press release about the “Safe City”. The face recognition system assigns an ID to every facial feature that can be used to identify that person later, even if the name is unknown, and also analyses age, gender and ethnicity.

While the advertising for the “Smart City” in Germany still focuses on sustainability, protection of the environment, efficiency and comfort, technology companies in China, Dubai and Turkey openly discuss what it’s really about: unbroken surveillance and control of the populace.

The combination of CCTV surveillance and artificial intelligence is burgeoning in China. The leading Chinese company for face recognition software, SenseTime, is excited about “the high demand fueled by Smart Cities and surveillance”1.

In the special economic zone of Shenzhen in Southern China, close to Hong Kong, jaywalkers will be identified immediately and pilloried on big monitors also displaying their personal data. A fine is calculated and the employer will be notified. Points will also be deducted from their “social score”, which decides about getting an apartment, a new job or a place at university.

The whole province of Xinjiang in Northwest China has become a real-time laboratory for mass surveillance. DNA and blood type of everyone between the ages of 12 and 65 is being tested, iris scans, finger prints and 3D images are created, all that as part of a so-called “free health check”2. In addition, the Chinese government installed a surveillance system in 2017 that automatically informs the police when a suspect moves more than 300 m away from his apartment or his workplace3. Not just criminals are suspects, but also members of the Muslim minority and people campaigning for human rights.

So you think China is far away?

Well, at the railway station “Südkreuz” in Berlin the federal police has been testing intelligent CCTV surveillance with face recognition since August 2017. This is the beginning. And it does not matter what the results of this “test” will be, Ex-Minister of the Interior Thomas de Mazière already announced at the beginning of this field trial that face recognition shall be introduced nation-wide in as many public places as possible. The new Minister of the Interior confirmed that he shares this view. And what is more: the new Federal Government already made a note in their coalition contract about further development towards “intelligent” CCTV surveillance. Tasteful streetlights with surveillance cameras and sensors can already be admired in “Arcadia”, a gated community in the city of Potsdam, neighbouring Berlin.

Or let us look at our close neighbours, the Netherlands, where “Smart Cities” are springing up like tulips: the city of Enschede wants to know who moves around where, and how often, and to this end is tracking all people carrying a smartphone with active WiFi using its unique MAC address. The Enschede traffic app rewards people for good behaviour: walking, cycling, using public transport: ironically the reward is free parking in the city for a day. The thing you will only find in the fine print: the collected personal movement data are sent to a company called Mobidot.

In Eindhoven the party district Stratumseind has been turned into a surveillance laboratory. There are street lights with WiFi trackers, cameras and microphones intended to detect aggressive behaviour. Starting in spring 2018, orange scent will be sprayed as necessary to soothe people.

Utrecht, finally, spies on the city youth as they move along the streets: how many are there? How old are they? Do they know each other? How do they treat each other? Are they going to cause trouble or not? Since 2014 Utrecht has had 80 “smart” projects in the city, and lost the general overview of what is happening where, because most of these projects are in the hands of private enterprises4.

“Smart City” companies collect data and refuse to inform about it. Often they will not even grant the cities access to their data – because they are trade secrets! One cannot help but get the impression that the companies are trying to pull a fast one on the cities. But neither the citizens nor the press can really check, because the contracts between the cities and the “Smart City” service providers are not publicly available – for competitive reasons.

Yes, “smart” technology is expensive. Where can one get the money from? Cities are attracted by cheap entry offers and national and EU development funds. Once again cities are enticed into surrendering their infrastructure into private hands, just as they did with cross-border leasing in the 1990s5. That is neither clever nor smart, but rather short-sighted and dangerous.

And the threat is larger than just selling off municipal infrastructure. Cities are carelessly selling things that they don't own: the citizens' data – and their privacy, their autonomy and their liberty along with it.

Citizens are not being asked. After all, the tech companies just want to play – who could hold it against them? When it's about innovative tech projects, everything else has to take a back seat: “digital first, concerns second”. The American equivalent is called “permissionless innovation”6. That means that the precautionary principle has been suspended – whoever claims to be innovative, is not bound by petty rules.

It is clear to the companies: Not the service is the real cash cow, but the citizens’ data. Who would have a better understanding of that than Alphabet, Google’s parent company. They bought into Toronto, Canada, to develop the “Waterfront” quarter into a “Smart City”. The name of the project is “Sidewalk Labs”. But Google may not have expected quite how much criticism and specific data protection inquiries would be coming forth from the Canadian public7. Meanwhile, Sidewalk Labs has employed the former Canadian data protection officer, Ann Cavoukian. That’s a smart move. In 2009 Ann Cavoukian developed the concept of “Privacy by Design”. “Smart Cities”, however, rather incorporate “Surveillance by Design”. We are genuinely curious how she is going to bring one into the other without completely turning Google’s business model inside out.

But let’s not be too negative. We really like technology. Let us just assume that hack-proofing the networked systems would not be a problem. Assume that the state would intend to use the total surveillance to our sole benefit. That the tech companies would only ever do good with our data. And let’s picture this friendly “Smart City”, whose sensors follow us at all times, which tell us what to do next, and whose algorithms use our profiles to calculate our desires in real time, before we ourselves are even aware of them. Perfectly phased green traffic lights, always a parking spot, and always the current local nitrogen oxide measurements on my smartphone – doesn’t that sound like a dream come true?

In the land of milk and honey, roasted geese are flying into people’s mouths, ready to be eaten. But the land of milk and honey is not paradise. It satiates, but it does not make one happy. Comfort will make us lazy and stupid. We need those moments of almost tripping over, to train our sense of balance. We need the effort, to be able to appreciate the things we achieved. We need chance, something different, the unknown, surprise, a challenge, to learn and to develop. We as humans need to be able to decide freely, and we need the opportunity to make mistakes. What other way is there to train our “moral muscle”?

This is another reason why we have to stand up against the dictate of technology and techno-paternalism.

A city cannot be “smart”. The people living in it are smart. We have a choice: Do we want to live in a post-democratic world of consumption, where others take the decisions for us, and in which the only possible answer is “Ok”8? Or do we choose Freedom?

Albus Dumbledore says in Harry Potter, Vol. 4:
“[T]here will be a time when we must choose between what is easy and what is right.”

That time is now.

Congratulations for the BigBrotherAward, “Smart City”!

Laudator.in

Rena Tangens am Redner.innenpult der BigBrotherAwards 2021.
Rena Tangens, Digitalcourage
Quellen (nur eintragen sofern nicht via [fn] im Text vorhanden, s.u.)
Jahr
Kategorie
Politics (2018)

Parliamentary groups of the Christian Democrats (CDU) and the Greens in Hesse

The parliamentary groups for the Christian-Democratic and the Green parties in the state parliament of Hesse receive the BigBrotherAward 2018 in the Politics category for their plans for a new domestic intelligence law. The draft by the “black-green coalition” contains an accumulation of grave surveillance powers that facilitate severe interferences with fundamental rights: So-called state trojans are to be used to secretly infect and investigate “suspect” computers, employees of projects to further democratic society are to be screened by intelligence agencies and, in a legal novelty, criminal undercover agents are to be exempted from persecution. All in all, this is a severe attack on democracy, on the rule of law, and on fundamental rights.
Laudator:
Portraitaufnahme von Rolf Gössner.
Dr. Rolf Gössner, Internationale Liga für Menschenrechte (ILFM)

The BigBrotherAward 2018 in the “Politics” Category goes to the parliamentary groups of the Christian Democrats (CDU) and the Greens in the parliament of the Federal State of Hesse.

The two governing parliamentary parties receive this negative award for their plans for a new law on the state’s domestic intelligence agency and for their planned reform of the Hessian police law. Their legislative proposals contain a dangerous collection of grave surveillance powers. These will profoundly affect fundamental rights and threaten the rule of law. The worst measures in brief:

1. The domestic intelligence agency (“Verfassungsschutz” – literally translated, “Protection of the Constitution”) is to be allowed to recruit undercover agents even when these have previously been convicted, and to continue using agents after they are found to be criminals. Experience shows that the agency has been doing exactly that even today – what is new is that this practice is to be given a legal basis, and that it will be possible and perfectly legal for agents who committed crimes to be exempted from criminal investigation. A free pass for criminal activities on a state mission, in violation of the rule of law.

2. The agency will also be able to recruit holders of professional secrets such as doctors, lawyers or journalists as undercover agents, or to place such agents among these persons’ professional contacts. This violates the obligation to confidentiality and the vulnerable relationships of trust to clients, patients, or sources.

3. Even data on minors under the age of 14 – in other words, children – is to be legally collected and stored by the domestic intelligence agency. This kind of stigmatisation by secret services at such an early age can have fatal consequences in further life for the people affected – for career choices, in the search for further education or jobs.

4. The domestic intelligence agency is to be empowered to share personal surveillance findings with public bodies – in order to “vet persons who apply for civil service positions regarding their loyalty to the constitution”. That is a dark reminder of the earlier practice of conviction scrutiny and occupational bans that contravened human rights. Also threatened by regular and suspicionless secret service inspection are organisations and potential employees in state-sponsored democracy and prevention projects, for example against right extremism or salafism. This will portray them as overall security risks and place them under general suspicion.

5. Spy software, or State Trojans, are to be injected into suspects’ computers and smartphones using discovered or purchased security vulnerabilities. This will be used to investigate these suspects in “preventative” remote evidence collection or telecommunication surveillance.

6. And police forces are to be empowered to use electronic tags on so-called “endangerers” as a preventative measure, in order to monitor their movements and contacts over several weeks or months. These are people who may have not committed any crimes, it will suffice that the police have certain clues that they might commit crimes in the future.

Towards the preventative-authoritarian security state

With this legislative proposal, the black-green coalition in Hesse is making great strides towards a preventative-authoritarian security state. With its very precarious regulations, this is one of many new laws across Germany which legalise, among other measures, the use of State Trojans for on-device communication interception and remote evidence gathering as well as electronic tagging for “endangerers”.

The original motivation to reform the domestic intelligence laws at the state and federal level was that consequences were overdue from numerous defects, failings and scandals relating to the series of murders committed by the Neo-Nazi terrorist group NSU (“National Socialist Underground”) and the revelations of mass surveillance by the NSA. As prime objectives, the new law would therefore have to apply effective legal restraints on the domestic intelligence agency and its powers, and significantly strengthen its supervisory bodies. But instead, these secret authorities, which are almost beyond democratic control, are given an undeserved boost despite their history; they are being upgraded and made ready for mass surveillance, and no effective protections are implemented against these services’ scandals and schemes. As a result, the domestic intelligence agency is emerging with new strength from their disasters and record of scandals. And the police’s armament is being upgraded as well.

What does this mean for those immediately affected, and for us all? Two examples:

1 – Clandestine attacks on computers and smartphones with State Trojans

Under certain conditions and to avert threats, the Hessian domestic intelligence agency is to be allowed to carry out technological attacks on “IT systems” . In plain English: this domestic secret service is allowed to hack computer systems with spy software for the sake of clandestine information gathering. The software used is one of the notorious “State Trojans”, or as they are called in Hesse, the “Hessian Trojan”1. The video that you just saw is a compelling insight into the way this works and how critical infrastructures such as hospitals, transport systems or water supplies can be impacted. A daunting example is the ransomware Trojan “Wannacry” that in Mai 2017 paralysed not only private PCs but also automotive industries, railway operators and hospitals, causing billions of damage. The weakness it exploited had been known to the US foreign intelligence agency NSA for years.

You have to remember that the secretive use of State Trojans can never be effectively supervised, neither by parliament committees nor by courts, although it is among the gravest state interferences with basic rights that threatens the “core area” of a person’s life. It also impedes the confidentiality and integrity of IT systems, wide-opening doors to abuse and serious cyber attacks.

2 – Electronic tags for tracking “endangerers’” locations

Hessian police is to be allowed to use electronic tags on so-called “terrorist endangerers” as a preventative measure, like the Federal Criminal Police Office has been able to do since 2017, and to obligate registration with authorities, restrict movements, impose house arrests, and prohibit certain social contacts. On condition of a judicial order, these restrictions are to be continuously monitored via GPS, even inside homes. This is to be allowed, in the words of black-green draft law, “if certain facts justify the expectation” that the individual, “within an assessable period of time and in a way that can be substantiated at least in terms of its kind”, will commit a crime – “or if their individual behaviour constitutes a concrete potential that within an assessable period of time” they will commit a crime. You could hardly be more vague, could you?

This electronic surveillance measure that is to prevent, among others, terrorist offences, is to be time-limited to three months at most. But this limit can be extended by three months at a time – in effect, indefinitely. If the people affected refuse the surveillance, they can be placed under police custody for up to ten days, depending on a judicial order.

Such deeply impactful and behaviour-controlling police actions, which also yield movement profiles and allow inferences on the personal conduct of life, are to be used against so-called endangerers. In other words, they may target people that so far have not committed a criminal offence, where the police merely suspects that they may do so in the future based on nothing more than circumstantial evidence, assumptions or supposed intentions or convictions. Such projections of future behaviour can result from personality or contact profiles compiled by police or secret services, or on algorithmic risk assessments (such as a pre-crime program named “radar ITE”). But how should it be ensured that in such grave decisions the balance is not tipped simply by institutional racism and islamophobia?

Such grave infringements on fundamental rights based on vague conjecture are disproportionate, they violate the privacy and personality rights of the people affected and consequently their human dignity. Furthermore, electronic tags can often be quite easily removed or manipulated, so ultimately they are probably unsuitable to prevent terrorist crimes – in particular if the potential perpetrators are determined and ready for anything. One of the two culprits that cut a Catholic priest’s throat in Normandy in 2016 was wearing an electronic tag, and the attack on a Berlin Christmas market in late December 2016 would hardly have been prevented in this way either – while it turned out that other police measures that would have been effective were not used.

Civil society protests and Green inner-party conflict on the “Hessian Trojan”

Strong protest and resistance has stirred against the Hessian draft law. A broad alliance of democracy projects, human rights groups and privacy organisations is supporting a joint declaration that rejects the planned tightening of the law, saying that it damages democracy and fundamental rights2. In a hearing in the state parliament, a large majority of the expert opinions strongly criticised the plans and called for significant changes3.

The Green party base has also voted against the black-green plans, especially against the legalisation of the “Hessian Trojan”, refusing to support the position of the Green parliamentary group4. And rightly so, as the Green Party generally rejects State Trojans, and the Greens in Hesse had promised in their election campaign that no remote evidence gathering would be allowed as a preventative measure5. But the parliamentary group is stubbornly holding on, justifying the breach of the election promise by pointing to “terrorist threats” that supposedly require wider monitoring of digital communications. This lousy game with public fear of terrorism, played in order to restrict liberties for supposed gains in security, has up to now been avoided by the Greens and left to other players like the Christian Democrats or the “grand coalitions” between Christian and Social Democrats. But now the Green parliamentary group in Hesse is itself taking part in the surveillance poker, joining the race to outdo the competitors, and even brazenly claiming that a “Green handwriting” were discernible in the draft law.

In the light of such laws on police forces or secret services in Federal States, as planned in Hesse under a black-green government or in part already enacted in Baden-Württemberg with its green-black government, the Greens can begin to say farewell to their self-conception as a party of civil rights.

Commentator Heribert Prantl of the Süddeutsche Zeitung has written about a “digital inquisition” with State Trojans and voiced his surprise that most people seem to go along with this. This laudation is therefore now going to end with an appeal for strong mass support for constitutional complaints like the one lodged by Digitalcourage against the State Trojan. This is an act of civil self-defence. And there is still time before the Hessian draft law is to be passed by parliament. The wide-spread voices of disapproval should finally be heeded.

Congratulations to the parliamentary groups of the CDU and the Green Party in the parliament of Hesse on the BigBrotherAward 2018.

Laudator.in

Portraitaufnahme von Rolf Gössner.
Dr. Rolf Gössner, Internationale Liga für Menschenrechte (ILFM)
Quellen (nur eintragen sofern nicht via [fn] im Text vorhanden, s.u.)

1 These state trojans are developed for German security authorities by the “Central Facility for Information Technology in the field of Security” (Zentralstelle für Informationstechnik im Sicherheitsbereich, ZITiS), which was set up in Munich in 2017.

2 Joint declaration (Humanistische Union)

3 Expert opintion on the draft law on the Hessian domestic intelligence agency (Web-Archive-Link) and a report (Web-Archive-Link)

4 netzpolitik.org: „Streit um geplantes Hessentrojaner-Gesetz bei den Grünen“ (Web-Archive-Link) und gruene-hessen.de: „Digitale Gefahrenabwehr statt digitaler Gefahrenquellen“ (Web-Archive-Link)

5 Digitales Hessen – Netzpolitik ist Zukunftspolitik (PDF)

Jahr
Kategorie
Administration (2018)

Cevisio Software und Systeme GmbH

The BigBrotherAward 2018 in the “Administration” Category goes to Cevisio Software und Systeme GmbH & Co. KG in Torgau, Germany, for their software Cevisio Quartiermanagement (QMM), which is used in refugee shelters. This software can be used to register and store movements to the shelter, movements within the shelter, the handing out of meals, medical checks such as X-rays, blood and stool tests, family relationships, religious affiliation and ethnicity and many other details. The collected data facilitate a total control of the refugees.
Laudator:
Dr. Thilo Weichert am Redner.innepult der BigBrotherAwards 2021.
Dr. Thilo Weichert, DVD, Netzwerk Datenschutzexpertise
Das Cevisio-Logo. Darunter der Text: „Totalkontrolle optimal organisiert“.

The BigBrotherAward 2018 in the “Public Administration” Category goes to Cevisio Software und Systeme GmbH in Torgau, Germany for their software Cevisio Quartiermanagement (accomodation management, or QMM for short), which was developed in co-operation with the German Red Cross especially for refugee shelters. This software registers and stores movements to the shelter, movements within the shelter, the handing out of meals, medical checks such as X-rays, blood and stool tests, family relationships, religious affiliation and ethnicity and many other details. The collected data facilitate a total control of the refugees, and they show in how many ways privacy can be violated.

This software deserves this prize not just for the privacy violations it enables but for the underlying conception of the affected people. Refugees are human beings, not things. They are not put on a shelf for later retrieval and use. They are not prisoners and do not need to be closely monitored. They have come here for protection, and they have rights – human rights and basic rights that Cevisio does not even mention.

In 2015 when many refugees came to Germany, administrations were in chaos. Collecting data and organising accommodation and supply posed great challenges for all involved. The medium-sized company Cevisio worked out the perfect solution in collaboration with the Saxonian branch of the German Red Cross. The company boasts on its website that this software is used in more than 280 refugee accommodation centres, “managing more than 380,000 refugees.”

It follows that the Cevisio QMM software holds data on all these people. Registration is based on an ID card with an RFID chip or barcode. Inhabitants carry this card with them in the shelter and – if the software makers get their way – hold their card briefly in front of cards readers that are positioned near entry and exit, the meal serving counter, the laundry service, when receiving pocket money, when borrowing books or doing voluntary work.

The so-called “actions” that are recorded in the shelters are then merged with data from the Federal Office for Migration and Refugees (BAMF) and immigration authorities. Recorded data include pregnancies, family relationships, and medical data (examinations and test results). The software also makes sure that all official documents are captured. In addition to “managing” the refugees, the software also caters for the (quote) “billing of refugees”. It can capture “all data concerning the asylum procedure, including EASY optimisation and BAMF data”.

This is total control. Daily routines, habits, contacts, kinship, general health, asylum status – everything in one place. Linked and actionable.

Some of it may make sense, e. g. dietary requirements due to allergies or religious beliefs. However, the Cevisio software goes far beyond that: A technical brochure mentions “recording of all meals served to a person” and “notification on handing out one meal multiple times to the same person”. What is this needed for?

Is it necessary to record and store any movement into or out of the house minutely? The brochure mentioned before says yes (quote): “The integrated presence overview shows accurately to the second which refugees, helpers or staff members are on site. Besides the control functionality, this overview is indispensable in case of catastrophic events (fire, etc.).”

“Indispensable!” Is it not strange that thousands of schools, shopping centres, or youth hostels can still cope without such an overview that is accurate to the second? Are they all irresponsible?

No. This is life. Which comes with a certain life risk. Cevisio’s data collection on the other hand is the wet dream of surveillance fanatics. We can see no sign of empathy for people who presumably fled to Germany hoping to lead a life in freedom.

So perhaps it is pragmatism of the “who cares?” variety if the expression “data protection” does not occur a single time in the 15 pages long system description. Technical data security measures are hidden behind the expression “system administration”. I was unable to find any functionality regarding rights of the individuals affected – things like information disclosure or transparency to refugees.

Shortcomings have been noted in practice, too: The data protection officer for Bremen states “substantial concerns with respect to privacy rights” in her annual report1. Storage periods were far too long. She did not see the justification for checking each handing out of food. Storage of medical data had to be reduced massively on her demand. Regarding kinship details, those affected were not offered any choice. Many questions remain open until today.

The Bremen data protection investigation covered just a few of the facilities. There is no guarantee nor a way to check that illegal surveillance features will be removed in the remaining 270+ facilities as well. The legal situation is the same in all cases and could be built into the software as defaults, for example as retention periods,. Cevisio could support facility operators with advice on how to maintain privacy.

We ask: Is this software designed as it is because those affected are refugees? Granted, refugee shelters are logistically complex systems, and their operators (the German Red Cross and others) could benefit from digital support. But how are refugees supposed to integrate in our society if they are never given a chance to enjoy the values of what some like to call Leitkultur (or guiding culture), i. e., the values of our constitution? These values include the right to self-determination, in particular, the right to informational self-determination.

The Cevisio software QMM is just one example for the patronising, intransparent, and control-freaky way refugees are treated generally. Job centres release absolutely everyone from professional confidentiality obligations, including the social welfare offices and migration advisory centres. In 2016, the so-called “Daten­austausch­verbesserungs­gesetz” (data exchange improvement law) stipulated that practically any agency can inform any other agency about refugees if it seems necessary. For finding out the origin of refugees, the BAMF has gained permission to access refugees’ smartphones which store their entire communication history and many private details.

At the same time, independent counselling organisations for refugees report that some officials deny them information that would be helpful for advice and support, citing data protection reasons. This is data protection being abused as a smoke screen in order to impede social work.

We have to be extra careful about how we treat refugees. The Nazis as well as the East German regime have controlled and maltreated their population based on information and data collection. The governments of the countries that refugees reach us from often torment their population via control, lawlessness, and by using anything they know about them. There is a great risk that Cevisio-style data management will re-traumatise people. Likewise, there is a high probability that our data collections will get into the wrong hands – imagine a secret service in someone’s country of origin. Software companies, too, carry a responsibility to ward off such threats. We should realise that what is done to refugees today could be applied to ourselves tomorrow.

Congratulations on winning the BigBrotherAward 2018 in the Authorities category, Cevisio.

Laudator.in

Dr. Thilo Weichert am Redner.innepult der BigBrotherAwards 2021.
Dr. Thilo Weichert, DVD, Netzwerk Datenschutzexpertise
Quellen (nur eintragen sofern nicht via [fn] im Text vorhanden, s.u.)
Jahr
Technology (2018)

Microsoft Germany

Microsoft Germany, represented by Sabine Bendiek, Chairwoman of the Management Board, receives the BigBrotherAward 2018 in the Technology category for implanting telemetry (i.e. the transmission of diagnostic data) in Windows 10 that is almost impossible to deactivate. Even skilled users will hardly be able to stop this data from being transmitted.
Laudator:
Frank Rosengart am Redner.innenpult der BigBrotherAwards 2021.
Frank Rosengart, Chaos Computer Club (CCC)

The BigBrotherAward 2018 in the “Technology” Category goes to Microsoft Germany, represented by the Chairwoman of the Management Board, Sabine Bendiek, for implanting telemetry (i.e. the transmission of diagnostic data) in Windows 10 that is almost impossible to deactivate. Even skilled users will hardly be able to stop this data from being transmitted.

Microsoft is following a current trend with the introduction of Office 365 and Windows 10: A lot of data is now stored in the Cloud, software is made available as a subscription rather than a one-time purchase, and the Microsoft company is very curious to find out what exactly its users are doing. It starts with the licence activation, which requires an online connection. If I do not want to use the Internet, for which there are good reasons, then Windows 10 makes that practically impossible.

The fact that my Windows 10 system wants to transmit information about the size of available RAM to Microsoft once every day may seem harmless at first sight. Regrettably, software or devices that “phone home” to send usage statistics have almost become a normality. It will not be such a trivial affair to most people to learn that a list of all software that is installed on their computer is being shared. Why should it concern Microsoft whether I use my computer as a typewriter, a toy, a television set or for image editing? And what does the company do with this information? We do not know.

According to Microsoft, Windows 10 is also transmitting some information that seems just banal. How often is the key combination Alt+Tab used to switch between currently running programs? “I don’t care if Microsoft knows about these things”, one part of the users might say. “That is none of their business!”, says another.

Those in the second group will clearly want to stop these data transmissions. Surely there must be a switch for that somewhere?! If you check Settings → Privacy, you will be overwhelmed with switches and option lists. Dozens of things are there to activate and deactivate, and most of us cannot know what consequences one decision or another may have.

At least when the EU’s General Data Protection Regulation comes into force in late May 2018, the default setting for all these switches should be “no transmission”. User action should be required to activate them – privacy and data protection are the founding principles of the GDPR. We all – you all – should keep an eye on whether Microsoft will stick to that.

Windows 10 does actually have settings on data protection. These are “only” five clicks away from the normal working screen, so this is a not place that users may just stumble across by coincidence. End even if we do get to this place, the only choice that we get is between “basic” and “full” transmission. “Send nothing, please, nothing at all” is not available as an option.

In any case, all these complicated settings are only about the data that the Windows 10 operating system will collect about my computer. The data transmissions by the browser, from the app tiles, or by the anti-virus “Defender” cannot be switched off anywhere at all. And then there is voice recognition, the search in the start menu, and on and on …

The Bavarian Privacy Commissioner has documented in a “Windows 10 Investigation Report” how hard, bordering on impossible, it is to silence Windows 10. Even if all telemetry settings have been changed by making more than 50 changes in the so-called registry (a database of settings that is expressly meant for expert users because imprudent changes could render the computer unusable), a Windows 10 computer will continue to send all kinds of requests to Internet services for tiles, updates or recommendations. These services will log the users’ IP address at least, even though the user has never wittingly visited a web page. Incidentally, registry changes can only be made in the “Enterprise” variant of Windows 10, which is directed at business customers.

We are not going to bore you with an assessment of the (il)legality of every single data transmission. From the user’s perspective it is simply a filthy trick that transmissions are practically unstoppable – in particular as many people have no viable alternative to using Windows due to reasons of compatibility.

Microsoft did already receive a “Lifetime” BigBrotherAward in 20021. Back then, Microsoft’s privacy commissioner Sascha Henke even visited the gala to pick up the award in person, saying that the company would take our critical position seriously. With the introduction of Office 365, Microsoft has handed over many applications and with these your own data, ladies and gentlemen, to the Cloud. That alone would have been worth a new award. Even as early as 2011 – two years before Snowden – Caspar Bowden, then chief privacy advisor at Microsoft, warned about intelligence agency access to Cloud data. He poignantly pointed out that Microsoft was sharing their customers’ contents with NSA, CIA & co., because US services are given access to all Cloud data by the 2008 FISA act (Foreign Intelligence Services Act) – and non-US citizens have no legal recourse. Microsoft fired Caspar Bowden for his urgent warnings in 2011. The had better listened to him!

With Windows 10 incessantly “phoning home”, Microsoft’s products have now turned into an intolerable problem!

Congratulations, Microsoft, for now having earned your second BigBrotherAward.

Laudator.in

Frank Rosengart am Redner.innenpult der BigBrotherAwards 2021.
Frank Rosengart, Chaos Computer Club (CCC)
Quellen (nur eintragen sofern nicht via [fn] im Text vorhanden, s.u.)
Jahr
Kategorie
Workplace (2018)

Soma Analytics

The company “Soma Analytics” receives the BigBrotherAward 2018 in the “Workplace” category for their efforts to propagate their health app “Kelaa” and the “Kelaa Dashboard” among the human resources departments of companies. The app monitors several parameters (such as excitement in the voice during phone calls) to spy on the users’ vital functions, in order to give employers indications of their employees’ mental state.
Laudator:
Prof. Dr. Peter Wedde am Redner.innenpult der BigBrotherAwards 2021.
Prof. Dr. Peter Wedde, Frankfurt University of Applied Science
Oben links der Text: „Introducing Kelaa. Für eine bessere Ausbeutung durch den Arbeitgeber“. Rechts daneben eine Grafik eines Tablets. Unten Links das Logo von Kelaa (K).

The BigBrotherAward 2018 in the “Workplace” category goes to Soma Analytics UG from Bruckmühl near Munich, represented by their CEO Johann Huber, for their efforts to propagate their health app “Kelaa” among employees, and place the associated “Kelaa Dashboard” at the human resources departments of companies. Putting employees’ health data in the hands of the employers breaks a taboo. But let us start at the beginning.

Just like many other health apps, the Kelaa app monitors data on its user’s vital functions.

We have been warning about health apps for years. But Soma Analytics takes our disapproval into a new dimension. Anyone can install the Kelaa app on their smartphone. However, it will only work if the user’s employer uses the “Kelaa Dashboard” software. With this software, employers can display their employees’ health data in (quote) “aggregated and anonymised form”. Soma Analytics provides the required data analyses.

But the only purpose of this ménage à trois is to improve health, “of course”: for instance, employees with the Kelaa app on their smartphone will receive information about relaxation techniques when signs of stress are detected by the app. And employers get to know how stressed their employees are. Whether they will use that information to improve working conditions, or simply to lay off employees with “weak nerves”, we can only speculate.

How it works

The smartphone, pivotal work tool and constant companion, collects these sensitive personal data. Many people use this device from waking up in the morning until they fall asleep in the evening. Whoever can read out and evaluate this health data and “stress information” very likely knows more about the owner’s mental and physical state than the owners themselves.

But Soma Analytics is not content with that: the company even tries to collect sensitive data at other times by inviting the users to take the device to bed with them in order to capture movements during sleep. This enables the company to gain insights not only into sleeping habits but incidentally also into cohabitation habits. Whether or not the employees’ partners would appreciate this or consent to it remains to be seen.

Additionally, emotions detectable in the voice during telephone calls are evaluated. The whole thing is completed by the answers to a “self-assessment questionnaire” given to the employee.

Published material about the Kelaa app indicates that the Soma software also evaluates writing and typing behaviour and general smartphone use. Data about how often the user picks up the phone is collected, as well as the time spent looking at the screen.

It is not known whether other data sources are utilised as well, such as a step counter in a health app, or if conversations besides phone calls are also intercepted and evaluated. On the basis of the legal notices on the website, such data collection activities cannot be entirely ruled out. It would rather be a surprise to us if Soma were not making use of this information source.

The effects on employees …

Soma Analytics promises a lot to employees in return for their data. Kelaa is touted like a “magic health bullet”: “When you use the Kelaa app, you will work as efficiently as never before and you will also make your employer happy by an increase in productivity!” The only thing apparently missing is a promise of increased hair growth.

There is no mention by Soma Analytics of possible risks and side effects for the employees, due to them feeling – and rightly so – that their employer is constantly watching them, even at home and while asleep. As the saying goes, being off work is such an outdated concept.

… and on employers

Soma also makes a lot of promises to employers: Using the Kelaa Dashboard supposedly strengthens their decision-making process. It is suggested that employers use Soma’s data sets to identify issues that need improvement, in order to increase the employees’ productivity along with their health.

We say: Soma Analytics is trying to give employers all-encompassing control over the physical and mental condition of their employees. Employer’s are enthusiastic about this and it gives them new ideas, as shown by initial reports from the United Kingdom, where Kelaa has been deployed in a large law firm with more than a thousand employees. The person responsible for utilisation of the Kelaa app found that “the app can show us potentials that we can realise if we and our employees work together on how they sleep.” To make such a thing possible is unequivocally worthy of an award.

According to their own description, Soma Analytics uses Big Data and sophisticated algorithms.

Soma Analytics also mentions that they are supported by leading researchers from the fields of psychology, clinical sleep research and computer science. The names of these renowned researchers are, however, discreetly withheld. And, of course, the design of the algorithms used is not described, either.

Trade Secrets

Maybe these are trade secrets, just like the methods used for the aggregation and anonymisation of the data collected by employees’ smartphones for display in the Dashboard. Nothing can be found on the website about how the anonymisation works or how secure that process is. What makes us suspicious, though, is that at the same time, Soma Analytics offers to identify the most stressful departments in your companies. We can infer from this that the collected information can be linked to smaller units, departments or groups of persons within a company. From there it may only be a small step to identifying an individual employee.

A clear foundation in data protection law for these kinds of evaluations cannot be identified. Soma Analytics use legal notices on their website to grant themselves far-reaching processing permissions. But in Germany it would be required that the employees agree to this kind of processing. This is stipulated by the still (just) applicable Federal Data Protection Law (Bundesdatenschutzgesetz). The Kelaa app disclaimer also does not entail a legally binding approval by the employees to the processing of sensitive personal data. Considering the European General Data Protection Regulation that will come into force on 25 May 2018 with its requirements for an effective agreement, the assessment of the legal situation does not change. On the contrary: Both the old and the new data protection legislation specifically protect the processing of health data. And there is an even stronger protection when an employment relationship is involved.

Distribution

We do not know how successful the business model of Soma Analytics is. Known reports on practical use relate to examples in the United Kingdom and Italy, although the company’s headquarters are in Bruckmühl near Munich. Nothing is known to us about the app’s adoption by German employers – but that is not our point.

Regardless of location, regardless of economic power, regardless of the size of the company, it is the idea behind the Kelaa app that is breaking the taboo. Why do people develop this kind of software? Because they have no sense of moral boundaries. Because “digital first, concerns second” has been used last autumn by the German Liberal Party (FDP) as an advertising slogan and has become socially acceptable in certain circles. Because data protection is not seen as a German virtue and a German export success, but instead as an impediment to business concepts. And because many small start-ups dream that they will be bought out by a global corporation if they exploit Big Data to the fullest extent possible. Red lines are being crossed all the time – this has to end!

The Kelaa debate coincides with the discussion about the use of software in the field of “predictive analysis”. This aims at using voice patterns or other use of technology by employees to derive signs of imminent problems or violations of their contractual obligations to their employer. If similar concepts of permanent and covert automated spying on employees become common practice, this will open up ways for employers to gather knowledge and act on it that Big Brother could only have dreamed of.

If you, as an employee, think it is beneficial for your health to install an app on your smartphone to detect and reduce stress, you are of course free to do that. But make sure that this software does not belong to your employer. By observing this simple rule you will not have to fear a warning from the app’s “stress-o-meter” when your employer goes on talking about needing teams fit for the Olympics, and that therefore the lesser qualified employees need to jump ship.

With that in mind: Congratulations, Soma Analytics UG.

Laudator.in

Prof. Dr. Peter Wedde am Redner.innenpult der BigBrotherAwards 2021.
Prof. Dr. Peter Wedde, Frankfurt University of Applied Science
Jahr
Kategorie
Consumer Protection (2018)

Amazon Alexa

Amazon receives the BigBrotherAward 2018 in the “Consumer Protection” category for its nosy, impertinent, all-too clever and gossipy bugging operation in a can by the name of Alexa. It is well known that Alexa’s speech recordings are processed in the Cloud. What is award-worthy is the fact that these recordings are also stored in the Cloud and that they can be played back even months later. That makes it possible to monitor everyone present in the home, and it is unclear who else is given access to the recordings.
Laudator:
padeluun am Redner.innenpult der BigBrotherAwards 2021.
padeluun, Digitalcourage

“Alexa, who does the BigBrotherAward 2018 in the ‘Consumer Protection’ category go to?”

Alexa voice: “The BigBrotherAward 2018 in the ‘Consumer Protection’ Category goes to the Amazon company, for its speaking virtual assistant, Alexa.”

I presume that there would be a lot of applause for this kind of winner, although I haven’t even mentioned Apple Siri, Google Assistant, Microsoft Cortana, Samsung Bixby and Nuance, which we could largely have awarded at the same time. But Amazon Alexa is the most award-worthy of them all. The device listens in my home 24 hours a day, constantly waiting for me to say the word “Alexa”. As soon as it ‘hears’ that word, it records the sentences I say next and sends these to computers in the Amazon Cloud for analysis. There my text is translated, evaluated, and then activities are triggered remotely. For example, a timer or alarm is set, music matching my current mood – or whatever the device thinks my current mood could be – is played, a drum roll is kicked off or a new golden hamster is ordered online. “Alexa” is Amazon’s strategy for even more dominance in the online retail market. This makes Amazon even more similar to ‘The Shop’ in German author Marc-Uwe Kling’s book “Qualityland”. Are we therefore giving a negative award for economic astuteness and success? No. What is reprehensible is the ambition to get too big and come perilously close to a state of hubris – that is dangerous.

Do I need to say more? Do I really have to substantiate why an eavesdropping interface that comes disguised as an alarm clock but that is actually an all-knowing butler in someone else’s service, that lets me personally carry it into my bedroom and connect it to the world-wide spying network, should receive a BigBrotherAward? No, I don’t have to do that. Do I?

Congratulations to Amazon “Alexa”. That’s all. Mr presenter, your turn to take over. [… pause …] Hang on, please remain seated. Of course I have more to say!

Rena Tangens spoke in her award speech earlier about the “perfect link between the totalitarian surveillance state from George Orwell’s ‘1984’ and the normalised, only seemingly free consumers in Aldous Huxley’s ‘Brave New World’.” The so-called virtual assistants are the pesky extra in that total surveillance system that calls itself ‘smart’ but that is actually satanic. Scylla and Charybdis in one. The private space (Alexa) as much as the public space (‘smart’ streetlights) are turning against me – against my freedom, my agency, my personal development, my dignity. Soon it will be a reality that whenever I speak on the street, the streetlight will identify my voice. Who the person behind that voice is has been betrayed by Alexa, which gobbled up my voice profile and threw it at the Big Data leech to digest. This imaginary data leech will then not only know who I am visiting but also which route I took on the way.

Today, as this is only beginning to get introduced, I have taken up a new habit: as I get into someone else’s home, I call out “Alexa, please order one hundred cans of ravioli”. If the owners react nervously, I know that the home I just entered is bugged.

And if there really are people out there who install this device – feeling that they now have their own butler – please remember that via a mobile app, the person installing the “Alexa” device can access anything that is brought to this device’s ears. Those installing the device can see a list of all fragments of speech that Alexa intercepted, in textual form with date and time, and they can replay them with just a click. That is something to keep in mind for stalking victims counsellors, for example.

I played around with this device for a few days. It’s nice to be able to just say, while cooking pasta, “Alexa, timer, 8 minutes”. Or when the alarm sounds in the morning, without having to turn around, just snarl “Alexa, 10 more minutes”. But if that entails the recording being stored in Amazon’s Internet and the company knowing when I get out of bed, then I’d better forego this extra comfort. Because with Alexa, setting this alarm is no longer a local action on my smartphone, it becomes a piece of Big Data owned by Amazon.

People keep asking me whether Amazon would do anything evil with “Alexa”. If it might not secretly record everything said in the room and send it all to Amazon, even if the keyword “Alexa” has not been uttered. The weekly paper “Die Zeit” asked a technician from Tactical Tech to investigate. And he said: “Well, maybe. The device encrypts all the data it sends, so there is no way of knowing whether it only sends ‘desired’ utterances or more.”

Our guess is: Amazon will probably stay on the safe side. What the device is designed to do is bad enough. And even worse are the plans in the making.

We sought out patents that Amazon, Google & co. have secured, they show very nicely which way this future under the motto of “digitalisation first, concerns second”1 is supposed to go. The companies are not only holding patents for identifying speakers, but also for recognising from their voice what mood they are currently in. Peter Wedde’s “Workplace” award just now told us about an app that uses such technology. When mum is crying in despair in the morning, a soothing liquor will be delivered. Calling out “Alexa, music!” will use the psychological profile known to the company to play either punk rock or Gregorian chants. Or Amazon will issue a preventative call to the police to ‘swat’ the premises if the algorithms gleam from the voices that somebody might be about to start an assassination attempt. There are patents for differentiating between a number of voices and associate these with individuals. So Little Jack can keep trying to say in a low voice, “Alexa, show some porn”, the child protection would prevent that. A side effect is that even more information on the private sphere and the family that might be helpful for consumer manipulation would reach the company. And as to “Alexa” only responding to its wake-word, that will soon be a thing of the past: There are patents for searching the full audio stream for certain keywords – and respond by playing commercials. So when you say, “darling, shall we go out for dinner tonight?”, Alexa will cut in to offer the special deal at ‘Little Italy’ and proceed to reserve a table on the terrace.

Like a little puppy, “Alexa” tries to learn all of this about us, by continually listening to our voices, intonation, seeking out certain words such as “like” or “bought”, in order to bury what it picked up like old rotten bones in Amazon’s large data garden. And Amazon’s reward comes in a Gollum voice: “My cutie!”

Children in their room might be called to order automatically when their fighting gets too loud. Or parents could be warned if the children whisper to hedge some plan.

As usual, the companies will say that these are all just features that they had patented ‘in passing’ – they would never want to implement them. But, first, I have been given this kind of response over thirty years, but I see that everything that “would never be implemented” is a normality today. And second, we can never know if such features will get implemented and activated more or less secretly or even openly – or if that has already happened.

The issue is not ‘abuse’. It is the potential that this device has even today. And the potential for Amazon to exploit this without mercy. Not to forget that Alexa (just like a smartphone) is nothing else than a computer, for which all kinds of companies are now developing so-called skills – in other words, apps –, which we are supposed to install on the “Alexa” system and which will again shuffle some kind of data from the home into the net for the makers’ own purposes. There is the “fart” skill (yes, it does exactly what you are thinking now – only at the moment it does not include a scent), the Fox News TV skill (which tops the list of popular skills), the waste collection calendar skill (which does not work here in Bielefeld) and hundreds or thousands more. And in the end, nobody will want to be held responsible – just as in the Facebook case, where the company feigns bewilderment about what Cambridge Analytica has done with their data. While in reality, investigating people, their habits, most secret desires, friendships, political convictions and even their health issues is these companies’ business model. And that applies to Amazon as well.

That being said, I also want to pass the buck to those people that invite these playthings into their lives and motivate ruthless merchants to make and sell instruments that put our civilisation at risk. We can see this right now (spring 2018) in the current Facebook debate2.

Dear people, be reasonable. We have seen a Federal Minister for Justice, Sabine Leutheusser-Schnarrenberger, resign her post in 1996 because her (liberal) party decided to back the “Major Eavesdropping Attack” (Großer Lauschangriff, the German name for audio surveillance being used on private homes in the course of state investigations). Today we are inviting a huge eavesdropping attack right into our own most intimate spaces. Do not surrender yourselves, keep on eagerly opposing – a trait that is indispensable for a viable civilisation and democracy. Yes, that does entail having to set the alarm by hand for a few more years. But if we all do this, we can all envy our descendants for being able to use comfortable technology without the fear of falling victim to manipulation and power interests. Because it is up to us to make sure now that new technologies embody privacy by design and respect our freedom. That is possible! But for that we need to remain steadfast and resistant – even against our friends as well as ourselves – and we must not give in, not to a naïve faith in technology, not to playfulness or control freakery, and not even to laziness and surveillance mania.

Congratulations, Amazon, for already earning your third BigBrotherAward3.

Laudator.in

padeluun am Redner.innenpult der BigBrotherAwards 2021.
padeluun, Digitalcourage
Quellen (nur eintragen sofern nicht via [fn] im Text vorhanden, s.u.)

1 Translator’s remark: this slogan was (in)famously posted across German cities by the liberal party F.D.P. during the federal election campaign in 2017.

2 BigBrotherAward 2011 to Facebook

3 BigBrotherAward 2015, Workplace category and BigBrotherAward 2015, Economy category

Jahr

About BigBrotherAwards

In a compelling, entertaining and accessible format, we present these negative awards to companies, organisations, and politicians. The BigBrotherAwards highlight privacy and data protection offenders in business and politics, or as the French paper Le Monde once put it, they are the “Oscars for data leeches”.

Organised by (among others):

BigBrother Awards International (Logo)

BigBrotherAwards International

The BigBrotherAwards are an international project: Questionable practices have been decorated with these awards in 19 countries so far.