it’s boring to be predictable

While I understand and partly agree with the critical voices regarding ever-increasing data collection I want to add a perspective that I didn’t hear in that discussion so far. It might sound naive or even awkward in that context, nevertheless I think it deserves a place.

Let me start like this; what are some possible counter-measures if someone knows something about you and is about to benefit from that information or cause you damage? For instance someone knows about your affair and you are afraid your partner might hear it from them.

1) try to silence the information-holder by begging, offering something in return or by threatening
2) expect the breach of this information and
2a) disseminate a scenario in which upon disclosure of the information it becomes part of a story that reduces the harm or even flips things around
2b) discredit the messenger for
2ba) the very reason of harming you in this way
2bb) any other reason
3) “come out” and share the sensible information yourself in order to get ahead of the information-holder, in that way the disclosure and its context is controllable

Afterwards there are different ways to deal with what happened.

1) be angry and plan ways to harm the messenger in return
2) deny the truth of the revelation and live with that lie
3) find and reveal proof that the information-holder acted with bad intentions when exposing you and therefore his action might even be more to be condemned than your secret
4) accept what happened and integrate it into your personality

Ok, these lists are definitely incomplete and relating to data aggregation mechanisms online is very well a different story. You can hardly blame an individual with bad intentions against you in particular on the other end. It’s more rigorous business plans and harvesting from the masses. Nor do you often have the slightest idea what “they” are looking for and what “they” could use it for.

What I find interesting though are point 3 of the counter-measures in combination with point 4 of the ways to deal with it.
Am I totally off or overly romantic by assuming that an increase in observation/tracking CAN also cause humans to develop more “transparent personalities”? Meaning less secretive, more likely to admit mistakes, to reveal intentions and face their own ballast in terms of trauma, loss or disease.
The scenario here would be: “Well, if AI-algorithms in some dozen data-centers around the world concluded that my shopping-behavior is likely to be influenced by a loss I suffered in early childhood, than I could just as well stop hiding that painful fact from my friends!?”
Or to say it in an image; the stuff we don’t like anyone to see suggests itself to be pictured as some dark sea behind us. Data collection mechanisms use razor sharp math to find patterns in your traces – they shoot myriad laser-beams into your dark sea. You could rightfully see this as intrusion and be furious about it – but you could also use this as trigger to “light up the darkness” (oh, so cheesy) of your sea and maybe dry some of it out?

But again, it might be naive to even talk about character development as an effect of algorithmic intrusion of your life. After all we really can’t tell what the mathematical masterminds (meaning both humans and self-evolving AI) have build, are building and will build to harness the data of billions of people and systems.

A different angle, but also targeting character development. I personally think it is boring to be predictable. I remember scenes in my past where I was either being told that this or that of my thoughts and action was easily predictable, or I noticed that it was – and I just don’t like it. I don’t like to be swallowed entirely by someone else’s intellect, it provokes me and eventually stimulates my sense of competition and getting better at something. I would imagine many people “have this symptom” in different degrees? In that way to be confronted with ones own predictability can be an engine for development, no? The scenario here would be: “Oh well, if I am just this consumer-sheep that is all to easy to be targeted with ads than I’ll better upgrade my preferences or complicate some of my consumer-patterns – let’s see if those algorithms still ‘get me right'”. But even if that kind of competition wouldn’t be stimulated – I would hope being confronted with ones own predictability is at least an engine for reflection.
So, one government-strategy of ‘maturing your citizens’ could be portals aka “show me what you can do with my data” or “who am I under the lens of your data”?

Published by

Benjamin Aaron Degenhart

Engineering fellow 2020 at Tech4Germany

3 thoughts on “it’s boring to be predictable”

  1. Interesting thoughts, Benjamin!

    Argument #1: “The future is already here, it’s just not evenly distributed”. If we all got more open at the same time, your argument might be valid. My fear is that those in power will be less open than everyone else, and create “dynasties” of those who are less transparent and thus less vulnerable.

    Argument #2: Being unpredictable is really hard – what humans “like” is a very strong motivation. Also, even things that seem unpredictable might not be that at all. In other words, what if algorithms see patterns about ourselves we don’t see at all?

    Like

    1. Hey Sebastian. Yes, funnily I could imagine it in both ways – that “the less open ones” gain more power because of their lack of transparency, OR that transparency is an advantage that makes one less vulnerable and therefore potentially more powerful.

      Exactly, as I wrote above, we don’t have the slightest clue yet what is and will be possible to synthesize from all those oceans of data. And it is very likely that more and more hard numbers will be put on patterns about ourselves that we aren’t even aware of. More reason for some sort of “look, this is what we can do with your data”-type of governmental Aufklärungs-agency thingy…

      Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.