|
The movie poster of Fritz Langs 'Metropolis', the first movie to show a humanized robot, the 'Menschmaschine'.
|
|
The machine acting like a human
|
|
In 1950, the computer scientist Alan Turing proposed a way to decide wether a machine was intelligent. In a nutshell, Turing considered the question ‚does this machine think?‘ as too hard to answer, and substituted it with a different, easier to answer question: Does this machine exhibit behavior that is indistinguishable from human behavior?
This aptly called ‚Turing Test‘ was a radical departure from the ways we used to think about intelligent machines. Previously, all presentations of thinking machines depicted robots that looked like humans, because a machine that was intelligent but didn’t look and behave like a human human was completely unfathomable.
With Turing, behaving like a human transformed from being a crutch for understanding change to the requirement on how a thinking machine must behave.
Anthropomorphizing machines became a requirement. And as machines and algorithms take on more and more responsibilities, so are more and more things made to at least feel human.
|
|
Corporations are people, too.
|
|
|
The anonymous glass skyscraper. Once the epitome of corporate success, it is now the symbol of a liability in customer experience.
|
|
Or at least, they act like them. Just like early SciFi writers couldn’t imagine machines acting like humans that didn’t look like humans, and Turing didn’t find any other way to define intelligence other than „acting like a human“, companies are working hard to shed the corporate anonymity of the Yuppie 90ies and start behaving like humans.
At least, they are giving themselves human names: Take a look at this far from complete list from Bloomberg. How many did you recognize?
Of course, anthropomorphizing your company cannot stop at the name. That actually is just the start of a long series of expectations. If you interact with a company that is called like a person, you expect them to behave like a person. With a real personal feel, an incredible attention to detail in the conversation, and even an attitude that makes the personality of the brand predictable.
|
|
|
This is not an existing human. It is AVA, Autodesks Avatar for its chatbot.
|
|
In 1970, the robotics professor Masahiro Mori coined the term ‚Bukimi no Tani Gensho‘, which was later translated by Jasia Reichardt as „The uncanny Valley“.
The hypothesis states that as something becomes more and more human-like, the response to it will be more and more positive - up to a point.
When it comes too close to being human, but not quite hits the mark, the emotional response will plummet and become very negative. The thing might even become repulsive.
This negative response vanishes once the thing gets very very human-like and is more or less indistinguishable from a human.
Between ‚barely human‘ and ‚very human‘ lies the uncanny valley, in which everything is repulsive to us.
|
|
Right now, the uncanny valley is a very hot topic for chatbots but it’s not a far fetched thought that whole companies trying to be „more human“ can end up in the uncanny valley.
|
|
If robots act like humans, should they get legal personhood?
|
|
Should courts accept algorithms as individual entities, separate from their owner or creator? That’s a tough question. And even tougher, what would that actually mean? Which rights and duties would such a ‚electronic person‘ actually have? Politico doesn’t have the answers, but some very interesting questions.
|
|
Can algorithms have mental-health issues?
|
|
|
Marvin, the paranoid android from Hitchhikers Guide to the Galaxy. Image Source: BBC
|
|
Most people assume that algorithms don’t have biases. They are based on computers, and computers deal with numbers, and numbers are neutral, right?
Yeah. Sorry, no. Machines learning from input can incorporate any bias present in the data literally overnight into their data structure.
It took Twitter only 16 hours to turn a chatter bot into a racist bigot.
These biases and flaws can creep into the data set of a machine undetected - and often undetectable, changing the behavior of the machine, up to a point where it raises the question wether an algorithm actually might be hallucinating.
|
|
You're going to need an umbrella
|
|
Sometimes, people think mother nature needs a little help. For example, when natural processes don’t produce enough water. So, China is building a rain farm that will produce 10 billion cubic meters of rain per year.
(I have no clue how much rain that actually is, but it seems a lot. Maybe someone should break that down into raindrops per second.)
|
|
Thank you for reading this edition of Let’s Be Fwends.
If you've recently purchased something from a company named after a girl, a guy, or a cat: Please high-five yourself you trend-surfing in-the-know cat! And if you didn't, give yourself a vintage feel-good high-five for not needing to follow every trend that's coming around. 😻
|
|
|
|
|