I had the honor and pleasure to participate in World Usability Day Silesia in Katowice, Poland, on November 30 and December 1 2018. This was my second time to participate, as a workshop leader, presenter, and attendee. WUD Silesia is one of my favorite conferences. I always feel very engaged and welcome, plus I love Poland and Polish hospitality.
This year, the conference topic was the unintended consequences of technology. Upon hearing the topic, I immediately knew what I wanted to talk about. I spent several years designing security and privacy features, so for my way of thinking, unintended consequences are severely negative abuses—think of hackers stealing our credit cards and personal information, business secrets, etc. Think of spam. It’s a sure thing the inventors of email never considered the possibility of spam.
My workshop: The DarkSide of technology
The concept for the workshops was to consider how WUDzisław—a fictional forward thinking Polish city of the future—would be able to take advantage of future technology to enhance the lives of its residents. As you might guess, my interpretation of this concept was very dark. With control over vital monopoly services and intimate personal information (and little accountability), imagine the potential for a corrupt politician. To design a utopia, we have to start by preventing a dystopia! Big Brother is not looking out for your best interest.
However, I didn’t want the workshop to start with a negative tone. Consequently, it was called WUDzisław FutureTech: From the BrightSide…to the DarkSide and Back. The concept was to start the first half positively with the BrightSide by taking full advantage of the capabilities of future technology to look for the most meaningful ways to improve our resident’s lives. I presented a value- and scenario-driven design process to identify solutions that could have the most positive impact. An important part of this process was to deliberately avoid thinking about technology. Start with value and scenarios, and later explore ways for technology to enable this. Technology is the means, not the end, and our design process recognized that.
Our teams proposed ideas on how to enhance people’s self-esteem (especially teenagers) and to help address the horrible traffic in WUDzisław.
For the DarkSide second half, the concept was that despite our best intention, our services can and will be exploited. The challenge was to identify the potential ways in which a service could be exploited and design safeguards to prevent such exploitation. We used standard security methods, such as threat modeling. The final exercise was for teams to play BlackHats and try to exploit the other team’s service. We ended by composing a Digital Bill of Rights to provide protections to our residents.
I thought this part went especially well. My favorite exploit: A service designed to reduce the impact of traffic could be used maliciously to direct traffic either toward or away from real estate to benefit a crooked politician. If a politician (or crony) develops a mall just outside of town—guess where the traffic is going to be redirected now.
My talk Where did all the time go?
My talk was called Unintended Consequences: Where did all the time go? This is based on my observation that our technology today is extraordinarily good (especially compared to when I first started) and we have never been more productive, yet nobody has any spare time anymore. Nobody I work with does. Nobody in my family does. How did that happen?
I then explored where the time has gone. The lifestyle habits of millennials are well documented, so reviewing that data reveals that all our time is now spent on our smartphones with social media, Netflix, etc. While that likely isn’t shocking, what is disturbing is how much “free” service providers like Facebook are monetizing your time. Do the math on Instagram, and you are effectively working for Facebook for $.06 per hour. That’s not much money, but Facebook doesn’t seem to mind.
My next topic was where did our social life go? It’s no secret that our overuse of smartphones is greatly harming social interaction. People don’t talk to each other anymore. There’s many things I could say about this, but a President’s Choice commercial (a Canadian food company) explains it perfectly.
My last topic: Assume a machine that could instantly create a new design and instantly A/B test it. Suppose this machine had two settings: maximize user value and maximize profit. (Guess which one your CEO would choose?) What to you suppose the output of such a machine would be? My answer: a UI full of dark patterns (to manipulate users against their best interest.) Dark patterns are very profitable.
The ultimate point: We need to understand what we are doing, and take the user’s side more. We need a UX Code of Ethics. Here is my first draft proposal:
- Respect user’s time—don’t waste it
- Respect user’s focus—don’t interrupt or distract (especially when driving), no badge spam
- Respect user’s privacy—don’t gather or disclose without permission, don’t spy!
- Respect user’s choice—help them make informed decisions, the right to opt-in (instead of opt-out), don’t trick them with dark patterns
- Respect user’s space—don’t abuse ToS, permissions
- Respect user’s humanity—provide a human alternative, right to appeal (especially to algorithmic judgements), right to not use
Is our amazing technology being exploited to capture and monetize our user’s time and attention? I think it is.
I had many good discussions after my talk. Some of the feedback suggested that I overstated my case: That the technology companies aren’t doing anything wrong—it’s what we as users choose to do with it. However, a cursory monitoring of the news suggests otherwise. Here’s the latest one:
If you want to see a tech giant with a corporate motto to not be evil, you literally have to go to a museum.
What are the takeaways?
As designers, we are technologists so we are believers. We have faith that our solutions will be used for their intended good. Historically unintended consequences have been mostly beneficial—enabling people to do things never thought possible. We never had to worry before.
But the new connected 24/7 reality is that any technology can and will be misused by “bad actors.” Those bad actors aren’t just hackers after our credit cards—they are powerful global corporations, government authorities, corrupt politicians. More powerful technology means more potential for abuse. And that abuse is not just to maximize conversion, which is fairly benign and expected. It’s to sell your privacy, your security, your free time, your location, your habits, your opinions and beliefs, your friends and your family—anything in your life worth selling—to the highest bidder, almost certainly without you knowing it. As a user, you have no protection against this—you agreed to the Terms of Service, remember?
To me, the bottom line is that we must expand the concept of user experience beyond usability and empathy to include respect. We are designing for people, not conversion metrics and deep data points. We need to design safeguards to prevent abuse and to make any breach transparent. Just because technology (and our ToS) enables us to exploit our users invisibly doesn’t mean we should. And as users, we should expect and demand more.