A day full of outrage

A day full of outrage

I’ve blogged a lot about distractions and thinking lately. One thing I’ve noticed is just how distracted we are in our world today. You know what I’m talking about, the constant buzzing from our smartphones. I’ve also talked about how to deal with that. That part of it is on us. But today I want to turn our attention to the root cause of this.

Sure, in some regard, your phone is what you make of it. You can ignore notifications to some extent, and you can do things to reduce them. But that is definitely not the default mode. I was turned onto this idea when I first heard Tristan Harris. He appeared on a podcast with Sam Harris (no relation).

Here’s his core idea. While we say technology is neutral, the reality is that the applications we install on our phones manipulate us. They trick us to open them. They trick us to spend more time on them. I was vaguely aware of this, but hadn’t heard it phrased in a way that the ideas became concrete.


 

The Concept

Behind every app is a team of people working to keep your attention. Experiments are run to find new ways to keep your attention. Decisions are made to grab your eyeballs. These decisions typically optimize for one thing, time on screen.

Working in software engineering gives a unique view into this. We can see these decisions. But here’s the thing, I’ve never considered these malicious. And I’m fairly certain the developers behind obvious offenders like Facebook and Snapchat are also not making malicious decisions. But when you’re measured on a metric, you tend to optimize for it.

Let’s take a concrete, but unrelated example. Code coverage can be a great metric for assessing unit tests on a project. It gives us insight into how well our unit tests are written. But it can easily be gamed. This is the outcome you’ll get once that metric is used to measure an individual’s performance. If you do this, you’ll end up with unit tests that cover every line of code, but don’t actually test for correctness. You’ve optimized for the metric.

Obviously it wasn’t malicious, just the natural outcome of weighting the metric too highly. This is exactly the situation we find ourselves in now. Attention and time on screen are weighted way too heavily in our apps. As a result, we optimize for it.

So how is that done? Notifications are one example. With a notification, an app grabs your attention and pulls you in. That’s step one. Now the goal is to keep you there. Tristan references something he calls a “variable scheduled reward”. This is the number of tweets you might have waiting for you in Twitter. This rewards you for entering the app, training you to check back in. The never ending feeds keep you engaged in the app way longer than you meant to be.

Is this how you want to spend your time? Probably not, you likely didn’t even mean to. This is unfortunate, because it means you didn’t intend to do this.

How about those feeds? Have you ever considered what gets put into those feeds? Let’s consider the algorithm behind those. We know that Facebook has removed its human curators, and it is conceivable other companies have as well.

Since the algorithm is going to maximize clicks and shares, it is going to end up favoring articles with a heavy emotional response. We can see this already in the fake news phenomenon.

Imagine the different responses that the feeds might give you. Things that cause outrage are likely to be shared. Will they benefit you? Likely not, but they’ll get clicks and shares. How about jealousy? See where this can go? Next thing you know, your feeds are all outrage.

How to spend our time the way we want

So let’s start with our own lives. How can we improve that? I don’t want to dig too deeply into this, as I already did with my distractions blog. But I do want to introduce a couple new ideas.

We work in sprints for our jobs, what if we worked in sprints for our individual tasks? That’s exactly what the Pomodoro technique is. The concept is to spend 25 minutes at a time on a single task. Don’t let interruptions stop you, just focus on that one task. In fact, I recommend tracking your interruptions. After those 25 minutes, take a five minute break. This is the key. You have a stop point, which allows you to focus singularly for those 25 minutes.

How do you know you’re spending your own time well? If you can’t measure it, you can’t improve it. If you’re on iOS, I recommend checking out Moment. This app allows you to use battery screenshots on the iPhone to track your usage of apps throughout the week. Then, at the end of the week, gives you a summary of which apps you spent time in and how long. You can then rate those apps as time well spent or not.

This really digs into the core of Tristan’s ideas. The manipulation techniques that our apps are using could be used for good, if their values aligned with ours. Its important to know which apps you actually want to be spending time in.

Improving the industry

Unfortunately, I’m not very hopeful for change in this area. The incentives are all currently misaligned, which will prevent this from really changing. Even if we make efforts on an individual level, the metrics for screen time and attention will force products to move towards  maximizing time on screen.

So what has to change? The advertising model certainly doesn’t help. The correlation between time in app and ad clicks is a big influencer into this metric. But it goes deeper. Companies on subscription models are essentially forced to maximize screen time as well. Netflix has found that losing screen time can result in lost subscriptions.

The big players here are the gatekeepers. Apple and Google, the owners of the app store, control which apps are surfaced. I don’t think top grossing or most popular are metrics that are benefiting users. I’m not sure what the answer here is, without asking the user too many questions. I don’t like the idea of rating your time in an app, so I’m sort of at a loss here.

Even though I’m not hopeful for making a large change via personal efforts without a system change, I still think it is worth the effort. As devs, we should speak up when we recognize less than ethical choices in apps. We should question if we’re serving our users in ways that they want. Are they really communicating positively with their clicks? What other metrics can we use to determine time well spent.

Next steps

The biggest first step is awareness. You’re less likely to fall prey to the manipulative tricks of these apps if you’re aware of it. Maybe you’d even be ok with some of these tricks if they align with your own goals. We also need to admit to ourselves that we’re all susceptible to these tricks. Everyone likes to think we’re not susceptible, but we are.

I wonder if we can start sending a message by prioritizing our time for ourselves. Perhaps if we are not falling prey to the manipulation, it will stop working. Maybe then we can realign the incentive system for the industry.

Once we can prioritize our own time, we can start to think about how to change the industry. Only then we can change the metrics.

If you liked this post, please like or share it on your social media platform of choice. I’d also love to hear your feedback. If you have time, drop a comment below.