|
@tolmasky | |||||
|
An unfortunate UX trend I’ve seen is “fake focus” (or phocus, haha!) — patterns that ostensibly increase our focus but IMO actually make us more distracted. One example is the move to fullscreen apps, originally necessitated by mobile but increasingly adopted by desktop. (Thread)
|
||||||
|
||||||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
Think about everything you pay attention to while driving. Your main task is keeping your eyes on the road, but they also quickly jump to rear-view & side-view mirrors periodically, and to your dashboard to see your speed & fuel level. It’s actually quite amazing and subconscious
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
Now consider if instead there was a button that when pressed replaced your entire windshield with a back-facing camera feed or your current speed. That way you wouldn’t have all these “distractions” while driving. Do you think you’d be a better driver, or worse driver?
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
Although a bit contrived, this is similar to the trade-off we make going from a multi-window paradigm to a fullscreen one. Your eyes have evolved an amazing ability to pay a different kind of attention to the periphery than to what’s right in front you, and we’re wasting that.
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
More importantly, whenever you switch apps, like to answer a text message, your brain is getting a huge signal that “this is the main task now” and in the process deprioritizes what you were doing beforehand. This is rarely what you want.
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
This is “intended behavior”: it draws the entirety of your focus to each minute task you engage in. But this is its exhausting and, ironically, distracting. I want to devote “periphery” brain power to periphery tasks. But there can be no periphery tasks in a fullscreen world.
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
In a multi-window world, even when I bring an app like Messages to the front, my important apps are still visible in the background, usually taking up most of the actual space on the screen, serving as a visual indicator that *that* is still what I’m actually working on.
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
For me this makes me want to finish the interruption quickly, as opposed to the interruption receiving its own bespoke zen garden to monopolize my attention. I find myself forgetting what I was doing far more frequently on mobile, I think partially because of this.
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
So interestingly, on desktop I can have my “distractions” always visible, yet less distracting, since checking them shifts to the subconscious “polling” part of my visual process, vs. the conscious one that must decide to react to a notification.
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
Arguably, notifications became crucial *because* the entire screen was monopolized. Whereas before you could pick and choose multiple windows that show you information, we now needed a specialized system dedicated to this. And of course they had to overlap your primary content.
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
Then we had to build additional configuration around this, like “do not disturb”, which is in my experience too coarse. I don’t want some global notification cancellation, I just care about different things at different times, and usually in the periphery.
|
||
|
|
||
|
Francisco Tolmasky
@tolmasky
|
30. lis |
|
There has been little thought put into this in iOS IMO (I can’t even have picture-in-picture on my gigantic iPhone Max, YouTube has to be a “primary” task), but I think it might be a big hidden enabler of the uptick in distraction that we usually attribute to social media, etc.
|
||
|
|
||