I've worked in tech and lived in SF for ~20 years and there's always been something I couldn't quite put my finger on.
Tech has always had a culture of aiming for "frictionless" experiences, but friction is necessary if we want to maneuver and get feedback from the environment. A car can't drive if there's no friction between the tires and the road, despite being helped when there's no friction between the chassis and the air.
Friction isn't fungible.
John Dewey described this rationale in Human Nature and Conduct as thinking that "Because a thirsty man gets satisfaction in drinking water, bliss consists in being drowned." He concludes:
”It is forgotten that success is success of a specific effort, and satisfaction the fulfillment of a specific demand, so that success and satisfaction become meaningless when severed from the wants and struggles whose consummations they are, or when taken universally.”
In "Mind and World", McDowell criticizes this sort of thinking, too, saying:
> We need to conceive this expansive spontaneity as subject to control from outside our thinking, on pain off representing the operations of spontaneity as a frictionless spinning in a void.
And that's really what this is about, I think. Friction-free is the goal but friction-free "thought" isn't thought at all. It's frictionless spinning in a void.
I teach and see this all the time in EdTech. Imagine if students could just ask the robot XYZ and how much time it'd free up! That time could be spent on things like relationship-building with the teacher, new ways of motivating students, etc.
Except...those activities supply the "wants and struggles whose consummations" build the relationships! Maybe the robot could help the student, say, ask better questions to the teacher, or direct the student to peers who were similarly confused but figure it out.
But I think that strikes many tech-minded folks as "inefficient" and "friction-ful". If the robot knows the answer to my question, why slow me down by redirecting me to another person?
This is the same logic that says making dinner is a waste of time and we should all live off nutrient mush. The purposes of preparing dinner is to make something you can eat and the purpose of eating is nutrient acquisition, right? Just beam those nutrients into my bloodstream and skip the rest.
Not sure how to put this all together into something pithy, but I see it all as symptoms of the same cultural impulse. One that's been around for decades and decades, I think.
People want the cookie, but they also want to be healthy. They want to never be bored, but they also want to have developed deep focus. They want instant answers, but they also want to feel competent and capable. Tech optimizes for revealed preference in the moment. Click-through rates, engagement metrics, conversion funnels: these measure immediate choices. But they don't measure regret, or what people wish they had become, or whether they feel their life is meaningful.
Nobody woke up in 2005 thinking "I wish I could outsource my spatial navigation to a device." They just wanted to not be lost. But now a generation has grown up without developing spatial awareness.
> Tech optimizes for revealed preference in the moment.
I appreciate the way you distinguish this from actual revealed preference, which I think is key to understanding why what tech is doing is so wrong (and, bluntly, evil) despite it being what "people want". I like the term "revealed impulse" for this distinction.
It's the difference between choosing not to buy a bag of chips at the store or a box of cookies, because you know it'll be a problem and your actual preference is not to eat those things, and having someone leave chips and cookies at your house without your asking, and giving in to the impulse to eat too many of them when you did not want them in the first place.
Example from social media: My "revealed preference" is that I sometimes look at and read comments from shit on my Instagram algo feed. My actual preference is that I have no algo feed, just posts on my "following" tab, or at least that I could default my view to that. But IG's gone out of their way (going so far as disabling deep link shortcuts to the following tab, which used to work) to make sure I don't get any version of my preference.
So I "revealed" that my preference is to look at those algo posts sometimes, but if you gave me the option to use the app to follow the few accounts I care about (local businesses, largely) but never see algo posts at all, ever, I'd hit that toggle and never turn it off. That's my actual preference, despite whatever was "revealed". That other preference isn't "revealed" because it's not even an option.
Just like the chips and cookies the costs of social meida are delayed and diffuse. Eating/scrolling feels good now. The cost (diminished attention span, shallow relationships, health problems) shows up gradually over years.
Yes i agree with this. I think more people, than not, would benefit from actively cultivating space in their lives to be bored. Even something as basic as putting your phone in the internal zip part of your bag, so when you're standing in line at the store/post office/whatever you can't be arsed to just reach for your phone and instead be in your head or aware of your surroundings. Both can be such wonderful and interesting places but we seem to forget that now
Plants "want" nitrogen, but dump fertilizer onto soil and you get algal blooms, dead zones, plants growing leggy and weak.
A responsible farmer is a steward of the local ecology, and there's an "ecology of friction" here. The fertilizer company doesn't say "well, the plants absorbed it."
But tech companies do.
There's something puritanical about pointing to "revealed preference" as absolution, I think. When clicking is consent then any downstream damage is a failure of self-control on the user's part. The ecological cost/responsibility is externalized to the organisms being disrupted.
Like Schopenhauer said: "Man kann tun, was er will, aber er kann nicht wollen, was er will." One can do what one wants, but one cannot will what one wants.
I wouldn't go as far as old Arthur, but I do think we should demand a level of "ecological stewardship". Our will is conditioned by our environment and tech companies overtly try to shape that environment.
I think that's partially true. The point is to have the freedom to pursue higher-level goals. And one thing tech doesn't do - and education in general doesn't do either - is give experience of that kind of goal setting.
I'm completely happy to hand over menial side-quest programming goals to an AI. Things like stupid little automation scripts that require a lot of learning from poor docs.
But there's a much bigger issue with tech products - like Facebook, Spotify, and AirBnB - that promise lower friction and more freedom but actually destroy collective and cultural value.
AI is a massive danger to that. It's not just about forgetting how to think, but how to desire - to make original plans and have original ideas that aren't pre-scripted and unconsciously enforced by algorithmic control over motivation, belief systems, and general conformity.
Tech has been immensely destructive to that impulse. Which is why we're in a kind of creative rut where too much of the culture is nostalgic and backward-looking, and there isn't that sense of a fresh and unimagined but inspiring future to work towards.
I don't think I could agree with you more. I think that more in tech and business should think about and read about philosophy, the mind, social interactions, and society.
ED Tech for example I think really seems to neglect the kind of bonds that people form when they go through difficult things together, and the pushing through difficulties is how we improve. Asking a robot xyz does not improve ourselves. AI and LLMs do not know how to teach, they are not Socratic pushing and prodding at our weaknesses and assessing us to improve. The just say how smart we are.
This is perhaps one of the most articulate takes on this I have ever read - thank-you!
And - for myself, it was friction that kickstarted my interest in "tech" - I bought a janky modem, and it had IRQ conflicts with my Windows 3 mouse at the time - so, without internet (or BBS's at that time), I had to troubleshot and test different settings with the 2-page technical manual that came with it.
It was friction that made me learn how to program and read manuals/syntax/language/framework/API references to accomplish things for hobby projects - which then led to paying work. It was friction not having my "own" TV and access to all the visual media I could consume "on-demand" as a child, therefore I had to entertain myself by reading books.
Friction is an element of the environment like any other. There's an "ecology of friction" we should respect. Deciding friction is bad and should be eradicated is like deciding mosquitoes or spiders or wolves are bad and should be eradicated.
Sometimes friction is noise. Sometimes friction is signal. Sometimes the two can't be separated.
I learned much the same way you did. I also started a coding bootcamp, so I've thought a lot about what counts as "wasted" time.
I think of it like building a road through wilderness. The road gets you there faster, but careless construction disturbs the ecosystem. If you're building the road, you should at least understand its ecological impact.
Much of tech treats friction as an undifferentiated problem to be minimized or eliminated—rather than as part of a living system that plays an ecological role in how we learn and work.
Take Codecademy, which uses a virtual file system with HTML, CSS, and JavaScript files. Even after mastering the lessons, many learners try the same tasks on their own computers and ask, "Why do I need to put this CSS file in that directory? What does that have to do with my hard drive?"
If they'd learned directly on their own machines, they would have picked up the hard-drive concepts along the way. Instead, they learned a simplified version that, while seemingly more efficient for "learning to code," creates its own kind of waste.
But is that to say the student "should" spend a week struggling? Could they spend a day, say, and still learn what the friction was there to teach? Yes, usually.
I tell everyone to introduce friction into their lives...especially if they have kids. Friction is good! Friction is part of the je ne sais quoi that make human's create
In my experience part of the 'frictionless' experience is also to provide minimal information about any issues and no way to troubleshoot. Everything works until it doesn't, and when it doesn't you are now at the mercy of the customer support que and getting an agent with the ability to fix your problem.
> but friction is necessary if we want to maneuver and get feedback from the environment
You are positing that we are active learners whose goal is clarity of cognition and friction and cognitive-struggle is part of that. Clarity is attempting to understand the "know-how" of things.
Tech and dare I say the natural laziness inherent in us instead wants us to be zombies being fed the "know-that" as that is deemed sufficient. ie the dystopia portrayed in the matrix movie or the rote student regurgitating memes. But know-that is not the same as know-how, and know-how is evolving requiring a continuously learning agent.
Looking at it from a slightly different angle, one I find most illuminating, removing "friction" is like removing "difficulty" from a game, and "friction free" as an ideal is like "cheat codes from the start" as an ideal. It's making a game where there's a single button that says "press here to win." The goal isn't the remove "friction", it's the remove a specific type of valueless friction, to replace it with valuable friction.
Thank you for expressing this. It might not be pithy but its something I've been thinking about a lot for a long time and this a well articulated way of expressing this
I don't know. You can be banging your head against the wall to demolish it or you can use manual/mechanical equipment to do so. If the wall is down, it is down. Either way you did it.
Tech has always had a culture of aiming for "frictionless" experiences, but friction is necessary if we want to maneuver and get feedback from the environment. A car can't drive if there's no friction between the tires and the road, despite being helped when there's no friction between the chassis and the air.
Friction isn't fungible.
John Dewey described this rationale in Human Nature and Conduct as thinking that "Because a thirsty man gets satisfaction in drinking water, bliss consists in being drowned." He concludes:
”It is forgotten that success is success of a specific effort, and satisfaction the fulfillment of a specific demand, so that success and satisfaction become meaningless when severed from the wants and struggles whose consummations they are, or when taken universally.”
In "Mind and World", McDowell criticizes this sort of thinking, too, saying:
> We need to conceive this expansive spontaneity as subject to control from outside our thinking, on pain off representing the operations of spontaneity as a frictionless spinning in a void.
And that's really what this is about, I think. Friction-free is the goal but friction-free "thought" isn't thought at all. It's frictionless spinning in a void.
I teach and see this all the time in EdTech. Imagine if students could just ask the robot XYZ and how much time it'd free up! That time could be spent on things like relationship-building with the teacher, new ways of motivating students, etc.
Except...those activities supply the "wants and struggles whose consummations" build the relationships! Maybe the robot could help the student, say, ask better questions to the teacher, or direct the student to peers who were similarly confused but figure it out.
But I think that strikes many tech-minded folks as "inefficient" and "friction-ful". If the robot knows the answer to my question, why slow me down by redirecting me to another person?
This is the same logic that says making dinner is a waste of time and we should all live off nutrient mush. The purposes of preparing dinner is to make something you can eat and the purpose of eating is nutrient acquisition, right? Just beam those nutrients into my bloodstream and skip the rest.
Not sure how to put this all together into something pithy, but I see it all as symptoms of the same cultural impulse. One that's been around for decades and decades, I think.