Most of us spend our lives moving between screens.
We wake up to a phone. We work through a laptop. We relax through a TV. We talk, learn, scroll, shop, and think - inside of interfaces. The average person now spends more time interacting with digital systems than with physical people, places, or even their own thoughts.
And while we’re told that all of this technology is here to help - to streamline, to simplify, to save time - we don’t often stop to ask: help with what, exactly?
Because what it actually does, most of the time, is reduce the need for attention, decision-making, reflection. We’re shown what to do, where to go, what to buy, what to think, what to feel. Not by force - but by design.
We scroll because the feed never ends.
We tap because the notification appears.
We consume what’s suggested.
We follow what’s trending.
We believe what’s repeated.
The design of the system itself becomes the structure of our behavior. And slowly, without realizing it, we begin to mistake responsiveness for intelligence, convenience for freedom, and automation for insight. We still believe we’re in control. But the systems are guiding almost every move.
What if the technology you use every day isn’t making you more capable - but more conditioned?
This isn’t about quitting tech or escaping modern life. It’s about looking at it directly, without distortion, and asking the most obvious question that somehow never gets asked: Is this making me more alive, more aware, more human - or just more reactive, more efficient, more controlled?
The Fragmented Mind
To understand what our technology is doing to us, perhaps we have to ask a deeper question: What kind of mind is building it?
We like to think of technology as neutral - just tools, just code. But that’s not how it works. Every app, every platform, every system is built by people. And people bring assumptions, fears, habits, incentives, the drives for profit and power. These things get embedded in the design.
Most modern technology is designed by people under pressure - pressure to grow fast, ship early, raise valuation, grab attention. It’s a mindset shaped by urgency, comparison, competition, and reward. And so the products it creates don’t just solve problems - they reflect the emotional and psychological state of the people building them.
What kind of state is that?
It’s one that’s constantly optimizing. Constantly comparing. Constantly reacting.
The mind behind most systems is fragmented - split between goals, driven by metrics, pushed by the past, afraid of falling behind. And when that’s the state you’re in, you don’t design for depth. You design for scale. You don’t prioritize clarity. You prioritize clicks. You don’t ask what’s true. You ask what’s effective.
And so what we get is a flood of systems that do exactly what they were built to do: Maximize growth. Increase engagement. Accelerate decisions. Capture time.
But what they also do - quietly, consistently - is train us into those same patterns. They make us faster, but not clearer. More stimulated, but less steady. More connected, but less whole.
And because we rarely question the mindset behind the machine, we keep building tools that reflect confusion - and then use those tools to guide our lives.
So the question becomes simple: If the mind behind the system is fragmented, what kind of system can it build?
And what does it mean that we now live inside those systems?
Systems Built from Conditioning
Technology doesn’t appear out of nowhere. It comes from us - and reflects us.
That means our systems aren’t neutral - they carry the shape of the minds that designed them. And if we’re honest, most of our design decisions today aren’t grounded in clarity or care. They’re shaped by incentive structures, timelines, and the pressure to ship faster than the next team.
This is how conditioning turns into architecture. We build platforms to maximize engagement, because we’re conditioned to equate usage with success. We create shortcuts for everything - communication, memory, even thought itself - because we’ve been conditioned to value speed over depth. And then we hand these systems to the rest of society and call it innovation.
But what happens when a system designed from fragmentation becomes the environment we all live inside?
- Social media, originally built for connection, now optimizes for outrage and performative identity.
- Recommendation algorithms, meant to personalize, end up narrowing perspective.
- AI tools, designed to assist, begin replacing our need to reflect or even think carefully.
The system doesn’t just do what it was built to do - it teaches you to live the way it was built. And slowly, without realizing it, we start to adopt the assumptions of the machine.
That everything should be efficient.
That friction is bad.
That noise is normal.
That speed is safety.
But none of these things are true by default. They are just patterns we’ve inherited from the systems around us - systems shaped by minds under pressure, building for goals they never had time to question.
So now the responsibility falls back to us - not just to use these tools, but to ask: Do these systems reflect how we want to live? Or have we just become accustomed to living how the systems expect us to?
Observation vs Optimization
Most of modern life is built around optimization.
Everything we use - from navigation apps to smart fridges to AI agents - is designed to remove friction, speed up output, reduce effort. You don’t browse, you’re recommended. You don’t explore, you’re routed. You don’t reflect, you’re nudged.
And on the surface, this looks like progress. More ease. More time. Fewer decisions. But underneath, something else is happening.
The more optimized the system becomes, the less room there is for direct observation - real, conscious awareness of what’s actually happening. Instead of looking, we react. Instead of seeing, we respond. Our role shifts from participant to user, from chooser to consumer of suggestions.
This trade-off matters more than it seems.
Because when you stop observing and start optimizing, you stop asking why. You lose the space where independent thought happens. And over time, you become a mirror of the systems around you - efficient, responsive, and entirely unaware of how shaped you’ve become.
Evaluation is built-in. Everything is scored, ranked, liked, disliked, processed. You’re never just seeing - you’re endlessly measuring, comparing, reacting. And so intelligence becomes behaviour. Observation becomes reaction. Life becomes throughput.
But this isn’t an argument against convenience. It’s a reminder that convenience has a cost.
If our entire world is built for optimization, and if we never stop to observe what that’s actually doing to our minds, we risk becoming so adapted to the system that we forget there’s anything outside of it.
Closing Inquiry
There is no conclusion here. No theory to adopt, no position to defend, no solution to promote. Because the point isn’t to find an answer, but to break the mechanization of the mind - to look, "touch grass", and see what we actually need to build. To use technology as an aid to wellbeing rather than depending on it to keep us endlessly distracted.
We are not separate from our tools. They reflect us. They extend us. But they can also distort us, if we are not watchful.
The question, then, is not about what our systems do or how they are built - but what they require us to become. Not about how intelligent AI is - but whether it supports real intelligence in us.
We don’t need new ideologies or another performance of moral clarity - there is plenty of that in the space already.
We need to observe - to reconnect with life as it is. To look without distortion at the systems around us and the self within us - clearly, simply, directly. We need to "touch grass" and perhaps not let go.
That’s why The Verifier exists. Not to speak for you or shape your thinking, but to inquire into the mechanization of the mind, break the insatiable pattern of incentive-seeking, and come upon that space where you can just look, re-connect with what's real - be human.
Perhaps then we can adapt our technology to us, rather than the reverse.
An editorial by @0x1164
Previous Article - The Verifier: