Skill
Reflection on the concept of 'skill' in the context of Agent.
There's a word that keeps showing up in corporate announcements about AI adoption: skill. A modular, callable unit of automated capability that a platform can deploy on demand, without the human it was extracted from. People were not replaced by a machine that came from outside. They were replaced by a machine they had built piece by piece, from their own expertise.
Marx described a foundational move of capitalism: separating workers from their means of production. The loom, the factory, the land. These were taken from the people who operated them and concentrated in the hands of capital. Workers were left with only their labor power to sell, severed from the tools that gave that labor value.
What's happening now is structurally identical, but it operates at a new register.
Before, what capital extracted was physical output made with hands and time. What's being extracted now is cognition itself: the judgment, the pattern recognition, the trained intuition that took years to develop. It gets packaged into a model, a workflow, an API endpoint. And then the person is no longer necessary. If the old dispossession separated workers from their tools, this new dispossession separates workers from their expertise. The skill remains. The person holding it does not. Then what workers left to sell?
What's philosophically unsettling about this moment is not just that jobs are being lost, technological unemployment is not new. It's that the process of replacement has become so intimate.
In the industrial era, a machine could replace a worker's body: it could lift, stamp, weld, assemble. But the worker still knew how to do these things. Knowledge remained human. There was at least a residual claim: I understand this work, even if the machine does it now.
That claim is dissolving. When a language model is trained on a copywriter's output, when a level-generation tool is built from a designer's heuristics, when an HR chatbot is scaffolded on policies written by human staff, the knowledge no longer belongs to the people who generated it. It belongs to the platform. The last refuge of the worker: I know how to do this, has been colonized.
This is what makes the framing of "skills" so ideologically loaded. A skill, in its new corporate meaning, is not something you have. It is something that can be taken from you, formalized, and redeployed without you.
But I'm not arguing that AI development should stop, or that automation is inherently wrong. The level designers who built those tools presumably believed they were building something useful. What I'm arguing is that we need to be honest about the political economy of this process: who benefits from the extraction of human expertise into machine form, and who bears the cost.
When Salesforce saves $50 million by redeploying 500 customer service workers with AI, the question worth asking is: saved for whom? The shareholders who saw revenue grow 8% in Q1 2025? Or the 4,000 support workers whose institutional knowledge now lives in Agentforce?
When a game studio asks its designers to build tools that make their own jobs faster, the question worth asking is: faster toward what end? A more creative studio? Or a leaner one?
The euphemism of "skill" lets companies sidestep these questions. It makes dispossession sound like empowerment. It frames the extraction of human expertise as the creation of capability without asking whose capability, owned by whom, and at whose expense.
As AI systems grow more capable, the process of extracting human cognition, packaging it, scaling it, and severing it from the humans who generated it will only accelerate.
We need new frameworks for thinking about what it means to own one's expertise in the age of machine learning. We need labor agreements that treat the knowledge workers contribute to AI systems as something with value, something that deserves compensation, attribution, and protection, not just extraction. We need to stop calling this "upskilling."