In 1987, economist and Nobel laureate Robert Solow made a stark observation about the stalling evolution of the Information Age: Following the advent of transistors, microprocessors, integrated circuits, and memory chips of the 1960s, economists and companies expected these new technologies to disrupt workplaces and result in a surge of productivity. Instead, productivity growth slowed, dropping from 2.9% from 1948 to 1973, to 1.1% after 1973.



Marx did not - could not - conceive of thinking machines in Capital. AI is in a different class than a weaving mill.
I agree that AI would be unimaginable for Marx, but on the other hand I think the labor theory of value is still holding up. Has any AI actually generated value for anyone?
If a new machine gives an efficiency advantage over other makers of equivalent commodities - the owner of that machine reaps a financial advantage that is experienced as “generated value”. But as the new machine and any efficiency in production becomes generalized we would expect that ultimately the value of that commodity would actually decrease.
But AI, at this point and for the foreseeable future, is not a thinking machine. It’s a probability machine. It can do some neat tricks and some helpful things, but it is not thinking.
I would also posit that AI is in many ways less useful than tech that came before it. Computers largely augment what people had been doing on paper for centuries before, just faster, more consistently, easier. AI promises to outsource thinking, which isn’t augmenting something people already do (or at least should do). But at this point, it fails to do even that.
The particular qualities of the machine itself are practically irrelevant. It doesn’t matter what the machine does, what inherent qualities it has. What matters is the relation it has to workers and the capitalists.
“Machines were the weapons used by the capitalists to quell the revolt of specialized labor”
– some blurry guy
“Marx could not conceive of
thinkingprogrammable machines”Are you sure about that?
All the ways that loom stitching - like a player piano - is not Turing complete, much less like large training set AI.
And those looms inspired Chuck Babbage to build the Difference Engine, which was at the 1862 World’s Fair. Years before Marx started writing Capital.
So again, when you say these machines are inconceivable to Marx…
Blah blah blah
The difference engine was never built by Babbage.
This is a false comparison built upon a false presumption.
What was at the 1862 World Fair then? That pesky word you used…
Your beliefs rest on
If you could prove either, I guess you’d be in the clear, but for some reason you’re insisting upon both.
You’re right, it’s an entirely new class of machine. A fucking stupid class that should be taken out back and shot.
These machines don’t think, though. They are the same glorified calculators we’ve been using, just with slightly updated formula sets. What we should be focusing on is the advent of quantum computing, which will supercharge everything we’re currently dealing with. That will be our generation’s Y2K. I read recently that quantum will become cheap enough to start disrupting things as early as 2029, which is not very far away.
But human society is still somehow connected to human effort.