In 1987, economist and Nobel laureate Robert Solow made a stark observation about the stalling evolution of the Information Age: Following the advent of transistors, microprocessors, integrated circuits, and memory chips of the 1960s, economists and companies expected these new technologies to disrupt workplaces and result in a surge of productivity. Instead, productivity growth slowed, dropping from 2.9% from 1948 to 1973, to 1.1% after 1973.

  • Dryad@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    4 days ago

    But AI, at this point and for the foreseeable future, is not a thinking machine. It’s a probability machine. It can do some neat tricks and some helpful things, but it is not thinking.

    I would also posit that AI is in many ways less useful than tech that came before it. Computers largely augment what people had been doing on paper for centuries before, just faster, more consistently, easier. AI promises to outsource thinking, which isn’t augmenting something people already do (or at least should do). But at this point, it fails to do even that.