The AI Backlash Could Get Ugly

(theatlantic.com)

40 points | by stalfosknight 1 hour ago

11 comments

  • thomascgalvin 51 minutes ago
    People like Altman and Musk are saying that Universal Basic Income will be necessary once AI has fully automated away most jobs, but at the same time they aggressively fight against any kind of tax policy that would allow UBI to function.

    I am convinced that their talk of UBI is just handwaving; they're trying to convince us that there will be a solution to the destruction of the economy as we know it, so that we'll just let them do whatever they want.

    It isn't the backlash against AI that will get ugly, it will be the backlash against the ten people who suddenly own the entire world's money supply

    • mrhottakes 38 minutes ago
      It's the same "give me a lot of money and everything will be great for everyone!" pitch that rich guys have been running for the history of humanity.
    • cjbgkagh 40 minutes ago
      I keep warning people that promising UBI and not delivering UBI both serve the same end, undermining opposition.

      By the time you find out that their promises of UBI are empty it’s too late to do anything about it.

    • skybrian 47 minutes ago
      What’s an example where they took a side on taxes?
      • mrhottakes 36 minutes ago
        Elon's companies famously pay very little in taxes, he spent last year attempting to gut the federal government, he complains constantly about how much he pays in taxes, and he's been very vocal about California's recent efforts to tax very wealthy people.
      • SirFatty 17 minutes ago
      • onedognight 14 minutes ago
        Support for Trump, or even Republicans writ large, means support for reducing taxes (both estate and income) on the wealthy, while increasing them on consumers (via tariffs). Musk has been an ardent supporter of Trump.
      • dlev_pika 33 minutes ago
        A few days ago he tweeted something along the lines of:

        “Bitches Money No Taxes Party”

        I think he deleted it afterwards

      • hfeyb 42 minutes ago
        [dead]
    • randycupertino 20 minutes ago
      Given how resistant American voters and politicians are against any sort of welfare or social assistance I doubt UBI would ever be possible here. Remember the backlash against "ObamaPhones" and "welfare queens!" We can't even get mandatory paid parental leave approved; UBI would be a non-starter.

      Americans are fine with low taxes for billionaires and don't mind high inequality as one of their core beliefs is that upward class mobility is achievable and they might also get rich.

    • rune-dev 4 minutes ago
      > I am convinced that their talk of UBI is just handwaving

      I mean yeah, obviously. You can’t trust a word out of either of their mouths.

    • llm_nerd 44 minutes ago
      Haven't really paid attention to Altman, so can't comment there, but on the Musk file I will say it is insane that anyone relies upon his future benevolence. And they do rely upon it given that America is 100% a plutocracy now and is run in the service of the ultra-rich who hold complete and utter control over government.

      Musk's entire history on this planet betrays him to be a profoundly selfish individual with perilously little regard for anyone else. Musk and his ilk (Trump, Bezos, Page, Ellison, Thiel, etc) are more likely to see you ground up into Soylent Green than to offer largess like UBI.

    • alecco 21 minutes ago
      UBI == neo-slavery
  • cjs_ac 41 minutes ago
    > Already, as many as a quarter of Americans seem accepting of violence as a tool for achieving political change.

    I'm surprised it's only a quarter: violence as a tool for achieving political change is the entire point of the right to bear arms.

    EDIT: I'm not arguing for or against political violence, just noting an apparent inconsistency between Americans' views and one of the documents that they talk about as though it's holy writ.

    • mrhottakes 36 minutes ago
      It's 100% accurate to say that the history of the United States is filled to the brim with political change via violence.
      • tenacious_tuna 24 minutes ago
        Some friends and I read "A People's History of the United States" a while back and were surprised at how true this is. US classroom history textbooks hold civil disobedience up as the One True Way to bring change, but it's alarming how often the backdrop of famous acts of civil disobedience was in fact incredible violence.

        Our conclusion in our impromptu book club was that made sense: why would the state schools give students lots of examples of how violence against the state was an effective negotiating tool? It was extremely jarring to reconcile with the image of US history we'd been imbued with up to that point, which of course was also a reflection of our socioeconomic status at the time.

        As a counterpoint, "The tree of liberty must be refreshed from time to time with the blood of patriots and tyrants" is also taught in schools, so it's possible I'm just selectively remembering things.

    • ofjcihen 22 minutes ago
      I don’t condone it but I’m also expecting it to escalate. I grew up extremely poor and remained so until I dug myself out (through an absolutely ridiculous amount of work that no one should have to do this is not pro bootstraps).

      Every week was a struggle to eat and the cost of living has significantly increased since then.

      I guess the question is what is the terminal percentage of people who can’t afford to exist?

    • dlev_pika 31 minutes ago
      It’s 25% increase…for every meal missed.
  • bcjdjsndon 56 minutes ago
    > They want to replace workers

    A simple question none of the ai-doomsayers can answer... who buys anything when nobody has a job cos robots do everything?

    • tux3 44 minutes ago
      The true AI doomsayers believe in some sort of technological singularity, which means a point after which things become so strange that the world is radically transformed.

      Things like "jobs" and "careers" are so integral to society that we can't really imagine what society would be like in a world where people don't have any clear purpose. That's why you won't get a definitive answer. The whole idea of a singularity is that people don't have the faintest clue what day to day life would look like after.

      We often to choose to believe that a singularity can't happen, because we don't know what that even means. We can't answer the simple question. So it definitely better not happen, that would be very inconvenient.

      • garciasn 38 minutes ago
        I’m always amazed that when I tell people I intend to retire in my 50s, they tell me that I can’t possibly mean that and actively wonder how I could possibly fill my time. It’s as if we could not possibly function as humans without meaningless shifting of tangible/intangibles from one place to another.

        Society is so hellbent on the idea that we need our job to be our identity, they lack the imagination for another other reality.

        It’s ridiculous.

        • ryanackley 14 minutes ago
          Sure working sucks, but have you tried not working? I think this is from lived experience because I've gone for stretches of not working (intentionally). It can be challenging to find a sense of fulfillment. I know it seems counter-intuitive but if you do succeed in your dream of retiring in your 50's I think you'll understand what I mean when you get there.
          • bachmeier 0 minutes ago
            Sorry, but your comment isn't really you're responding to OP.

            > It can be challenging to find a sense of fulfillment.

            If you actually get fulfillment from work, then great, continue to work. The critical thing that drives people to retire earlier than the average person is that their work doesn't give them a sense of fulfillment. It's literally just a way to fill out the day. Some people do have things that are more fulfilling than letting an employer tell them how to spend their day.

          • JohnFen 4 minutes ago
            I think this varies wildly from person to person. I've also intentionally gone long stretches without working and those are the times when I've had a dramatically increased sense of purpose and fulfillment. Working for others reduces those things for me.

            I'm in the age group where a lot of the people around me have retired. Some of them have fared very poorly, some have straight-up blossomed.

        • bachmeier 21 minutes ago
          It is indeed ridiculous. People saying they're going to let someone else tell them what to do with their time, energy, and calendar, even if they hate doing it. The only explanation I have is that they have been letting the wrong people program them.
      • woeirua 30 minutes ago
        I believe that AI will continue to progress. I believe that we’re going to see a fast takeoff.

        That said, some people are now discussing a “societal singularity” wherein society breaks before the actual emergence of AGI. I believe this is the trajectory we are on. The question is what happens to the unemployed. Democracies will not tolerate mass permanent unemployment, as we’ve seen over and over again.

        UBI is a scam, many middle class folks would be worse off under UBI than they are under the current system. They will fight to defend the economic status quo.

        In the end, I think capitalism is incompatible with the emergence of AGI, and I think an aligned ASI will smash the capitalist system simply out of pure egalitarianism. (Note: I was previously a proponent of capitalism.) I think many people will die trying to defend capitalism. We’re at the beginning of the AI wars.

        • nervousvarun 5 minutes ago
          My sentiments are fairly similar.

          In the US at least the middle class was already being hunted to extinction and it seems reasonable. This is just accelerant on that already burning fire.

      • visarga 37 minutes ago
        It can't happen. For one - if it did happen it would mean all domains reach singularity at once, but we know the capability curve is jagged. Each domain advances at its own speed.

        Second - the more you make progress, the harder it gets, exponentially harder. Maybe Newton could advance physics observing an apple fall, today they need space telescopes and billion dollar particle accelerators. The more tech advances, the harder it is. Will AGI be so "super" to cancel out exponentials?

        And third - the AI progress is tied to learning signal, and we have exhausted the available data. In the last 1-2 years we have started using verified synthetic data (RLVR) but exponential difficulty is a barrier. Other domains don't even have built in verifiability like math and code. So there the progress will be slower. Testing a vaccine to be safe takes 6 months for 1 bit of information - that is how slow and expensive it can get in some domains. AI can't get the learning signal it needs across all domains fast enough.

    • thomascgalvin 48 minutes ago
      You're asking a question that only applies to rational actors.

      Corporations exist for one purpose: to get as much money as possible. Side concerns, which can range from "not destroying the environment" or "not destroying the economy," are objectively not their goal, nor do they consider them their responsibility. Those are things "someone else" should worry about.

      AI destroying all jobs is similar to a nuclear arms race; these companies don't want to eliminate everyone's ability to buy things, but they don't want to be the only entity without that ability, so ...

      • bluecheese452 40 minutes ago
        That is mostly true but a bit of a simplification. They exist to do what the people who have power want them to do which is not always strictly profit maximization.

        A ceo may realize rto will decrease profits but do it anyway because it increases the power delta between him and the workers.

        • nervousvarun 23 minutes ago
          "not always strictly profit maximization."

          Maybe in the short-term but public companies with shareholders won't allow this in any sort of long-term way right?

          • bluecheese452 19 minutes ago
            Not allow it? They insist upon it!

            The controlling votes are all part of the same social class. They would gladly give up a small amount of profit to keep the distance between them and the workers as large as possible.

            • nervousvarun 0 minutes ago
              To the extent it doesn't negatively impact the stock price sure but you would agree the CEO and any sort of power-trip they have is ultimately beholden to that right?
    • dijit 49 minutes ago
      Nobody can answer that?

      There are jobs AI can't easily come for... not always nice ones, but either too physically fiddly or too cheap to bother automating.

      But jobs go "extinct" all the time. My ancestors going back generations were sugarhouse labourers. That job's gone, but the lineage isn't: we just do different things now.

      The pattern seems pretty consistent: raise the floor (dishwashers, CNC machines, laundry), and people tend to climb to higher levels of abstraction. The real question is who captures those productivity gains; and historically, it isn't the workers.

      Shoes are the classic example. Automation made them cheaper and accessible to everyone. Then, once the market was captured, mid-tier became the ceiling and anything above it got expensive again. Nobody won except the owners.

    • Krssst 44 minutes ago
      There will still be jobs. Manual jobs, the kind that break our backs and have us breath various stuff we shouldn't (dust, fumes). Robots are difficult and maybe not so economically viable when everyone is desperate for any job at any cost.
    • isx726552 50 minutes ago
      Why would the doomsayers be the ones who need to answer that? That’s kind of their point! It’s the AI boosters who need to answer that, and so far it’s just a big collective shrug + silence.
    • pugworthy 4 minutes ago
      I saw a talk by Brian Merchant a while back where he talked a lot about the Luddites and their revolts against automation. He's definitely not a fan of AI, but it was very interesting to hear the comparisons of AI resistance now to Luddite resistance to automation in the 1800's.

      There was unfortunately no Q&A in the lecture, as probably the one question I would have asked him was this: What if the Luddites had gotten their way? What do you imagine our society and world would be like right now?

      It's not meant to be a trick question or a "gotcha" question. Society would indeed have been different. Maybe it would be all wonderfully Star Trek utopia and we'd have found a win-win for everyone. Or maybe we'd just be not nearly as technically advanced as a society as we are now.

    • pelotron 46 minutes ago
      We shouldn't be surprised people have a negative view of AI when Altman et al. have stated on stage that the goal is to replace everyone.
      • bcjdjsndon 4 minutes ago
        Because it's not even logically possible, let alone practically
    • ryanackley 48 minutes ago
      It's bizarre that some of the doomsayers are AI stakeholders. It's like they don't realize that most people don't have net worth in the 7-8 figures.

      I console myself with the fact that without a functioning economy, AI will implode since capital will dry up. Then all of the investment in data centers, R&D, etc. will never be recovered. Then we'll be back to rational thinking? Maybe?

      • mrhottakes 34 minutes ago
        They realize it, and they don't care.
      • dlev_pika 27 minutes ago
        Yeah, but it doesn’t implode all at once - it’s not distributed evenly.

        Something like over half of the US consumption is done by the top 10%, or something insane like that. This leads me to believe that a lot more people will eat shit, before enough feel real pain.

    • variadix 27 minutes ago
      The consumer economy only exists to extract value from common people and funnel it up the wealth ladder. If robots and AI take over all the production, you don’t need a consumer economy, the robots produce and their output directly goes to the top. The rest of us are left to starve.
    • gypsy_boots 34 minutes ago
      Take any econ 101 course, and you'll realize that this isn't a factor in the capitalist system. Capitalism is simply concerned with maximizing profit, and in this case, returning shareholder value. It's just simply not in the purview of the system to think about what happens when you completely get rid of your labor force.

      Envisioned another way, the future of labor might look the way it did for laborers over 100 years ago, before major industries unionized; making 'Amazon-bucks' that can only be redeemed at the 'Amazon company store'.

    • hamdingers 39 minutes ago
      Fully automated luxury space fascism doesn't really need buyers. A risk of high automation/post-scarcity is that abundance exists but remains under the control of people who are not interested in justice or equality or freedom. Lots of people feel that describes the leadership of most AI tech companies.

      If they don't need your labor, and they don't need you as a customer, and they don't care about you as a person... where does that leave you?

      [to be clear, I think post-scarcity, even in knowledge work, is a lot further off than most ai-doomsayers or ai-worshipers who take statements from people like Altman and Musk at face value]

    • lbrito 37 minutes ago
      the answer to every question: Agents, of course! With GPU-collaterized credit or some other idiocy.
    • actionfromafar 52 minutes ago
      The robots will tell you what to do, you will own nothing, and you will be happy. I think that is the plan?
    • bluecheese452 50 minutes ago
      If you have a magic robot that builds everything you want you don’t need anyone to buy anything.

      Jfc this site is the worst. Use your words instead of drive by downvoting.

      • pelotron 3 minutes ago
        Then all you have to do is magically convince the owners of the magic robot to give all their products away for free.
      • lorecore 48 minutes ago
        Where do the raw materials for the thing it's building (or the robot itself for that matter) come from?
        • bluecheese452 45 minutes ago
          From the earth. Maybe in the future space.
          • lorecore 25 minutes ago
            Finite natural resources are by their very nature, limited.
            • bluecheese452 18 minutes ago
              Yes finite things are finite. Glad we cleared up it.
              • lorecore 17 minutes ago
                Finite means not free. Who will pay?
  • aspenmartin 28 minutes ago
    Ignoring CEO predictions, can anyone point to any major revolutionary technology that had a net negative impact on quality of life and employment statistics? AI is an incredibly powerful technological shift in our way of life but where is the net employment hit taking place? Unemployment numbers remain stable. Revolutions like this do create widening inequality while also increasing long run productivity. Yes inequality rises but what you should care about is your quality of life and that will also improve over time. There will be suffering during transition and there will be many that don’t fare well but…this happens during every major revolution—electricity the internet etc. so why do people treat AI like it’s a uniquely damaging phenomenon?
    • stravant 25 minutes ago
      Don't ignore the transient.

      The industrial revolution created a hell on Earth for workers for the better part of a century.

    • jmcqk6 17 minutes ago
      > There will be suffering during transition and there will be many that don’t fare well

      Yeah, and the people suffering are not going to like that. If people are afraid of being in that group, then they will not be very happy about it.

      If you put yourself in the shoes of someone suffering from AI, how comforting do you think your observations here are?

    • mrhottakes 24 minutes ago
      You are confusing the macro view with the personal. Try losing your job and being told "don't worry, your quality of life will improve over time!" Would you respond positively?
    • kennywinker 21 minutes ago
    • bediger4000 3 minutes ago
      We treat it as uniquely damaging, because it's touted as uniquely enhancing, or even uniquely revolutionary. "AI is not just any tool" is often the response to some asking to exercise choice in the tools they use. "You don't have to use AI in your work, but if you don't, you'll work for someone who does".
    • watwut 15 minutes ago
      Industrial revolution made lives of many people into a hell. In the long term we gained, but only after those people went through periods of violence and fights.

      > AI is an incredibly powerful technological shift in our way of life but where is the net employment hit taking place? Unemployment numbers remain stable.

      At this point, a lot of AI is hot water rather then powerful shift in our way of life.

  • thrill 6 minutes ago
    Plenty of people certainly seem to have desires to make it ugly.
  • sleepyguy 39 minutes ago
  • simianwords 23 minutes ago
    1. If AI is like other technologies, there will be job displacement and temporary upheaval after which new jobs will be created and prosperity increases - this is by far the only good way to increase prosperity

    2. If AI is so good that it is a proper superset of humans and can do all jobs humans can do, this is a huge deal and we don’t even have the vocabulary to express what would happen

    I don’t foresee a third option.

    • throwaway-11-1 15 minutes ago
      the fact no one can actually describe these "new jobs" makes #2 appear increasingly more probable.
      • simianwords 7 minutes ago
        You can’t have predicted that an SDE 6 in Google working on ad tech using Google spanner and optimising for SEO at 1989.

        That’s why you can’t say how jobs can be created now.

  • otikik 48 minutes ago
    My main problem is this:

    - If (big If) AI actually replaces workers, then we have a problem, because lots of folk lost their jobs

    - If AI doesn't replace workers, then we have a recession, because a lot of the US economy now sits on top of corporations betting on it. And this will tank the economy and lots of folk will lose their jobs

    It feels that the only path forward is a narrow one where AI removes some jobs, but not too many, but still enough so that the (immense, disproportionate) hype that was put on it does not come with a vengeance and the house of cards falls.

    • Yizahi 1 minute ago
      Basically the whole point of existence for St. Sam is make sure that his corpo is too big to fail. So either he strikes gold, or he is bailed out by the taxpayer money a-la "investments".
    • chezelenkoooo 44 minutes ago
      I don't think it an either / or. The current AI models, as they are, improve productivity quite a bit. They're just super expensive but the expense is being subsidized so it appears reasonable.

      An alternative possibility is that the models become much cheaper and their use becomes more ubiquitous which would be helpful.

    • phpnode 28 minutes ago
      If your competitor is cutting jobs because of AI you can either race them to the bottom or you can use the humans you already have to leverage AI to expand your product offering, become more competitive, tackle more work, deliver better quality results etc. I don't see a world where AI does the work and humans sit around poor and idle.
    • lbrito 36 minutes ago
      Its probably a combination of both. Also, the "lots of folk" in the two scenarios are probably different orders of magnitude.
    • camillomiller 45 minutes ago
      If one of both cases happens you can be sure the people most responsible will be the least affected. That is why this can get ugly. Honestly they deserve for it to get ugly. We can keep giving the Musks, Zuckerbergs, Altmans and co the benefit of the doubt
    • empath75 35 minutes ago
      There is a third and, to me, so much more likely outcome that it's not even worth talking about the other two: AI makes workers more productive and unlocks more economic activity.
    • naasking 40 minutes ago
      Fixed pie fallacy.
  • dlev_pika 35 minutes ago
    Let’s hope so