• Bloomcole@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Could be but it depends, inbound helpdesk is not the same as outbound selling stuff with targets to be made and clients to convince.

    • TheRealKuni@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Having worked in a call center (doing survey research) during college, there are a lot of people employed by such places who really wouldn’t have many employment options anywhere else.

      I remember saying, while there, that the entire industry would be replaced by AI in 10-15 years. They all scoffed, saying they had ways to get people to answer surveys that an AI wouldn’t be able to do. I told them they were being naive.

      Here we are.

      That said, I do worry about some of those people. Just because they were borderline unemployable doesn’t mean they were worthless.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        There was a lot of talk about that when the call centers were sprouting up: generally poor jobs, minimum wage, and liable to be outsourced or ai’d. They were generally put places where there were no real options so those towns are going to suffer when it all goes away

      • Bloomcole@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 month ago

        doesn’t mean they were worthless

        Not what I said, on the contrary.
        It’s a horrible mindnumbing job and anyone deserves better.
        The avg of employment is 6 months.
        Some don’t make their targets and get fired, most find a less shitty job.

        • TheRealKuni@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          Oh don’t worry, I wasn’t accusing you of saying they were worthless. I was just voicing my own concern for some of my former coworkers.

    • Deflated0ne@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Isn’t that what we call “Innovation” in our capitalist society?

      You build a thing. Pour your blood sweat and tears into it. Some VC goon buys it during a downturn. They fire most of the staff. Strip the copper out of the walls. Make the service shittier and shittier until all that is left is its faltering brand recognition then sell it all for a bundle to the very next sucker they can?

      • RememberTheApollo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        Innovation is enshittification these days. It used to be invention, where entirely new products and materials came about. Then there was innovation, incremental improvement coupled with price hikes. Now “innovation” seems strictly rearranging deck chairs with worse service, and reducing employee count for increased profits.

        • MangoCats@feddit.it
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          In the 90s it was “selling it for parts” where the market value of the whole company was lower than the component parts, so buy it on the open market for a bargain, then slice and dice and profit.

          These days, they’re squeezing the lemons for all they can get.

          • RememberTheApollo_@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 month ago

            The “corporate raider” existed before that, infamously thanks to people like Frank Lorenzo dismantling Eastern Airlines in the ‘80s or Icahn to TWA. The late ‘70s and early ‘80s were rife with corporate raiders.

    • Initiateofthevoid@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      The idea of AI accounting is so fucking funny to me. The problem is right in the name. They account for stuff. Accountants account for where stuff came from and where stuff went.

      Machine learning algorithms are black boxes that can’t show their work. They can absolutely do things like detect fraud and waste by detecting abnormalities in the data, but they absolutely can’t do things like prove an absence of fraud and waste.

      • vivendi@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        For usage like that you’d wire an LLM into a tool use workflow with whatever accounting software you have. The LLM would make queries to the rigid, non-hallucinating accounting system.

        I still don’t think it would be anywhere close to a good idea because you’d need a lot of safeguards and also fuck your accounting and you’ll have some unpleasant meetings with the local equivalent of the IRS.

        • pinball_wizard@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          The LLM would make queries to the rigid, non-hallucinating accounting system.

          And then sometimes adds a halucination before returning an answer - particularly when it encournters anything it wasn’t trained on, like important moments when business leaders should be taking a closer look.

          There’s not enough popcorn in the world for the shitshow that is coming.

          • vivendi@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 month ago

            You’re misunderstanding tool use, the LLM only queries something to be done then the actual system returns the result. You can also summarize the result or something but hallucinations in that workload are remarkably low (however without tuning they can drop important information from the response)

            The place where it can hallucinate is generating steps for your natural language query, or the entry stage. That’s why you need to safeguard like your ass depends on it. (Which it does, if your boss is stupid enough)

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      How easy will it be to fool the AI into getting the company in legal trouble? Oh well.

    • vivendi@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      This is because auto regressive LLMs work on high level “Tokens”. There are LLM experiments which can access byte information, to correctly answer such questions.

      Also, they don’t want to support you omegalul do you really think call centers are hired to give a fuck about you? this is intentional

      • Repple (she/her)@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        I don’t think that’s the full explanation though, because there are examples of models that will correctly spell out the word first (ie, it knows the component letter tokens) and still miscount the letters after doing so.

        • vivendi@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          No, this literally is the explanation. The model understands the concept of “Strawberry”, It can output from the model (and that itself is very complicated) in English as Strawberry, jn Persian as توت فرنگی and so on.

          But the model does not understand how many Rs exist in Strawberry or how many ت exist in توت فرنگی

          • Repple (she/her)@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 month ago

            I’m talking about models printing out the component letters first not just printing out the full word. As in “S - T - R - A - W - B - E - R - R - Y” then getting the answer wrong. You’re absolutely right that it reads in words at a time encoded to vectors, but if it’s holding a relationship from that coding to the component spelling, which it seems it must be given it is outputting the letters individually, then something else is wrong. I’m not saying all models fail this way, and I’m sure many fail in exactly the way you describe, but I have seen this failure mode (which is what I was trying to describe) and in that case an alternate explanation would be necessary.

            • vivendi@programming.dev
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              1 month ago

              The model ISN’T outputing the letters individually, binary models (as I mentioned) do; not transformers.

              The model output is more like Strawberry <S-T-R><A-W-B>

              <S-T-R-A-W-B><E-R-R>

              <S-T-R-A-W-B-E-R-R-Y>

              Tokens can be a letter, part of a word, any single lexeme, any word, or even multiple words (“let be”)

              Okay I did a shit job demonstrating the time axis. The model doesn’t know the underlying letters of the previous tokens and this processes is going forward in time

  • arrakark@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    LOL. If you have to buy your customers to get them to use your product, maybe you aren’t offering a good product to begin with.

    • Jesus@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      There is another major reason to do it. Businesses are often in multi year contracts with call center solutions, and a lot of call center solutions have technical integrations with a business’ internal tooling.

      Swapping out a solution requires time and effort for a lot of businesses. If you’re selling a business on an entirely new vendor, you have to have a sales team hunting for businesses that are at a contract renewal period, you have to lure them with professional services to help with implementation, etc.

    • venusaur@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Plenty of good, non-AI technologies out there that businesses are just slow or just don’t have the budget to adopt.

    • dantheclamman@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      That stood out to me too. This is effectively the investor class coercing use of AI, rather than how tech has worked in the past, driven by ground-up adoption.

      • Jimmycakes@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 month ago

        That’s not what this is. They find profitable businesses and replace employees with Ai and pocket the spread. They aren’t selling the Ai

        • MintyFresh@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          They’re rent seeking douchbags who don’t add value to shit. If there was ever an advertisement for full on vodka and cigarettes for breakfast bolshevism it’s these assholes.

        • Aceticon@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          It only works until the inevitable costs from the accumulated problems due to AI use (mainly excessivelly high AI error rates with a uniform distribution - were the most damaging errors are no less likely than little mistakes, unlike with humans who can learn to pay attention not to make mistakes in critical things - leading to customer losses and increased costs of correcting the errors) exceed the savings from cutting down manpower.

          (Just imagine customers doing things that severely damage their equipment because they followed the AI customer support line advice and the accumulation of cost as said customers take the company whose support line gave that advice to court for damages and win those rulings, and in turn the companies outsourcing customer support to that “call center supplier” take it to court. It gets even worse than that for accounting, as for example the fines from submitting incorrect documentation to the IRS can get pretty nasty)

          I expect we’ll see something similar to how many long established store chains at one point got managers who started cutting costs by getting rid of long time store employees and replacing them with an ever rotating revolving door of short term cheap as possible sellers, making the store experience inferior to just buying it from the Internet, and a few years later those chains were going bankrupt.

          These venture capitalists’ grift works as long as they sell the businesses before the side effects of replacing people with language generators haven’t fully filtered through into revenue falls, court judgements for damages and tax authority fines and it’s going to be those buying such businesses (I bet the Venture Capitalists are going to try and sell them to Institutional Investors) that will end up with something that’s leaking customers, having to pay mass8ve compensations and having to hire back people to fix the consequences of AI errors, essentially reverting what the Venture Capitalists did and them spending even more money to cleanup the trail of problems cause by the excessive AI use.

          • CommanderCloon@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 month ago

            They’re VCs, they’re not here for the long run: they’ll replace the employees with AI, make record profits for a quarter, and sell their shares and leave before problems make themselves too noticeable to ignore. They don’t care about these companies, and especially not about the people working there

            • SocialMediaRefugee@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 month ago

              Better yet, they buy a company, take a loan out against the company, pocket the cash and then leave the struggling company with the extra debt. When it dies they leave the scraps to be sold and employees and others owed money are left out to dry.

            • tibi@lemmy.world
              cake
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 month ago

              And when the economy goes boom, they will ask their friends in the White House for a bailout

  • otacon239@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    I am so glad I got out of IT before AI hit. I don’t know how I would have handled customer calls asking why our chat is telling them their shit works when it doesn’t or to cover their computer in cooking oils or whatever.

    And only after they banged their head against the AI for two hours and are already pissed will they reach someone. No thanks.

    Thank god I can troubleshoot on my own.

    • tauisgod@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      When VC and PE call a company or industry “mature” it means they don’t see increasing revenue, only something to be sucked dry and sold for parts. To them, consistent revenue is worthless, it must be skyrocketing or nothing. If you want to see this in action right now, look what Broadcom is doing to VMWare. They also saw VMWare as a “mature company”.

  • Archangel@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Doesn’t this seem a little “forced”. This just seems like implementing AI wherever possible…regardless of demand.

  • Optional@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    “What if we threw a ton of money after the absolute shit ton of money we threw away?”

    • Bakkoda@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      Shareholder value? Capital gains? Golden parachute? These are all great things if you belong to the owner class.

    • MangoCats@feddit.it
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      The movie Outsourced (2006) didn’t foretell AI, but it did a pretty good job foretelling how the offshoring trend was going to unfold.

        • Markovchain@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          I liked the first half of the film, but it abruptly turns into a different movie. The second half isn’t bad, but it’s not what I wanted and it’s not what was advertised in the trailers and marketing.