Hard to believe it’s been 24 years since Y2K (2000) And it feels like we’ve come such a long way, but this decade started off very poorly with one of the worst pandemics the modern world has ever seen, and technology in general is looking very bleak in several ways

I’m a PC gamer, and it looks like things are stagnating massively in our space. So many gaming companies are incapable of putting out a successful AAA title because people are either too poor, don’t want to play a live service AAA disaster like every single one that has been released lately, Call of Duty, battlefield, anything electronic arts or Ubisoft puts out is almost entirely a failure or undersales. So many gaming studios have been shuttered and are being shuttered, Microsoft is basically one member of an oligopoly with Sony and a couple other companies.

Hardware is stagnating. Nvidia is putting on the brakes for developing their next line of GPUs, we’re not going to see huge gains in performance anymore because AMD isn’t caught up yet and they have no reason to innovate. So they are just going to sell their next line of cards for $1,500 a pop for the top ones, with 10% increase in performance rather than 50 or 60% like we really need. We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away

Virtual reality is on the verge of collapse because meta is basically the only real player in that space, they have a monopoly with them and valve index, pico from China is on the verge of developing something incredible as well, and Apple just revealed a mixed reality headset but the price is so extraordinary that barely anyone has it so use isn’t very widespread. We’re again a decade away from seeing anything really substantial in terms of performance

Artificial intelligence is really, really fucking things up in general and the discussions about AI look almost as bad as the news about the latest election in the USA. It’s so clowny and ridiculous and over-the-top hearing any news about AI. The latest news is that open AI is going to go from a non-profit to a for-profit company after they promised they were operating for the good of humanity and broke countless laws stealing copyrighted information, supposedly for the public good, but now they’re just going to snap their fingers and morph into a for-profit company. So they can just basically steal anything they want that’s copyrighted, but claim it’s for the public good, and then randomly swap to a for-profit model. Doesn’t make any sense and just looks like they’re going to be a vessel for widespread economic poverty…

It just seems like there’s a lot of bubbles that are about to burst all at the same time, like I don’t see how things are going to possibly get better for a while now?

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    COVID also inflated a lot of tech stock massively, as everybody suddenly had to rely a lot more on it to get anything done, and the only thing you could do for entertainment was gaming, streaming movies, or industrial quantities of drugs.

    Then that ended, and they all wanted to hold onto that “value”.

    It is a bubble, but whether it pops massively like in 2000, or just evens off to the point where everything else catches up, remains to be seen.

    “The markets can remain irrational longer than you can remain solvent” are wise words for anyone thinking of shorting this kind of thing.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Shows that You are in the UK. Just want to clarify I’m talking specifically about the USA but I agree with everything you said. Tech stocks became so inflated! Don’t know if people are seeing it in Europe, but here in the USA, there is this really toxic and very cringe behavior from these tech companies to get people back to office, they can force people to return to office across the country, basically you have to relocate and upend your entire life which could cost you $50,000 and they’re not paying for that, if you don’t do that you get fired. Easy way to start laying off people without having to pay them anything because you can call it insubordination, since they refuse to return to office. Now they supposedly have cause to get rid of people or deny them promotions for more money. IBM for example is doing this right now, Cisco was doing it as well. One of the most major networking software companies in the market. Scumbag behavior

  • madjo@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    We still don’t have the capability to play games in full native 4K 144 Hertz.

    And we really don’t need that. Gameplay is still more important than game resolution. Most gamers don’t even have hardware that would allow that type of resolution.

    • XIIIesq@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      I remember when running counter strike at 30fps on a 480p monitor meant you had a good computer.

      Modern graphics are amazing, but they’re simply not required to have a good gaming experience.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Gameplay is still more important than game resolution

      In your opinion*. You forgot that part. For lots of people, graphics are way more important because they want a beautiful and immersive experience. They are not wrong to want that. I respect that you feel the way you do, but I respect others who care more about graphics. I’ll even go so far as to say that I am of the same mind as you, I don’t care about the graphics much at all but there are some games where the graphics have truly wowed me, or the visual effects. For example two that come to mind, Ori and the will of the wisps, or No Man’s sky. Two very different games but absolutely crazy visual effects and graphics on high-end computers. Another game that I play a lot is World of Warcraft, gameplay is so damn fun but it’s hard to get any of my friends to play it because it’s so ugly, looks like a poorly rendered PS3 game. That horrible quality of graphics prevents people from even trying it

      Most gamers don’t even have hardware that would allow that type of resolution.

      This is because they refuse to innovate. Think of the DVD player. You think a DVD player costs a lot today? Of course not, there’s a million of them and no one wants them anymore. If they actually innovated and created drastic leaps and technology, then older technology would be cheaper. It’s not expensive to go out and get an RTX 2080, which is the graphics card I currently have. Is about 250 or $300 now, pretty damn solid card. If they actually innovated and kept pushing the limits, technology would accelerate faster. Instead they want the inverse of that. They want as slow growth in technology as feasibly possible, maximum amount of time to innovate, maximum amount of revenue, and maximized impact on the environment. All those carbon emissions and waste of graphics cards being thrown out

      • madjo@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        You can have the most realistic graphics in the world, pushing your AMViditel RTX 5095Ti Plus Platinum Ultra with 64TB VRAM to it’s absolute maximum, but if the gameplay sucks, you won’t have as much fun as you would with a pixel art indie game with lots of fun gameplay.

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        If graphics were with it, people would pay for it.

        The fact of the matter is that exponential graphics capabilities requires an exponential input of developer and asset creator budget. Given that there is a ceiling on game prices, it isn’t worth it going for higher fidelity games when the market isn’t going to pay for it.

  • scarabic@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    OP, when you say AI is really really fucking things up, what do you have in mind? Setting aside the ludicrous things people say about AI, do you see it directly fucking something up? I’m just curious what is on your mind when you say that.

    • Diva (she/her)@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      To me it’s seeing the Nvidia stock price in the same sort of range as Cisco stock prices were in the dot com bubble- I don’t have any confidence they’re going to reach the promised land of profitability this time either.

        • Diva (she/her)@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          8 months ago

          Pretty sure Cisco was too, the’re supplying hardware to the bubble. I think you misunderstood my point- I don’t think AI is going to be profitable in the long term, it’s probably very profitable to sell the hardware for that to people with investor money- their stock price reflects that.

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            8 months ago

            Just like always, it depends on how you define or redefine ai. For example, what used to be called ai has been very successful in photo processing. The same thing is going to happen: some portion or incarnation of the current generative ai will be successful, but it will be dismissed similar to “it’s just machine learning, not ai”

            I have a lot of hope for Apple’s approach, where they are incorporating it as tools into specific capabilities, and prioritizing privacy. While there’s no direct profit, it should help sell a lot more devices with ever higher tech specs. I also like their “private cloud” model that has a lot of potential beyond private ai

            • Diva (she/her)@lemmy.ml
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 months ago

              That’s pretty much where I see the ending for a lot of this, there’s a wide variety of useful applications, but hard to capitalize on especially for things that are self contained and not phoning home to some server you need to maintain access to for billing purposes

  • tibi@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Also, the movie industry is struggling because of many reasons. Movies are getting too expensive, the safe formulas big studios relied on aren’t working anymore, customer habits are changing with people going less to movie theaters.

    At the same time, just like with video games, the indie world is in a golden age. You can get amazing cameras and equipment for quite a small budget. What free software like Blender can achieve is amazing. And learning is easier than ever, there are so many free educational resources online.

    • linearchaos@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      The entire entertainment industry is floundering. Wages lagging inflation in many sectors, people are paying significantly more to eat. They’re going to cut back on the streaming services and they’re going to cut back on going out to the movies. I’m right here at these crossroads where the only thing that makes sense is to give people a little more value for the money, instead we’re going to pull every fast trick we can to make more in advertising and gambling.

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Or you had several companies try to start their own streaming services from scratch and thought you needed a ton of new shows to fill it. Disney+ could have easily gotten away with archived Disney Channel shows, all the animated Disney cartoons, the old Star Wars & Marvel movies, and the Simpsons. It didn’t need a lot of the new shows, no matter how cool they looked.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I wouldn’t say the movie industry is struggling, I would say that people that work for a living are struggling. Actors are still getting paid huge sums of money, so are directors and producers. They are getting their pound of flesh one way or another. They are just not producing anything that people want to watch. For example all this marvel post-infinity War bullshit, no one wants to see that. No one cares about marvel Disney anything right now, it’s low quality drivel. But Beetlejuice, Barbie, oppenheimer… These are proof that people do still want to see movies, they just don’t want to produce anything meaningful.

      The people struggling that I’m talking about, however, are the supporting roles. People doing the filming, set dressing, makeup, special effects. Lots of these lower levels supporting roles get almost nothing compared to their cost of living in California, while some of the main actors can get tens of millions

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        I’ve just been watching older movies, there’s this amazing sweet spot when CGI just became a thing where the visual effects are passable but not so prevelant that the entire plot gets replaced with pointless explosions.

      • tibi@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Just like AAA game studios, movie studios don’t want to take risks, so they go with productions they consider “safe”: aim for the lowest common denominator, play into nostalgia, don’t make anyone upset by touching subjects like politics, religion. And you end up with the garbage they are making right now.

  • sukotai@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    it’s time for you to play PACMAN, as i did when i was young 😂
    no AI, no GPU, no shitcoin: you just have to eat ghost, which is very strange in fact when you think about it 🤪

    • emax_gomax@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Correction the ghosts are AI and based on how many times they killed me clearly a step above anything mainstream today (º ロ º๑).

  • magic_lobster_party@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    What’s happening is that support from VC money is drying up. Tech companies have for a long time survived on the promise that they will eventually be much more profitable in the future. It doesn’t matter if it’s not profitable today. They will be in the future.

    Now we’re in a period where there’s more pressure on tech companies to be profitable today. That’s why they’re going for such anti consumer behaviors. They want to make more with less.

    I’m not sure if there’s a bubble bursting. It could just be a plateau.

    • XIIIesq@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I agree. Smartphones, for example, have hardly changed at all over the last ten years, but you don’t see Apple and Samsung going out of business.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        And it would be so easy to make a big splash in the market by having a phone where the camera doesn’t protrude out of the back.

          • Buttflapper@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            Damn that’s wild. Any business that has that drastic of spikes of profit and loss cannot possibly be sustainable. I can’t see how it could be. Look at the automobile giants in the USA. All it took was one major economic event to bankrupt them, and they got bailed out which should’ve never happened. It’s bullshit.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        I understand you don’t appreciate where we’ve come from and how fast, can’t see the year to years changes, but the iPhone is just a little over ten years old. Do you really not see huge changes between an early iPhone and today’s?

        • XIIIesq@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          8 months ago

          On the contrary, I absolutely appreciate it. I was about 15 when mobile phones first became a thing that everyone owned, so I’ve lived through the entire progression from when they were something only a well to do businessman would have all the way through to today. The first iPhone was 2007, 17 years ago btw.

          When mobile phones became popular, each new generation of phones saw HUGE improvements and innovation. However, the last ten years has pretty much just been slight improvements to screen/camera/memory/CPU. Form wise and functionally, they’re very similar to the phone of ten years ago.

          I understand that some technophiles will always be able to justify why the new iPhone is worth £1600 and if that’s what they want to spend their money on then good for them, but I personally think that they are kidding themselves. Today you can get a brilliant phone for £300 or even less.

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            8 months ago

            I’d never justify that urge to spend ridiculous money updating every year to the latest and greatest, but people tend to under appreciate the massive improvements from accumulated incremental improvements.

            OLED screen on my iPhone X was revolutionary (and I’m sure Android had it first), as just one example, and now most phones are. Personally I find ultrawideband and “find my” very innovative and well implemented. Or if that’s too small a change, how about the entire revolution of Apple designing their own SoC for every new model. There’s emergency satellite texting, fall/crash detection, even Apple mostly solving phone theft is innovative (even if you don’t like their approach)

            When we see steady improvements, humans tend to under-appreciate how it adds up

  • Dead_or_Alive@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    The pace of technological change and innovation was always going to slow down this decade. But Covid, Ukraine and a decoupling from Russia/China has further slowed it.

    You need three things in abundance to create tech. First an advanced economy, which narrows down most of the world. Second you need lots of capital to burn while you make said advances. Finally you need lots of 20 and thirty something’s who will invent and develop the tech.

    For the last 20 years we’ve had all of those conditions in the Western world. Boomers were at the height of their earnings potential and their kids were leaving home in droves letting them pour money into investments. Low interest rates abound because capital was looking for places to be utilized. China was the workshop of the world building low to mid range stuff allowing the West to focus its excess Millennials age workforce on value added and tech work.

    Now in the USA boomers are retiring and there aren’t enough GenX to make up the difference. Millennials and finally getting down to household creation or their oldest cohorts (Xennials) just now entering into their mid 40s and starting to move up in their careers but they probably still have kids to support. So it will be some time before capital becomes plentiful again. Gen Z is large but they aren’t enough to back fill the loss of Millennials.

    Ohh I made a point to highlight that this was a US demographic phenomena. Europe and Japan do not have a large Millennial or GenZ populations to replace their aging boomers. We have no modern economic model to map out what will happen to them.

    China is going through a demographic collapse worse than what you see in Europe or Japan. Only they aren’t rich to compensate add in the fact that they decided to antagonize their largest trading partners in the West causing the decoupling we are now seeing.

    The loss of their labor means the West has to reshore or find alternative low wage markets for production and expend a lot of capital to build out the plant in those markets to do so.

    Add on top geopolitical instability of the Ukraine and you have a recipe for slower tech growth.

  • Telorand@reddthat.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I’m a PC gamer, and it looks like things are stagnating massively in our space.

    I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.

    Overall, I don’t see things the way you see them. I recommend taking a break from social media, go for a walk, play games you like, and fuck the trajectory of tech companies.

    Live your life, and take a break from the doomsaying.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      Gaming now is more amazing that ever in part because we have access to classic games too. If someone thinks gaming was amazing 10 years ago, cool. We still have those games! I’m playing a really old game right now myself and loving it.

      I think OP confuses this whole bubble bursting thing. When a phenomenon passes out of its early explosive growth phase and settles into more of a steady state, that’s not the “bubble bursting” that’s maturity.

      Tech as a whole is now a more mature industry. Companies are expected to make money, not revolutionize the world. OP would have us believe this means that tech is over. How does the saying go? It’s not the beginning of the end, but it is perhaps the end of the beginning.

      • frezik@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Companies are expected to make money, not revolutionize the world

        I’d like to believe that, but I don’t think investors have caught on yet. That’s where the day of reckoning will come.

        AI is a field that’s gone through boom and bust cycles before. The 1960s were a boom era for the field, and it largely came from DoD money via DARPA. This was awkward for a lot of the university pre and post grads in AI at the time, as they were often part of the anti-war movement. Then the anti-war movement starts to win and the public turns against the Vietnam war. This, in turn, causes that DARPA money to dry up, and it’s not replaced with anything from elsewhere in the government. This leads to an AI winter.

        Just to be clear, I like AI as a field of research. I don’t at all like what capitalism is doing with it. But what did we get from that time of huge AI investment? Some things that can be traced directly back to it are optimizing compilers, virtual memory, Unix, and virtual environments. Computing today would look entirely different without it. We may have eventually invented those things otherwise, but it would have taken much, much longer.

    • Lvxferre [he/him]@mander.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.

      Amen.

      Indie games might not be flashy, but they’re often made with love and concern about giving you a fun experience. They also lack all those abusive DRM and intrusive anti-cheat systems that A³ games often have.

      • Telorand@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        And I’ll add on to that, even if every GPU company stops innovating, we’ll still have older cards and hardware to choose from, and the games industry isn’t going to target hardware nobody is buying (effectively pricing themselves out of the market). Indie devs especially tend to have lower hardware requirements for their games, so it’s not like anyone will run out of games to play.

      • Rob Bos@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        They also tend to have linux support. Where the AAA companies want to eat the entire mammoth and scorn the scraps, small companies can thrive off of small prey and the offal. :)

          • Lvxferre [he/him]@mander.xyz
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            It’s a great analogy though - Linux users aren’t deemed profitable by the A³ companies, just like offal is unjustly* deemed yucky by your typical person.

            *I do love offal though. And writing this comment made me crave for chicken livers with garlic and rosemary over sourdough bread. Damn.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 months ago

              Idk, I’ve spent way more on games since Valve came to Linux. I was a Linux user first, and mostly played games on console because I didn’t like rebooting into Windows or fiddling w/ WINE, so if I played games, it’s because it had Linux support (got a ton through Humble Bundle when they were small and scrappy). When Steam came to Linux, I created an account (didn’t have one before) and bought a bunch of games. I bought Rocket League when the Steam Controller and Steam Deck launched (was part of a bundle), and when Proton launched, I bought a ton of Windows games.

              So at least for me, I’ve easily spent 100x what I would’ve spent on video games due to Steam supporting Linux. That said, there are easily 50 other people spending more than me on Windows for every one of me, so I get that Linux isn’t a huge target market. But I will spend more on an indie game if it has native Linux support.

    • EnderMB@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      My only fear with the indie gaming industry is that many of them are starting to embrace the churn culture that has led AAA gaming down a dark path.

      I would love an app like Blind that allows developers on a game to anonymously call out the grinding culture of game development, alongside practices like firing before launch and removing credits from workers. Review games solely on how the dev treated the workers, and we might see some cool corrections between good games and good culture.

      • Telorand@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        There’s certainly room to grow with regard to workers’ rights. I think you could probably solve at least a few of them if they were covered by a union, and publishers who hire them would have to bargain for good development contract terms.

    • dinckel@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Genuinely wish more people understood this. I’ve mostly only been playing indie games for the past few years. By far the best fun i’ve had in gaming. A ton of unbelievably creative, unique games out there. Not to mention that 99% of them are a single-purchase experience, instead of a cash treadmill

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      Hello indie gamer, it’s me, you, from the future.

      I’d like to introduce you to PATIENT indie gaming.

      The only games I play are small team, longer running, well documented, developers are passionate, mods exist, can play on a potato or a steam deck, etc

      Because I’m patient, I don’t ever get preorder, Kickstarter, prealpha disappointed.

      I know exactly what I’m getting, I pay once, and boom, I own a great game for ever. (You can more often fully DL indie games)

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          Bro I’m from the future you can’t ask me stuff like that, be patient, you’ll figure it out

    • RubberDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Plenty of good games out there, even in the early access I have found some real gems. Just recently coffee stain released satisfactory… labor of love and it shows. I recently tried bellwright, it’s impressive, so is manor lords.

      And hardware stagnating also means that people get to learn what it’s all about and optimize for it. The last gen games on a console are usually also better optimized than the first series of games on a platform. So yeah…

    • Shadywack@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I love this, and I’ll even one up it. Let the bubbles burst, this is just a transitional period that you see like a predictable cycle in tech. The dot com burst was like a holocaust compared to this shit. Everyone who was in the tech scene before Google has an easier time with this. We can comfortable watch FAANG recede, and even be grateful for it. Let it happen.

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Well, that’s the doomer take.

    The rumors are that the 80 series card is 10% faster than the 90 series card from last gen: that’s not a ‘10%’ improvement, assuming the prices are the same, that’s more like a 40% improvement. I think a LOT of people don’t realize how shitty the 4080 was compared to the 4090 and are vastly mis-valuing that rumor.

    I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong. My gaming has moved almost entirely to my Rog Ally and you know what? Shit is just as fun and way more convenient than the 7700x/3080 12gb desktop even if it’s 1080p low and not 1440p120. If the only thing the game has going for it is ‘ooh it’s pretty’ then it’s unlikely to be one of those games people care about in six months.

    And anyways, who gives a crap about AAAAAAAAAAAAA games? Indie games are rocking it in every genre you could care to mention, and the higher budget stuff like BG 3 is, well, probably the best RPG since FO:NV (fight me!).

    And yes, VR is in a shitty place because nobody gives a crap about it. I’ve got a Rift, Rift S, Quest, and a Quest 2 and you know what? It’s not interesting. It’s a fun toy that, but it has zero sticking power and that’s frankly due to two things:

    1. It’s not a social experience at all.
    2. There’s no budget for the kind of games that would drive adoption, because there’s no adoption to justify spending money on a VR version.

    If you could justify spending the kind of money that would lead to having a cool VR experience, then yeah, it might be more compelling but that’s been tried and nobody bought anything. Will say that Beat Saber is great, but one stellar experience will not sell anyone on anything.

    And AI is this year’s crypto which was last year’s whatever and it’s bubbles and VC scams all the way down and pretty much always has been. Tech hops from thing to thing that they go all in on because they can hype it and cash out. Good for them, and be skeptical of shit, but if it sticks it sticks, and if it doesn’t it doesn’t.

    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong.

      Some of the best games I’ve played have graphics that’ll run on a midrange GPU from a decade ago, if not just integrated graphics

      Case in point, this is what I’m playing right now:

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      The 5080 is rumored to be 10% faster, but also use 90% the power. While performance has a normal generational leap, power consumption has gone up to match leaving you with a much smaller actual improvement.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Power consumption numbers like that are expected, though.

        One thing to keep in mind is how big the die is and how many transistors are in a GPU.

        As a direct-ish comparison, there’s about 25 billion transistors in a 14900k, and 76 billion in a 4090.

        Big die + lots and lots of transistors = bigly power usage.

        I wouldn’t imagine that the 5000-series GPUs are going to be smaller or have less transistors, so I’d expect this to be in the die shrink lowers power usage, but more transistors increase power usage zone.

        • Vik@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          8 months ago

          Conversly, the apple silicon products ship huge, expensive dies fabbed on leading TSMC processes which sip power relative to contemporaries. You can have excellent power efficiency on a large die at a specific frequency range, moreso than a smaller die clocked more aggressively.

          • schizo@forum.uncomfortable.business
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            You’re not wrong (and those are freaking enormous dies that have to cost apple a goddamn fortune to make at scale), but like, it also isn’t an Apples-to-Apples comparison.

            nVidia/Intel/AMD have gone for the maximum performance and fuck any heat/noise/power usage path. They haven’t given a shit about low-power optimizations or investing in designs that are more suited to low-power usage (a M3 max will pull ~80w if you flog the crap out of it, so let’s use that number) implementations. IMO the wrong choice, but I’m just a computer janitor that uses the things, I don’t design them.

            Apple picked a uarch that was already low power (fun fact: ARM was so low power that the first test chips would run off the board’s standby power and would boot BEFORE they were actually turned on) and then focused in on making it as fast as possible with the least power as possible: the compute cores have come from the mobile side prior to being turned into desktop chips.

            I’m rambling but: until nVidia and x86 vendors prioritize power usage over raw performance (which they did with zen5 and you saw how that shit spiraled into a fucking PR shit mess) then you’re going to get next year’s die shrink, but with more transistors using the same power with slightly better performance. It’s entirely down to design decisions, and frankly, x86 (and to some degree so has nVidia) have painted themselves into a corner by relying on process node improvements (which are very rapidly going to stop happening) and modest IPC uplifts to stay ahead of everyone else.

            I’m hoping Qualcomm does a good job staying competitive with their ARM stuff, but it’s also Qualcomm and rooting for them feels like cheering on cancer.

            • Vik@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 months ago

              This outlines several issues, a key one is outbidding apple for wafer alloc on leading processes. They primarily sell such high margin products that I suppose they can go full send on huge dies with no sweat. Similarly, the 4090’s asking price was likely directly related to it’s production cost. A chunky boy with a huge l2$.

              I like the way Mike Clark frames challenges in semi eng as a balancing act between area, power, freq and performance (IPC); like a chip that’s twice as fast but twice the size of its predecessor is not considered progress.

              I wish ultra-efficient giga dies were more feasible but it’s kind of rough when TSMC has been unmatched for so long. I gather Intel’s diverting focus in 18A, and I hope that turns our well for them.

              I’m not sure that arm as an ISA (or even RISC) is inherently more efficient than CISC today, particularly when we look at Qualcomm’s latest l efforts at notebooks, more that Apple have extremely proficient designers and benefit significantly from vertical integration.

    • astropenguin5@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Little bit of pushback on the vr front: Sure, there aren’t many massive publishers driving it forward, but I would wholeheartedly argue that it can very much be a social experience, and offers experiences it is damn near impossible to get anywhere else, and three games immediately come to mind:

      VRchat (obviously): Literally entirely a social game, and has a pretty large community of people making things for it, from character models to worlds because that’s what drives the game. There is a massive scene of online parties, raves, hangouts, etc. that bring people together across the whole world in a medium more real than any flat game because of the custom models, worlds, and the relative abundance of people using full body tracking to show off, dance, and interact with each other.

      VTOL VR: This is still fairly social in that you can either play with friends or people online, but the main draw for me is the level of immersion in flying you can get. You have full interactable cockpits that you basically just use your real hands to interact with (depending on your controller/hand tracking) and it’s all pretty realistic. It’s just impossible to have the same level of experience without VR.

      Walkabout mini golf: I was pretty skeptical of this game when my friends wanted to play it, it’s literally just a mini golf sim. The thing is, the ability to play mini golf with friends who live across the country/world is amazing, and the physics of just swinging your controller/hands in the same way as real mini golf is so special.

      It is still quite expensive to get really good gear, and that is definitely the current biggest hurdle. It may forever be a smaller community due to the space/tech/cost requirements to make the experience truly incredible, but for me even just on a quest 2 in my room without a lot of fancy stuff, it is still interesting and something special. A lot of people really do care a lot about VR, and even if it is far less than conventional gaming, it should not be entirely discounted. And I personally think that while is probably won’t ever replace flat screen gaming, it is an entirely different kind of experience and has a at least decent future ahead

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Fair points on VR games being fairly social. I was more thinking of the in-person social experience, which is still involving some portion of people sitting around stuffing their face into a headset and wandering off into their own world.

        IMO, this is something that AR/MR stuff could do a great job of making more social by adding the game to the world, rather than taking the person out of the world to the game but, of course, this also restricts what kind of games you can do so is probably only a partial solution and/or improvement on the current state of affairs.

        I also agree that it’s way too expensive still, and probably always will be because the market is, as you mentioned, small.

        PCVR is pretty much dead despite its proponents running around declaring that it’s just fine like it’s a Monty Python skit. And the tech for truly untethered headsets is really only owned by a single (awful) company and only because the god-CEO thinks it’s a fun thing to dump money on which means it’s subject to sudden death if he retires/dies/is ousted/has to take time off to molt/has enough shareholder pressure put on him.

        Even then, it’s only on a second generation (the original Quest was… beta, at best) and is expensive enough that you have to really have a reason to be interested rather than it being something you could just add to your gaming options.

        I’d like VR to take off and the experiences to more resemble some of the sci-fi worlds that have a or take place in a virtual reality world, but honestly, I’ve thought that would be cool for like 20 years now and we’re only very slightly closer than we were then, we just have smaller headsets and somewhat improved graphics.

  • asdfasdfasdf@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I agree. But also add in the movie industry that’s been complete trash for a while now. Not to mention books. I’m not sure if we’ll ever see another Harry Potter level book again, at least in our lifetimes.

    My take is we’ve already left the golden ages of movies, music, and books and probably won’t get another for an extremely long time.

    Video games are going through the same downfall which streaming services brought. Physical media left the movie scene as a standard while ago, but video games took longer. Now it’s going to be all streaming and subscriptions where you can never own anything.

    Once that happens, enshittification will peak, companies won’t be incentivized to make the games good anymore, standards tank, and people will forget how good things once were.

    • hedgehog@ttrpg.network
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Not to mention books. I’m not sure if we’ll ever see another Harry Potter level book again, at least in our lifetimes.

      Are you talking quality or popularity? Because there are many, many books that are just as good or better than Harry Potter.

    • schizo@forum.uncomfortable.business
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      movie industry that’s been complete trash for a while now.

      This is not a callout of you in particular so don’t get offended, but that’s really only true if you look at the trash coming out of Hollywood.

      There’s some spectacularly good shit coming out of like France and South Korea (depending on what genres you’re a fan of, anyways), as well as like, everywhere else.

      Shitty movies that are just shitty sequels to something that wasn’t very good (or yet another fucking Marvel movie) is a self-inflicted wound, and not really a sign that you can’t possibly do better.

      • Zorsith@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Not to mention an ungodly amount of Animated content of all varieties. Anime, cartoons, indie (Helluva Boss is hilarious and (un?)surprisingly dark), I recall seeing a screenshot of something French with amazing art style I want to look into watching.

        One Piece is gearing up for a re-animation from the beginning using its new style from the Wano arc IIRC, and that is a hell of a long epic story.

        • schizo@forum.uncomfortable.business
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          8 months ago

          Train to Busan, Parasite, Unlocked, Wonderland, Anatomy of a Fall and Close have been ones I’ve seen recently that I liked.

          I think some of those are available on Netflix, but as I don’t use Netflix I can’t say which ones and for certain, though.

          Edit: I just realized some of those are vague and will lead to a billion other movies lol. The first 4 are S. Korean, the last two are French and they’re all from 2020 or newer so anything not from there or older isn’t the right one.

    • solomon42069@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      Check out the Mistborn and Wheel of Time series for books that are waaay better than Harry Potter. Anything by Brandon Sanderson and Neil Gaiman is a good time.

      Also highly recommend any comics by Moebius and/or Alejandro Jodorowsky, and Neil Gaiman. Some incredible mind altering works to enjoy there like The Incal and Sandman.

  • 𞋴𝛂𝛋𝛆@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

    Nvidia is just playing conservative because it was massively overvalued by the market. The GPU use for AI is a stopover hack until hardware can be developed from scratch. The real life cycle of hardware is 10 years from initial idea to first consumer availability. The issue with the CPU in AI is quite simple. It will be solved in a future iteration, and this means the GPU will get relegated back to graphics or it might even become redundant entirely. Once upon a time the CPU needed a math coprocessor to handle floating point precision. That experiment failed. It proved that a general monolithic solution is far more successful. No data center operator wants two types of processors for dedicated workloads when one type can accomplish nearly the same task. The CPU must be restructured for a wider bandwidth memory cache. This will likely require slower thread speeds overall, but it is the most likely solution in the long term. Solving this issue is likely to accompany more threading parallelism and therefore has the potential to render the GPU redundant in favor of a broader range of CPU scaling.

    Human persistence of vision is not capable of matching higher speeds that are ultimately only marketing. The hardware will likely never support this stuff because no billionaire is putting up the funding to back up the marketing with tangible hardware investments. … IMO.

    Neo Feudalism is well worth abandoning. Most of us are entirely uninterested in this business model. I have zero faith in the present market. I have AAA capable hardware for AI. I play and mod open source games. I could easily be a customer in this space, but there are no game manufacturers. I do not make compromises in ownership. If I buy a product, my terms of purchase are full ownership with no strings attached whatsoever. I don’t care about what everyone else does. I am not for sale and I will not sell myself for anyone’s legalise nonsense or pay ownership costs to rent from some neo feudal overlord.

    • Chris@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

      I’m a die hard open source fan but that still feels like a stretch. I remember 10 years ago we were theorizing that windows would get out of the os business and just be a shell over a unix kernel, and that never made it anywhere.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        It remained in the OS business to the extent that is required for the malware business.

        Also NT is not a bad OS (except for being closed, proprietary and probably messy by now). The Windows subsystem over it would suck just as bad if it would run on something Unix.

        • Chris@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          Yeah, I guess in my fantasy I was Assuming that windows would do a full rewrote and adopt the unix abi, but I know that wouldn’t happen.

      • Rob Bos@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        I don’t think that is necessarily out of the running yet. OS development is expensive and low profit. Commodification may be inevitable. Control of the shell and GUI, where they can push advertisements and shovelware and telemetry on you, that is profitable.

        So in 20 years, 50? I predict proprietary OSes will die out eventually, balance of probability.

        • Chris@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          I’m with you in the long term.

          I am curious what kernel is backing the computers on the stuff SpaceX is doing. I’ve never seen their consoles but I am guessing we are closer to modern reusable hardware and software than we were before. When niche applications like that keep getting more diverse, i bet we will get more open specifications so everything can work together.
          But again I am more pessimistic and think 50 years would be relatively early for something like that.

    • tias@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      AI still needs a lot of parallelism but has low latency requirements. That makes it ideal for a large expansion card instead of putting it directly on the CPU die.

      • 𞋴𝛂𝛋𝛆@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Multi threading is parallelism and is poised to scale to a similar factor, the primary issue is simply getting tensors in and out of the ALU. Good enough is the engineering game. Having massive chunks of silicon laying around without use are a mach more serious problem. At the present, the choke point is not the parallelism of the math but actually the L2 to L1 bus width and cycle timing. The ALU can handle the issue. The AVX instruction set is capable of loading 512 bit wide words in a single instruction, the problem is just getting these in and out in larger volume.

        I speculate that the only reason this has not been done already is because pretty much because of the marketability of single thread speeds. Present thread speeds are insane and well into the radio realm of black magic bearded nude virgins wizardry. I don’t think it is possible to make these bus widths wider and maintain the thread speeds because it has too many LCR consequences. I mean, at around 5 GHz the concept of wire connections and gaps as insulators is a fallacy when capacitive coupling can make connections across all small gaps.

        Personally, I think this is a problem that will take on a whole new architectural solution. It is anyone’s game unlike any other time since the late 1970’s. It will likely be the beginning of the real RISC-V age and the death of x86. We are presently at the age of the 20+ thread CPU. If a redesign can make a 50-500 logical core CPU slower for single thread speeds but capable of all workloads, I think it will dominate easily. Choosing the appropriate CPU model will become much more relevant.

    • sunzu2@thebrainbin.org
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      I do not make compromises in ownership.

      preach!

      At the end of the day though proper change will only come once the critical mass aligns on this issues along few others.

      Political process is too captured for peasant to affect any change, we have more power voting with our money as customers, at least for now.

  • Jolteon@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I agree with you on the GPU hardware and AI bubbles, but I’m not sure I would consider VR/AR to be a bubble right now. The hype has mostly died down by now, and I think it’s stabilized to the point where it will remain until we have new advances in hardware.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      VR is on the verge of collapse in the USA thanks to the US government banning byte dance. We can’t even order the new Pico 4 ultra, which is one of the most anticipated VR sets in the world right now. Meta basically has a monopoly and just announced they’re cutting funding to VR

      • tee9000@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Sorry but a new pico headset wouldnt do much of anything. New meta headset, new valve headset would give a bump.

        Really needs better content. The hardware is almost there (in terms of cost and accessibility of the experience).

        Its slowly getting there. But the current population of vr users is characterized by: who would play the same limited experiences consistently with hardware that is often cumbersome and loading screens that arent super long but become your entire existence and its annoying.

        Meta sucks but they have been a boon for vr development.

  • LordCrom@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I would love to have a VR headset that didn’t require a damn account with a 3rd party just to use it. I don’t need an account for my monitor or my mouse. Plus when I bought the thing, it was just Oculus, then meta bought it and promised nothing would change, before requiring a meta account to use the fucking thing.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      That unfortunately is the consequence of letting a company have a monopoly. The US govt should’ve opposed that, and should’ve forced them to sell it. They own such a huge share of the entire VR market right now it’s unbelievable, and Pico by byte dance isn’t legally able to be sold on the USA

  • tee9000@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I really truly suggest diversifying to newsfeeds without comment sections like techmeme for a bit.

    Increasing complexity is overwhelming and theres plenty of bad shit going on but theres a lot overblown in your post.