• NicePool@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Isn’t Apple the company that charges $5k+ for 16GB? All while intentionally deprecating the hardware within 2 years. /s

    I’ve had to support their products on a professional level for over a decade. I will NEVER buy an Apple product.

    • cm0002@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I’ve had to support their products on a professional level for over a decade.

      Their enterprise stuff…can only be described as a quintessential example of an ill-conceived, horrendously executed fiasco, so utterly devoid of utility and coherence that it defies all logic and reasonable expectation. It stands as a paragon of dysfunction, a conflagration of conceptual failures so intense and egregious that it resembles a blazing inferno of pure, unadulterated refuse. It is, in every conceivable sense, a searing, molten heap of garbage—hot, steaming, and reeking with the unmistakable stench of profound ineptitude and sheer impracticality.

  • Nomecks@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Golly, thanks Apple. It’s not like I can go buy a 256GB DIMM right now. 16GB what a joke.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        It’s not an upgrade though it’s just a different model. They’re not modules you can install and I don’t even think Apple can install them you just get a different motherboard.

        Which is objectionable for so many reasons, not least of all E-Waste.

        • stellargmite@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          Yeh I get that. Its treated as if its an upgrade - a sales upsell to a different unit I guess, rather than an upgrade to the literal unit the customer is receiving. Yep objectionable all round.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            My point is you cannot effectively upgrade after the fact. You have to buy a whole new device.

            • MystikIncarnate@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 months ago

              There’s reasons behind this. LPDDR IIRC works most efficiently when it’s closer to the CPU than what dimms would allow for.

              Boosts speed and lowers the power requirements.

              It also incentivizes people to buy larger SKUs than they originally wanted, which, bluntly, is probably the main driver for going that direction… I’m just saying that there’s technical reasons too

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                ·
                8 months ago

                The technical benefits are honestly quite overblown. The M-series didn’t get the massive speed lift because it moved to soldered RAM near the CPU, it got the massive speed lift because it doesn’t have to copy stuff between the CPU and GPU, the proximity to the CPU is a pretty modest improvement. So they could’ve gotten 95% of the benefit while still offering socketed RAM, but they decided not to, probably to drive prices up.

                • MystikIncarnate@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  8 months ago

                  There’s actually an argument that makes the point of driving prices down with soldered RAM.

                  The individual memory chips and constituent components are cheaper than they would be for the same in a DIMM. We’re talking about a very small difference, and bluntly, OEMs are going to mark it up significantly enough that the end consumer won’t see a reduction for this (but OEMs will see additional profits).

                  So by making it into unupgradable ewaste, they make an extra buck or two per unit, with the added benefit of our being unupgradable ewaste, so you throw it out and buy a whole new system sooner.

                  This harkens back to my rant on thin and light phones, where the main point is that they’re racing to the bottom. Same thing here. For thin and light mobile systems, soldered RAM still saves precious space and weight, allowing for it to be thinner and lighter (again, by a very small margin)… That’s the only market segment I kind of understand the practice. For everything else, DIMMs (or the upcoming LPCAMM2)… IMO, I’d rather sacrifice any speed benefit to have the ability to upgrade the RAM.

                  The one that ticks me off is the underpowered thin/lights that are basically unusable ewaste because they have the equivalent of a Celeron, and barely enough RAM to run the OS they’re designed for. Everything is soldered, and they’re cheap, so people on a tight budget are screwed into buying them. This is actually a big reason why I’m hoping that the windows-on-ARM thing takes off a bit, because those systems would be far more useful than the budget x86 chips we’ve seen, and far less expensive than anything from Intel or AMD that’s designed for mobile use. People on a tight budget can get a cheap system that’s actually not ewaste.

            • stellargmite@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 months ago

              Indeed. Making that initial decision even more of a forced decision toward the expensive upsell. Its evil. And wasteful as you said.

  • Omega@discuss.online
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I always thought 8gb was a fine amount for daily use if you never did anything too heavy, are apps really that ram intense now?

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Yes. Just as 4GB was barely enough a decade ago.

      I usually find myself either capping out the 8GN of RAM on my laptop, or getting close to it if I have Firefox, Discord and a word processor open. Especially if I have Youtube or Spotify going.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        I can get over 8 GB just running Discord, Steam, Shapes2

        I am pretty sure most of that is just discord.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          Imagine how much more room we’d have if everything wasn’t dragging a big trailer full of Chrome behind it.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            I’m pretty sure Chrome doesn’t even use the memory for anything it just likes it allocated.

      • Omega@discuss.online
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Most of that is discord, they can’t manage a single good thing right Use more GPU than the game I’m playing? Check. Have an inefficient method of streaming a game? Check. Be laggy as fuck when no longer on GPU acceleration when lemmy and guilded is fine? Check.

    • Sethayy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Heavily depends on what you use, on a Linux server as a NAS I’m able to get away with 2gb, an orange pi zero 3 1gb but it essentially only ever ones one app at a time.

      Im sure a hardcore rgb gamer could need 32gb pretty quick by leaving open twitch streams, discord, a couple games in the background, a couple chrome tabs open all on windows 11

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Yep. I work in IT support, almost entirely Windows but similar concepts apply.

      I see people pushing 6G+ with the OS and remote desktop applications open sometimes. My current shop does almost everything by VDI/remote desktop… So that’s literally the only thing they need to load, it’s just not good.

      On the remote desktop side, we recently shifted from a balanced remote desktop server, over to a “memory optimised” VM, basically has more RAM but the same or similar CPU, because we kept running out of RAM for users, even though there was plenty of CPU available… It caused problems.

      Memory is continually getting more important.

      When I do the math on the bandwidth requirements to run everything, the next limit I think we’re likely to hit is RAM access speed and bandwidth. We’re just dealing with so much RAM at this point that the available bandwidth from the CPU to the RAM is less than the total memory allocation for the virtual system. Eg: 256G for the VM, and the CPU runs at, say, 288GB/s…

      Luckily DDR 4/5 brings improvements here, though a lot of that stuff has yet to filter into datacenters

      • Liz@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Recently I downloaded Chrome for some testing that I wanted to let separate from my Firefox browser. After a while I realized my computer was always getting hot every time I opened chrome. I took a look at the system monitor: chrome was using 30% of of my CPU power to play a single YouTube video in the background. What the fuck? I ended up switching the testing environment over the libreWolf and CPU load went down to only 10%.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          I’d say to try chromium, but you basically need to compile it yourself to get support for all the video codecs.

      • Demdaru@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Stop. You’re scaring todays companies. Optimization? That’s a no-no word.

        Now please eat whole ass libraries imported for one function, or that react + laravel site which amounts to most stock bootsrap looking blog.

  • padge@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    My sister just bought a MacBook Air for college, and I had to beg her to spend the extra money on 16gb of memory. It feels like a scam that it appears cheap with the starting at price, but nobody should actually go with those “starting at” specs.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Yeah it’s about future proofing. 8 GB might be okay for basic browsing and text editing now, but in the future that might not be the case. Also in my experience people who only want to do basic browsing and word editing, end up inevitably wanting to do more complex things and not understanding that their device is not capable of it.

      • padge@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Exactly. I told her that 8gb might be fine for a year or two, but if she wants this thousand plus dollar laptop to last four years she needs to invest the extra money now. Especially once she told me she might want to play Minecraft or Shadow of the Tomb Raider on it

  • masterspace@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    8 months ago

    Let me know how many multiple thousands of dollars it’s going to cost for a MAX variant of the chip that can run three external monitors like it’s 2008.

      • masterspace@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 months ago

        Nope. All base Mx Series Macs can only support a single external monitor in addition to their internal one.

        Pro Series are professional enough that Apple deems your work worthy of using two (2) external monitors.

        Max Series are the only ones that have proved their Maximum enough to Apple to let them use 3 monitors.

        It’s honestly absurd. And none of them support Display Port’s alt mode so they can’t daisy chain between monitors and they max out at 3, whereas an equivalent Windows or Linux machine could do 6 over the same Thunderbolt 3 connection.

        Windows and Linux machines also support sub pixel text rendering, so text looks far better on 1080p and 1440p monitors.

        I have to use MacOS for work and while I’ve come to accept many parts and even like some, their external monitor support is just mind numbingly bad.

        • brbposting@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          sub pixel text rendering, so text looks far better on 1080p and 1440p monitors.

          Why would you need that? Buy an Ultra Pro Retina Max Display and please get the stand if you don’t want Apple to go out of business.

        • narc0tic_bird@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          What you’re describing as “DisplayPort alt mode” is DisplayPort Multi-Stream Transport (MST). Alt mode is the ability to pass native DisplayPort stream(s) via USB-C, which all M chip Macs are capable of. MST is indeed unsupported by M chip hardware, and it’s not supported in macOS either way - even the Intel Macs don’t support it even though the hardware is capable of it.

          MST is nice for a dual WQHD setup or something (or dual UHD@60 with DisplayPort 1.4), but attempt to drive multiple (very) high resolution and refresh rate displays and you’ll be starved for bandwidth very quickly. Daisy-chaining 6 displays might technically be possible with MST, but each of them would need to be set to a fairly low resolution for today’s standards. Macs that support more than one external display can support two independent/full DisplayPort 1.4 signals per Thunderbolt port (as per the Thunderbolt 4 spec), so with a proper Thunderbolt hub you can connect two high resolution displays via one port no problem.

          I agree that even base M chips should support at least 3 simultaneous displays (one internal and two external, or 3 external in clamshell mode), and they should add MST support for the convenience to be able to connect to USB-C hubs using MST with two (lower-resolution) monitors, and support proper sub-pixel font anti-aliasing on these low-DPI displays (which macOS was perfectly capable of in the past, but they removed it). Just for the convenience of being able to use any random hub you stumble across and it “just works”, not because it’s necessarily ideal.

          But your comparison is blown way out of proportion. “Max” Macs support the internal display at full resolution and refresh rate (120 Hz), 3 external 6K 60Hz displays and an additional display via HDMI (4K 144 Hz on recent models). Whatever bandwidth is left per display when daisy-chaining 6 displays to a single Thunderbolt port on a Windows machine, it won’t be anywhere near enough to drive all of them at these resolutions.

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            8 months ago

            Agreed, I typed quickly before bed and meant MST not alt mode.

            But otherwise you’re just arguing that it’s not a big deal because ‘you don’t need any of these fancy features if you throw out your monitor every three years and buy new thousand dollar ones’.

            For everyone who doesn’t want to contribute to massive piles of e-waste, we still have 1080p and 1440p, 60Hz monitors kicking around, and there is no excuse for a Mac to only be able to drive one of them with crappy looking text. It could easily drive 6 within the bandwidth of a 4k, 120Hz signal. Hell it could drive 8 or more if you drop the refresh down to 30.

            • narc0tic_bird@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 months ago

              I’m not generally arguing it’s not a big deal. I’m actually saying the regular M chips should be upgraded to M “Pro” levels of display support. But beyond two external displays, yes, I’m arguing it’s not a big deal, simply because >99% of users don’t want to use more than two external displays (no matter the resolution). Even if I had 6 old displays lying around I would hardly use more than two of them for a single computer. And as long as I’m not replacing all 6 displays with 6 new displays it doesn’t make a difference in terms of e-waste. On the contrary I’d use way more energy driving 6 displays simultaneously.

              I’m 100% with you that MST should be supported, but not because driving six displays (per stream) is something I expect many people to do, but because existing docking solutions often use MST to provide multiple (2) DisplayPort outputs. My workplace has seats with a USB-C docking station connected to two WQHD displays via MST, and they’d all need replacing should we ever switch to MacBooks.

              And sure, they should bring back proper font rendering on lower resolution displays. I personally haven’t found it to be too bad, but better would be … better, obviously. And as it already was a feature many moons ago, it’s kind of a no-brainer.

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            You need to reread my comment where I point out that it’s only the Max chips that can drive more than two external monitors.

            And bro, a cursory Google search would also bring up this page from Apple which confirms everything I wrote. A base M3 mac can only drive two monitors if the internal display is closed, i.e. it can only drive one external monitor and one internal.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          I guess you could get an eGPU. Probably not cheaper than just giving Apple their pound of flesh, though.

      • carleeno@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        My last job issued me an M2 air that could only power 1 external monitor. Was annoying as hell.

    • DJDarren@thelemmy.club
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Because a huge part of their business model over the past twenty years has been the upsell.

      I bought my first MacBook in 2007. It had 2gb of RAM as standard. I asked about upgrading it, the guy told me to pick some up online as it would be waaaay cheaper, and he was right. Did the same for the MacBook Pro that replaced it a few years later, but in the meantime they moved to the soldered model so had to swallow the cost of the 16gb ‘upgrade’ in my M2 Air.

      To be fair, the cost over time of my Macs has been incredible. My 2011 MBP is still trucking along, these days running Linux Mint. With the cost to upgrade the RAM and replace the HDD with an SSD, all in it cost me around £1200. Less than £100 a year for a laptop that still works perfectly fine.

    • luves2spooge@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      Because there are two types of mac users:

      • People that ate buying them with their own money because they’re trendy and just using them as glorified Internet browsers. 16gb is plenty.
      • People using them professionally so their company is paying and Apple can over charge for the necessary memory upgrade
      • aStonedSanta@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        I have an m2 8 gb. And it’s plenty. It’s just a browsing/discord/stream box basically.

      • bamboo@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        This pretty much. I don’t care that much that a maxed out MBP is $6000 or whatever, my employer pays for that.

    • Munkisquisher@lemmy.nz
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Greed, it lowers the advertised price, but once you spec it decently you’ve added a grand in extras

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        Price discrimination based on memory loadout is real, but it’s not specific to Apple, either.

      • cmnybo@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        It does make some things better, but there are a number of downsides too. The biggest downside is that it’s not practical to make the memory socketed because of the speed that’s required.

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      It’s OK - for an extra $400 they’ll sell you one with an extra $50 worth of RAM.

        • ripcord@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          I think they meant what the end user would NORMALLY pay, which is the better comparison.

          • JohnDClay@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            8 months ago

            But Apple isn’t buying consumer ram, they’re spending $8 to put on a different chip instead. If other laptop manufacturers are charging $50, it’s because they think they can get away with it, like apple.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  8 months ago

                  It’s really not. Other companies with socketed RAM also upsell, they are just limited in how much they can ask because the customer has the option to DIY adding more RAM. So the cost these companies charge is roughly the price to the customer of upgrading their own RAM, plus a bit extra for the convenience of not having to do that.

                  For example, Framework upcharges by something like 20-50% for RAM and SSDs when comparing to equivalent parts. It’s not just Apple, all OEMs do it, but Apple can charge much more because the user can’t easily replace either on their own.