

“There are some bad things on the internet”
“Just… Don’t use the internet?”
“There are some bad things on the internet”
“Just… Don’t use the internet?”
Over the past 5 years, I’ve installed ubuntu about 30 times on different computers. Not once has an install on an SSD taken me more than an hour, with it typically taking me 30 minutes or less except for rare occasions where I’ve messed something up.
It’s the other way around, an Apple Silicon Mac would be able to run an intel binary through Rosetta (I think there’s almost no exceptions at this point). It’s intel macs that can’t run Arm specific binaries.
I thought a few days ago that my “new” laptop (M2 Pro MBP) is now almost 2 years old. The damn thing still feels new.
I really dislike Apple but the Apple Silicon processors are so worth it to me. The performance-battery life combination is ridiculously good.
Also because, as a person who has studied multiple languages, German is hard and English is Easy with capital E.
No genders for nouns (German has three), no declinations, no conjugations other than “add an s for third person singular”, somewhat permissive grammar…
It has its quirks, and pronunciation is the biggest one, but nowhere near German (or Russian!) declinations, Japanese kanjis, etc.
Out of the wannabe-esperanto languages, English is in my opinion the easiest one, so I’m thankful it’s become the technical Lingua Franca.
Language | Native Speakers | Total Speakers | Sources |
---|---|---|---|
English | ~380 million | ~1.5 billion | Wikipedia |
German | ~76–95 million | ~155–220 million | Wikipedia |
Mandarin | ~941 million–1.12 billion | ~1.1–1.3 billion | Wikipedia |
Well, it has 10x more speakers than German, but it still has fewer speakers than English and most of them are localised in a single country.
I’m talking about running them in GPU, which favours the GPU even when the comparison is between an AMD Epyc and a mediocre GPU.
If you want to run a large version of deepseek R1 locally, with many quantized models being over 50GB, I think the cheapest Nvidia GPU that fits the bill is an A100 which you might find used for 6K.
For well under that price you can get a whole Mac Studio with those 192 GB the first poster in this thread mentioned.
I’m not saying this is for everyone, it’s certainly not for me, but I don’t think we can dismiss that there is a real niche where Apple has a genuine value proposition.
My old flatmate has a PhD in NLP and used to work in research, and he’d have gotten soooo much use out of >100 GB of RAM accessible to the GPU.
If it’s for AI, loading huge models is something you can do with Macs but not easily in any other way.
I’m not saying many people have a use case at all for them, but if you have a use case where you want to run 60 GB models locally, a whole 192GB Mac Studio is cheaper than the GPU alone you need to run that if you were getting it from Nvidia.
So the lack of apple-branded AI Slop is slowing down the sales for iPhones but not for Macs?
Edit for clarity: I’m aware sequoia “has” apple intelligence but in a borderline featureless state, so it’s as good (or as bad) as not having anything.
I recently discovered dupe.com , which works quite well by giving it an Amazon link and finding it on potentially cheaper, definitely non-Bezos websites.
There are tons more applications in the workplace. For example, one of the people in my team is dyslexic and sometimes needs to write reports that are a few pages long. For him, having the super-autocorrect tidy up his grammar makes a big difference.
Sometimes I have a list of say 200 software changes that would be a pain to summarise, but where it’s intuitively easy for me to know if a summary is right. For something like a changelog I can roll the dice with the hallucination machine until I get a correct summary, then tidy it up. That takes less than a tenth of the time than writing it myself.
Sometimes writing is necessary and there’s no way to cut down the drivel unfortunately. Talking about professional settings of course - having the Large Autocorrect writing a blog post or a poem for you is a total misuse of the tool in my opinion.
I hope AMD keep pushing to do things well, because right now the value proposition of anything with an Intel processor is more ridiculous than when Apple charges $300 for an extra 8GB of RAM. Their $600 processors currently offer performance on par with the entry-level Apple Silicon M4. Which is great news for Apple, but not for anyone who wants to use Linux or “the other Mainstream OS”.
Something I find incredibly weird about US company culture is how they talk about overtime like it’s a good thing.
“Our employees worked weekends, days and nights to make this happen! We wouldn’t have succeeded without people who are willing to give up their personal lives!”
I hope they not only succeed but get shares. Doing weekends or nights for a company you don’t (partially) own feels like a con.
If they keep duplicating the ask, soon they’ll be asking for a googol from Google. Hehe.
I don’t think anything with the word “intel” can be taken seriously in value comparisons…
When I got my last laptop I ended up with a MBP because there were no high end options for Linux laptops with AMD. Now the options are better, but back then, the only realistic alternative to a MacBook Pro would have had a third of the real-world battery life if not less, even if I decided to spend £3k. That didn’t seem like an acceptable compromise so there were virtually no laptops in existence that could compete with an M2 MBP.
16 GB of RAM are kinda meh, but I can’t think of many $600 devices that can run three 6K monitors simultaneously at 60 Hz, plus then one at a lower res but still 60 Hz.
How do people have the time to organise vigils and get into “coalitions” and politics in the workplace?
Granted I don’t work at Microsoft, but I feel me and everyone around me is overworked enough that when we have the time to stop working… We head home (or close the laptop if WFH) and rest, not engage in additional activities in the workplace.
Yeah, this sounds like a problem for only the 5% of the world who live in a specific country.
I also think heating everything up is the smoothest solution. But to offer an alternative, I’d use dental floss to get in between the bowl and plate. If the bowl has slightly rounded edges (I believe it will), it won’t be too hard to get floss in. With the floss you’ll get inevitably some air in… Which will equalise the pressure and break the vacuum.
As an inferior alternative to floss, fishing line could work for this approach as well.