There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.
And now all the fan boys and girls will go out and buy another MacBook. That’s planned obsolescence for ya
Someone who is buying a MacBook with the minimum specs probably isn’t the same person that’s going to run out and buy another one to get one specific feature in Xcode. Not trying to defend Apple here, but if you were a developer who would care about this, you probably would have paid for the upgrade when you bought it in the first place (or couldn’t afford it then or now).
Well no, not this specific scenario, because of course devs will generally buy machines with more RAM.
But there are definitely people who will buy an 8GB Apple laptop, run into performance issues, then think “oh I must need to buy a new MacBook”.
If Apple didn’t purposely manufacture ewaste-tier 8GB laptops, that would be minimised.
I wouldn’t be so sure. I feel like many people would not buy another MacBook if it were to feel a lot slower after just a few years.
This feels like short term gains vs. long term reputation.
And why they solder the RAM, or even worse make it part of the SoC.
There are real world performance benefits to ram being as close as possible to the CPU, so it’s not entirely without merit. But that’s what CAMM modules are for.
But do those benefits outweigh doubling or tripling the amount of RAM by simply inserting another stick that you can buy for dozens of dollars?
That’s extremely dependent on the use case, but in my opinion, generally no. However CAMM has been released as an official JEDEC interface and does a good job at being a middle ground between repairability and speed.
It’s an officially recognized spec, so Apple will ignore it as long as they can. Until they can find a way to make money from it or spin marketing as if it’s some miraculous new invention of theirs, for something that should just be how it’s done.
Parts pairing will do. That’s what Apple known for, knee capping consumer rights.
Yes, there are massive advantages. It’s basically what makes unified memory possible on modern Macs. Especially with all the interest in AI nowadays, you really don’t want a machine with a discrete GPU/VRAM, a discrete NPU, etc.
Take for example a modern high-end PC with an RTX 4090. Those only have 24GB VRAM and that VRAM is only accessible through the (relatively slow) PCIe bus. AI models can get really big, and 24GB can be too little for the bigger models. You can spec an M2 Ultra with 192GB RAM and almost all of it is accessible by the GPU directly. Even better, the GPU can access that without any need for copying data back and forth over the PCIe bus, so literally 0 overhead.
The advantages of this multiply when you have more dedicated silicon. For example: if you have an NPU, that can use the same memory pool and access the same shared data as the CPU and GPU with no overhead. The M series also have dedicated video encoder/decoder hardware, which again can access the unified memory with zero overhead.
For example: you could have an application that replaces the background on a video using AI. It takes a video, decompresses it using the video decoder , the decompressed video frames are immediately available to all other components. The GPU can then be used to pre-process the frames, the NPU can use the processed frames as input to some AI model and generate a new frame and the video encoder can immediately access that result and compress it into a new video file.
The overhead of just copying data for such an operation on a system with non-unified memory would be huge. That’s why I think that the AI revolution is going to be one of the driving factors in killing systems with non-unified memory architectures, at least for end-user devices.
I feel like this is an arguement for new specialized computers at best. At worst it shows that this AI crap is even more harmful to the end user.
That’s a fantastic explanation! Thank you!
Bus goes Vrrrroom vrrooom. Fuck AI.
It’s highly dependent on the application.
For instance, I could absolutely see having certain models with LPCAMM expandability as a great move for Apple, particularly in the pro segment, so they’re not capped by whatever they can cram into their monolithic SoCs. But for most consumer (that is, non-engineer/non-developer users) applications, I don’t see them making it expandable.
Or more succinctly: they should absolutely put LPCAMM in the next generation of MBPs, in my opinion.
And even if the out-of-the-box RAM is soldered to the machine, it should still be possible to add supplementary RAM that isn’t soldered for when the system demands it. Other computers have worked like this in the past with chip RAM but a socket to add more.
Apple’s SoC long predates CAMM.
Dell first showed off CAMM in 2022, and it only became JEDEC standardised in December 2023.
That said, if Dell can create a really good memory standard and get JEDEC to make it an industry standard, so can Apple. They just chose not to.
In this particular case the RAM is part of the chip as an attempt to squeeze more performance. Nowadays, processors have become too fast but it’s useless if the rest of the components don’t catch up. The traditional memory architecture has become a bottleneck the same way HDDs were before the introduction of SSDs.
You’ll see this same trend extend to Windows laptops as they shift to Snapdragon processors too.
People do like to downplay this, but SoC is the future. There’s no way to get performance over a system bus anymore.
deleted by creator
Funny that within one minute, they state the exact same problem.
deleted by creator
deleted by creator
And the apple haters will keep making this exact same comment on every post using their 3rd laptop in ten years while I’m still using my 2014 MacBook daily with no issues.
Be more original.
Nice attempt to justify planned obsolescence. To think apple hasn’t done this time and time again, you’d have to be a fool
👍
-posted from my ten year old MacBook which shows no need for replacement
And is what, 3 or 4 operating systems behind due to it being obsolete
At which point did Apple decide your MacBook was too old to be usable and stop giving updates or allow new software to run on it?
Still gets security updates. All the software I need to run on it runs on it.
My email, desktop, and calendar all still sync with my newer desktop. I can still play StarCraft. I can join zoom meetings while running Roll 20. I can even run Premiere and do video editing… to a point.
I guess if you need the latest and greatest then you might have a point, but I don’t.
This whole thread is bitching about software bloat and Apple does that to stop the software bloat on older machines, but noooo that’s planned obsolescence. 🙄
This is pretty much it. People really just want to find reasons to hate Apple over the past 2 - 3 years. You’re right, though, your Mac can run easily for 10+ years. You’re good basically until the web browsers no longer support your OS version, which is more in the 12-15 year range.
deleted by creator
Weren’t you just complaining about software bloat?
deleted by creator
I still have a fully functioning Windows 95 machine.
My daily driver desktop is also from around 2014.
That’s pretty sick actually
These were obsolete the minute they were made, though… So it’s not really planned obsolescence. I got one for free (MacBook Air), and it’s always been trash.
I have an M2 MBA and it’s the best laptop I’ve ever owned or used, second to the M3 Max MBP I get to use for work. Silent, battery lasts all week, interface is fast and runs all my dev tools like a charm. Zero issues with the device.
This isn’t a big deal.
If you’re developing in Xcode, you did not buy an 8GB Mac in the last 10-years.
If you are just using your Mac for Facebook and email, I don’t think you know what RAM is.
If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.
If you know what RAM is, and you bought an 8GB Mac in the last 10-years, then you are likely self-aware of your limited demands and/or made an informed compromise.
Or you simply refuse to pay $200+ to get a proper machine. Like seriously, 8GB Mac’s should have disappeared long ago, but nope, Apple stick to them with their planned obsolescence tactics on their hardware, and stubbornly refusing to admit that in 2023 releasing a MacBook with soldered 8Gb of RAM is wholy inadequate.
I get around this by simply not buying a Mac. Free’s up so much money for ram.
Yeah, the 8GB model’s purpose is to make an “starting at $xxxx” price tag possible.
deleted by creator
I’m not gonna stand up and declare that 8gb is absolutely fine, because in very short order it won’t be. But yeah, currently for an average use case, it is.
My work Mac mini has 8gb. It’s a 2014 so can’t be upgraded, but for the tasks I ask of it it’s ok. Sure, it gets sluggish if I’m using the Win11 VM I sometimes need, but generally I don’t really have any issues doing regular office tasks.
That said, I sometimes gets a bee in my bonnet about it, so open Activity Monitor to see what’s it’s doing, and am shocked by how much RAM some websites consume in open tabs in Safari.
8gb is generally ok on low end gear, but devs are working very hard to ensure that it’s not.
Funny: knowing that you only get one shot, I bought 32GB of RAM for my Mac Mini like 1.5 years ago. I figured that it gave me the best shot of keeping it usable past 5 years.
imagine showing this post to someone in 1995
shit has gotten too bloated these days. i mean even in my head 8GB still sounds like ‘a lot’ of RAM and 16GB feels extravagant
I still can’t fully accept that 1GB is not normal, 2GB is not very good, and 4GB is not all you ever gonna need.
If only it got bloated for some good reasons.
I remember when I got my first computer with 1GB of RAM, where my previous computer had 64MB, later upgraded to 192MB. And there were only like 3 or 4 years in between them.
It was like: holy shit, now I can put all the things in RAM. I will never run out.
The moment you use a file that is bigger than 1GB, that computer will explode.
Some of us do more than just browse Lemmy.
Wow. Have you ever considered how people were working with files bigger than total RAM they had in the normal days of computing?
So in your opinion if you have 2GB+ of a log file, editing it you should have 2GB RAM occupied?
I just have no words, the ignorance.
I chalk it up to lazy rushed development. Good code is art.
I have a VPS that uses 1GB of RAM, it has 6-7 apps running in docker containers which isn’t the most ram efficient method of running apps.
A light OS really helps, plus the most used app that uses a lot of RAM actually reduce their consumption if needed, but use more when memory is free, the web browser. On one computer I have chrome running with some hundreds of MB used, instead of the usual GBs because RAM is running out.
So it appears that memory is full,but you can actually have a bit more memory available that is “hidden”
Same here. When idle, the apps basically consume nothing. If they are just a webserver that calls to some PHP script, it basically takes no RAM at all when idle, and some RAM when actually used.
Websites and phone apps are such an unoptimized pieces if garbage that they are the sole reason for high RAM requirements. Also lots of background bloatware.
This is resource reservation, it happens at an OS level. If chrome is using what appears to be alot of ram, it will be freed up once either the OS or another application requires it.
It just exists so that an application knows that if it needs that resource it can use X amount for now.
Absolutely.
Bad, rushed software that wires together 200 different giant libraries just to use a fraction of them and then run it in a sandboxed container with three daemons it needs for some reason doesn’t mean “8 Gb isn’t enough”, it means write tighter, better software.
deleted by creator
You just have to watch your favorite tablet get slower year after year to understand that a lot of this is artificial. They could make applications that don’t need those resources but would never do so.
We measure success by how many GB’s we have consumed when the only keys depressed from power on to desktop is our password. This shit right here is the real issue.
Guy from '95: “I bet it’s lightning fast though…”
No dude. It peaks pretty soon. In my time, Microsoft is touting a chat program that starts in under 10 seconds. And they’re genuinely proud of it.
And latency is more shit than it ever was
I once went for lower CAS timing 2x 128MB ram sticks (256 MB) instead of 2x 256s with slower speeds because I thought 512MB was insane overkill. Realized how wrong I was when trying to play Star Wars galaxies mmorpg when a lot of people were on the screen it started swapping to disk. Look up the specs for an IBM Aptiva, first computer my parents bought, and you’ll understand how 512MB can seem like a lot.
Now my current computer has 64 GB (most gaming computers go for 32GB) at the time I built it. My workstation at work has 128GB which really isn’t even enough for some workloads we have that use a lot of in-memory cache… And large servers can have multiple TB of RAM. My mind has been blown multiple times.
Opens chrome on a 8GB Mac. Sees lifespan of SSD being reduced by 50%. After 2-3 years of heavy usage SSD starts to get errors. Apple solution: buy a new one. No wonder they are 2nd/3rd wealthiest company on the planet.
buy a new one.
Buy a new SSD and swap out the old one?
…buy a new SSD, right??
SSD is soldered to the board. With only 8GB you’ll be using the swap partiton a lot so for anything exceeding 8GB of RAM you will be using the SSD as a slower “RAM” which will wear it’s lifespan down by constantly writing/reading into it’ s swap partition.
“tHATs nOT tRuE the aRCHiteCTuRe iS cOmPlETlY dIffErEnT!!!1!11!!ONEONE!!!” <— Apple fanboys when this was predicted on launch of the M1 🤖
The only people more cultish than Apple fans are Tesla/Elongated Muskrat fans.
You don’t get the most valuable company by selling a SSD. So, yeah a new Mac of course.
Well they do charge particularly hard for SSDs as well. They’ve found a way to eat the cake twice.
I think SSDs are also soldered to the mainboard on most apple products.
deleted by creator
Oh, my sweet summer child…
deleted by creator
Nah ur nat doin that with apple. Cmon just buy a new PC! Wa don car abt the env! Who cares anyway! Cmon not that expensive
HP seems to think 4 GB is an acceptable amount of RAM to put in a modern notebook (although they don’t charge even close to what Apple charges).
https://www.amazon.com/HP-Micro-edge-Microsoft-14-dq0040nr-Snowflake/dp/B0947BJ67M
Edit: Thinking about it, this is worse. Apple isn’t targeting low-income people. This is HP selling the poor a computer that doesn’t work properly.
Shipping with Windows S. That’s Microsoft’s version of a Chromebook for some light web browsing for 188 dollars. I wouldn’t buy it but this doesn’t look like a rip off at this price point.
They could just raise the prize to $198 and slap another 4GB of RAM on it.
And if they raised the price to $250, they could go with a faster processor and better wifi!
And if they raised another 2000$, they could add an RTX 4090 graphics card
S mode does allow you to turn it off, so it’s more like a hobbled version of home.
The computer is as bad as one I saw several years ago with 64g emmc and “Quad core processor.” not a quad core, it was literally the name that showed in system. It did have 4 cores: at 400Mhz, boosting to 1.1Ghz. Buyer changed their mind and we couldn’t give it away.
Of course that notebook is bad but for the price point of shitty hardware, you get shitty hardware. Apple sells shitty hardware at the cost of premium hardware.
At a $188 price point. An additional 4GB of memory would probably add ~$10 to the cost, which is over a 5% increase. However, that is not the only component they cheaped out on. The linked unit also only has 64GB of storage, which they should probably increase to have a usable system …
And soon you find that you just reinvented a mid-market device instead of the low-market device you were trying to sell.
4GB of ram is still plenty to have a functioning computer. It will not be as capable of a more powerful computer, but that comes with the territory of buying the low cost version of a product.
If they wanted it to be as cheap as possible, they could have installed Linux on it.
At that point you gotta wonder if it can keep up with an $80 Raspberry Pi, especially if HP tries to shoehorn Windows into that
In addition to the raw compute power, the HP laptop comes with a:
- monitor
- keyboard/trackpad
- charger
- windows 11
- active cooling system
- enclosure
I’ve been looking for a lapdock [0], and the absolute low-end of the market goes for over $200, which is already more expensive than the hp laptop despite spending no money on any actual compute components.
Granted, this is because lapdocks are a fairly niche product that are almost always either a luxury purchase (individual users) or a rounding error (datacenter users)
[0] Keyboard/monitor combo in a laptop form factor, but without a built in computer. It is intended to be used as an interface to an external computer (typically a smartphone or rackmounted server).
deleted by creator
Now let me present you the laptops with 2GB of RAM still being sold here in Brazil: https://www.zoom.com.br/notebook/notebook-multilaser-legacy-cloud-pc132-intel-atom-x5-z8350-14-2gb-windows-10-bluetooth
And it’s not on Linux! Wow. Sounds so horrible.
I was looking at notebooks at Walmart the other day, and I was amazed that they almost all had less or the same amount of RAM as my phone.
Miniaturization is amazing. The limiting factor to how powerful we can make phones is not space to put in computational units (processors,ram,etc). It is the ability to deal with the heat they generate (and the related issue of rationing a limited amount of battery power)
Sounds about right for HP.
I don’t even understand how HP still exists. Can anyone name a single product they’ve made in the last ~15 years that wasn’t a complete piece of junk?
I really like their pagewide xl printers, but those are purely aimed at businesses. Just to name one thing I like :D
And those xl printers are the only thing that I can think off. I won’t even consider buying a current HP computer/laptop/small printer/…
HP printer ink, that is their main source of revenue
And worst part: they installed Windows on it.
They moved to on-die RAM for a reason: To nickel and dime yo ass.
I needed to expense a Mac Mini for iOS development, and everyone (Me, the company, our purchasing department) was baffled at how much it cost to get 16 GB. And they only go up to 24GB. Imagine how much they’ll charge for 32 in a year!
It’s technically a bit faster, but yeah, I think charging more is the bigger motivation.
Companies primarily make decisions to maximise the profitability of someone and it’s never the consumer.
Sounds like one of those rare cases where engineering and marketing might agree on something.
Mac Mini is meant to be sort of the starter desktop. For higher end uses, they want you on the Mac Studio, an iMac, or a Mac Pro.
I assumed that the Mini was the effectively a Mac without a monitor. Is it relatively underpowered too?
Its not underpowered for average users, but it’s not meant for professional uses beyond basic office work.
Similar to the mini they offer the Studio which doesn’t have a monitor built in https://www.apple.com/mac-mini/compare/?modelList=Mac-studio-2023,Mac-mini-M2
Then for the higher end uses they offer a more typical tower format https://www.apple.com/mac-pro/
But would an M1 Mini be similar to an M1 iMac?
As far as I understand, the Mac lineup don’t have screens, the IMacs are stationary and do have a screen, the MacBooks are the laptops.
8GB is definitely not enough for coding, gaming, or most creative work but it’s fine for basic office/school work or entertainment. Heck my M1 Macbook Air is even good with basic Photoshop/Illustrator work and light AV editing. I certainly prefer my PC laptop with 32GB and a dedicated GPU but its power adapter weighs more than a Macbook Air.
8GB would be fine for basic use if it was upgradable. With soldered RAM the laptop becomes e-waste when 8GB is no longer enough.
Yeah, the soldering is outrageous. I miss the time when Apple was a (more) customer friendly company. I could open my Mac mini 2009 and just add more RAM, which I did.
When I bought my first MacBook in ‘07 I asked the guy in the store about upgrading the RAM. He told me that what Apple charged was outrageous and pointed me to a website where I’d get what I needed for much less.
I feel that if Apple could have soldered the RAM back then, they would have.
I feel that if Apple could have soldered the RAM back then, they would have.
Apple used to ship repair and upgrade kits with guides on how to apply them. Not sure they were as anti-repair then as they are now.
Embrace, extend, extinguish is an attitude for more than one company I guess.
deleted by creator
I mean I develop software on an 8GB laptop. Most of the time it’s fine, when I need more I have a desktop with 128GB ram available.
Really depends what type of software you’re making. If you’re using python a few TB might be required.
8GB is definitely not enough for coding, gaming, or most creative work but it’s fine for basic office/school work or entertainment.
The thing is, basic office/school/work tasks can be done on any laptop that costs twice less than an 8GB MacBook.
This is true for part time or casual use but for all day work use including travel you get better build quality and far less problems with a pro grade machine. We spend the same on a macbook, thinkpad, surface or probook for our basic full time users.
While it may be a bit overkill for someone who spends their day in word, excel, chrome and zoom we save money in the long term due to reliability. There is far less downtime and IT time spent on each user over the life of the system (3-4 years). The same is true about higher quality computer accessories.
8GB of dedicated VRAM is hardly enough these days…
Especially with 4k
4k rendering or 4k textures?
This is my biggest lament about getting a 2060 without knowing how important vram is. I can make it perform better and more efficiently a bunch of different ways, but to my knowledge, I can’t get around the 6GB vram wall.
Why do they struggle so much with some “obvious things” sometimes ? We wouldn’t have a type-C iphone if the EU didn’t pressured them to do make the switch
deleted by creator
The E-Mac (looks like a toilet, sounds like a jet) came with 256 MB of RAM in one of the two slots, adding a 512MB stick was dirt cheap (everyone had at the very least 1GB on their PC), well it was dirt cheap except if you bought it from Apple…
It’s how Apple monetizes their customers. Figuring out an artificial shortcoming they can sell as an upgrade to them (check out dongles for example).
💸💸💸
They didn’t have a reason to switch to USB-C, and several reasons to avoid it for as long as possible. Their old Lightning connector (and the big 30-pin connector that came before it) was proprietary, and companies had to pay a royalty to Apple for every port and connector they manufactured. They made a lot of money off of the royalties.
They should do 4Gb. I hear M3 mac’s make it seem like 8Gb.
If you allocate it right, you can add 200GB of swap space and then that 4GB of RAM will feel like 408GB!
I think you mean gigabytes, not gigabits.
8 Gb = 1 GB
Darn you broadband
That’s true. Data transmission is usually measured in bits, not bytes. Gigabit Ethernet can only transmit a maximum of ~128 MB/s.
You mean they can even make 0.5GB appear as 8GB?! That’s 16x! That apple silicon is just something else!
8Gb from 4Gb is 1GB from 0.5GB 😉
Oh man, I remember so many people defended 8GB since the M1 first came out (and since).
I always argued it would significantly reduce the lifetimes of these machines if you bought one, not just because you’d be swapping a lot more on the (soldered in BTW) ssd, but because after a few years of updates it would become unbearably slow, or hardware would fail, or both.
Didn’t stop people constantly “tHe aRchITecTuRE iS cOmPlETelY diFFeRenT!!!”
Sure it’s different, but it’s still just a computer. A technical person can still look at the spec sheet and calculate effective performance accounting for bus widths etc.
Disclosure: I bought a top spec 16GB M1 Mac Air on launch and have been extremely happy with it - it’s still going strong.
Didn’t stop people constantly “tHe aRchITecTuRE iS cOmPlETelY diFFeRenT!!!”
Different Turing Machine on different math and alternative physics, I guess.
I bought a top spec 16GB M1 Mac Air on launch
My condolences.
EDIT: do people geuenly belive that math doesn’t apply to Apple’s products or they just don’t understand even such concentrated sarcasm?
I can’t believe, there’s no Linux reference yet!
Give your “8 gigs not enough” hardware to one of us and see it revived running faster than whatever you’re running now with your subpar OS.
Software and AI development would be hard with 8gb of RAM on Linux. Having you seen the memes on AI adding to global climate change? Not even Linux can fix the issues with ChatGPT…
I don’t think anyone anywhere is claiming 8GB RAM is enough for software and AI development. Pretty sure we’re talking about consumer-grade hardware here. And low-end at that.
My main development machine has 8 GB, for what it’s worth. And most of the software in use nowadays was developped when 8GB was a lot of RAM
This. My Mac has 16GB but I use half of it with a Linux virtual machine, since I use my Mac to write Linux (server) software.
I don’t need to do that - I could totally run that software directly on my Mac, but I like having a dev environment where I can just delete it all and start over without affecting my main OS. I could totally work effectively with 8GB. Also I don’t need to give the Linux VM less memory, all my production servers have way less than that. But I don’t need to - because 8GB for the host is more than enough.
Obviously it depends what software you’re running, but editing text, compiling code, and browsing the web… it doesn’t use that much. And the AI code completion system I use needs terabytes of RAM. Hard to believe Apple’s one that runs locally will be anywhere near as good.
The lede by OP here contains this:
[…] addition to Xcode 16 […] is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it
So either RecluseRamble meant that development with a feature like predictive code completion would work on 8 GB of RAM if you were using Linux or his comparison was shit.
That’s absolutely what I’m saying. Apple is just holding back that feature for upselling (as always) and because it’s hardly possible to debloat macOS.
Okay good, thanks for confirming. I remember Kate feeling very nice to use during my studies, more responsive than VS Code or Eclipse. But I also had 16Gigabytes of RAM, so I couldn’t be sure.
I don’t think anyone anywhere is claiming 8GB RAM is enough for software development
I do. GCC doesn’t need much. Vim/emacs work fine with 128 MB of RAM. With 1 GB you can run KDE and QtCreator instead of vim.
Macbook Pros aren’t really consumer grade hardware. Nor are they priced like consumer grade hardware.
Nor are they priced like consumer grade hardware.
Apple products in general aren’t.
Macbook Pros aren’t really consumer grade hardware.
They are even below that.
I’d love to see you run xcode 16 code completion on your superior OS. Send me a link once you’ve uploaded the vid.
Why limit it to proprietary software? Almost every linux distro can run Github Copilot X and Jetbrains, which both have had more time to be publicly used and tested and work better in my opinion.
Send me a video link of Mac having direct access to containers without using a VM (which ruins the point of containers). THAT is directly related to my actual work, as opposed to needing a robot to code for me specifically using Apple’s AI
Because that was what the article was about…I actually am a Linux user and fan, folks just misreading the intentions of my post.
I would genuinely love to see it, because I’m stuck on mac hardware to do my job and I really hope one day they get crucified for their anticompetative practices so I can freely choose the OS my business uses.
Pls provide source code.
There is a project being worked on called Darling, but it isn’t ready yet. The developers are making progress though.
removed by mod
I actually bought a m1 mini for a linux low power server. I was getting tired of the Pi4 being so slow when I needed to compile something. Works real well, just need the Asahi team to get TB working. And for my server stuff, 8gb is plenty.
You wouldn’t happen to run a jellyfin server on that mac mini would you? Currently looking to find something performant with small form factor and low power consumption.
I’ve run Plex servers on Mac Minis (M1). Docker on MacOS runs well finally — the issues that were everywhere a couple of years ago are resolved.
It ran very well on the hardware. The OP of this post is right, 8gb is not enough in 2024; however I would also wager that the vast majority of commenters have not used MacOS recently or regularly. It is actually very performant and has a memory scheduler that rivals that found on GNU/Linux. Apple’s users aren’t wrong when they talk about how much better the OS is than Windows at using memory.
Not sure about jellyfin, but I assume it uses ffmpeg? The M1 is fast enough that ffmpeg can re-encode raw video footage from a high end camera (talking file sizes in the 10s of gigabyte range) an order of magnitude faster than realtime.
That would be about 20W. Apparently it uses 5W while idle — which is low compared to an Intel CPU but actually surprisingly high.
Power consumption on my M1 laptop averages at about 2.5 watts with active use based on the battery size and how long it lasts on a charge and that includes the screen. Apple hasn’t optimised the Mac Mini for energy efficiency (though it is naturally pretty efficient).
TLDR if you really want the most energy efficient Mac, get a secondhand M1 MacBook Air. Or even better, consider an iPhone with Linux in a virtual machine - https://getutm.app/ - though I’m not sure how optimsied ffmpeg will be in that environment… the processor is certainly capable of encoding video quickly, it’s a camera so it has to be able to encode video well.
The M1 is fast enough that ffmpeg can re-encode raw video footage from a high end camera (talking file sizes in the 10s of gigabyte range) an order of magnitude faster than realtime.
Depending on codec and settings, this might be super fast and super slow.
No I do not, but I don’t see any reason it shouldn’t work though. I have PiHole, Apache, email, cups, mythtv and samba currently.
I can run Arch Linux (BTW!) in a potato with starch RAM!
You can run Windows on 200 MB of RAM
deleted by creator
removed by mod
As I said: feel free to upgrade your MacBook just don’t throw the one with a “meager” 8 gigs away since it’s totally usable with a non-bloated system.
removed by mod
You replied, I replied back. That’s how public social media work. It’s unlikely we know each other.
removed by mod
removed by mod
Do you actually want to run an application that doesn’t exist on Linux?
I use a virtual machines for the 2 or 3 times a year I need to use a couple garbage windows-only programs. Usually for configuring some arcane piece of proprietary hardware that people were getting rid of because it is incompatible with everything.
removed by mod
“Disrespectful” would be telling you that you in particular should continue to use windows or mac, and avoid Linux like the plague.
removed by mod
If I wanted your clothes, I wouldn’t have left them at goodwill.
For who? My mother who only use facebook, youtube and googling don’t need 8gb
Sounds like all she needs is a dirt cheap chromebook then
That’s what she has
Then her situation isn’t applicable to this topic
That all depends on how much work they want to put into troubleshooting it for her. I got my mom a Mac Mini when her PC needed to be replaced. It’s way less responsibility on my part. I mostly just answer the occasional how-to.
Mac is easier than Windows, sure, but not easier than a chromebook. Nothing is simpler than a Chromebook. You can do much more with a Mac, but a chromebook is much easier.
I don’t know what Xcode is so yeah, I haven’t been found wanting with my 8GB M2. Videos, downloading, web browsing, writing, chat applications, some photo editing, games (what I can actually play on a Mac, anyway), all good here.
16GB+ is obviously going to be necessary though, and not exactly that expensive to put into their base models so it should be put in soon.
I had a laptop with 8GB. Doing one of those things was fine, but when you open up another program it takes forever to switch to the browser
And then you have to activate linux app support for a thing she needs and can not do with chromebook and suddenly it is more complicated than macOS?
deleted by creator
removed by mod
If you choose to be a weak little quiet corporate Stan then that’s up to you. Apple is well aware that third party apps exist and they’re well aware that machines with less ram will need replaced far sooner than machines with more. RAM is cheap and Apples intigrated memory is no different in the regard. The only reason to use less is planned obsolescence. If you don’t believe that then you’re either Tim Cook or you’re an idiot.
What is the obsession with shitting on people’s choices? I don’t understand the irony of demanding choice in this industry, then shitting on people when they make a choice you don’t agree with.
No one is shitting on other people’s choices. They are criticizing a major corporations choices to skimp on specs while charging a premium price. Specs that can’t be upgraded and will absolutely lead to a shorter usable life. I find it odd that people get upset about criticism that isn’t aimed at them at all. The only thing I can think is maybe they realize they were ripped off after putting so much money into Apple products and they need to defend their financial decisions. Even then I don’t fully understand. I’ve purchased overpriced junk many times and don’t feel the need to defend the offending company. Maybe it’s because Apple has managed to make their customers feel like they’re in an exclusive club even though everyone uses Apple products these days. A publicly traded company is around to make money and nothing more. They should never be held in reverence.
What is the obsession with shitting on people’s choices?
As much as people want to act like they are better than they were, say, 100 years ago, it’s not really true. Humans are really just advanced monkeys running around and very few can actually surpass that nature.
This we can agree on.
Sorry, boo, everyone wants to hate Apple these days. It’s the Zeitgeist. Even if you say something reasonable or perhaps factual, the people are against you and will react violently.
Accurate.
removed by mod
The unreasonable escalation of your response really makes you come across as exceedingly insecure.
See the OP’s ending sentence for reference.
I was quite literally illustrating the absurdity by being similarly absurd. Telling people to shut the fuck up about an issue is funny as hell to respond with a similar statement.
I wonder what is the general use for the Mac Mini, MacBook Air, iMac, and MacBook Pro? People generally seem to do all the lightweight stuff like social media consumption on their phones; and desktops/laptops are used for the more heavy-weight stuff. The only reason I’ve ever used a Mac was for IOS development.
I have a friend who is self-employed. He uses an iPhone and a MacBook Air. He only uses iMessage, Numbers, Safari and Apple Music for entertainment. He gets away with 8gb just fine and rarely has to reboot.
He probably could use a Chromebook or something even lighter, but the support and ecosystem were enough for him to pay the premium. His time is valuable to him so it was worth it to him.
Any computer with only 1MB of RAM is also usable … as paperweight.
Well, 2000€ for a “Pro” model of the Macbook 14" with only 8GB RAM sounds a bit strange, tbf. And +230€ for +8GB is straight up greedy.
They said “Actually, 8GB on an M3 MacBook Pro is probably analogous to 16GB on other systems” and well , that’s definitely not the case for their upcoming AI usecases, because - and many people seem to overlook that - their RAM is shared RAM (or as they call it “unified memory”), which means that the GPU is limited by these 8GB of (V)RAM because it can only use what is left by the System usage.