512gb of unified memory is insane. The price will be outrageous but for AI enthusiasts it will probably be worth it.
taking Apple prices to a new extreme
Well this news means there will be cheaper second hand m1 and m2 machines on the market.
my college buddy and startup cofounder had a pathetically slow old laptop. he asks me the other day, “should i buy an ipad pro?” i was dumbfounded. bro you don’t even have a proper computer. we went around a bunch and he kept trying to get really bad ones like a base model mac mini. finally i persuaded him to get a 16" M1 Pro for a grand (about 700 after his trade in) and he couldn’t be happier.
I’m still using my M1 MBP like 4 years later. Don’t even care to upgrade! these things are great value
I thought a few days ago that my “new” laptop (M2 Pro MBP) is now almost 2 years old. The damn thing still feels new.
I really dislike Apple but the Apple Silicon processors are so worth it to me. The performance-battery life combination is ridiculously good.
M2 user here. It is wonderful. You cannot get it to even heat up.
Honestly, the base level M1 mini is still one hell of a computer. I’m typing this on one right now, complete with only 8gb RAM, and it hasn’t yet felt in any way underpowered.
Encoded some flac files to m4a with XLD this morning. 16 files totalling 450mb; it took 10 seconds to complete. With my work flows I can’t imagine needing much more power than that.
Unfortunately that market is already flooded with functionally-useless 8GB machines.
can be configured up to 512GB, or over half a terabyte.
Are you ok mate?
They’re not wrong. 1000 GB is a terabyte, so 512 GB is over half a terabyte.
It’s exactly half a tebibyte though.
512 GiB is half a tebibyte. 512 GB is just under 477 GiB.
Yup.
- 512 GB > 1TB/2 - what article claims
- 512 GiB = 1 TiB/2 - what many assume
- don’t mix GiB and GB
Correct. But that means 512 GB is not half a tebibyte.
Ah, correct. RAM used GiB, so I guess I implicitly made the switch.
Isn’t unified memory terrible for AI tho? I kind of doubt it even has bandwidth of a 5 years old vram.
While DDR7 DRAM is obviously better, the massive amount of memory can be a massive advantage for some models.
This type of thing is mostly used for inference with extremely large models, where a single GPU will have far too little VRAM to even load a model into memory. I doubt people are expecting this to perform particularly fast, they just want to get a model to run at all.
That extreme’s name? Albert Einstein.
Weird that my mind just read that as MKUltra.
Maybe appropriate for AI.
Is memory that small, connected externally, or does that SoC just end up being a large package, with that much RAM on it?
It’s just external and soldered to the motherboard on Macs, no?
The storage prices are insane. It’s over 9 thousand to get the model with 512GB RAM, and it still only has 1TB of probably non removable internal storage.
2TB is +$400 4TB is +$1000 8TB is +$2200 16TB + $4600
They’re saying 8TB is worth more than the entire base model Mac Studio at 2k.
For those prices I expect a RAID 5 or 6 system built in, god knows they have the processor for it.