It’s not running out. It’s being hoarded for the entropy machine.
Edit: anyone know if entropy machine ram can be salvaged for human use? If they use the same sticks?
Yes but you’ll need special hardware. Enterprise systems use registered “RDIMM” modules that won’t work in consumer systems. Even if your system supports ECC that is just UDIMM aka consumer grade with error correction.
This all being said I would bet you could find some cheap Epic or Xeon chips + an appropriate board if/when they crash comes.
Okay so I’d just need an enterprise board.
Yah. And a CPU to match. Either Epic or Xeon.
Engineering and quality samples float around sometimes which makes them more reasonable prices too. They have minor defects sometimes but I’ve never had an issue yet that matter
I could use some extra memory. Just jam it into my head I’m sure it’ll work
Yeah, gonna be interesting. Software companies working on consumer software often don’t need to care, because:
- They don’t need to buy the RAM that they’re filling up.
- They’re not the only culprit on your PC.
- Consumers don’t understand how RAM works nearly as well as they understand fuel.
- And even when consumers understand that an application is using too much, they may not be able to switch to an alternative either way, see for example the many chat applications written in Electron, none of which are interoperable.
I can see somewhat of a shift happening for software that companies develop for themselves, though. At $DAYJOB, we have an application written in Rust and you can practically see the dollar signs lighting up in the eyes of management when you tell them “just get the cheapest device to run it on” and “it’s hardly going to incur cloud hosting costs”.
Obviously this alone rarely leads to management deciding to rewrite an application/service in a more efficient language, but it certainly makes them more open to devs wanting to use these languages. Well, and who knows what happens, if the prices for Raspberry Pis and cloud hosting and such end up skyrocketing similarly.Add to the list: doing native development most often means doing it twice. Native apps are better in pretty much every metric, but rarely are they so much better that management decides it’s worth doing the same work multiple times.
If you do native, you usually need a web version, Android, iOS, and if you are lucky you can develop Windows/Linux/Mac only once and only have to take the variation between them into account.
Do the same in Electon and a single reactive web version works for everything. It’s hard to justify multiple app development teams if a single one suffices too.
At this rate I suspect the best solution is to cram everything but the UI into a cross-platform library (written in, say, Rust) and have the UI code platform-specific, use your cross-platform library using FFI. If you’re big enough to do that, at least.
Or just use rust for everything with Dioxus. At least, that’s what Dioxus is going for.
Are we gui yet?
I haven’t really kept up with Rust UI frameworks (or Rust at all lately, nearly nobody wants to pay me to write Rust, they keep paying me to write everything else). Iced was the most well-known framework last I tried any UI work, with Tauri being a promising alternative (that still requires web tech unfortunately). This was just me playing around on the desktop.
Is Dioxus easy to get started with? I have like no mobile UI experience, and pretty much no UI experience in general. I prefer doing backend development. Buuuut there’s an app I want to build and I really want it to be available for both iOS and Android… And while iOS development doesn’t seem too horrible for me, Android has always been weird to me
Is Dioxus easy to get started with?
I haven’t tried it myself but I’ve read the tutorials and it looks very React-inspired. It looks quite easy to pick up. It is still based on HTML and CSS but you can use one code base for all platforms.
iOS
At my last job we had a stretch where we were maintaining four different iOS versions of our software: different versions for iPhones and iPads, and for each of those one version in Objective-C and one in Swift. If anyone thinks “wow, that was totally unnecessary”, that should have been the name of my company.
This equation might change a bit as more software users learn how bloated apps affect their hardware upgrade frequency & costs over time. The RAM drought brings new incentive to teach and act on that knowledge.
Management might be a bit easier to convince when made to realize that efficiency translates to more customers, while bloat translates to fewer. In some cases, developing a native app might even mean gaining traction in a new market.
I’m not too optimistic on that one. Bloated software has been an issue for the last 20 or so years at least.
At the same time upgrade cycles have become much slower. In the 90s you’d upgrade your PC every two years and each upgrade would bring whole entire use cases that just weren’t possible before. Similar story with smartphones until the mid-2010s.
Nowadays people use their PCs for upwards of 10 years and their smartphones until they drop them and crack the screen.
Devices have so much performance nowadays that you can really just run some electron apps and not worry about it. It might lag a little at times, but nobody buys a new device just because the loyalty app of your local supermarket is laggy.
I don’t like Electron either, but tbh, most apps running on Electon are so light-weight that it doesn’t matter much that they waste 10x the performance. If your device can handle a browser with 100 tabs, there’s no issue running an Electron app either.
Lastly, most Electron/Webview apps aren’t really a matter of choice. If your company uses Teams you will use teams, no matter how shit it runs on your device. If you need to use your public transport, you will use their app, no matter if it’s Electron or not. Same with your bank, your mobile phone carrier or any other service.
We have an enormous problem with software optimization both in cycles and memory costs. I would love for that to change but the vast majority of customers don’t care. It’s painful to think about but most don’t care as long as it works “good enough” which is a nebulous measure that management can use to lie to shareholders.
Even mentioning that we’ve wiped out roughly a decade in hardware gains with how bloated and slow our software is doesn’t move the needle. All of the younger devs in our teams truly see no issue. They consider nextjs apps to be instant. Their term, not me putting words in their mouths. VSCode is blazingly fast in their eyes.
We’ve let the problem slide so long that we have a whole generation of upcoming devs that don’t even see a problem let alone care about it. Anyone who mentors devs should really hammer this home and maybe together we can all start shifting that apathy.
many chat applications written in Electron, none of which are interoperable.
This is one of my pet peeves, and a noteworthy example because chat applications tend to be left running all day long in order to notify of new messages, reducing a system’s available RAM at all times. Bloated ones end up pushing users into upgrading their hardware sooner than should be needed, which is expensive, wasteful, and harmful to the environment.
Open chat services that support third party clients have an advantage here, since someone can develop a lightweight one, or even a featherweight message notifier (so that no full-featured client has to run all day long).
Open chat is good in theory but corporate overlords need control. We can ignore them, but that is a lot of laptops with a lot of memory
As a programmer myself I don’t care about RAM usage, just startup time. If it takes 10s to load 150MB into memory it’s a good case for putting in the work to reduce the RAM bloat.
As a programmer myself I don’t care about RAM usage,
But you repeat yourself.
Not sure why you said that. In programming I lean DRY unless it’s a separate use case. The repetitions come from the hundreds of left pad implementations in node_modules
I mean, don’t get me wrong, I also find startup time important, particularly with CLIs. But high memory usage slows down your application in other ways, too (not just other applications on the system). You will have more L1, L2 etc. cache misses. And the OS is more likely to page/swap out more of your memory onto the hard drive.
Of course, I don’t either sit in front of an application and can tell that it was a non-local NUMA memory access that caused a particular slowness, so I can understand not really being able to care for iterative improvements. But yeah, that is also why I quite like using an efficient stack outright. It just makes computers feel as fast as they should be, without me having to worry about it.
Side-note
I heavily considered ending this comment with this dumbass meme:

Then I realized, I’m responding to someone called “Caveman”. Might’ve been subconscious influence there. 😅
I should learn C because of Unga Bunga reasons. I fully agree that lower RAM usage is better and cache misses are absolute performance killers but at the company I’m at there’s just no time or people or scale to do anything remotely close to that. We just lazy load and allow things to slowly cost more RAM while keeping the experience nice.
I mean, for me, it’s also mostly a matter of us doing embedded(-adjacent) software dev. So far, my company would hardly ever choose one stack over another for performance/efficiency reasons. But yeah, maybe that is going to change in the future.
I feel like it’s as much the number of libraries as the language. There are many bloated C/C++ applications.
One of the main culprits in all of this is the Windows users themselves. How can poeple still degrade themselves like that? Stop installing Windows on your clients or servers…for the sake of humanity. You will do just fine without it, just gotta re-learn some controlls at most. Linux has been client friendly for YEARS now.
640k ought to be enough for anybody
But how can I get anything done with these meager 128 GB computers?
TUI enthusiasts: “I’ve trained for this day.”
P.S. Yes, I know a TUI program can still be bloated.
Just glad I invested in 64GBs when it only cost $200. Same ram today is nearly $700.
Removed by mod
The only software that matters.
Am I allowed to complain about software bloat if I don’t have that?
Software engineers and game designers should be allowed 4 gb ram.
*4 kilobytes is all they will ever need.
Hah, wishful thinking
Relevant community: !sustainabletech@lemmy.sdf.org
Everyone better start learning Rust.
Rust programs can definitely still consume a lot of memory. Not using a garbage collector certainly helps with memory usage, but it’s not going to change it from gigabytes to kilobytes. That requires completely rethinking how things are done.
That said I’m very much in favour of everyone learning Rust, as it’s a great language - but for other reasons than memory usage :)
True, but memory will be freed in a more timely manner and memory leaks probably won’t happen.
Memory leaks are more than possible in rust. Rust type system prevents things like free being called on an already free resource. It very much also allows not calling free even when nothing references things. It also makes things like arena allocation a fun endeavor compared to other systems languages. It’s not impossible just trickier. Rust isn’t a panacea, you would need something more like idris with its type system to programmatically enforce resources are freed at runtime during the compilation phase. But a fully dependent type system is very much a bleeding edge thing.
it might be time for me to learn GPUI, i wonder if it’s any good.
I was impressed with GPUI’s description of how they render text, and hope that it either grows into a general purpose GUI toolkit, or inspires one with a similar approach. It has a long way to go, though.
You might find this interesting:
https://github.com/longbridge/gpui-componentIn the meantime, Qt is still the only cross-platform desktop toolkit that does a convincing job of native look and feel. If you’re not married to Rust, you might have a look at these new Qt bindings for Go and Zig:
https://github.com/mappu/miqt
https://github.com/rcalixte/libqt6zig
I’m not sure what there’s less excuse for, the software bloat or the memory running out.






