

Love the pronunciation section, it’s settles nothing.


Love the pronunciation section, it’s settles nothing.


This one won’t peak for a few years. So a new all time high every month until then.


Do you think he would use GeForce drivers?


You are telling me that you can’t proof of concept something without a matching tech stack? Or learn exactly how a new tech works? It also sounds like you should never give your work any of your personal time, you won’t gain anything except for more work.


I think it means they setup new tech on their homelab to learn how everything works and how to break it. Then when a problem arises where one of these solutions is needed at work, you can implement it without any large issues. It makes sense if your hobby is close to or adjacent your day job, and you are on Salary, and your boss treats you right.


I figured it out. It’s drivers. Intel provides open source Linux drivers.


That is the one game I won’t save scum. It kills the spirit of DnD if everything can’t go to hell.


The channel has no technical depth and has had one too many controversies to be considered valid. It’s a way for large companies to “give” the newest most expensive shit to have it featured for viewers to see and want to buy the newest most expensive shit. He is a sales person.


The Intel Battlemage is an interesting choice. I’d put one in non-gaming PC if the price was right, but what is his reasoning? I have alchemist cards that have been running 8 hours a day on QA PCs perfectly fine for 2+ years. They sucked for the first couple months until Intel significantly improved the drivers.


While UE5 is a one size fits all for game dev, it does not mean all games have to try to have the highest quality graphics possible. CO took that route. You can do whatever you want and optimize as much as you can. You are correct, studios are just going ham on the capabilities and not caring about the hardware requirements. Having a min requirements of a 12th Gen intel, 32GB RAM and 3060 is fucked up. And you need 64GB of RAM for good performance. My biggest gripe on the UE4 vs 5 opinion is that you can still make a low-fi game that runs at 200fps. It’s a design choice not an engine choice.


I hate to be the one to say it but an ambitious game made by a small team is a shit game to use as an example. They did not have the budget to optimize and the game is full of small technical flaws. Who ever told you that UE4 is better than UE5 is wrong, even without Lumen or Nanite. Also most games in UE4 only used 2k textures and in UE5 8k textures are used. The assets are heavier in general.


You are the first person I have ever heard say UE5 sucks. Why?

Exactly. “Swtches of comparable quality” is a fucking joke. Gateron switched are better. My Kailh and Akko switches are next level and function exactly like I want. In my tester boards the cherries stand out a bit because they feel awful.


Your name is amazing


Ask Steve why he was working on my feature branch. Steve is not a smart person. He also built a feature that another team was working on, over a weekend and implemented it on Monday morning. The feature was already finished on Friday and the PR was waiting for approval. While 10x devs work fast, they create 10x the work for everyone else. He no longer works here and it turns out he burned every single team with shit like this. It is so hard to get rid of someone who can work fast. When upper management is convinced someone who is productive and smart can do no wrong. They ignore the fucking carnage they create.


I wasn’t going to get into the whole lossyness of the formats and just simplified to full image instead of compressed formatted. That is interesting that it is only saving 20%-40%. I was under the impression that the page only rendered the image size necessary to fit the layout and not the full resolution image. Forcing it to less lossy or lossless would mean that the larger image would always be available to be served to be rendered without any web request.


Maybe this should come with a warning. The purpose of WebP is to quickly serve images to the user without grabbing the entire image data. Without WebP all images will be fully loaded, in the right conditions a page could load real slow.


I can’t swear or reference other team members anymore, it was considered hostile. Fuck Steve, trying to get his git numbers up by running a linter on my feature branch while I am developing the branch. Now I can’t fucking quickly read the code, it is a mess for a reason, it is temporary. I hate Python for this, I come from C++ land and need my whitespace.


After being in an ICU for a busy night and being in the same room while another person didn’t make it, I am DNR on like everything. You shouldn’t need to do that to my body for me to be alive.
Every single person i have met with a CAT phone says they are fucking indestructible. I used to do IT for manufacturing, agriculture, and construction companies. So that is quite a few people.