No.5465
FUCK AI.
No.5466
i feel lucky i got a new computer before this whole fuckery happened.
No.5467
>>5466Same here, it was just weeks before this that I got the essentials. Just missing the GPU, but I have to find some sort of contract where I won't spend more than exactly 50 bucks monthly for my pick.
No.5468
I was planning on upgrading my RAM and this bullshit happened. God damnit.
No.5470
He's a bit late. *cough*
>>5386Unless the AI crash happens in the next few weeks (which is always possible) I assume the prices will only go higher and higher so it's probably still a good idea to get it, if you can afford it.
I did my upgrade early this year since I figured prices would spike due to uhh.. off-topic reasons.
Kind of wish I went for that second set of RAM, but it's a "what if" scenario as I'm not nearing maxing out what I currently use. I don't know what I could do with 192GB of RAM that I couldn't do with 96. You need like 700 of it and thus a special motherboard to load the big AI text models.
https://www.amazon.com/dp/B0CCXQF414 is what I have and I paid $300. This is my computer for the next 5 or so years, though.
No.5474
>>5470I got a 96GB CL34 6800 kit for around 190 bucks a few months ago, though I'm not sure how good or bad the CL34 is and I bet prices here are generally cheaper than in the US. I was thinking about getting a second kit at some point in the future, mostly because I really enjoy the pre-rendering capabilities for editing, but the last time I used up 4 slots at once was 20 years ago and nowadays I often read about people saying this approach (can) come with stability issues? I'm not sure.
I'm still on a 900 series GPU with 2GB VRAM though because I spent literally all my money on the other parts and just recently managed to get out of the red when the months turn.
>>5473Really nice offer if genuine, I hope a poornon gets to pick it up.
No.5475
Also have some parts laying around if anyone is interested. A 1660 Super, 64GB DDR4 RAM set and a different 32GB DDR4 RAM set.
No.5476
I never trust fear mongering PC related news. Someone is probably profitting from people panic buying RAM right now.
No.5477
>>5476Generally I agree with your skepticism, but PC part prices have become pretty fucked since covid and the ai boom. Waiting for a good deal used to be a no-brainer, but now prices rarely come down. The AI bubble bursting would help but that could easily be years away. I hate that my instinct is to recommend people buy now because it'll likely be worse in a years time.
No.5479
If AI companies get considered a permanent customer, is there a chance for the market to adapt and naturally produce more to meet the new demand and restore balance? I don't know anything about economics and whatnot.
No.5482
i'm still on 16gb system ram... it's enough i swear
No.5493
>>5482It's enough
until you start playing with comfyui
No.5494
Upgraded my entire setup for the first time in a decade just a few months ago, glad I went through with it before this started,
No.5495
Good thing that used DD4 is still cheap.
No.5500
>>5482Wait, you're telling me that 32gbs of ram is the new standard now?? Oh god...
No.5505
Win11 won't even run efficiently on most computers.
Now consider the 'developing' market and their computers.
Windows is toast.
No.5509
>>5482I have 16G DDR3 and I'm usually under 8G, but gaming brings it up quite a bit.
No.5539
>>5464Sigh. I'm gonna need at 2TB RAM for that locally hosted Kimi K2. Kimi actually needs 1.5TB RAM and I want to leave some headroom for future models.
Why? Because even $100 AI subscriptions are subsidized by AI companies, and they add up to $4800 over a 4 year period (cost of the planned setup, was anyway). Surely they are going to jack up the prices in the future.
No.5540
Is this related somehow to Nvidia being dumped by Softbank and Berkshire Hathway?
No.5548
>>5540Nvidia dump implies the AI bubble is about to pop so the DRAM price jump should be temporary while AI companies build out datacenters. Hardware manufacturers haven't increased production either only shifted current capacity.
No.5549
Buy MU
No.5556
>>5539RAM, as in DDR4 or even DDR5, is a very poor choice for running LLMs. Especially very large models.
tk/s = (model size in GB) / (memory bandwidth in GB/s).
Even the very best memory optimized DDR5 CPUs (meaning like 6-channel, 8-channel, or greater server CPUs) only top out at like 500GB/s. For a model that requires 1.5TB, you're going to top out at less than 1tk/s if you go for CPU inference with RAM.
This is why Nvidia HGX, and DGX, use NVlink GPU-to-GPU interconnects: each interconnect provides direct memory access at memory speed, scaling memory bandwidth approximately linearly with each GPU. A H100 SXM5 96 GB, for example, has a VRAM memory bandwidth of 3.36TB/s, so if you have a HGX chassis with 8x GPUs, you have an effective memory bandwidth of ~26.9TB/s; for a 1.5TB model, that would give you 17.9tk/s, which is very usable. Naturally, though, you can't fit 1.5TB into 96GB x8, which is why Nvidia partnered with Supermicro to sell entire racks as a single unit that have server-to-server NVlink interconnects.
No.5562
>>5548> the AI bubble is about to pop>Asked during an earnings call about the timing of the Nvidia sale, SoftBank’s CFO Yoshimitsu Goto suggested the company needed liquidity to fund its OpenAI commitments.>“This year our investment in OpenAI is large, more than $30 billion needs to be made,” he said. “For that, we do need to divest our existing assets.”Yeah, certainly popping 2 more weeks!
>>5556Your result is totally wrong. Read how to calculate MoE inference speed.
For Kimi K2 you need to multiply your calculation by 30 to get the correct result.
No.5563
>>5556You don't need to run the inference on the entire parameter space with moe models
>(active experts / total experts) * model size / memory bandwidth gives a better approximation for tps speed.
No.5569
Black friday week, and as expected, no ram on sale this time. There was a sudden 24 hours price drop from 210 to 160€ for a 2x16gb CL36 kit, but it's back to the inflated price again.
No.5590
Why now? AI is not new. Why did they decide in Q4 2025 to buy up all the RAM?
No.5592
i hate AI
No.5594
i love AI
No.5595
i'm indifferent towards AI
No.5596
i'm AI
No.5598
AI flew over my house
No.5624
>>5623No one needs more than 512 kB anyway
No.5625
>>5624My t61 has 8gb of ddr2 ram, checkmate atheists