[ home / bans / all ] [ amv / jp ] [ maho ] [ f / ec ] [ qa / b / poll ] [ tv / bann ] [ toggle-new ]

/maho/ - Magical Circuitboards

Advanced technology is indistinguishable from magic

New Reply

Options
Comment
File
Whitelist Token
Spoiler
Password (For file deletion.)
Markup tags exist for bold, itallics, header, spoiler etc. as listed in " [options] > View Formatting "



[Return] [Bottom] [Catalog]

 No.5464

Chip Shortage Part 2: The AI Datacenter Build Out

DRAM prices have risen 100% across a number of products, due in part to DDR4 going EOL, DDR5 being in high demand from datacenters, and OpenAI putting in an order for 40% of global DRAM manufacturing capacity. Presumably GPU silicon price increases to follow at some point...

 No.5465

FUCK AI.

 No.5466

i feel lucky i got a new computer before this whole fuckery happened.

 No.5467

>>5466
Same here, it was just weeks before this that I got the essentials. Just missing the GPU, but I have to find some sort of contract where I won't spend more than exactly 50 bucks monthly for my pick.

 No.5468

I was planning on upgrading my RAM and this bullshit happened. God damnit.

 No.5469

File:C-1763369829804.png (26.79 KB,540x387)

Peachy. Just peachy. I was going to build a new PC in August and I postponed it until Christmas because I was going to be too busy with classes.

I'm gonna hope my 9 year old machine holds up until prices go down to normal, because if that one and the spare Dell Optiplex (7 years old) both crap the bed, I'm gonna be stuck with a N100 MiniPC.

 No.5470

File:[Erai-raws] Alma-chan wa K….jpg (188.6 KB,1920x1080)

He's a bit late. *cough* >>5386

Unless the AI crash happens in the next few weeks (which is always possible) I assume the prices will only go higher and higher so it's probably still a good idea to get it, if you can afford it.
I did my upgrade early this year since I figured prices would spike due to uhh.. off-topic reasons.
Kind of wish I went for that second set of RAM, but it's a "what if" scenario as I'm not nearing maxing out what I currently use. I don't know what I could do with 192GB of RAM that I couldn't do with 96. You need like 700 of it and thus a special motherboard to load the big AI text models.

https://www.amazon.com/dp/B0CCXQF414 is what I have and I paid $300. This is my computer for the next 5 or so years, though.

 No.5473

File:R-1763388235677.jpeg (3.1 MB,4032x3024)

I got these two sticks just laying around if anyone wants them.

 No.5474

File:1756863092613.jpg (81.79 KB,640x480)

>>5470
I got a 96GB CL34 6800 kit for around 190 bucks a few months ago, though I'm not sure how good or bad the CL34 is and I bet prices here are generally cheaper than in the US. I was thinking about getting a second kit at some point in the future, mostly because I really enjoy the pre-rendering capabilities for editing, but the last time I used up 4 slots at once was 20 years ago and nowadays I often read about people saying this approach (can) come with stability issues? I'm not sure.

I'm still on a 900 series GPU with 2GB VRAM though because I spent literally all my money on the other parts and just recently managed to get out of the red when the months turn.

>>5473
Really nice offer if genuine, I hope a poornon gets to pick it up.

 No.5475

Also have some parts laying around if anyone is interested. A 1660 Super, 64GB DDR4 RAM set and a different 32GB DDR4 RAM set.

 No.5476

I never trust fear mongering PC related news. Someone is probably profitting from people panic buying RAM right now.

 No.5477

>>5476
Generally I agree with your skepticism, but PC part prices have become pretty fucked since covid and the ai boom. Waiting for a good deal used to be a no-brainer, but now prices rarely come down. The AI bubble bursting would help but that could easily be years away. I hate that my instinct is to recommend people buy now because it'll likely be worse in a years time.

 No.5479

If AI companies get considered a permanent customer, is there a chance for the market to adapt and naturally produce more to meet the new demand and restore balance? I don't know anything about economics and whatnot.

 No.5482

i'm still on 16gb system ram... it's enough i swear

 No.5493

>>5482
It's enough until you start playing with comfyui

 No.5494

Upgraded my entire setup for the first time in a decade just a few months ago, glad I went through with it before this started,

 No.5495

Good thing that used DD4 is still cheap.

 No.5496

File:[SubsPlease] Spy x Family ….jpg (318.48 KB,1920x1080)

>>5493
WHERE ARE ALL THESE SURPRISE BOXES COMING FROM?!

 No.5498

>>5496
from ur mom lol

 No.5500

>>5482
Wait, you're telling me that 32gbs of ram is the new standard now?? Oh god...

 No.5504

>>5482
I have 4 GB DDR3

 No.5505

Win11 won't even run efficiently on most computers.
Now consider the 'developing' market and their computers.

Windows is toast.

 No.5509

>>5482
I have 16G DDR3 and I'm usually under 8G, but gaming brings it up quite a bit.

 No.5539

File:719f12a425e56bf18188ec1620….png (651.09 KB,880x880)

>>5464
Sigh. I'm gonna need at 2TB RAM for that locally hosted Kimi K2. Kimi actually needs 1.5TB RAM and I want to leave some headroom for future models.
Why? Because even $100 AI subscriptions are subsidized by AI companies, and they add up to $4800 over a 4 year period (cost of the planned setup, was anyway). Surely they are going to jack up the prices in the future.

 No.5540

Is this related somehow to Nvidia being dumped by Softbank and Berkshire Hathway?

 No.5548

>>5540
Nvidia dump implies the AI bubble is about to pop so the DRAM price jump should be temporary while AI companies build out datacenters. Hardware manufacturers haven't increased production either only shifted current capacity.

 No.5549

Buy MU

 No.5556

>>5539
RAM, as in DDR4 or even DDR5, is a very poor choice for running LLMs. Especially very large models.

tk/s = (model size in GB) / (memory bandwidth in GB/s).

Even the very best memory optimized DDR5 CPUs (meaning like 6-channel, 8-channel, or greater server CPUs) only top out at like 500GB/s. For a model that requires 1.5TB, you're going to top out at less than 1tk/s if you go for CPU inference with RAM.

This is why Nvidia HGX, and DGX, use NVlink GPU-to-GPU interconnects: each interconnect provides direct memory access at memory speed, scaling memory bandwidth approximately linearly with each GPU. A H100 SXM5 96 GB, for example, has a VRAM memory bandwidth of 3.36TB/s, so if you have a HGX chassis with 8x GPUs, you have an effective memory bandwidth of ~26.9TB/s; for a 1.5TB model, that would give you 17.9tk/s, which is very usable. Naturally, though, you can't fit 1.5TB into 96GB x8, which is why Nvidia partnered with Supermicro to sell entire racks as a single unit that have server-to-server NVlink interconnects.

 No.5562

>>5548
> the AI bubble is about to pop
>Asked during an earnings call about the timing of the Nvidia sale, SoftBank’s CFO Yoshimitsu Goto suggested the company needed liquidity to fund its OpenAI commitments.
>“This year our investment in OpenAI is large, more than $30 billion needs to be made,” he said. “For that, we do need to divest our existing assets.”​
Yeah, certainly popping 2 more weeks!
>>5556
Your result is totally wrong. Read how to calculate MoE inference speed.
For Kimi K2 you need to multiply your calculation by 30 to get the correct result.

 No.5563

File:0d273b23dc59e9f259b99ab64a….png (663 KB,880x880)

>>5556
You don't need to run the inference on the entire parameter space with moe models
>(active experts / total experts) * model size / memory bandwidth
gives a better approximation for tps speed.

 No.5569

Black friday week, and as expected, no ram on sale this time. There was a sudden 24 hours price drop from 210 to 160€ for a 2x16gb CL36 kit, but it's back to the inflated price again.

 No.5590

Why now? AI is not new. Why did they decide in Q4 2025 to buy up all the RAM?

 No.5592

i hate AI

 No.5594

i love AI

 No.5595

i'm indifferent towards AI

 No.5596

i'm AI

 No.5597

File:kago_ai.jpg (18.13 KB,210x240)

What about AI Kago?

 No.5598

AI flew over my house

 No.5623

The RAM

 No.5624

>>5623
No one needs more than 512 kB anyway

 No.5625

>>5624
My t61 has 8gb of ddr2 ram, checkmate atheists




[Return] [Top] [Catalog] [Post a Reply]
Delete Post [ ]

[ home / bans / all ] [ amv / jp ] [ maho ] [ f / ec ] [ qa / b / poll ] [ tv / bann ] [ toggle-new ]