WD and Seagate confirm: Hard drives sold out for 2026
144 points by layer8 15 hours ago | 167 comments

lccerina 14 hours ago
Everyone: things suck, better move my stuff on a small home server. The hyper-scaler mafia: NOT ON MY WATCH!

The only silver lining is that newer devices will have to scale down memory, so developers will have to ditch memory-sucking frameworks and start to optimize things again.

reply
stingraycharles 14 hours ago
I’m just afraid that prices of $everything will go up soon and will not come down anymore, like they did after Covid.

If it’s temporary I can live with it.

I guess this was inevitable with the absolute insane money being poured into AI.

reply
ethbr1 8 hours ago
> If it’s temporary I can live with it. I guess this was inevitable with the absolute insane money being poured into AI.

Hyperscalars refresh hardware and firesale old stock.

~2028 is going to see a lot of high power refurb supply hit the market.

reply
Analemma_ 7 hours ago
Do they? When I was at AWS in 2019-2021 there were some servers in the fleet— not many, but they definitely existed— which dated to before 2010. Maybe it depends on the type:class, but I think hyperscalers run hardware until it dies, then junk it. Where have you seen hyperscaler hardware for sale?
reply
ethbr1 4 hours ago
Non-standard racks (Meta?) started showing up on the aftermarket around the time that Meta and a few other folks were the only ones using them.

Forget the details, as it's been a decade+, but everything gets junked at some point.

They're almost always stripped down, then sold as board+chassis only.

reply
FrankBooth 13 hours ago
We useless eaters are to be priced out of life soon enough.
reply
dgxyz 13 hours ago
Traditionally that hasn't gone well for the rich folk.
reply
Eddy_Viscosity2 10 hours ago
It also hasn't gone well for the non-rich folk either.
reply
brador 13 hours ago
Truer than even you dare to admit.

How many useless living humans do you know? They go somewhere. Something happens to them. Whatever it is it’s about to happen to 30% of the population.

What’s the opposite of survivor bias?

reply
malfist 8 hours ago
It takes a profound lack of empathy to refer to your neighbors as "useless living humans"
reply
scarecrowbob 7 hours ago
Furthermore, it's usually just plain dangerous.
reply
brador 3 hours ago
It is meant in the capitalist sense. Your horror at the statement is the point.
reply
BobaFloutist 7 hours ago
> How many useless living humans do you know?

Oh, I can think of about 77 million right off the top of my head.

reply
roysting 13 hours ago
Traps tend to only go one way.
reply
stogot 11 hours ago
I am afraid my NAS will have a hard drive failure, and I won’t be able to order replacements. I should’ve bought a back up
reply
cube00 14 hours ago
>If it’s temporary I can live with it.

Given this has been going on for years at this point, the high prices of graphics cards through crypto and now AI, it feels like this is the new normal, forever propped up by the next grift.

reply
dgxyz 14 hours ago
I don't think this ideology and investment strategy will survive this grift. There's too much geopolitical instability and investment restructuring for it to work again. Everyone is looking at isolationist policies. I mean mastercard/visa is even seen as a risk outside US now.
reply
lazide 13 hours ago
Yup, when you can’t trust partners (or even nominal allies), what else is there but isolationism?
reply
dgxyz 13 hours ago
It's not really isolation but exclusion. Push all risks as far away from you as possible.
reply
lazide 13 hours ago
When everything ‘outside’ is a risk, what would you call a summary of that policy?
reply
dgxyz 13 hours ago
Well a risk has an abstract level and is either increasing or decreasing. You can look at your risk profile over time and work out how to define policy going forwards. It takes a long time to make changes at country level.

US is medium risk and increasing rapidly. Run away quickly.

reply
iso1631 13 hours ago
cooperation.

Sure you have to isolate certain rogue states - North Korea, Russia, USA. Always the way.

reply
Fervicus 13 hours ago
> I don't think this ideology and investment strategy will survive this grift

Big tech will be deemed "too big to fail" and will get a bail out. The tax payers will suffer.

reply
dgxyz 13 hours ago
Big tech has already failed. Which is why it got into politics.
reply
iso1631 13 hours ago
> I’m just afraid that prices of $everything will go up soon and will not come down anymore, like they did after Covid.

Just like the price of labour. Your salary went up and doesn't come down

In the UK weekly earnings increased 34% from December 2019 to December 2025.

CPI went up 30% in the same period.

Obviously that CPI covers things which went up more, and things which went up less, and your personal inflation will be different to everyone elses. Petrol prices end of Jan 2020 were 128p a litre, end of Jan 2025 they are 132p a litre [0]. Indeed petrol prices were 132p in January 2013. If you drive 40,000 miles a year you will thus see far lower inflation than someone who doesn't drive.

[0] https://www.rac.co.uk/drive/advice/fuel-watch/

reply
buran77 13 hours ago
> I’m just afraid that prices of $everything will go up soon and will not come down anymore, like they did after Covid.

That's how inflation works. In this case it seems more narrow though, there's hope the prices will go down. Especially if the AI hype finds a reason to flounder.

reply
lowdude 13 hours ago
Unless people notice that they just built lots of useless datacenters and push back towards a mainframe + terminal setup, because ah sorry, modern software just runs much better that way, and you can save money on our inexpensive laptop with subscription model
reply
dgxyz 14 hours ago
Saw this one coming and got my personal stuff out. It's running on an old Lenovo crate chucked in my hallway.

Work is fucked. 23TB of RAM online. Microservices FTW. Not. Each node has OS overhead. Each pod has language VM overhead. And the architecture can only cost more over time. On top of that "storage is cheap so we won't bother to delete anything". Stupid mentality across the board.

reply
roysting 13 hours ago
It is one tiny sliver of silver lining that “storage/memory/compute is cheap” nonsense that has produced all kinds of outsourced human slop code. That mentality is clearly going to have to die.

It could even become a kind of renaissance of efficient code… if there is any need for code at all.

The five guys left online might even get efficient and fast loading websites.

Honorable mention of the NO-TECH and LOW-TECH magazine site because I liked the effort at exploring efficient use of technology, e.g., their ~500KB solar powered site.

https://solar.lowtechmagazine.com/about/

reply
dgxyz 13 hours ago
I think your ideological perspective is spot on.

We went from using technology to solve problems to the diametric opposite of creating new problems to solve with technology. The latter will have to contract considerably. As you say, many problems can be solved without code. If they even need to be solved in the first place.

On the efficiency front, most of what we built is for developer efficiency rather than runtime efficiency. Also needs to stop.

I'm a big fan of low tech. I still write notes on paper and use a film camera. Thanks for the link - right up my street!

reply
b3lvedere 14 hours ago
The main thing that the powers that be have always underestimated is the insane creativity the common people have when it comes to wanting things, but being forced to use alternative ways. Not going to say it won't suck, but interesting ways will indeed be found.
reply
roysting 13 hours ago
You’re going to find what, ways to make hand crafted survival RAM and drives in your backyard chip foundry?

Call me cynical if you like, but I don’t see this optimism that assumes the banal idea that somehow good always wins, when that’s simply not possible and in fact bad-guys have won many times before, it’s just that “dead men tell no tales” and the winners control what you think is reality.

reply
louiskottmann 13 hours ago
People will find a way to not need as much RAM, and thus the devices that require it.

Same way the price of groceries going up means people buy only what they need and ditch the superfluous.

reply
zozbot234 13 hours ago
The Chinese have end-to-end production capacity for lower capacity, lower performance/reliability consumer HDDs, so these are quite safe. Maybe we'll even see enterprise architectures where that cheap bottom-of-the-barrel stuff is used as opportunistic nearline storage, and then you have a far lower volume of traditional enterprise drives providing a "single source of truth" where needed.
reply
iamnothere 9 hours ago
In the same way that China is stepping into RAM production, I suspect they will step into the gap for high capacity drives as well. The market abhors a vacuum and China is eager to fill it, even at minimal levels of profit. Chinese manufacturers have become very good at providing an acceptable level of quality at a good price, maybe not the highest quality, but acceptable for consumer use.

I can understand that incumbents may not want to overinvest in capacity, which could be financially precarious, but they are also putting themselves in danger by opening up avenues for competition. One more thing ruined by AI mania, I suppose.

reply
b3lvedere 10 hours ago
I come from a time when people had to use 1 to a maximum of 48 kilobytes for their entire computer. Later on i once went to Helsinki to watch with my own eyes what people can program in a restriction of 4 to 64 kilobytes. Computers have amazed me, but people who used them, have amazed me a lot of more times.

I wasn't saying it'll be good and that the good guys win, but a lot of insane creativity to circumvent restrictions will pop up.

reply
lazide 13 hours ago
One way of putting it, is the winners are ‘the good guys’.
reply
ckbkr10 14 hours ago
> The only silver lining is that newer devices will have to scale down memory, so developers will have to ditch memory-sucking frameworks and start to optimize things again.

No. Prices will just go up, less innovation in general.

reply
lazide 13 hours ago
A few places will have no choice - low price elasticity, combined with things that need to actually work.
reply
theandrewbailey 13 hours ago
At least we can add "use the least amount of RAM and drive space" to our AI prompts.

/s

reply
zozbot234 13 hours ago
Well you can do that, but then the AI won't be nearly as smart as it was before...
reply
sireat 10 hours ago
After, RAM, SSD, GPUs, now HDDs what else is there left to sell out? Power supplies, fans?

In a way this feels a bit absurd for these AI centers to hog HDDs.

As pointed by others neither training nor inference require HDDs and storing raw data should not require that much.

So my hypothesis is that it is a double whammy of overall declining consumer sided HDD demand, leaving data centers as main source of demand and additional demand from the new AI centers.

I feel like the AI centers are just buying HDDs because why not throw a HDD in each server blade even if there is no need? The money is there to be spent and it must be spent.

As someone who has been building computers since 1989 it feels like end of personal hobby casual building.

I will end with an imperfect analogy with multiplayer gaming. It is quite common in multiplayer games for higher level players to wish to acquire some tradeskill they neglected to acquire earlier. maybe a new quest appears, or new "must have" item that requires such skill.

They (past me included) have too much game money and no wish to acquire tradeskill items slowly. So the "rich" will overpay by 2x or 10x or even 100x the usual price.

That is free market at work right?

In the process whole low level economy is destroyed due to 2nd order effects. Meaning a new player starting out can only be a farmer.

So if a student comes to me wishing to start building computers what advice do I give them? Farm something?

reply
zozbot234 9 hours ago
> As someone who has been building computers since 1989 it feels like end of personal hobby casual building.

We have a long way to go before the average PC costs even half as much as it did in 1989 (adjusted for inflation). And of course the performance for typical consumer use is orders of magnitude better than it was back then.

reply
fnands 14 hours ago
Damn. First GPUs, then RAM, now hard drives?

What's next, the great CPU shortage of 2026?

reply
geolqued 14 hours ago
reply
fnands 14 hours ago
Oh no, looks like my 8700k will have to hold out a little longer.
reply
Fervicus 13 hours ago
What's next is no custom built PCs. They want us running dumb thin clients and subscribing to compute. Or it will be like phones. We'll get pre-built PCs that we aren't allowed to repair and they'll be forced to be obsolete every few years.
reply
squeefers 12 hours ago
"they"? i see companies jacking their prices up, plain and simple. and us idiots still pay. ask yourself does intel no longer wish to sell CPUs to consumers? doesnt sound reasonable that intel would want to decimate their main market so AI companies can rule the world for some reason
reply
63stack 9 hours ago
Why doesn't that sound reasonable, when nvidia did exactly that?
reply
loeg 14 hours ago
I think hard drives was before RAM but it kind of all happened contemporaneously.
reply
bilekas 14 hours ago
Better start hoarding Silica.
reply
post-it 14 hours ago
It'll be fine. The supply chain for these components is inelastic, but that means once manufacturing capacity increases, it'll stay there. We'll see lower prices, especially if there is an AI crash and a mass hardware selloff like some people are predicting.
reply
wongarsu 13 hours ago
The number of HDDs sold has been in decline for over a decade. I doubt there is massive appetite for expanding production capacity

On the other hand the total storage capacity shipped each year has risen, as a combination of HDDs getting larger and larger, and demand shifting from smaller consumer HDDs to larger data center, enterprise and NAS HDDs. I'm not sure how flexible those production lines are, but maybe the reaction will be shifting even more capacity to higher-capacity drives with cutting-edge technology

reply
zvqcMMV6Zcr 13 hours ago
Server grade hardware (rack blades) is already poor fit for consumer needs and AI dedicated hardware straight up requires external liquid cooling systems. It will be expensive to adopt them.
reply
m4rtink 10 hours ago
Strip it for usable parts (DRAM/VRAM/NANAD chips, maybe also some of the controllers), then recover any valuable metals (copper heat sinks, gold on contact pins), simple as that. :)
reply
cubefox 14 hours ago
> According to Mosley, Seagate is not expanding its production capacities for now. Growth is to come only from higher-capacity hard drives, not from additional unit numbers.
reply
mrtksn 14 hours ago
If it takes 2 years to increase, after 2 years everything will be thin clients already. Completely locked in, fully under control and everybody used to it. Very dystopian TBH.
reply
post-it 11 hours ago
You're going to throw out all your computers in two years?
reply
mrtksn 9 hours ago
No but you can switch and get comfortable with fully cloud based solutions when you your computer ages and the prices for the now one are through the roof.
reply
StopDisinfo910 14 hours ago
True if production capacity increases but it's an oligopoly and manufacturers are being very cautious because they don't want to cut into their margins. That's the problem with concentration. The market becomes ineffective for customers.
reply
Maxion 14 hours ago
It's not about cutting in to their margins, if they end up scaling up production it will take several years and cost an untold amount of billions. When the AI bubble pops, if there's not replacement deman there's a very real chance of them going bankrupt.
reply
adornKey 13 hours ago
Do the guys that buy out the market have real use for all the hardware - or is it just hype? A solution against investors trying to corner the market would be to sell virtual hardware. Let them buy as much options on virtual "to be delivered" hardware" as they want. We also need an option market for virtual LLM-tokens, where the investors can put all their money without affecting real people.
reply
alecco 2 hours ago
Sam Altman (just yesterday): Codex weekly users have more than tripled since the beginning of the year!

https://twitter.com/sama/status/2023233085509410833

Funny how they all meet at the "You'll own nothing" Swiss club.

reply
ksec 13 hours ago
There hasn't been a better time in the past 15 years to push for a new video or image codec. Saving storage Space is important again.

This is assuming most of what we stored are either images or video.

reply
croes 10 hours ago
New video codecs needs hardware support for good performance means you need to buy new hardware. Not a good time for that.
reply
throwrg25 7 hours ago
Next on the data centers' menu: jet engines https://www.wsj.com/business/energy-oil/how-jet-engines-are-...
reply
embedding-shape 14 hours ago
I'll go against the grain and claim this might be a good thing long term. Yes, it sucks also, I was planning to expand my NAS but guess I'll figure out how to compress stuff instead.

Which goes into why I think this might be good. Developers have kind of treated disks as "oh well" with binaries ballooning in size, even when it can easily solved, and there is little care to make things lightweight. Just like I now figure out a different solution to recover space, I'm hoping with a shortage this kind of thing will be more widespread, and we'll end up with smaller things until the shortage is over. "Necessity is the mother of all invention" or however it goes.

reply
63stack 14 hours ago
There is an increasing chance the "invention" will be that nobody owns personal computers, and now has to rent from the cloud.
reply
embedding-shape 9 hours ago
If anything the ecosystem is moving away from that, not into that. People seem more and more averse of relying on SaaS when local applications can do, but this might also be somewhat of a bubble I'm perceiving, rather than a global trend I suppose.
reply
altmanaltman 14 hours ago
Think that's more of a "silver lining" instead of the overall trend being a "good thing long term." It's still pretty terrible.
reply
embedding-shape 8 hours ago
Yeah, true, that's more aligned with what I was thinking and more accurate. Thanks :)
reply
greatgib 13 hours ago
No one can be surprised to see that all of these artificial "shortages" are impacting components with monopoly or few actors producers...
reply
zozbot234 13 hours ago
That's the electronics industry in general though. The shortages are real and a normal part of growing pains for any industry that's so capital-intensive and capacity constrained.
reply
oxag3n 8 hours ago
It's not clear if they are going to redistribute production lines in favor of hyper-scalers - are we going to see fewer SATA drives on the market?
reply
m4rtink 14 hours ago
Looks like we need a computer hardware reserves the same way there are regional reserves for food, fuels and other critical commodities?

And for the same reason - to avoid the dominant players going "oh shiny" on short term lucrative adventures or outright trying to manipulate the market - causing people to starve and making society grind to a halt.

reply
zozbot234 13 hours ago
The real "computer hardware reserves" is the used market. Average folks and smaller businesses will realize that their old gear now has resale value and a lot more of it will be entering the resale/refurbishment market instead of being thrown away as e-waste.
reply
m4rtink 10 hours ago
That is a very good point!

Still, at least in the short to medium term, there are many companies, institutions and individuals that are used to getting new hardware that is under warranty - developer hardware refreshes, store hardware updates and deployments, some schools and academic institutions that provide hardware to students or even just people getting a laptop for schools use.

I think people would be used to getting new hardware for all of these & might be pretty much left out cold without a machine or having to opt for a reused or resold machine without the necessary technical knowledge to support it themselves.

reply
ta9000 14 hours ago
Save us, China.
reply
phatfish 14 hours ago
It just goes to show how totally corporations have captured western aligned governments. Our governments are powerless to do anything (aside from some baby steps from the EU).

China is now the only solution to fix broken western controlled markets.

reply
prewett 7 hours ago
Cynicism in unhelpful, and it's not correct. This has nothing to do with governments and everything to do with market economics. This sort of thing happens every few years in computing.
reply
KRAKRISMOTT 5 hours ago
Then what's your solution? It's a capital intensive business to get into, without some regulatory changes either on the supply side or the demand side, there's no way for it to be naturally prevented without consumers bearing the brunt of the downside. Yes, the market will eventually correct itself but until then consumers suffer.
reply
arjie 14 hours ago
I picked up a few hundred TB from a chia farm sale. Glad for it. I think I'm set for a while. Honestly, the second they started buying this stuff I started buying hardware. The only problem for me is that they're even ruining the market for RTX 6000 Pro Blackwells.
reply
fastily 14 hours ago
If component prices keep going up and the respective monopoly/duopoly/triopoly for each component colludes to keep prices high/supply constrained, then eventually devices will become too expensive for the average consumer. So what’s the game plan here? Are companies planning to let users lease a device from them? Worth noting that Sony already lets you do this with a ps5. Sounds like we’re headed towards a “you will own nothing and be happy” type situation
reply
Maxion 14 hours ago
> Sounds like we’re headed towards a “you will own nothing and be happy” type situation

That's when I sell of my current hardware and house, buy a cow and some land somewhere in the boondocks and become a hermit.

reply
b3lvedere 14 hours ago
It could be a level up from that.

"You will use AI, because that will be the only way you will have a relaxed life. You will pay for it, own nothing and be content. Nobody cares if you are happy or not."

reply
StopDisinfo910 14 hours ago
We could also vote the policians protecting these uncompetitive markets out of power and let regulators do their job. There has been too many mergers in the component market.

You also have to look at the current status of the market. The level of investment in data centers spurred by AI are unlikely to last unless massive gains materialize. It's pretty clear some manufacturers are betting things will cool down and don't want to overcommit.

reply
pickleglitch 10 hours ago
> We could also vote the policians protecting these uncompetitive markets out of power and let regulators do their job.

Could we though? Even if gerrymandering and voter suppression weren't already out of control, and getting worse, there are very few politicians who could or would do anything about all this.

Elections aren't going to save us.

reply
bravetraveler 14 hours ago
/dev/null as a service, mooning
reply
steve1977 14 hours ago
Time for some heavy regulation
reply
tjpnz 14 hours ago
That's not going to happen when AI is already propping up a significant chunk of the economy.

There is appetite in some circles for a consumer boycott but not much coordination on targets.

reply
squeefers 12 hours ago
> There is appetite in some circles for a consumer boycott

its not being used anywhere lol where are they meant to boycott?

reply
moomoo11 14 hours ago
Future show HN: how I managed to parallelize 100 tape drives to load windows and play video games
reply
nubinetwork 13 hours ago
You could do worse... https://youtu.be/1hc52_PWeU8
reply
blackhaz 14 hours ago
VHS cassettes: maybe not so obsolete, after all?

Also, the Return of PKZIP.

reply
Keyframe 14 hours ago
Video Backup System for Amiga!
reply
baal80spam 14 hours ago
Stacker and Doublespace!
reply
boobsbr 14 hours ago
memmaker and qemm
reply
fnands 14 hours ago
First they came for the GPUs, but I did not speak out, for I was not a gamer.

Then they came for the RAM, but I did not speak out, for I had already closed Firefox.

Then they came for the hard drives, but I did not speak out, for I had the cloud.

Then my NAS died, and there was no drive left to restore from backup.

reply
Havoc 14 hours ago
Supply of 2nd hand enterprise stuff is also showing a slowdown. Seeing less of it show up in eBay
reply
lpcvoid 14 hours ago
But hey, we get slop videos of the pope doing something funny, that's just as cool as being able to purchase computer hardware, right?
reply
littlecranky67 14 hours ago
To be fair, heise is a german news site, and the very article is auto-translated by AI from its german counterpart.
reply
kasabali 13 hours ago
Machine translation have become pretty good long before AI hype started.
reply
IshKebab 11 hours ago
Not really.
reply
kasabali 9 hours ago
Really.

Source: I've been using Chrome's built in translation heavily since around 2018-19, and not before that only because that's the time I've realized it'd became god damn good.

reply
IshKebab 7 hours ago
GPT-2 was in 2019. That was really the start of the LLM bandwagon. You could argue it was ChatGPT that brought it to the masses, but that was still only in 2022. 3 years is not a long time.
reply
kasabali 6 hours ago
Yes I argue exactly that. Notice how I've avoided saying LLM or GPT and said "AI hype". Everyone here on HN has had heard of GPT back then, yet we've been through like what, 3, 4, 5? different hypes since 2018, AI hype is only being the latest.

Maybe to clarify my argument a little bit more, "AI" was already good and was being used for useful/productive purposes before it became the hype. It'd be better had it stay the same way instead of it being the latest money generation machine and sucking the air out of the room.

reply
lpcvoid 14 hours ago
AI use for translation is a good fit for the tech. The problem is generative AI.
reply
archagon 6 hours ago
Uhm, kind of, if you don't mind random stuff getting inserted into your translation sometimes.

(Source: this has actually happened to me.)

reply
coffeebeqn 12 hours ago
We can also lose our jobs in a few years so there’s that
reply
ChrisArchitect 9 hours ago
Previously:

Thanks a lot, AI: Hard drives are sold out for the year, says WD

https://news.ycombinator.com/item?id=47034192

reply
archagon 7 hours ago
Somebody once commented on social media that cryptocurrency feels like a dropshipped alien technology designed to deplete a civilization's resources.

Now that crypto has failed to paperclip maximize the world, maybe LLMs are attempt number two.

reply
iamshs 13 hours ago
Repairability, upgradability and standards compliance needs to be minimum in consumer products. No to proprietary connectors. No soldered SSD or RAM. For home use, allow relaxed licensing options for discarded enterprise products like switches, Wifi Access Points etc. (Juniper Mist APs are fantastic, but are a brick without their cloud). Currently, I cannot put in a market bought SSD in my Macbook. I cannot put in a SSD in my Unifi router without buying their $20 SSD tray. I cannot put third party ECC-RAM and SSDs in my Synology NAS because the policy has only been lifted on HDDs but nothing else. I fear opposite will happen. Only leveraged companies have access to DRAM and NAND and hence will use it to lock us into their ecosystem as consumers won't even get access to storage in the open market otherwise.
reply
cubefox 13 hours ago
I'm confused, that doesn't make sense to me:

> They largely come from hyperscalers who want hard drives for their AI data centers, for example to store training data on them.

What type of training data? LLMs need relatively little of that. For example, DeepSeek-V3 [1], still a relatively large model:

> We pre-train DeepSeek-V3 on 14.8 trillion diverse and high-quality tokens

At 2 bytes per token, that's 29.6 terabytes. That's basically nothing compared to the amount of 4K content that is uploaded to YouTube every day.

1: https://arxiv.org/html/2412.19437v1

reply
citrin_ru 11 hours ago
A few random thoughts:

There are many new data-centers they are being filled with servers. Most servers have at least 2 HDD (mirror) for the OS. I would not be surprised if on a huge scale even 2 HDD per server could cause HDD shortage.

There are likely models which are trained on 4k video and it should be stored somewhere too.

Even things like logs and metrics can consume petabytes for a large (and complex) cluster. And the less mature the software the more logs you need to debug it in production.

In the AI race investments if not unlimited at least abundant. In such conditions optimization of hardware usage is the waste of time and velocity is the only things which matters.

reply
m4rtink 10 hours ago
So at least for HDDs it is unlikely they will have super custom "AI HDDs" like thay have with custom High Bandwidth Memory displacing regular RAM in fab output & being harder to reuse later.

So hopefully once they all go bust, there should be a lot of cheap enterprise HDDs on the marked as creditors pick through the wreckage. :)

reply
greatgib 13 hours ago
Honestly looks highly suspicious to me. Because ok they might need some big storage like petabits. But how can this be a match in proportion with the capacity that is currently usually needed for everything that is hard drive hungry. Any cloud service, any storage service, all the storage needed for private photo/video/media storage for everything that is produced everyday, for all consumer hardwares like computers...

Gpu I understand but hard drive looks excessive. It's like if tomorrow there is a shortage of computer cabling because ai datacenter needs some.

reply
zozbot234 13 hours ago
If you're building for future training needs and not just present, it makes more sense. Scaling laws say the more data you have, the smarter and more knowledgeable your AI model gets in the end. So that extra storage can be quite valuable.
reply
coffeebeqn 12 hours ago
If you’re building a text-only model then the storage is limited but once you get to things like video then it’ll explode exponentially
reply
Jach 13 hours ago
You may have answered your own question if they're wanting to train models on video and other media.
reply
cubefox 10 hours ago
So Veo (Google), Sora (OpenAI) or Seedance (ByteDance)?
reply
newsclues 14 hours ago
I hope the data centres burn
reply
icf80 14 hours ago
they’re pushing for AI, but nobody will have a device to use it?
reply
Havoc 14 hours ago
Chromebook with a 64gig shitty eMMC is what Google and friends would love you to use. Pay that cloud drive subscription!
reply
112233 14 hours ago
The TV is best device for unleashing your creativity by upvoting your favourite Sora creators! Become an expert at any field by activating premium prompts from our partners! By connecting camera you can have meaningful motivating discussions with your deceased loved ones (camera required for fraud prevention. Remember, not looking at the screen during ads is punishable by law)

You have 31/999 credits remaining. What activity would you like to productively participate in today?

reply
FMecha 14 hours ago
I feel traditional "rust" hard disks would be inefficient for AI use. Unless they include SSDs (which I feel these data centers are more likely to be using) in the definition as well...
reply
stockresearcher 10 hours ago
In a large rack-mount SAN with hundreds or thousands of disks connected to your computing equipment via fiberchannel, the performance difference is pretty negligible.

The real advantage of SSDs in this use case is storage density and power efficiency. On the other hand, your compute resources might be packed in so tight with power-intensive stuff that you appreciate the spinning rust “wasting” space.

reply
foxrider 14 hours ago
They need it hoard datasets.
reply
olavgg 14 hours ago
[flagged]
reply
nubg 14 hours ago
Sorry, do people not immediately see that this is an AI bot comment?

Why is this allowed on HN?

reply
dang 6 hours ago
Generated comments are not allowed on HN, and this has been the case for a long time: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

Identifying generated comments is not always easy, as others have pointed out, plus we don't come close to seeing everything that gets posted. If you see a post that ought to have been moderated but hasn't been, the likeliest explanation is that we didn't see it. You can help by flagging it or emailing us at hn@ycombinator.com.

With LLM comments, there's an important distinction between legit users (who may have no idea that they're breaking a rule, especially because it isn't explicit in the guidelines yet) and accounts that appear to be posting nothing but gen-AI text. If you (anyone and everyone!) see a case of the latter, definitely please email us because we've been banning those accounts.

Even in more borderline cases, though, it's still helpful to email us because sometimes we contact the user, if we can, to let them know that we've been getting such reports. Or we might tell them we've suspended their account until we hear from them that they won't post LLM-generated or processed comments.

reply
Maxion 14 hours ago
> Why is this allowed on HN?

1) The comment you replied to is 1 minute old, that is fast for any system to detect weird comments

2) There's no easy and sure-fire way to detect LLM content. Here's wikipedias list of tells https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing

reply
bilekas 14 hours ago
> Sorry, do people not immediately see that this is an AI bit comment?

How do you know that ? Genuine question.

reply
Maxion 14 hours ago
To be fair, it is blindingly obvious from the tells. OP also confirms it here: https://news.ycombinator.com/item?id=47045459#47045699
reply
f311a 14 hours ago
> isn't just "high demand", but "contractual lock-out."

The "isn't just .., but .." construction is so overused by LLMs.

reply
hakanderyal 14 hours ago
It has Claude all over it. When you spend enough time with them it becomes obvious.

In this case “it’s not x, it’s y” pattern and its placement is a dead giveaway.

reply
bayindirh 14 hours ago
Isn't this ironic to use AI to formulate a comment against AI vendors and hyperscalers.

It's not ironic, but bitterly funny, if you ask me.

Note: I'm not an AI, I'm an actual human without a Claude account.

reply
phatfish 12 hours ago
I wonder what the ratio is of "constructive" use of AI is, verses people writing pointless internet comments.

It seems personal computing is being screwed so people can create memes, ask questions that take 30 seconds to find the answer to with Google or Wikipedia, and sound clever on social media?

reply
bayindirh 12 hours ago
If you think AI as the whole discipline, there are very useful applications indeed, generally in pattern recognition and regulation space. I'm aware a lot of small projects which rely on AI to monitor ecosystems, systems or used as nice regulatory mechanisms. Also, same systems can be used for genuine security applications (civilian, non-lethal, legal and ethical).

If we are talking generative AI, again from my experience, things get a bit blurry. You can use smaller models to dig data you own.

I personally used LLMs, twice up to this day. In each case it was after very long research sessions without any answers. In one, it gave me exactly one reference, and I followed that reference and learnt what I was looking for. In the second case, it gave me a couple of pointers, which I'm going to follow myself again.

So, generative AI is not that useful for me, uses way too much resources, and industry leading models are well, unethical to begin with.

reply
nubg 13 hours ago
Yes I found this ironic as well lmao.

I do agree with the sentiment of the AI comment, and was even weighting just letting it slide because I do fear the future tht comment was warning against.

reply
A_D_E_P_T 14 hours ago
> “it’s not x, it’s y”

ChatGPT does this just as much, maybe even more, across every model they've ever released to the public.

How did both Claude and GPT end up with such a similar stylistic quirk?

I'd add that Kimi does it sometimes, but much less frequently. (Kimi, in general, is a better writer with a more neutral voice.) I don't have enough experience with Gemini or Deepseek to say.

reply
lsp 14 hours ago
The phrasing. "It's not just X, it's Y," overuse of "quotes"
reply
dspillett 13 hours ago
The problem with any of these tells is that an individual instance is often taken as proof on its own rather than an indicator. People do often use “it isn't X, it is Y” like constructs¹ and many, myself included sometimes, overuse “quotes”², or use m-dashes³, or are overly concerned about avoiding repeating words⁶, and so forth.

LLMs do these things because they are in the training data, which means that people do these things too.

It is sometimes difficult to not sound like an LLM-written or LLM-reworded comment… I've been called a bot a few times despite never using LLMs for writing English⁴.

--------

[1] particularly vapid space-filler articles/comments or those using whataboutism style redirection, which might be a significant chunk of model training data because of how many of them are out there.

[2] I overuse footnotes as well, which is apparently a smell in the output of some generative tools.

[3] A lot of pre-LLM style-checking tools would recommend this in place of hyphens, and some automated reformatters would make the change without access, so there are going to be many examples in training data.

[4] I think there is one at work in VS which I use in DayJob, when it is suggesting code completion options to save typing (literally Glorified Predictive Text) and I sometimes accept its suggestion, and some of the tools I use to check my Spanish⁵ may be LLM based, so I can't claim that I don't use them at all.

[5] I'm just learning, so automatic translators are useful to check what I'm written isn't gibberish. For anyone else doing the same: make sure you research any suggested changes preferably using pre-2023 sources, because the output of these tools can be quite wrong as you can see when translating into a language you are fluent in.

[6] Another common “LLM tell” because they often have weighting functions especially designed to avoid token repetition, largely to avoid getting stuck in loops, but many pre-LLM grammar checking tools will pick people up on repeated word use too, and people tend to fix the direct symptom with a thesaurus rather than improving the sentence structure overall.

reply
notanastronaut 9 hours ago
Much like the epidemic we're going through of "everything is fake", every comment that even has a whiff of these tells will automatically be dismissed at "AI".

I've find myself doing it, a time or two.

reply
oytis 14 hours ago
LLMs learned rhetorical negation from humans. Some humans continue to use it, because it genuinely makes sense at times.
reply
olavgg 14 hours ago
It is my text, enchanced by AI. Without AI, I would never have used the word "Monopsony". So I learned something new writing this comment.
reply
dang 6 hours ago
Please don't post LLM-generated or LLM-edited or LLM-filtered comments here. These things are more or less the same in how they impact discussion.

We'd much rather hear you in your own voice: https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...

Using LLMs to learn, of course, is great. HN posts benefit from things people learn however they learn them. Just please write them yourself.

(Also, I'm sorry for the harsh reactions people have been showering on you for saying this. The community feels really strongly about it, and of course the norms around all this are still in flux, not just here but in society.)

reply
lpcvoid 14 hours ago
This behavior is part of the problem that got us here, using LLMs for everything.
reply
bilekas 14 hours ago
The irony is lost on you ...
reply
dang 6 hours ago
Please don't cross into personal attack.
reply
bilekas 3 hours ago
Sorry dang, didn't intend it to seem personal!
reply
badpenny 14 hours ago
[flagged]
reply
dang 6 hours ago
Yikes, that's not good. Please don't cross into personal attack.
reply
smcl 14 hours ago
Come on man, you're a "founder" and you can't even write your own comments on a forum?
reply
f311a 14 hours ago
You are losing your personality by modifying your text with LLMs. It saves you how much, 1 minute of writing?
reply
bombela 14 hours ago
It reads almost too AI to the point of being satire maybe?
reply
bilekas 14 hours ago
> "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.

This is the game plan of course, why have customers pay one time for hardware when they can have you constantly feed them money over the long term. Shareholders want this model.

It started with planned obsolescence, now this new model is the natural progression.. There is no obsolescence even in discussion when you're only option is to rent a service, that the provider has no incentive to even make competitive.

I really feel this will be China's moment to flood the market with hardware and improve their quality over time.

reply
actionfromafar 13 hours ago
"I think there is a world market for maybe five c̶o̶m̶p̶u̶t̶e̶r̶s̶" compute centers.
reply
ahsillyme 14 hours ago
> "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.

Yep. My take is that, ironically, it's going to be because of government funding the circular tech economy, pushing consumers out of the tech space.

reply
squeefers 12 hours ago
> pushing consumers out of the tech space.

post consumer capitalism

reply
shit_game 14 hours ago
This is the result of the long-planned desire for consumer computing to be subscription computing. Ultimately, there is only so much that can be done in software to "encourage" (read: coerce) vendor-locked, always-online, account-based computer usage; there are viable options for people to escape these ecosystems via the ever growing plethora of web-based productivity software and linux distributions which are genuinely good, user friendly enough, and 100% daily-drivable, but these software options require hardware.

It's no coincidence that Microsoft decided to take such a massive stake in OpenAI - leveraging the opportunity to get in on a new front for vendor locking by force-multiplying their own market share by inserting it into everything they provide is an obvious choice, but also leveraging the insane amount of capital being thrown into the cesspit that is AI to make consumer hardware unaffordable (and eventually unusable due to remote attestation schemes) further enforces their position. OEM computers that meet the hardware requirements of their locked OS and software suite being the only computers that are a) affordable and b) "trusted" is the end goal.

I don't want to throw around buzzwords or be doomeristic, but this is digital corporatism in its endgame. Playing markets to price out every consumer globally for essential hardware is evil and something that a just world would punish relentlessly and swiftly, yet there aren't even crickets. This is happening unopposed.

reply
kuerbel 13 hours ago
What can we do? Serious question.

It's so hard to grasp as a problem for the lay person until it's too late.

reply
ASalazarMX 8 hours ago
I guess we can support open hardware projects like RISC-V, and homegrown chips. DIY chips will be expensive and very limited at first, so hopefully hobbysts will prioritize efficiency while they get better.

Fortunately we won't ever see a shortage of monitors and input devices, because then how would we consume the rent-a-remote-desktop services.

reply
shit_game 12 hours ago
Honestly; I don't know. I don't think there really is a viable solution that preserves consumer computation. Most of the young people I know don't really know or care about computers. Actually, most people at large that I know don't know or care about computers. They're devices that play videos, access web storesfronts, run browsers, do email, save pictures, and play games for them. Mobile phones are an even worse wasteland of "I don't know and I don't care". The average person doesn't give a shit about this being a problem. Coupled with the capital interests of making computing a subscription-only activity (leading to market activity that prices out consumers and lobbying actions that illegalize it), this spells out a very dire, terrible future for the world where computers require government and corporate permission to operate on the internet, and potentially in ones home.

Things are bad and I don't know what can be done about it because the balance of power and influence is so lopsided in favor of parties who want to do bad.

reply
squeefers 12 hours ago
it goes mainframe (remote) > PC (local) > cloud (remote) > ??? (local)
reply
GCUMstlyHarmls 12 hours ago
mainframe (remote) > PC (local) > cloud (remote) > learning at a geometric rate ∴ > (local)
reply
AmazingTurtle 14 hours ago
"You'll own nothing. And you'll be happy"
reply
iso1631 12 hours ago
> "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.

These things are cyclical.

reply