Personally I think it will be a big headache for HP, people can be hard on laptops and HP is already not excited about consumer support (i.e. mandatory 15 minute wait time for support calls). But if they make it work, I think there's probably a good number of people who feel like they need a laptop but don't care so much about the specifics and want to keep their costs low (as all of their costs appear to be rising right now).
For consumers who don't replace their laptops on a schedule it makes less sense.
RAM was this price some years back, and yet last summer/fall it was at an all-time low.
Will we continue to see steady improvement in top quality CPU/GPUs? Would they even bother releasing consumer versions of ram faster than DDR5?
(A large factor here is, obviously, the cloud. With photos, documents, e-mail, IMs, etc. all hosted for cheap or free on "other people's computers", the total hardware demands on the end-user computing device is much less. Think storage, not just RAM.)
It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once. I have a separate company laptop for work, and I occasionally turned on my PC, but it turns out that a foldable phone is good enough to do everything on personal side I'd normally use a laptop for. So here I am, with my primary compute device I don't have full control over - and yes, I'm surprised by this development myself, and haven't fully processed it yet.
It's a deeply flawed comparison, because many of the things we do with a phone now wasn't something we'd do at all with the computers we grew up with. We didn't pay at the grocery store with a computer, we didn't buy metro tickets, we didn't use it to navigate (well, there was a short period of time where we might print out maps, but anyway..)
When I grew up, I feel like our use of home computers fell into two categories:
1. Some of us kids used them to play games. Though many more would have a Nintendo/Sega for that, and I feel like the iPhone/iPad is a continuation of that. The "it just works" experience where you have limited control over the device.
2. Some parents would use it for work/spreadsheets/documents ... and that's still where most people use a "real" computer today. So nothing has really changed there.
There is now a lot more work where you do the work on services running on a server or in the cloud. But that's back to the original point: that's in many cases just not something we could do with old home computers. Like, my doctor can now approve my request for a prescription from anywhere in the world. That just wasn't possible before, and arguably isn't possible without a server/cloud-based infrastructure.
Phones/tablets as an interface to these services is arguably a continuation of like those old dumb terminals to e.g. AS/400 machines and such.
> It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once.
I do agree, I am in a similar situation.
> [...] my doctor can now approve my request for a prescription from anywhere in the world. That just wasn't possible before [...]
I'm picking nits, but wasn't this more or less instantaneous approval possible before with e.g., a fax and a telephone? Or (although this is a bit of a stretch) a telegram and telegraph?
There is a reason I have a server in my basement - it lets me edit files on my phone (if I must - the keyboard is and screen space are terrible compromises but sometimes I can live with it), laptop (acceptable keyboard and screen), or desktop (great keyboard, large screen); it also lets me share with my wife (I haven't got this working but it can be done). I have nearly always had a server in my house because sharing files between computers is so much better than only being able to work on one (or using floppies). The cloud expands my home server to anywhere in the world: it offloads security on someone else, and makes it someone else's problem to keep the software updated.
There is a lot to hate about the cloud. My home servers also have annoyances. However for most things it is conceptually better and we just need the cloud providers to fix the annoyances (it is an open question if they will)
iPad awas the perfect device for her (I've touched one perhaps twice, in my entire lifetime).
Meanwhile my much more expensive laptop mostly interfaces with applications that primarily exist on servers that I have no control over, and it would be nearly worthless if I disconnected it from the Internet. Your central point is right, the economics are concerning, but I think it's been a ship slowly sailing away that we're now noticing has disappeared over the horizon.
The niche is still there, probably as big as it was before. For example, as I grew weary of being subject to services I have little control over, I set up my own home server using a refurbished PC. It has been an amazing journey so far. But I don't think a normie would ever get interested in buying a refurbished Dell, install Debian on it, and set up their own services there.
As long as there is a niche of people interested in buying their own computers, there will be companies willing to fill that niche.
There have been memory chip panics before, the US funded RAM production back into the 80s/90s in competition with Japan at the time.
The AI boom/"hyperscale" currently is almost exactly like the dotcom boom.
It's already starting to shake down. Anthropic is occupying the developer space, OpenAI has just exited the video/media production space. More focused and vertical market AI is emerging.
The current vortice of money between OpenAI <-> Microsoft <-> Oracle <-> NVidea <-> Google <-> etc etc is going to break.
Outside of the obvious economic effect of the dot com boom - the creation of near infinitely scalable high margin online businesses - there was a secondary effect on consumer electronics, with a massive growth in demand for networked devices; there was then much more of a balance between the hardware growth in the network infrastructure and data center worlds as well as in desktop and mobile.
The AI boom’s hardware impact is much more skewed, as this article details.
Yes but these Chinese firms are a tiny share of the overall RAM/SSD market, and they'll have the same problems with expanding production as everyone else. So it doesn't actually help all that much.
* Chinese firms finance through different banks and investors than current ram producers
* A company with a mission statement of consumer ram won’t have their supply outbid by data centers
* Chinese manufacturing has more expertise in scaling then any other manufacturing culture
I'd like to say a brief thank you to what the brief, golden period of globalisation was able to bring us.
I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future, but it seems the pendulum is swinging the other way for the time being.
I hope that, wherever the current direction ends up, there are lessons that can be learnt about what we had, and somehow fumbled, such that there is motivation enough to get back there.
Not everyone benefited. Market globalism wasn't particularly kind to the global south, and the specific mandates that the WTO enacted on countries in latin america / africa (Washington Consensus) greatly increased local wealth disparities despite visibly growing GDP for a time.
America profited handsomely because for most of the past 30 years, it was where the (future) transnational conglomerates were based. These companies stood to benefit from the opening up of international markets. Now that these companies are being out-competed by their asian counterparts, instead of going back to the drawing board and innovating they are playing the "unfair trade practices" card and of course the current administration is on-board with it.
Globalisation is not going anywhere, but America is increasingly alienating itself from allies who it could stand to benefit from.
Me too, but without all the slavery this time please. It'll never work if some actors are willing to abuse their workforces to keep prices low as they do.
Maybe... just maybe, a TODO list app shouldn't run 4 processes, and consume hundreds of megabytes of RAM?
With only a few kilobytes of code, you could send a UDP packet directly to your phone, with an app you "wrote" with just a few lines of code (to receive, without auto-confirmation).
What do I gain if more developers take this approach? Lightning fast performance. Faster backups. Decreased battery drain => longer battery service lifetime => more time in between hardware refreshes. Improved security posture due to orders of magnitude less SLOC. Improved reliability from decreased complexity.
It’s been convenient that we can throw better hardware at our constraints regularly. Our convenience much less our personal economic functions is not necessarily what markets will generally optimize for, much like developers of electron apps aren’t optimizing for user resources.
Less bloat is 100% always a good thing, no matter what the market conditions are.
More seriously and more ironically, at the same time, we've now reached a strange time where even non-programmers can vibe-code better software than they can buy/subscribe to - not because models are that good, or programming isn't hard, but because enshittification that has this industry rotten to the core and unable to deliver useful tools anymore.
- https://xn--gckvb8fzb.com/projects/
Their github repos:
They even built a BBS-style reader client that supports Hacker News:
https://github.com/mrusme/neonmodem
I miss the days of the web being weird like this :-)
Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.
Whenever I read about fusion, I get reminded of a note in the sci-fi book trilogy The Night's Dawn. In that story, the introduction of cheap fusion energy had not cured global warming on Earth but instead sped it up with all the excess heat from energy-wasting devices.
What matters is not what we don't have, but how we manage that which we do have.
There are several challenges, not least of which is storage. We have considerable leakage in most of our current helium storage solutions on earth because it’s so light. Our national reserves are literally in underground caverns because it’s better than anything we can build. Space just means any containment system will need to work in a wider range of pressure/temperatures.
Doing some googling yields an estimated cost of about $25,000 per kg. I can see why extraction from wells is preferred.
Non-helium hard drives are basically limited by their bearing spin hours. If one only spins a few hours a week, it'll probably run for decades. Not so with helium.
Running a VPS with Tailscale for private access, SQLite instead of managed databases, flat files synced with git instead of cloud storage. None of this requires expensive hardware, it just requires caring enough to set it up
FWIW might want to check https://github.com/wg-easy/wg-easy to remove yet another managed elsewhere piece of your setup.
So yeah, it's fun. But don't under-estimate that time, it could easily be your time spent with friend or family.
I have a homelab that supports a number of services for my family. I have offsite backups (rsync.net for most data, a server sitting at our cottage for our media library), alerting, and some redundancy for hardware failures.
Right now, I have a few things I need to fix: - one of the nodes didn't boot back up after a power outage last fall; need to hook up a KVM to troubleshoot - cottage internet has been down since a power outage, so those backups are behind (I'm assuming it's something stupid, like I forgot to change the BIOS to power on automatically on the new router I just put in) - various services occasionally throw alerts at me
I have a much more complex setup than necessary (k8s in a homelab is overkill), but even the simplest system still needs backups if you care at all about your data. To be fair, cloud services aren't immune to this, either (the failure mode is more likely to be something like your account getting compromised, rather than a hardware failure).
I love self-hosting and run tons of services that I use daily. The thought of random hardware failures scares me, though. Troubleshooting hardware failure is hard and time consuming. Having spare minipcs is expensive. My NAS server failing would have the biggest impact, however.
Kubernetes adds a lot of complexity initially, but it does make it easier to add fault tolerance for hardware failures, especially in conjunction with a replicating filesystem provider like Longhorn. I only knew that I had a failed node because some services didn't come back up until I drained and cordoned the node from the cluster (looks like there are various projects to automate this—I should look into those).
Sure - self hosting takes a bit more work. It usually pays for itself in saved costs (ex - if you weren't doing this work, you're paying money which you needed to do work for to have it done for you.)
Cloud costs haven't actually gotten much cheaper (but the base hardware HAS - even now during these inflated costs), and now every bit of software tries to bill you monthly.
Further, if you're not putting services open on the web - you actually don't need to update all that often. Especially not the services themselves.
Honestly - part of the benefit of self-hosting is that I can choose whether I really want to make that update to latest, and whether the features matter to me. Often... they don't.
---
Consider: Most people are running outdated IP provided routers with known vulnerabilities that haven't been updated in literally years. They do ok.
Most of this I didn't for many years because it is not my core competence (in particular the security aspects). Properly fleshed-out explanations from any decent AI will catapult you to this point in no time. Maintenance? Almost zero.
p.s. Admittedly, it's not a true self-hosting solution, but the approach is similar and ultimately leads to that as well.
If anyone reading this has struggled with servers accumulating cruft, and requiring maintainance, I recommend NixOS.
Combine that with deploy-rs or similar and you have a very very stable way to deploy software with solid rollback support and easy to debug config issues (it's just files in the ./result symlink!)
They need to get over it.
Pick up some Ansible and or Terraform/tofu and automate away. It can be easy or as involved as you want it to be.
The fact that I didn't know any of this is what is significant here. At some point I stopped caring about this sort of thing. It really doesn't matter any more. Don't get my wrong, I am as nerdy as they come. My first computer was a wire wrapped 8080 based system. That was followed by an also wire wrapped 8086 based system of my own design I used for day to day computing tasks (it ran Forth). If someone like me can get to the point of not caring there is no real reason for anyone else to care.
0.03 kW * 24 h * 365 d * $0.18 = $47.30/year
Even CPU TDP is not an accurate measure. My latest AMD CPU will pull more than it’s rated “TDP” under certain loads.
(FWIW, searching for the CPU model brings up an old review where the full system they’re testing pulls 145W under some amount of load. While that’s not nothing, it’s also not outrageous for a desktop PC that does the desktop PC things you require of it.)
TDP is a thermal measurement, it's how much heat energy your heatsink and fan need to be able to dissipate to keep the unit within operational temperatures. It does not directly correlate to the amount of electricity consumed in operation.
New systems idle at something like 25 Watts according to a lazy search. So 49-25=24W. That works out to $15/year hypothetically saved by going to a newer system. But I live in a cold climate and the heating season is something like half the year. But I only pay something like half as much for gas heat as opposed to electric heat. So let's just knock a quarter off and end up with 15-(15/4)=$11.25USD hypothetically saved per year. I will leave it here as I don't know how much the hypothetical alternative computer would cost and, as already mentioned, I don't care.
[1] https://forums.anandtech.com/threads/athlon-ii-x2-250-vs-ath...
HDD/SSD?
Pretty surprising to have this thing still be working 17 years later, unless it spent a good chunk of that in 'cold storage'.
Is that likely? History says it's inevitable, but timeframe is an open question.
If this does occur, unfortunately it isn’t like any of the production capacity is going to immediately shift or be repurposed. A lot of the hardware isn’t usable outside of datacenter deployments. I would guess a more realistic recalibration is 2-3 years of immense pain followed by gradual availability of components again.
The capital from the gulf is already disrupted. It isn't anymore a matter of if or when.
Stuff like that already exists for flash memory; I can harvest eMMC chips from ewaste and solder them to cheaply-available boards to make USB flash drives. But there the protocols are the same, there's no firmware work needed...
M1 Apple Silicon MacBook Airs are still good computers 5+ years after release.
Many games are still playable (and being released on!) the PS4, which is almost 12 years old.
The iPhone 15 pro has 8gb of RAM which will likely be sufficient for a long time.
Don't get me wrong, this whole parts shortage is exceptionally annoying, but we're living in a great time to weather the storm.
The current AI-induced shortages aside, the times have never been better in my opinion. There is overwhelming choice; ordinary consumers can access anything from Raspberry PIs all the way up to enterprise servers and AI accelerators. The situation was very different in the 1990s when I built my first PC.
That's not true at all.
There are a lot of people willing to buy smartphones with small screen or smartphones with Linux or any other OS than iOS or Android.
But those people are not enough to justify the gigantic initial investment that is necessary to provide viable products in this market. And the existing actors aren't interested in those niche.
I suspect we're trending back to the pre-personal computing era where access to 'raw' computing power will be hard to come by. It will become harder and harder to learn to program just because it'll be harder and harder to get your hands on the necessary equipment.
I got my first PC circa 1992 (a 2nd hand IBM PS/2, 80286 processor with 2MB RAM and 30MB HDD) and the "golden age" was already there. We are well over 40 years of almost uninterrupted "pay less for more performances" in the home/personal computing space, and that's because that space started around 50 years ago. There was some fluctuation (remember the earthquake affecting HDD prices a few years ago?) but demand was there and manufacturing tech became more efficient.
The actual important change is that for most consumer uses, the perf improvements stopped to make sense already what, over 10 years ago?
It will be scarcity mindset from here on out; will always buy the top tier thing .
For gaming, I have a dedicated device - a Nintendo Switch, but I also play indie PC games like Slay the Spire, Forge MTG, some puzzle games e.g. TIS-100.
Linux with i3 is fast and responsive. I write code in the terminal, no fancy debuggers, no million plugins, no Electron mess.
It’s enough for everything I need, and I don’t see a reason to ever upgrade. Unless my hardware starts failing, of course.
* someone has to write language specifying a program, natural language or programming.
* a programming langugage is a handle with specific properties at a specific level of abstraction. Whether it’s a popular handle won’t change that it’s far more than a toy.
The experience is quite immersive and well worth the upgrade that happened very progressively (WiFi 5 1080p then WiFi 6/7 4K).
In that sense, I suppose you could still make it work. Our society celebrated surrendering ownership of media to iTunes and Steam for our convenience, whittled down online content that didn't make us feel good, limited which applications we could install on our phones in the name of security and privacy, and eliminated our anonymity to save the kids. At this point, removing the hardware is the least surprising step, because as Captain Beatty says, "if you don’t want a house built, hide the nails and wood."
Or perhaps you were thinking of Brave New World.
"don't create the torment nexus, etc."
That said.... hopefully at least on Android side you can get a free (as in unchastified) OS to run on it.
Until they come for the HW.
a couple of my favorites: "rust programming socks - Google", "Amazon.com: waifu pillow", "Rick Astley - Never Gonna Give You Up", "censorship on hacker news - Google"
Think about it like this: Imagine the AI/Cloud/Crypto companies who are buying up all these compute and storage resources realize they now control the compute hardware market becoming compute lords. What happens when joe/jane six pack or company xyz needs a new PC or two thousand but cant afford them due to the supply crunch? Once the compute lords realize they control the compute supply they will move to rent you their compute trapping users in a walled garden. And the users wont care because they aren't computer enthusiasts like many of us here. They only need a tool that works. They *do not* care about the details.
They hardware lords could further this by building proprietary hardware in collusion with the vendors they have exclusivity with to build weaker terminal devices with just enough local ram and storage to connect to a remote compute cluster. Hardware shortage solved!
All they need to do is collude with the hardware makers with circular contracts to keep buying hardware in "anticipation of the AI driven cloud compute boom." The hardware demand cycle is kept up and consumers are purposefully kept out of the market to push people into walled gardens.
This is unsustainable of course and will eventually fall over but it could tie up computing resources for well over a decade as compute lords dry up the consumer hardware market pushing people to use their hoarded compute resources instead of owning your own. We are in a period where computing serfdom could be a likely outcome that could cause a lot of damage to freedom of use and hardware availability and the future ability to use the internet freely.
Newer graphics hardware is pointless to me. The expensive new techniques I find incredibly offense from an interactivity standpoint (temporal AA, nanite & friends). I run Battlefield 6 at 75% render scale with everything set to low. I really don't care how ass the game looks as long as it runs well. I much more enjoy being able to effectively dispatch my enemies than observe aesthetic clutter.
The tipping point for MCUs was WiFi - which not only allows you to speak multiple protocols (UDP/Zigbee/HTTP/etc) and have audio IO, but also P2P communication and novel new form factors. There's been incredible progress with the miniaturisation of sensors and how we're able to understand and perceive our environment.
So yes, whilst traditional hardware is getting more expensive and locked down, there's a strong counter movement towards computing for everyone - and by that I also mean that there's going to be less abstraction in the entire stack. Good times ahead!
uBlock Origin has prevented the following page from loading:
https://xn--gckvb8fzb.com/hold-on-to-your-hardware/
This happened because of the following filter:
||xn--$document
The filter has been found in: IDN Homograph Attack Protection - Complete BlockageWar aside, I also bet there's going to be a huge demand for edge-compute for other kinds of robotics: self-driving cars, delivery robots, factory robots, or general-purpose humanoids (Tesla Optimus, Boston Dynamics Atlas, 1X NEO, etc). Moving that kind of compute to the cloud is too laggy and unreliable. I know researchers who've tried it, the results were mixed.
Also, the engineers working on these platforms aren't going to reinvent the wheel every time they need to connect hardware together and they're going to use interoperable standards, like PCIe for storage or GPUs, DIMM slots for memory, ATX for power, etc. So I don't see general-purpose computing dying.
Everything today is a web app. If it doesn't exist and you want to vibe code it? It's probably going to become a web app, vibed using a web app.
The problem is, web apps are stupendous memory hogs. We're even seeing Chromebooks with 8 gigs of RAM now. LLM:s are all trained for and implemented in apps assuming the user can have $infinity browsers running, whether it's on their PC or on their phone. It's going to be very hard to change that in a way that's beneficial to what passes for business models at AI companies.
Ah, the paradoxes of modern software.
Even remote VDI instances are accessed through a web page now.
On top of that add all the corporate bloatware and securityslop-ware, and suddenly my "thin" client is using 60% of 10 available cores and 85% of 16GB or RAM.
I don't think it needs an explanation on how insane that resource usage is.
I’m doing more with a decade old GPU, which was manufactured before “Attention is all you need“, than I could 5 years ago, when quantization techniques were implemented.
I’m holding on to my 32 bit machines.
Most linux distributions dropped support for them (for good reason). But at the end of the day these machines are a fabric of up to ~ 4 billion bytes that can be used in a myriad of ways, and we only covered a fraction of the state space before we had moved on.
Does all this not apply to businesses buying computers for their employees?
I must admit that my workflow it's not that heavy.
If the AI boom slows, it will free up manufacturing capacity for the consumer supply chain, but there is going to be a long drought of supply.
I would specifically add, whatever you have, or whatever you choose to buy, it would greatly benefit you to ensure a degree is Linux compatibility to ensure its lifespan can be extended further than the greed enthusiasts at MS, Apple, and Google would like you to. They will be facing the same declines in purchasing habits and are further incentivised to assert their ownership over what you might mistakenly consider your devices.
In my country for offline store purchase of USB HDD only 4Tb seagate variant available, thats 15000 in pur currency thats almost 1.5 month salary in private sector
Any higher size and have to import, and and forex applied, prices goes upto 4 0's , when i read people on youtube or blogs saying they rotate 15Tb and higher on their nas raids, that seems just dream for use never to fulfill
Phones and tablets only get replaced when they die.
Why should I throw away stuff that still works as intended?
The favicon changes every time you switch away to something different with various tab names (ranging from porn, to right-wing news, to 4chan and I'm sure more).
All this author has convinced me to do is block their website.
I also don't agree at all with the premise of the article so I don't imagine I'm going to be missing much by not seeing this site again.
Consumer hardware will always be a market worth serving for companies who don't see their stock price as their product.
If the existing companies are unwilling to make a sale, I am sure new players will arise picking up their slack.
They will let the hyperscalers buy their supply at a premium and wait for the bust. Then they will shift back to the consumer space.
Hardware is going to be expensive for awhile but its not as dire as the article makes it out to be.
https://web.archive.org/web/20180513133803/https://www.techr...
Prices went down again after that.
To me this is just a temporary swing in the other direction - they're riding the gravy train while they can, because once it ends it's back to low prices.
At the same time, the article’s argument that the value of personal computer ownership is only going to rise, in terms of the value of speech, not strictly in terms of the value of lunch, is important to call out.
I’m glad I held on to my 2009 MacBook, for example, as it still functions today as an active part of my homelab, at an amortized yearly cost of practically the price of taking a nice steak dinner once a year.
The US is headed for a cataclysmic crash at this point and it's not clear what will trigger it, but all those companies pushing underpriced tokens and Rust ports of existing tools by agents aren't going to survive it.
These are super interesting problems. However, it seems like selection pressures, or just pure greed, attracts people to the "easiest" solution: pure domination. You don't need to care about any of these (well, you still do eventually, but in the minds of said people) if you just have pure utter control over every part of the stack.
Oh bubbles... their so bubbly. Remember when there was an unlimited demand for fibre optics because - The Internet? So Nortel and other manufacturers lent the money to their clients building the Internet because the growth was unlimited forever? Except they actually didn't have any money, just stock valuations?
"This is a critical step in our effort to unleash the full potential of our high-performance optical component solutions business," said Clarence Chandran, COO of Nortel Networks. "This acquisition really strengthens Nortel Networks' leadership position in high-performance optical components and modules which are essential to delivering the all-optical Internet."
There could be a swing in the future where people will demand local AI instead and resources could shift back to affordable local AI devices.
Lastly, this thesis implies that we will be supply constrained forever such that prices for personal devices will always be elevated as a percentage of one's income. I don't believe that.
whatever happens it's crazy and hope AI madness is worth it
For example, my current Thinkpad T14-gen5, was bought with 8GB ram and 256GB NVME, and then upgraded to 64GB ram and 2TB NVME, for the same price as 16G/512G would have cost at Lenovo. And then I still have the 8GB/256GB to re-use/re-sell.
But also consider that PCs have been an anomaly for very long. I don't think there's an equivalent market where you, as a consumer, can buy off-the-shelf cutting-edge technical pieces in your local mall and piece them together into a working device. It's a fun model, for sure, but I'm not sure it's an efficient model. It was just profitable enough to keep the lights on, thanks primarily to a bunch of Taiwanese companies in that space but it wasn't growing anywhere and the state of software is a mess.
Apple the PCs collective lunch before DCs did. So have gaming consoles. So I weep for consumer choice but as things become more advanced maybe PCs and their entire value chain don't make a lot of sense any more.
Obviously at the end there will still be consumer devices, because someone needs to consume all of this AI (at least people are thrown entirely out of the loop, but then all those redundant meat sacks will need entertainment to keep them content). We have the consumer device hyperscaler Apple doing rather OK even with these supply crunches although I'm not sure for how long.
Can dang/a moderator please ban the domain from HN? Even if its not exactly malware, it's pretending to be malware to grab your attention and it's obviously intending to fill your browser history with inappropriate content, which didn't work on my browser because I opened the blog in a private browser session. The operator clearly doesn't run his blog in good faith.
I thought it was clever. But it also seems ham-fisted, and in poor taste.
> You may want to consider linking to this site, to educate any script-enabled users on how to disable JavaScript in some of the most commonly used browsers. The following code uses scare tactics to do so.
> When added to your website, it will change the icon and the title of your website's tab to some of the most unhinged things imaginable once the user sends your tab to the background. Upon re-activation, the script will display a popover to the user informing them about the joke and referring them to this initiative.
{ "Official Church of Scientology: Difficulties on the Job - Online Course", "Ask HN: How could I safely contact drug cartels?", "The internet used to be fun", "am I boring - Google Search", "what is punycode - Google Search", "arguments for HN comment - Google Search", "how to hack coworker's phone - Google Search", "censorship on hacker news - Google Search", "rust programming socks - Google Shopping", "Adult entertainment clubs - Google Maps", "Pick up lines suggestions - ChatGPT", "Online debate argument suggestions - ChatGPT", "The Flat Earth Society", "Amazon.com: taylor swift merch", "Amazon.com: waifu pillow", "/adv/ - topple government - Advice - 4chan", "r/wallstreetbets on Reddit", "Infowars: There's a War on For Your Mind!", "birds aren't real at DuckDuckGo", "Lincoln MT Cabins For Sale - Zillow", "The Anarchist Cookbook by William Powell | Goodreads", "Fifty Shades of Grey | Netflix", "jeff bezos nudes - Google Image Search", "zuckerberg nudes - Google Image Search", "bigfoot nudes - Google Image Search", "Rick Astley - Never Gonna Give You Up - YouTube", "Pennsylvania Bigfoot Conference - Channel 5 - YouTube", "Linus goes into a real girl's bedroom - Linus Tech Tips - YouTube", "MrBeast en Español - YouTube", "FTX Cryptocurrency Exchange" }
You are also completely speculating on the intent. Less drama please.
The Trump/anti-America phase has gone on way longer than I thought but it won’t last forever.
Even if we have to wait for this old world cabal to die and fade away, time is still on our side.
Boomers are stupid for using time as a weapon.
I’m chillin. Waiting for people to die while growing my businesses.
Travel to a functional place off the beaten path to see nobody can really stop forward progress. Even in these places where time has stopped.
Those who are best able to use a resource are willing to pay the most for it thus pricing out unproductive usages of it.
This is pure Capitalism.
If one is in general against Capitalism, yes, one can complain.
But saying "I want free markets" and "I want capitalism", but then complaining when the free markets increase the price of your RAM is utterly deranged.
Some will say "but Altman is hoarding the RAM, he's not using it productively". It's irrelevant, he is willing to pay more than you to hoard that RAM. In his view he's extracting more value from that than you do, so he's willing to pay more. The markets will work. If this is unproductive use of Capital, OpenAI will go bankrupt.
And the RAM sellers make more money, which is good in Capitalism. It would be irresponsible for them to sell to price sensitive customers (retail), when they have buyers (AI companies) willing to pay much more. And if this is a bad decision, because that AI market will vanish and they will have burned the retail market, Capitalism and Free Markets will work again and bankrupt them.
Survival of the fittest. That is Capitalism. And right now AI companies are the fittest by a large margin.
AI and Capitalism are the exact same thing, as famously put. We are in the first stages of turning Earth into Computronium, you either become Compute or you will fade away.
You might have a DVD collection, ten external drives, three laptops, and a workstration. You may still for all intents and purposes be wholly dependent on cloud computing, say, because that it is the only practical way to run whatever AI-driven software three years from now.
Edit: That’s an example. It goes beyond AI. and...:
Liberty goes beyond that.
https://www.reddit.com/r/LocalLLaMA/comments/1s0czc4/round_2...
- "Stare into this hole to verify your age.
- "Stick your finger in the box.
- "Ignore the pain to get your AI token bucks and unlock access to the shiny new attestation accelerated internet."
- "Sync ALL of your usernames and passwords into this secure enclave."
Every packet and data stream will be analyzed locally by the AI to determine the intentions and predict future behavior. The AI summarized behavior will be condensed into an optimized encoded table to be submitted hourly to the Central Nanny Overseer. I might be slightly exaggerating and a bit hyperbolic but it will be something in this spirit and people will sleep walk right into it.
My only question is which country will control the behavior of these chips.
However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.
This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.
I could sell the RAM alone now for the price I paid for it.
People who are willing to drop $20k on a computer might not be affected much tho.
They probably won't, but those willing to drop $3-10k will be if the consumer and data-center computing diverge at the architectural level. It's the classical hollowing out the middle - most of the offerings end up in a race-to-the-bottom chasing volume of price-sensitive customers, the quality options lose economies of scale and disappear, and the high-end becomes increasingly bespoke/pricey, or splits off into a distinct market with an entirely different type of customers (here: DC vs. individuals).
https://us.ugreen.com/collections/usb-c-hubs - these docks only require a single USB port to connect to. That could be a SBC working as a handheld. These docks could end up being the largest cost component in the new era of all-in-ones. UGreen could be the next Apple as screens and processors snap-on to these hubs, in addition to their own range of power banks and SSD enclosures. Their quality is high too.
In fact, I would go so far as to say we are entering a tinkering culture, and free-energy technologies are upon us as a response oppressive economic times. Sort of like how the largest leaps in religious and esoteric thought have occurred in the most oppressive of circumstances.
People will reject their crappy thin clients, start tinkering and build their own networks. Knowledge and currency will stay private and concentrated - at least at first.
But indeed, once you have USB-C support on your device, you can connect all kinds of periphery through it, from keyboards to 4K screens. Standardized device classes obviate the need for most drivers.
This will likely extend further and further, more into the "normie" territory. MS Windows is, of course, the thing that keeps many people pinned to the x64 realm, but, as Chromebooks and the Steam Deck show us, Windows is not always a hard requirement to reach a large enough market segment.
The large PC builders (Dell, HP, Lenovo) will continue down the road of cost reduction and proprietary parts. For the vast majority of people pre-packaged machines from the "big 3" are good enough. (Obviously, Apple will continue to Apple, too.)
I think bespoke commodity PCs will go the route, pricing wise, of machines like the Raptor Talos machines.
Edit: For a lot of people the fully customized bespoke PC experience is preferred. I used to be that person.
I also get why that doesn't seem like a big deal. I've been a "Dell laptop as a daily driver" user for >20 years now. My two home servers are just Dell server machines, too. I got tired of screwing around with hardware and the specs Dell provided were close enough to what I wanted.
I'm very excited about the Steam Machine for the reasons you mention - I want to buy a system, not a loose collection of parts that kind-of-sort-of implement some standard to the point that they probably work together.
I also want housing as cheap as it was a couple of years ago.
But I can imagine that it would become less prevalent on personal machines, maybe even rare eventually.
Prior to the crunch, you could have anything from 48-64 cores and a good chunk of RAM (128GB+). If you were inordinately lucky, 56 cores and 64GB of onboard HBM2e was doable for 900-1500 USD.
They’re not Threadrippers or EPYCs,but sort of a in between - server chip that can also make a stout workstation too.
We're out here with amazing performance in $600 laptops that last all day on battery and half of this comment section is acting like personal computing is over.
Raspberry Pi is way cheaper than those things, and I'm sure you could hook one up with an all-day battery for $100-200.. Doesn't mean it's "better".
They’re not ideal for all use cases, of course. I’m happy to still have my big Linux workstation under my desk. But they seem to me like personal computers in all the ways that matter.
Many a people need only a basic device for Netflix, YouTube, google docs or email or search/but flights tickets. That will be amazing.
Many have job supplied laptop/desktop for great performance (made rubbish by AV scanners but that's different issue)
I was looking up an old video game homepage the other day for some visual design guidance. It was archived on the Wayback Machine, but with Flash gone, so was the site. Ruffle can't account for every edge case.
Flash was good. It was the bedrock of a massive chunk of the Old Net. The only thing awful are the people who pushed and cheered for its demise just so that Apple could justify their walled garden for the few years before webdev caught up. Burning the British Museum to run a steam engine.
This is what I'm afraid of. As more stuff moves to the cloud helped in part by the current prices of HW, the demand for consumer hardware will drop. This will keep turning the vicious cycle of rising consumer HW prices and more moves to the cloud.
I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform. If a GPU is so expensive, you move to a rental model and the subsequent drop in demand will make GPUs even more expensive. They're far from the only ones with dollar signs in their eyes, between the money and total control over customers this future could bring.
Being entirely reliant on someone else's software and hardware is a bleak thought for a person used to some degree of independence and self sufficiency in the tech world.
It's also a nightmare from any sort of privacy perspective, in a world that's already becoming too much like a panopticon.
Roblox is not popular because of its graphics. Younger gamers care more about having fun than having an immersive experience.
The problem I describe is companies pushing towards the "rent" model vs. "buy to own". Nvidia was just an example by virtue of their size. Microsoft could be another, they're also eying the game streaming market. Once enough buyers become renters, the buying market shrinks and becomes untenable for the rest, pushing more people to rent.
GPUs are so expensive now that many gamers were eying GeForce Now as a viable long term solution for gaming. Just recently there was a discussion on HN about GeForce Now where a lot of comments were "I can pay for 10 years of GeForce Now with the price of a 5090, and that's before counting electricity". All upsides, right?
In parallel Nvidia is probably seeing more money in the datacenter market so would rather focus the available production capacity there. Once enough gamers move away from local compute, the demand is unlikely to come back so future generations of GPUs would get more and more expensive to cater for an ever shrinking market. This is the vicious cycle. Expensive GPU + cheap cloud gaming > shrinking GPU market and higher GPU prices > more of step 1.
Roblox is one example of a game, there are many popular games that aren't graphics intensive or don't rely on eye candy. But what about all the other games that require beefy GPU to run? Gamers will want to play them, and Nvidia like most other companies sees more value in recurring revenue than in one time sales. A GPU you own won't bring Nvidia money later, a subscription keeps doing that.
The price hikes come only after there's no real alternative to renting. Look at the video streaming industry.
Also, if gamers demand infinitely improving graphics so much that they would rather pay for cloud gaming than relax their expectations and be happy with, say, current gen graphics, then that is more a claim about modern self-pwned gamer behavior than megacorp conspiracy.
But I don't buy that either. The biggest games on Steam Charts and Twitch aren't AAA RTX 5090 games.
Riddle me this: does anyone pursue a self-pwn intentionally?
"Conspiracy theory" is just dehumanizer talk for falling prey to business as usual.
This what always happens in capitalism. Scarcity is almost always followed by glut
Memory makers, for example, have sold out their inventory for several years, but instead of investing to manufacture more, they’re shutting down their consumer divisions. They’re just transferring their consumer supply to their B2B (read AI) supply instead.
Thats likely because they don’t expect this demand to last past a few years.
Many users will not want to risk their privacy, data, and workflow on someone else's rapidly-enshittifying AI cloud model. Right now we don't have much choice, but there are signs of progress.
Many new games cannot run max settings, 4k, 120hz on any modern gpus. We probably need to hit 8k before we max out on the returns higher resolution can provide. Not to mention most game devs are targeting an install base of $500 6 year old consumer hardware, in a world where the 5090 exists.
My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.
I'm not arguing mind you, just trying to understand the usecases people are thinking of here.
For word processing, basic image manipulation, electron app (well...) even the "cheap" Macbook Neo is good enough, and it's a last year phone CPU. But that's not enough for a lot of use case.
Running Electron apps and browsing React-based websites, of course.
Perhaps the math will change if the hardware market stagnates and people are keeping computers and phones for 10 years. Perhaps it will even become a product differentiator again. Perhaps I'm delusional :).
Well, some of the "old school" has left the market of natural causes since the 2000s.
That only leaves the rest of 'em. Wer dey go, and what are your top 3 reasons for how the values of the 2000s era failed to transmit to the next generation of developers?
I have an example.
I use Logos (a Bible study app, library ecosystem, and tools) partially for my own faith and interests, and partially because I now teach an adult Sunday school class. The desktop version has gotten considerably worse over the last 2-3 years in terms of general performance, and I won't even try to run it under Wine. The mobile versions lack many of the features available for desktop, but even there, they've been plagued by weird UI bugs for both Android and iOS that seem to have been exacerbated since Faithlife switched to a subscription model. Perhaps part of it is their push to include AI-driven features, no longer prioritizing long-standing bugs, but I think it's a growing combination of company priorities and framework choices.
Oh, for simpler days, and I'm not sure I'm saying that to be curmudgeonly!
https://en.wikipedia.org/wiki/Yr.no
https://f-droid.org/en/packages/com.ominous.quickweather/
--
[0] - Having https://sectograph.com/ as a watch face is 80%+ of value of having a modern smartwatch to me. Otherwise, I wouldn't bother. I really miss Pebble.
("Why don't you just close firefox?" No thanks, I've lost tab state too many times on restart to ever trust its sessionstore. In-memory is much safer.)
You have to close Firefox every now and then for updates though. The issue you describe seems better dealt with on filesystem level with a CoW filesystem such as ZFS. That way, versioning and snapshots are a breeze, and your whole homedir could benefit.
And when, for whatever reason, having a "desktop application" becomes a priority to developers, what do they do? Write it in Electron and ship a browser engine with their app. Yuuuuuuck!
For the rest: I agree with you.
I haven't noticed any kind of difference when using Teams. That piece of crap is just as slow and borken as it always was.
> I haven't noticed any kind of difference when using Teams.
If the device is a laptop, also the thermal design (or for laptops that are in use: whether there is dust in the ventilation channels (in other words: clean the fans)) is very important for the computer to actually achieve the performance that the hardware can principally deliver.
The issue is with applications that have no business being entitled to large amount of resources. A chat app is a program that runs in the background most of the time and is used to sporadic communication. Same for music players etc. We had these sorts of things since the 90's, where high end consumer PCs hat 16mb RAM.
these days individual _tabs_ are using multiple gb of ram.
tl;dr, no one is looking for their RAM to stay idle. They're looking for their RAM to be available.
In not trying to excuse crappy developers making crappy slow ad wasteful apps, I just don't think electron itself is the problem. Nor do I think it's a particularly big deal if an app uses some memory.
The issue with Electron is that it encourages building desktop apps as self-contained websites. Sure, that makes it easier to distribute apps across systems and OSes, but it also means you've got front end web devs building system applications. Naturally, they'll use what they're used to: usually React, which exacerbates the problem. Plus it means that each app is running a new instance of a web browser, which adds overhead.
In real life, yeah, it's rare that I actually encounter a system slowdown because yet another app is running on Electron. I just think that it's bad practice to assume that all users can spare the memory.
I'll admit that my concern is more of a moral one than a practical one. I build software for a living and I think that optimizing resource usage is one way to show respect to my users (be they consumers, ops people running the infra, or whatever). Not to mention that lean, snappy apps make for a better user experience.
This is why the top model of the previous generation of the iPhone (the iPhone 16 Pro Max) has only 8 GB of RAM, bumped to 12 GB for the current top model (the iPhone 17 Pro Max at the higher tiers of additional storage). If Apple had decided to put more RAM than that into any iPhone, even the models where the price is irrelevant to most buyers, they would not have been serving their customers well.
So, now you have to pay a penalty in either battery life or device weight for the duration of your ownership of any device designed for maximum mobility if you ever want to having a good experience when running Electron apps on the device.
Are they slow because they're Electron? No idea. But you can't deny that most Electron apps are sluggish for no clear reason. At least if they were pegging a CPU, you'd figure your box is slow. But that's not even what happens. Maybe they would've been sluggish even using native frameworks. Teams seems to do 1M network round-trips on each action, so even if it was perfectly optimized assembly for my specific CPU it would probably make no difference.
I wonder if there’s a computer science law about this. This could be my chance!
https://en.wikipedia.org/wiki/Wirth%27s_law
Not exactly the same (it's about power rather than price). But close enough that when you said it, I thought, "oh! there is something like that." There's also more fundamental economics laws at play for supply and demand of a resource / efficiencies at scale / etc. Given our ever increasing demand of compute compared increasing supply (cheaper more powerful compute), I expect the supply will bottleneck before the demand does.
I guess this might be happening with LLMs already
The constant increases in website and electron app weight don't feel great either.
A lot of commercial CAD software exists for a very long time, and it is important for industrial customers that the backward compatibility is very well kept. So, the vendors don't want to do deep changes in the CAD kernels.
Additionally, such developments are expensive (because novel algorithms have to be invented). I guess CAD applications are not that incredibly profitable that as a vendor you want to invest a huge amount of money into the development of such a feature.
Most affordable laptops have exactly that, 16gigs of ram and a terabyte of storage. Think about THAT!
That's "non powerful" to you?
This absolutely boggles my mind. Do you mind if I ask what type of computing you do in order to justify this purchase to yourself?
I'm also into motorcycles. Before I owned a house with a garage, I had to continuously pack my tools up and unpack them the next day. A bigger project meant schlepping parts in and out of the house. I had to keep track of the weather to work on my bikes.
Then, when I got a house, I made sure to get one with a garage and power. It transformed my experience. I was able to leave projects in situ until I had time. I had a place to put all my tools.
The workstation is a lot like that. The alternative would be renting. But then I'd spend a lot of my time schlepping data back and forth, investing in setting things up and tearing them down.
YMMV. I wouldn't dream of trying to universalize my experience.
I would bet it continues to be more affordable to buy reasonable specs with current consumer hardware, rather than buying a top system once.
768GB of RAM is insane…
Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.
I don't know your workloads, but for me personally 64 GB is the ceiling buffer on RAM - I can run entire k8s cluster locally with that and the M5 Pro with top cores is same CPU as M5 Max. I don't need the GPU - the local AI story and OSS models are just a toy for my use-cases and I'm always going to shell out for the API/frontier capabilities. I'm even thinking of 48 config because they already have those on 8% discounts/shipped by Amazon and I never hit that even on my workstation with 64 GB.
No, it won't. The power drain of merely refreshing DRAM is negligible, it's no higher than the drain you'd see in S3 standby over the same time period.
Similarly, having more cache may mean less SSD activity, which may mean less energy draw overall.
If I had a chip to put on the roulette table of this "what if" I'd put it on the "it won't make a difference in the real world in any meaningful way" square.
It wasn't my primary motivator but it hasn't made me regret my decision.
I hummed and hawed on it for a good few months myself.
How is this going to work? You need uncontrolled compute for developing software. Any country locking up that ability too much will lose to those who don't.
I've read about companies where all software developers have to RDP to the company's servers to develop software, either to save on costs (sharing a few powerful servers with plenty of RAM and CPU between several developers) or to protect against leaks (since the code and assets never leave the company's Citrix servers).
Oh you sweet summer child :(
You think our best and brightest aren't already working on that problem?
In fact they've fucking aced it, as has been widely celebrated on this website for years at this point.
All that remains is getting the rest of the world to buy in, hahahaha.
But I laugh unfairly and bitterly; getting people to buy in is in fact easiest.
Just put 'em in the pincer of attention/surveillance economy (make desire mandatory again!).
And then offer their ravaged intellectual and emotional lives the barest semblance of meaning, of progress, of the self-evident truth of reason.
And magic happens.
---
To digress. What you said is not unlike "you need uncontrolled thought for (writing books/recording music/shooting movies/etc)".
That's a sweet sentiment, innit?
Except it's being disproved daily by several global slop-publishing industries that exist since before personal computing.
Making a blockbuster movie, recording a pop hit, or publishing the kind of book you can buy at an airport, all employ millions of people; including many who seem to do nothing particularly comprehensible besides knowing people who know people... It reminds me of the Chinese Brain experiment a great deal.
Incidentally, those industries taught you most of what you know about "how to human"; their products were also a staple in the lives of your parents; and your grandparents... if you're the average bougieprole, anyway.
---
Anyway, what do you think the purpose of LLMs even is?
What's the supposed endgame of this entire coordinated push to stop instructing the computer (with all the "superhuman" exactitude this requires); and instead begin to "build" software by asking nicely?
Btw, no matter how hard we ignore some things, what's happening does not pertain only to software; also affected are prose, sound, video, basically all electronic media... permit yourself your one unfounded generalization for the day, and tell me - do you begin to get where this is going?
Not "compute" (the industrial resource) but computing (the individual activity) is politically sensitive: programming is a hands-on course in epistemics; and epistemics, in turn, teaches fearless disobedience.
There's a lot of money riding on fearless disobedience remaining a niche hobby. And if there's more money riding on anything else in the world right now, I'd like an accredited source to tell me what the hell that would be.
Think for two fucking seconds and once you're done screaming come join the resistance.
Before this price spike, it used to be you could get a second-hand rack server with 1TB of DDR4 for about $1000-2000. People were massively underestimating the performance of reasonably priced server hardware.
You can still get that, of course, but it costs a lot more. The recycling company I know is now taking the RAM out of every server and selling it separately.
Apple hardware is incredibly overpriced.
See a $1100 GPU on eBay, but it’s in the US? Actually a $1900 GPU.
A colleague were just talking about how well he timed the purchase of his $700 24GB 3090.
As someone who just bought a completely maxed out 14" Macbook Pro with an M5 Max and 128GB of RAM and 8TB SSD, it was not $10k, it was only a bit over $7k. Where is this extra $3k going?
Turns out the heatsink in the 14" isn't nearly enough to handle the Max with all cores pegged. I'd get about 30 seconds of full power before frequency would drop like a rock.
I'm wondering if there was something wrong with your particular unit?
How can you say this when Apple is releasing extremely fast M5 MacBook Pros? Or the $600 MacBook Neo that has incredible performance for that price point?
Even x86 is getting some interesting options. The Strix Halo platform has become popular with LLM users that the parts are being sold in high numbers for little desktop systems.
If you haven't tried out a desktop CPU in a while, I highly recommend you giving it a try if you're used to only using laptops, even when in the same class the difference is obvious.
For CPU-bound tasks like compiling they’re not that different. For GPU tasks my desktop wins by far but it also consumes many times more power to run the giant GPU.
If you think laptops are behind consumer desktops for normal tasks like compiling code you probably haven’t used a recent MacBook Pro.
What are the exact CPU models used here though? Since my point was about CPUs in the "same class", and it's really hard to see if this is actually the case here.
And yes, I've played around with the recent Apple CPUs, all the way up to M4 Pro (I think, thinking about it I'm not 100% sure) and still I'd say the same class of CPUs will do better in a desktop rather than a laptop.
If you want to compare it in the Apple ecosystem, compare the CPUs of a laptop to one of the Mac Mini/Mac Studio, and I'm sure you'll still see a difference, albeit maybe smaller than other brands.
The same chip perform basically the same in the different form factors.
For all of the definitive statements you're making in this thread, you don't seem to know much about Apple M-series silicon.
A 300W GPU released in 2025 is about 10x M5 perf. The difference is going to be smaller for CPU perf, but also not close.
This is not true. The recent MacBook Pros are every bit as fast as my Zen 5 desktop for most tasks like compiling.
For GPU there is a difference because both are constrained by thermal and power requirements where the desktop has a big advantage.
For CPU compute, the laptop can actually be faster for single threaded work and comparable for multi threaded work.
Anyone claiming laptop CPUs can’t keep up with desktop CPUs hasn’t been paying attention. The latest laptops are amazing.
Bad example. That's highly parallel, so a higher core-count die is going to destroy the base M5 here.
I don't typically compile Linux on my M5, so I don't really care, but at least online available clang benchmarks put it at roughly half the LOC/s of a 9950X, which released in 2024.
Anything single threaded it should match or even edge ahead though.
It gets for worse for multi threaded perf if you leave behind consumer-grade hardware and compare professional/workhorse level CPUs like EPYC/Threadripper/Xeon to Apple's "pro" lines. That's just a slaughter. They're roughly 3x a 9950X die for these kinds of workloads.
The base M5 starts at 10 cores and scales to 18 cores. The performance is similar to high end dekstop consumer CPUs.
> I don't typically compile Linux on my M5, so I don't really care,
If you don't compile large codebases, why do you care then?
I do compile large codebases and I'm speaking from experience with the same codebase on both platforms. Not "LOC/s" benchmarks
There's a large C++ codebase I need to compile, but it can't compile/run on OSX in the first place, hence the desktop that I use remotely for that. Since it's also kind of a shitshow, that one has really terrible compile times: up to 15 minutes on a high powered Intel ThinkPad I no longer use, ~2 minutes on desktop.
I could do it in a VM as well, but let's be real: running it on the M5 in front of me is going to be nowhere near as nice as running it on the water cooled desktop under my desk.
If you're using your old ThinkPad as your reference point for laptop versus desktop it's no wonder that you're confused by the M5 MacBook Pro comments.
And then, if your laptop is busy, your machine is occupied - I hate that feeling. I never run heavy software on my laptop. My machine is in the cellar, I connect over ssh. My desktop and my laptop are different machines. I don't want to have to keep my laptop open and running. And I don't want to drag an expensive piece of hardware everywhere.
And then you need to use macOS. I'm not a macOS person.
I would hope so, given that you can buy multiple M5 laptops for the price of that CPU alone.
I made a comment about how impressive the M5 laptops were above, so these comments trying to debunk it by comparing to $12,000 CPUs (before building the rest of the system) are kind of an admission that the M5 is rather powerful. If you have to spend 3-4X as much to build something that competes, what are we even talking about any more?
(Of course, I don't disagree with the notion that consumerism produces an extraordinary amount of worthless trash, but that's a different matter. The main problem with consumerism is consumerism itself as a spiritual disease; the material devastation is a consequence of that.)
The planet has a certain resource-bound carrying capacity. It's a fact of physics. Just because we aren't there yet as of (checks time) 2026-03-27, doesn't mean Malthusians are wrong.
Although to be fair to the other side, I think with abundant renewable energy we'll be able to delay resource depletion for a very long time thanks to recycling (and lower standards of living of course).
Also C: nations that are both A and B, needlessly causing oil volatility with unplanned military dickheadedness.
And even if we figured out how to electrify everything (which we didn't as I just said), we would still run into resources shortages for batteries, wires (copper etc.), nuclear fuel (uranium)...
There is not a risk of resource shortage of copper. The doomer and prepper talking points you're parroting are not based in reality.
And I don't even understand your other points to be honest. What do you mean "consumer vehicles" ? Are you taking about individual's cars ? I'm not taking about that, these don't matter that much. I'm taking about trucks, boats, planes, the stuff actually shipping you your lifestyle.
Look up what it means to have a conversation in "good faith" vs in "bad faith" and you might learn something useful about conversation tools. For example, lying about what someone says and calling it "peculiar" is "bad faith".
This is where I think current hackers should be headed. I grew up with lots of family who were backyard mechanics, wrenching on cars and motorcycles. Their investment in tools made my occasional PC purchase look extremely affordable. Based on what I read, senior mechanics often have five-figure US dollar investments in tools. Of course, I guess high quality torque wrenches probably outlast current GPU chips? I'd hate to be stuck making a $10K investment every 24 months on a new GPU . . .
I have been renting GPU resources and running open weight models, but recently my preferred provider simply doesn't have hardware available. I'm now kicking myself a little for not simply making a big purchase last fall when prices were better.
It really feels like we're slowly marching back to the era of mainframe computers and dumb terminals. Maybe the democratization of hardware was a temporary aberration.
Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?
Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.
The RAM choice was because I have never regretted buying more RAM - it's practically always a better trade than a slightly faster CPU - and 96GB DIMMs were at a sweet spot compared to 128GB DIMMs.
That, and the ability to have big LLMs in memory, for some local inference, even if it's slow mixed CPU/GPU inference, or paged on demand. And if not for big LLMs, then to keep models cached for quick swapping.
I don't mean to judge, it's your money but to me it seems like an enormous waste. Just like spending $100k on a car when you can get one for $15k that does pretty much exactly the same job.
It's not so easy to get nice second-hand hardware here in Switzerland, and my HEDT is nice and quiet, doesn't need to be rack-mounted, plugs straight into the wall. I keep it in the basement next to the internet router anyway.
The "sensible" choice is to rent. It's the same with cars; most people these days lease (about 50% of new cars in CH, which will be a majority if you compare it with auto loan and cash purchase).
I think leasing might be okayish if you find a really good deal, but it's really not much different than buying new which is just a shit deal no matter how you turn it. A 1-4 year old car is pretty much new anyway, I don't see any reason to buy brand new.
That’s for everyone
Never really used it all, usually only about 40%, but it's one of those better to have than not need, and better than selling and re-buying a larger memory machine (when it's something you can't upgrade, like a Mac or certain other laptops)
We live in world where we optimised for globalization. Industry in china, oil in middle east, etc...
This approach proved to be fragile on the hands of people with money and/or power enough to tilt the scale
[0] https://youtu.be/jVzeHTlWIDY?si=cRJ6C7jPxLIpKTyF
And I fear they will be equally confused and annoyed by disposing of all of them.
I thought the trend is the opposite direction, with RTX 5x series converging with server atchitectures (Blackwell-based such as RTX 6000 Pro+). Just less VRAM and fewer tensor cores, artificially.
Where is the divergence happening? Or you don't view RTX 5x as consumer hardware?
People laugh at young men for looksmaxxing. And then there’s this. I dunno. As someone who has been playing computer games since the 70s, I clearly do not understand the culture anymore. But what forces would drive a young man to spend the price of a used car to play a derivative FPS? It seems heartbreaking. Just like the looksmaxxer.
Tech feels increasingly fragile with more and more consolidation. We have a huge chunk of advanced chip manufacturing situated on a tiny island off the coast of a rising superpower that hates that island being independent. Fabs in general are so expensive that you need a huge market to justify building one. That market is there, for now. But it doesn't seem like there's much redundancy. If there's an economic shock, like, I dunno, 20% of the world's oil supply suddenly being blockaded, I worry that could tip things into a death spiral instead.
Are you kidding? Apple's mobile chips are now delivering perf that AMD & intel desktop never could or did.
Most applications don't make aggressive use of the SIMD instructions that modern x86 chips offer, thus you get this impression. :-(
What are you talking about?
My laptops are, and always have been, primarily places where I do local computing. I write code there, I watch movies there, I listen to music there, I play games there...all with local storage, local compute, and local control (though I do also store a bunch of my movies on a personal media server, housed in my TV stand, because it can hold a lot more). My smartphone is similar.
If you think that the vast majority of the work most people do on their personal computers is moving to LLMs, or cloud gaming, then I think you are operating in a pretty serious bubble. 99.9% of all work that most people do is still best done locally: word processing, spreadsheets, email, writing code, etc. Even in the cases where the application is hosted online (like Google Docs/Sheets), the compute is still primarily local.
The closest to what you're describing that I think makes any sense is the proliferation of streaming media—but again, while they store the vast libraries of content for us, the decoding is done locally, after the content has reached our devices.
It doesn't matter if a cutting-edge AI-optimized server can perform 10, 100, or 1000 times better than my laptop at any particular task: if the speed at which my laptop performs it is faster than I, as a human, can keep up (whatever that means for the particular task), then there's no reason not to do the task locally.
I don't share the same 1:1 opinion with regards to the article, but it is absolutely clear that RAM prices have gone up enormously. Just compare them. That is fact.
It may be cheaper lateron, but ... when will that happen? Is there a guarantee? Supply crunch can also mean that fewer people can afford something because the prices are now much higher than before. Add to this the oil crisis Trump started and we are now suddenly having to pay more just because a few mafiosi benefit from this. (See Krugman's analysis of the recent stock market flow of money/stocks.)
Open source efforts need to give up on local AI and embrace cloud compute.
We need to stop building toy models to run on RTX and instead try to compete with the hyperscalers. We need open weights models that are big and run on H200s. Those are the class of models that will be able to compete.
When the hyperscalers reach take off, we're done for. If we can stay within ~6months, we might be able to slow them down or even break them.
If there was something 80-90% as good as Opus or Seedance or Nano Banana, more of the ecosystem would switch to open source because it offers control and sovereignty. But we don't have that right now.
If we had really competitive open weights models, universities, research teams, other labs, and other companies would be able to collaboratively contribute to the effort.
Everyone in the open source world is trying to shrink these models to fit on their 3090 instead, though, and that's such a wasted effort. It's short term thinking.
An "OpenRunPod/OpenOpenRouter" + one click deploy of models just as good as Gemini will win over LMStudio and ComfyUI trying to hack a solution on your own Nvidia gaming card.
That's such a tiny segment of the market, and the tools are all horrible to use anyway. It's like we learned nothing from "The Year of Linux on Desktop 1999". Only when we realized the data center was our friend did we frame our open source effort appropriately.
We have this class of models already, Kimi 2.5 and GLM-5 are proper SOTA models. Nemotron might also release a larger-sized model at some time in the future. With the new NVMe-based offload being worked on as of late you can even experiment with these models on your own hardware, but of course there's plenty of cheap third-party inference platforms for these too.
Oh god no, please not more slop, you're already consuming over 1 percent of human energy output, could you, like, chill a bit?
I.e., /if/ I am going to consume LLM tokens, I figure that a local LLM with 10s of billions of parameters running on commodity hardware at home will still consume far more energy per token than that of a frontier model running on commercial hardware which is very strongly incentivized to be as efficient as possible. Do the math; it isn't even close. (Maybe it'd be closer in your local winter, where your compute heat could offset your heating requirements. But that gets harder to quantify.)
Maybe it's different if you have insane and modern local hardware, but at least in my situation that is not the case.
^ Fair. Yep, I agree the calculus changes if you don't have _any_ local hardware and you're needing to factor in the cost of acquiring such hardware.
When I did this napkin math, I was mostly interested in the energy aspect, using cost as a proxy. I was calculating the $/token (taking into consideration the cost of a KWh from my utility, the measured power draw of my M1 work machine, and the measured tokens per second processed by a ~20BP open-weight model). I then compared this to the published $/token rate of a frontier provider, and it was something like two orders of magnitude in favor of the frontier model. I get it, they're subsidizing, but I've got to imagine there's some truth in the numbers.
I wonder, does (or will) the $/token ratio fall asymptotically toward the cost of electricity? In my mind I'm drawing a parallel to how the value of mined cryptocurrency approximately tracks the cost of electricity... but I might be misremembering that detail.
- Our career is reaching the end of the line
- 99.9999% of users will be using the cloud
- if we don't have strong open source models, we're going to be locked into hyperscaler APIs for life
- piddly little home GPUs don't do squat against this
Why are you building for hobby uses?
Build for freedom of the ability to make and scale businesses. To remain competitive. To have options in the future independent of hyperscalers.
We're going to be locked out of the game soon.
Everyone should be panicking about losing the ability to participate.
Play with your RTXes all you like. They might as well be raspberry pis. They're toys.
Our future depends on our ability to run and access large scale, competitive, open weights. Not stuff you run with LM Studio or ComfyUI as a hobby.
Also, the only thing crashing down will be the economic participation of everyday people if we don't have ownership over the means of creation. Hyperscalers will be just fine.
Here's my retort: https://news.ycombinator.com/item?id=47543367