I don't buy the central thesis of the article. We won't be in a supply crunch forever.
However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.
This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.
I could sell the RAM alone now for the price I paid for it.
TheRoque 7 minutes ago [-]
We are on borrowed time, most of the world is running on oil and this resource is not unlimited at all. A lot of countries have gone past their production peak, meaning it's only downhill from here. Everything is gonna be more costly, more expensive, our lavish "democracies" lifestyles are only possible because we have (had) this amazing freely available resource, but without it it's gonna chance. Even at a geopolitical scale you can see this pretty obviously, countries that talked about free market, free exchange are not starting to close the doors and play individually. Anyways, my point is, we are in for decades, if not a century of slow decline.
margalabargala 27 seconds ago [-]
Doubt it. Renewables are expanding much faster than oil output is decreasing. Wind and solar will enable energy to remain cheap everywhere that builds it.
raincole 2 hours ago [-]
We won't be in a supply crunch forever. We'll have a demand crunch. The demand of powerful consumer hardware will shrink so much that producing them will lose the economics of scale. It 've always been bound to happen, just delayed by the trend of pursuing realistic graphics for games.
People who are willing to drop $20k on a computer might not be affected much tho.
TeMPOraL 2 hours ago [-]
> People who are willing to drop $20k on a computer might not be affected much tho.
They probably won't, but those willing to drop $3-10k will be if the consumer and data-center computing diverge at the architectural level. It's the classical hollowing out the middle - most of the offerings end up in a race-to-the-bottom chasing volume of price-sensitive customers, the quality options lose economies of scale and disappear, and the high-end becomes increasingly bespoke/pricey, or splits off into a distinct market with an entirely different type of customers (here: DC vs. individuals).
m3nu 2 hours ago [-]
My bet is that phone hardware will be used more and more in mini PCs and laptops keeping the cost down and volume up. We see it with Apple and many Chinese mini PC makers I looked at.
bparsons 32 minutes ago [-]
The problem is that there is a very large incentive for three large companies to corner the market on computing components, forcing consumers to rent access instead of owning.
molszanski 2 hours ago [-]
> We won't be in a supply crunch forever.
This what always happens in capitalism. Scarcity is almost always followed by glut
drecked 2 hours ago [-]
I don’t believe we are seeing the investments necessary that would indicate this will happen.
Memory makers, for example, have sold out their inventory for several years, but instead of investing to manufacture more, they’re shutting down their consumer divisions. They’re just transferring their consumer supply to their B2B (read AI) supply instead.
Thats likely because they don’t expect this demand to last past a few years.
fmajid 1 hours ago [-]
They have seen boom and bust cycles previously and are understandably wary of expanding capacity for expected demand that may fizzle. If they stay too conservative, China’s CXMT is chomping at the bit to eat their lunch, backed by the Chinese government, but that’s not going to help until late 2027 at best.
bitmasher9 2 hours ago [-]
If the demand lasts for a few years, I’m doubtful that all of the consumer capacity will come back.
close04 1 hours ago [-]
> We'll have a demand crunch
This is what I'm afraid of. As more stuff moves to the cloud helped in part by the current prices of HW, the demand for consumer hardware will drop. This will keep turning the vicious cycle of rising consumer HW prices and more moves to the cloud.
I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform. If a GPU is so expensive, you move to a rental model and the subsequent drop in demand will make GPUs even more expensive. They're far from the only ones with dollar signs in their eyes, between the money and total control over customers this future could bring.
Being entirely reliant on someone else's software and hardware is a bleak thought for a person used to some degree of independence and self sufficiency in the tech world.
HKH2 43 minutes ago [-]
> I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform.
Roblox is not popular because of its graphics. Younger gamers care more about having fun than having an immersive experience.
hombre_fatal 36 minutes ago [-]
Yeah, this gamer conspiracy theory never made sense to me.
Also, if gamers demand infinitely improving graphics so much that they would rather pay for cloud gaming than relax their expectations and be happy with, say, current gen graphics, then that is more a claim about modern self-pwned gamer behavior than megacorp conspiracy.
But I don't buy that either. The biggest games on Steam Charts and Twitch aren't AAA RTX 5090 games.
foobarian 39 minutes ago [-]
I love it when I get my Robloxhead daughter to test drive some of the games I play on my 5090 box. "Ooooh these graphics are unreal" "Can we stop for just a moment and admire this grass" :-D
kace91 3 hours ago [-]
The thing is, other than AI stuff, where does a non powerful computer limit you?
My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.
I'm not arguing mind you, just trying to understand the usecases people are thinking of here.
zozbot234 2 hours ago [-]
> other than AI stuff, where does a non powerful computer limit you?
Running Electron apps and browsing React-based websites, of course.
tormeh 2 hours ago [-]
For real. Once I've opened Spotify, Slack, Teams, and a browser about 10GB of RAM is in use. I barely have any RAM left over for actual work.
foobarian 28 minutes ago [-]
I keep wondering why we can't have 2000s software on today's hardware. Maybe because browsers are de facto required to build apps?
skydhash 2 hours ago [-]
That’s why I only run those on work computers (where they are mandated by the company). My personal computers are free of these software.
lpcvoid 2 hours ago [-]
I rarely doge a chance to shit on Microslop and its horrible products, but you don't use a browser? In fact, running all that junk in a single chromium instance is quite a memory saver compared to individual electron applications.
bluGill 56 minutes ago [-]
I use a browser at home, but I don't use the heaviest web sites. There are several options for my hourly weather update, some are worse than others (sadly I haven't found any that are light weight - I just need to know if it would be a thunderstorm when I ride my bike home from work thus meaning I shouldn't ride in now)
TeMPOraL 28 minutes ago [-]
I'm giving up on weather apps bullshit at this point, and am currently (literally this moment) making myself a Tasker script to feed hourly weather predictions into a calendar so I can see it displayed inline with events on my calendar and most importantly, my watch[0] - i.e. in context it actually matters.
--
[0] - Having https://sectograph.com/ as a watch face is 80%+ of value of having a modern smartwatch to me. Otherwise, I wouldn't bother. I really miss Pebble.
lpcvoid 50 minutes ago [-]
Try Quickweather (with OpenMeteo) if you're on Android. I love it.
skydhash 2 hours ago [-]
Why would I need a browser to play music? Or to send an email? Or to type code? My browser usage is mostly for accessing stuff on someone else’s computer.
lpcvoid 1 hours ago [-]
The only subscription I have is Spotify, since there's no easy way that I know of to get the discoverability of music in a way that Spotify allows it.
For the rest: I agree with you.
teeray 2 hours ago [-]
Companies love externalizing the costs of making efficient software onto consumers, who need to purchase more powerful computing hardware.
vladvasiliu 1 hours ago [-]
If only. At work I've got a new computer, replacing a lower-end 5-yo model. The new one has four times the cores, twice the RAM, a non-circus-grade ssd, a high-powered cpu as opposed to the "u" series chip the old one has.
I haven't noticed any kind of difference when using Teams. That piece of crap is just as slow and borken as it always was.
sfn42 1 hours ago [-]
Yeah people love to shit on electron and such but they're full of crap. It doesn't matter one bit for anything more powerful than a raspberry pi. Probably not even there. "Oh boo hoo chrome uses 2 gigs of ram" so what you have 16+ it doesn't matter. I swear people have some weird idea that the ideal world is one where 98% of their ram just sits unused, like the whole point of ram is to use it but whenever an application does use it people whine about it. And it's not even like "this makes my pc slow" it's literally just "hurr durr ram usage is x" okay but is there an actual problem? Crickets.
duskdozer 22 minutes ago [-]
The web browser on my phone instantly gets killed the moment I switch to another app because it eats up so much ram.
interf4ce 37 minutes ago [-]
The issue isn't usage, it's waste. Every byte of RAM that's used unnecessarily because of bloated software frameworks used by lazy devs (devs who make the same arguments you're making) is a byte that can't be used by the software that actually needs it, like video editing, data processing, 3D work, CAD, etc. It's incredibly short sighted to think that any consumer application runs in a vacuum with all system resources available to it. This mindset of "but consumers have so much RAM these days" just leads to worse and worse software design instead of programmers actually learning how to do things well. That's not a good direction and it saddens me that making software that minimizes its system footprint has become a niche instead of the mainstream.
tl;dr, no one is looking for their RAM to stay idle. They're looking for their RAM to be available.
gryfft 1 hours ago [-]
"chrome uses 2gb of ram"
these days individual _tabs_ are using multiple gb of ram.
vladvasiliu 50 minutes ago [-]
I think it's a correlation vs causation type thing. Many Electron apps are extremely, painfully, slow. Teams is pretty much the poster child for this, but even spotify sometimes finds a way to lag, when it's just a freaking list of text.
Are they slow because they're Electron? No idea. But you can't deny that most Electron apps are sluggish for no clear reason. At least if they were pegging a CPU, you'd figure your box is slow. But that's not even what happens. Maybe they would've been sluggish even using native frameworks. Teams seems to do 1M network round-trips on each action, so even if it was perfectly optimized assembly for my specific CPU it would probably make no difference.
Esophagus4 2 hours ago [-]
It seems like as hardware gets cheaper, software gets more bloated to compensate. Or maybe it’s vice versa.
I wonder if there’s a computer science law about this. This could be my chance!
Not exactly the same (it's about power rather than price). But close enough that when you said it, I thought, "oh! there is something like that." There's also more fundamental economics laws at play for supply and demand of a resource / efficiencies at scale / etc. Given our ever increasing demand of compute compared increasing supply (cheaper more powerful compute), I expect the supply will bottleneck before the demand does.
mettamage 2 hours ago [-]
This is the way
WillAdams 1 hours ago [-]
3D CAD/CAM is still CPU (and to a lesser extent memory) bound --- I do joinery, and my last attempt at a test joint for a project I'm still working up to was a 1" x 2" x 1" area (two 1" x 1" x 1" halves which mated) which took an entry-level CAM program some 18--20 minutes to calculate and made a ~140MB file including G-code toolpaths.... (really should have tracked memory usage....)
swiftcoder 2 hours ago [-]
The big one for me is ballooning dependency trees in popular npm/cargo frameworks. I had to trade a perfectly good i9-based MacBook Pro up to an M2, just to get compile times under control at work.
The constant increases in website and electron app weight don't feel great either.
mbfg 39 minutes ago [-]
I've never have a personal computer that came even close to powerful enough to do what i want. Compiles that take 15 minutes, is really annoying for instance.
duskdozer 1 hours ago [-]
>My phone has 16gigs of ram and a terabyte of storage
That's "non powerful" to you?
kace91 41 minutes ago [-]
The opposite. I meant that if this is what consumer grade looks like nowadays, even with a fraction of current flagships we seem well covered - this was less than 800 bucks.
guessmyname 3 hours ago [-]
> I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked […]
768GB of RAM is insane…
Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.
rafaelmn 2 hours ago [-]
Your battery is going to suffer because of the extra ram as well.
I don't know your workloads, but for me personally 64 GB is the ceiling buffer on RAM - I can run entire k8s cluster locally with that and the M5 Pro with top cores is same CPU as M5 Max. I don't need the GPU - the local AI story and OSS models are just a toy for my use-cases and I'm always going to shell out for the API/frontier capabilities. I'm even thinking of 48 config because they already have those on 8% discounts/shipped by Amazon and I never hit that even on my workstation with 64 GB.
zozbot234 2 hours ago [-]
> Your battery is going to suffer because of the extra ram as well.
No, it won't. The power drain of merely refreshing DRAM is negligible, it's no higher than the drain you'd see in S3 standby over the same time period.
cduzz 35 minutes ago [-]
I suspect this is one of those "it depends" situations; does the 128gb vs 64gb sku have more chips or denser chips? If "more chips" probably it'll draw a tiny bit more power than the smaller version. If the "denser" chips, it may be "more power draw" but such a tiny difference that it's immaterial.
Similarly, having more cache may mean less SSD activity, which may mean less energy draw overall.
If I had a chip to put on the roulette table of this "what if" I'd put it on the "it won't make a difference in the real world in any meaningful way" square.
3form 2 hours ago [-]
Given the DRAM refresh is part of S3 standby, I'm afraid this is circular reasoning.
barrkel 3 hours ago [-]
Look at the way age gating is going in a global coordinated push. Can control of compute be far behind?
It wasn't my primary motivator but it hasn't made me regret my decision.
I hummed and hawed on it for a good few months myself.
WillAdams 1 hours ago [-]
Just look at ITAR and the various attempts at legislating 3D printing and CNC machining of firearms parts to see one justification point of that.
jusssi 2 hours ago [-]
> Can control of compute be far behind?
How is this going to work? You need uncontrolled compute for developing software. Any country locking up that ability too much will lose to those who don't.
cesarb 50 minutes ago [-]
> How is this going to work? You need uncontrolled compute for developing software.
I've read about companies where all software developers have to RDP to the company's servers to develop software, either to save on costs (sharing a few powerful servers with plenty of RAM and CPU between several developers) or to protect against leaks (since the code and assets never leave the company's Citrix servers).
gzread 3 hours ago [-]
> 768GB of RAM is insane.
Before this price spike, it used to be you could get a second-hand rack server with 1TB of DDR4 for about $1000-2000. People were massively underestimating the performance of reasonably priced server hardware.
You can still get that, of course, but it costs a lot more. The recycling company I know is now taking the RAM out of every server and selling it separately.
Apple hardware is incredibly overpriced.
FpUser 2 hours ago [-]
My home server has 512GB RAM, 48 cores, my 4 desktops are 16 cores 128GB, 4060GPU each. Server is second hand and I paid around $2500 for it. Just below $3000 price for desktops when I built them. All prices are in Canadian Pesos
xyzsparetimexyz 2 hours ago [-]
Canadian Pesos?
doubled112 2 hours ago [-]
Jokes because the Canadian dollar’s value isn’t very high right now.
See a $1100 GPU on eBay, but it’s in the US? Actually a $1900 GPU.
A colleague were just talking about how well he timed the purchase of his $700 24GB 3090.
heffer 2 hours ago [-]
Please, it's actually Cambodian Dollhairs or Canuckistan Pesos.
FpUser 2 hours ago [-]
It is sarcasm. Our dollar which used to be on par with US is no more.
motbus3 1 hours ago [-]
I believe superficially speaking you could be right. But I think it was realised that causing the scarcity of products and commodities is a power move.
We live in world where we optimised for globalization. Industry in china, oil in middle east, etc...
This approach proved to be fragile on the hands of people with money and/or power enough to tilt the scale
picture 3 hours ago [-]
It seems like you largely agree with the article - people shall own nothing and be happy. Perhaps the artificially induced supply crunch could go on indefinitely.
Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?
Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.
People spend a lot more than that on a car they use less, especially if they're in tech.
The RAM choice was because I have never regretted buying more RAM - it's practically always a better trade than a slightly faster CPU - and 96GB DIMMs were at a sweet spot compared to 128GB DIMMs.
That, and the ability to have big LLMs in memory, for some local inference, even if it's slow mixed CPU/GPU inference, or paged on demand. And if not for big LLMs, then to keep models cached for quick swapping.
sfn42 45 minutes ago [-]
I bought a 4 year old car for significantly less than that. And I can get a computer that can do 99% of what your monster can do for like 10% of the price. And if I want LLM inference I can get that for like $20 a month or whatever.
I don't mean to judge, it's your money but to me it seems like an enormous waste. Just like spending $100k on a car when you can get one for $15k that does pretty much exactly the same job.
barrkel 20 minutes ago [-]
Sure. You're right, it is my money. And I pay even more for inference on top; I have OpenRouter credits, OpenAI subscription, Claude Max subscription.
It's not so easy to get nice second-hand hardware here in Switzerland, and my HEDT is nice and quiet, doesn't need to be rack-mounted, plugs straight into the wall. I keep it in the basement next to the internet router anyway.
The "sensible" choice is to rent. It's the same with cars; most people these days lease (about 50% of new cars in CH, which will be a majority if you compare it with auto loan and cash purchase).
Aurornis 30 minutes ago [-]
> Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node.
How can you say this when Apple is releasing extremely fast M5 MacBook Pros? Or the $600 MacBook Neo that has incredible performance for that price point?
Even x86 is getting some interesting options. The Strix Halo platform has become popular with LLM users that the parts are being sold in high numbers for little desktop systems.
embedding-shape 11 minutes ago [-]
They're ultimately laptops, you won't be able to squeeze out the same amount of performance from a laptop compared to a desktop, regardless of the hardware.
If you haven't tried out a desktop CPU in a while, I highly recommend you giving it a try if you're used to only using laptops, even when in the same class the difference is obvious.
chmod775 25 minutes ago [-]
They're fast, but they'll never even remotely reach what a mid-range desktop PC with dedicated graphics burning 500W is able to do.
A 300W GPU released in 2025 is about 10x M5 perf. The difference is going to be smaller for CPU perf, but also not close.
porkeynon 32 minutes ago [-]
$20k?
People laugh at young men for looksmaxxing. And then there’s this. I dunno. As someone who has been playing computer games since the 70s, I clearly do not understand the culture anymore. But what forces would drive a young man to spend the price of a used car to play a derivative FPS? It seems heartbreaking. Just like the looksmaxxer.
barrkel 30 minutes ago [-]
Alas, I'm not a young man any more. And my HEDT is headless, it has no monitor with which to play FPSes.
embedding-shape 10 minutes ago [-]
[dead]
kgeist 2 hours ago [-]
>we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
I thought the trend is the opposite direction, with RTX 5x series converging with server atchitectures (Blackwell-based such as RTX 6000 Pro+). Just less VRAM and fewer tensor cores, artificially.
Where is the divergence happening? Or you don't view RTX 5x as consumer hardware?
girvo 1 hours ago [-]
Blackwell diverges within Blackwell itself… SM121 on the GB10 vs the RTX 5000 consumer vs the actual full fat B100 hardware all have surprisingly different abilities. The GB10 has been hamstrung by this a bit, too.
shevy-java 2 hours ago [-]
> We won't be in a supply crunch forever.
I don't share the same 1:1 opinion with regards to the article,
but it is absolutely clear that RAM prices have gone up enormously.
Just compare them. That is fact.
It may be cheaper lateron, but ... when will that happen? Is there
a guarantee? Supply crunch can also mean that fewer people can
afford something because the prices are now much higher than before.
Add to this the oil crisis Trump started and we are now suddenly
having to pay more just because a few mafiosi benefit from this.
(See Krugman's analysis of the recent stock market flow of money/stocks.)
bluGill 53 minutes ago [-]
General predictions are in 3-5 years things will return to normal. 3 years if the current AI crunch is a short term thing, 5 years if it isn't and we have to build new RAM factories.
echelon 2 hours ago [-]
Local is a dead end.
Open source efforts need to give up on local AI and embrace cloud compute.
We need to stop building toy models to run on RTX and instead try to compete with the hyperscalers. We need open weights models that are big and run on H200s. Those are the class of models that will be able to compete.
When the hyperscalers reach take off, we're done for. If we can stay within ~6months, we might be able to slow them down or even break them.
If there was something 80-90% as good as Opus or Seedance or Nano Banana, more of the ecosystem would switch to open source because it offers control and sovereignty. But we don't have that right now.
If we had really competitive open weights models, universities, research teams, other labs, and other companies would be able to collaboratively contribute to the effort.
Everyone in the open source world is trying to shrink these models to fit on their 3090 instead, though, and that's such a wasted effort. It's short term thinking.
An "OpenRunPod/OpenOpenRouter" + one click deploy of models just as good as Gemini will win over LMStudio and ComfyUI trying to hack a solution on your own Nvidia gaming card.
That's such a tiny segment of the market, and the tools are all horrible to use anyway. It's like we learned nothing from "The Year of Linux on Desktop 1999". Only when we realized the data center was our friend did we frame our open source effort appropriately.
zozbot234 2 hours ago [-]
> We need open weights models that are big and run on H200s.
We have this class of models already, Kimi 2.5 and GLM-5 are proper SOTA models. Nemotron might also release a larger-sized model at some time in the future. With the new NVMe-based offload being worked on as of late you can even experiment with these models on your own hardware, but of course there's plenty of cheap third-party inference platforms for these too.
lpcvoid 2 hours ago [-]
> Open source efforts need to give up on local AI and embrace cloud compute.
Oh god no, please not more slop, you're already consuming over 1 percent of human energy output, could you, like, chill a bit?
nhecker 1 hours ago [-]
In a similar vein: seek efficiency.
I.e., /if/ I am going to consume LLM tokens, I figure that a local LLM with 10s of billions of parameters running on commodity hardware at home will still consume far more energy per token than that of a frontier model running on commercial hardware which is very strongly incentivized to be as efficient as possible. Do the math; it isn't even close. (Maybe it'd be closer in your local winter, where your compute heat could offset your heating requirements. But that gets harder to quantify.)
Maybe it's different if you have insane and modern local hardware, but at least in my situation that is not the case.
zozbot234 32 minutes ago [-]
But commodity hardware that's right-sized for your own private needs is many orders of magnitude cheaper than datacenter hardware that's intended to serve millions of users simultaneously while consuming gigawatts in power. You're mostly paying for that hardware when you buy LLM tokens, not just for power efficiency. And your own hardware stays available for non-AI related needs, while paying for these tokens would require you to address these needs separately in some way.
nhecker 14 minutes ago [-]
>And your own hardware stays available for non-AI related needs, while paying for these tokens would require you to address these needs separately in some way.
^ Fair. Yep, I agree the calculus changes if you don't have _any_ local hardware and you're needing to factor in the cost of acquiring such hardware.
When I did this napkin math, I was mostly interested in the energy aspect, using cost as a proxy. I was calculating the $/token (taking into consideration the cost of a KWh from my utility, the measured power draw of my M1 work machine, and the measured tokens per second processed by a ~20BP open-weight model). I then compared this to the published $/token rate of a frontier provider, and it was something like two orders of magnitude in favor of the frontier model. I get it, they're subsidizing, but I've got to imagine there's some truth in the numbers.
I wonder, does (or will) the $/token ratio fall asymptotically toward the cost of electricity? In my mind I'm drawing a parallel to how the value of mined cryptocurrency approximately tracks the cost of electricity... but I might be misremembering that detail.
gessha 2 hours ago [-]
Man, going to personal computing was a mistake, we should’ve stayed jacked to the mainframes /s
lenova 12 minutes ago [-]
Oh man, I've come across this person's blog before and I love it, not just because of the personalization/personality they've put into the site's design, but because of all of the random CLI/TUI-based tools they've developed. Examples:
I miss the days of the web being weird like this :-)
saadn92 38 minutes ago [-]
The article's dystopia section is dramatic but the practical point is real. I've been self-hosting more and more over the past year specifically because I got uncomfortable with how much of my stack depended on someone else's
servers.
Running a VPS with Tailscale for private access, SQLite instead of
managed databases, flat files synced with git instead of cloud storage. None
of this requires expensive hardware, it just requires caring enough to set it up
mwcz 17 minutes ago [-]
Depending on someone else's servers isn't that different from depending on someone else's software, which unfortunately we all must do. Unfathomable reams of it, with a growth curve that recently went vertical. I guess the crucial difference is that someone else's servers can be taken away in a flash, while someone else's (FOSSl software can't.
gregoriol 31 minutes ago [-]
You are missing one important part: maintenance. While on a managed service, dozens of hours of maintenance are done by someone, when you are self-hosting, you'll be doing 3 times that, because you can't know all the details of making so many tools work, because each tool will have to be upgraded at some point and the upgrade will fail, because you have to test you backups, and many many more things to do in the long run.
So yeah, it's fun. But don't under-estimate that time, it could easily be your time spent with friend or family.
Carrok 26 minutes ago [-]
I have been self hosting for years. The maintenance is minimal to nonexistent. You are conflating modern SaaS with a stable OSS docker image.
organsnyder 9 minutes ago [-]
Keeping services running is fairly trivial. Getting to parity with the operationalization you get from a cloud platform takes more ongoing work.
I have a homelab that supports a number of services for my family. I have offsite backups (rsync.net for most data, a server sitting at our cottage for our media library), alerting, and some redundancy for hardware failures.
Right now, I have a few things I need to fix:
- one of the nodes didn't boot back up after a power outage last fall; need to hook up a KVM to troubleshoot
- cottage internet has been down since a power outage, so those backups are behind (I'm assuming it's something stupid, like I forgot to change the BIOS to power on automatically on the new router I just put in)
- various services occasionally throw alerts at me
I have a much more complex setup than necessary (k8s in a homelab is overkill), but even the simplest system still needs backups if you care at all about your data. To be fair, cloud services aren't immune to this, either (the failure mode is more likely to be something like your account getting compromised, rather than a hardware failure).
mbeex 12 minutes ago [-]
Much easier with AI. Went from Webhosting all-in package + NAS to Hetzner Storage Share and a separate Emailprovider (Runbox). After a short time I dumped the Nextcloud instance and moved on to a Hetzner VPS with five docker containers, Caddy, proper authentication and all. Plus a Storage Box. Blogging/Homepage as Cloudflare Pages, fed by Github, domains from CF and porkbun, Tailscale, etc., etc. ad nauseam, NAS still there.
Most of this I didn't for many years because it is not my core competence (in particular the security aspects). Properly fleshed-out explanations from any decent AI will catapult you to this point in no time. Maintenance? Almost zero.
horsawlarway 18 minutes ago [-]
This point is oversold.
Sure - self hosting takes a bit more work. It usually pays for itself in saved costs (ex - if you weren't doing this work, you're paying money which you needed to do work for to have it done for you.)
Cloud costs haven't actually gotten much cheaper (but the base hardware HAS - even now during these inflated costs), and now every bit of software tries to bill you monthly.
Further, if you're not putting services open on the web - you actually don't need to update all that often. Especially not the services themselves.
Honestly - part of the benefit of self-hosting is that I can choose whether I really want to make that update to latest, and whether the features matter to me. Often... they don't.
---
Consider: Most people are running outdated IP provided routers with known vulnerabilities that haven't been updated in literally years. They do ok.
josmar 25 minutes ago [-]
Since using NixOS for my home server, I've found it to Just Work™ flawlessly every time.
If anyone reading this has struggled with servers accumulating cruft, and requiring maintainance, I recommend NixOS.
saadn92 19 minutes ago [-]
yes, I do agree with that sentiment, there are times when I'm spending way too much time restarting a service that went down, but it doesn't take as long as it used to, especially with AI assistance nowadays. If I'm spending too much time on it, then I'm also probably learning something along the way, so I don't mind spending that time.
esseph 24 minutes ago [-]
There are a lot of people that have made a lot of money and careers because developers in particular don't want to know or don't care to know how to manage this stuff.
They need to get over it.
Pick up some Ansible and or Terraform/tofu and automate away. It can be easy or as involved as you want it to be.
bluejay2387 2 hours ago [-]
The general take here seems to be "everything eventually passes". That isn't always true. I wonder how many people have a primary computing device that they don't even have full control over now (Apple phones, tablets...). Years ago the concept of spending over $1k on a computer that I didn't even have the right to install my own software on was considered ridiculous by many people (myself included). Now many people primarily consume content on a device controlled almost entirely by the company they bought it from. If the economics lead to a situation where its more profitable to sell you compute time than sell you computers then businesses will chose to not sell you computers. I have no idea if that is what ends up happening.
threetonesun 36 minutes ago [-]
The framing here is wrong, I think. My iPad has a lot of software on it that I use for music production, it all runs locally. Yes I had to install it through Apple's app store but I could disconnect it from the Internet and expect it to, at this point, work as long as the software on almost any piece of hardware it replaces.
Meanwhile my much more expensive laptop mostly interfaces with applications that primarily exist on servers that I have no control over, and it would be nearly worthless if I disconnected it from the Internet. Your central point is right, the economics are concerning, but I think it's been a ship slowly sailing away that we're now noticing has disappeared over the horizon.
cmiles74 2 hours ago [-]
It's worth keeping an eye on this HP-rental-laptop thing.
Personally I think it will be a big headache for HP, people can be hard on laptops and HP is already not excited about consumer support (i.e. mandatory 15 minute wait time for support calls). But if they make it work, I think there's probably a good number of people who feel like they need a laptop but don't care so much about the specifics and want to keep their costs low (as all of their costs appear to be rising right now).
bluGill 48 minutes ago [-]
Rental seems to be about corporate laptops. Companies just want things to work at a predictable cost. They are already replacing laptops after 5 years even if they work. They are already replacing a few laptops that break in less than that 5 years. In short they are already renting the laptops, they are just paying the price upfront and then using accounting to balance it out. Rental just moves the accounting, but otherwise nothing changes.
For consumers who don't replace their laptops on a schedule it makes less sense.
ozgrakkurt 2 hours ago [-]
To be fair the people that have ipad as their only computer device now didn’t have a computer back then
TeMPOraL 2 hours ago [-]
Not necessarily. Many people grew up with PCs and laptops but now mostly use their phones, because outside of specific jobs or hobbies, everyday computing needs are heavily optimized for mobile-first.
(A large factor here is, obviously, the cloud. With photos, documents, e-mail, IMs, etc. all hosted for cheap or free on "other people's computers", the total hardware demands on the end-user computing device is much less. Think storage, not just RAM.)
It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once. I have a separate company laptop for work, and I occasionally turned on my PC, but it turns out that a foldable phone is good enough to do everything on personal side I'd normally use a laptop for. So here I am, with my primary compute device I don't have full control over - and yes, I'm surprised by this development myself, and haven't fully processed it yet.
audunw 48 minutes ago [-]
> Not necessarily. Many people grew up with PCs and laptops but now mostly use their phones, because outside of specific jobs or hobbies, everyday computing needs are heavily optimized for mobile-first.
It's a deeply flawed comparison, because many of the things we do with a phone now wasn't something we'd do at all with the computers we grew up with. We didn't pay at the grocery store with a computer, we didn't buy metro tickets, we didn't use it to navigate (well, there was a short period of time where we might print out maps, but anyway..)
When I grew up, I feel like our use of home computers fell into two categories:
1. Some of us kids used them to play games. Though many more would have a Nintendo/Sega for that, and I feel like the iPhone/iPad is a continuation of that. The "it just works" experience where you have limited control over the device.
2. Some parents would use it for work/spreadsheets/documents ... and that's still where most people use a "real" computer today. So nothing has really changed there.
There is now a lot more work where you do the work on services running on a server or in the cloud. But that's back to the original point: that's in many cases just not something we could do with old home computers. Like, my doctor can now approve my request for a prescription from anywhere in the world. That just wasn't possible before, and arguably isn't possible without a server/cloud-based infrastructure.
Phones/tablets as an interface to these services is arguably a continuation of like those old dumb terminals to e.g. AS/400 machines and such.
> It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once.
I do agree, I am in a similar situation.
nhecker 10 minutes ago [-]
(edit: I'm broadly in agreement with your comment & observations, so I don't at all mean to come off as argumentative for the sake of being argumentative. You just got me thinking about how that situation might have been handled thirty or a hundred years ago.)
> [...] my doctor can now approve my request for a prescription from anywhere in the world. That just wasn't possible before [...]
I'm picking nits, but wasn't this more or less instantaneous approval possible before with e.g., a fax and a telephone? Or (although this is a bit of a stretch) a telegram and telegraph?
bluGill 37 minutes ago [-]
In a lot of ways the cloud is better than my personal computer, even if I'm on it.
There is a reason I have a server in my basement - it lets me edit files on my phone (if I must - the keyboard is and screen space are terrible compromises but sometimes I can live with it), laptop (acceptable keyboard and screen), or desktop (great keyboard, large screen); it also lets me share with my wife (I haven't got this working but it can be done). I have nearly always had a server in my house because sharing files between computers is so much better than only being able to work on one (or using floppies). The cloud expands my home server to anywhere in the world: it offloads security on someone else, and makes it someone else's problem to keep the software updated.
There is a lot to hate about the cloud. My home servers also have annoyances. However for most things it is conceptually better and we just need the cloud providers to fix the annoyances (it is an open question if they will)
nhecker 1 hours ago [-]
Ditto. My personal equipment includes a home server (128GB DDR3 ECC) and a tablet with a keyboard. It's honestly astonishing what you can do without a full-fledged laptop, if you're willing to go through some gymnastics to get there. And it travels light compared to a laptop! (The tablet, that is. Not the headless box. :-))
daveidol 24 minutes ago [-]
Not my 70yo mom. She used to have a big gray PC but switched to a Chromebook (one I gave her) about 15 years ago, and now only uses her phone and tablet.
doom2 2 hours ago [-]
I'm also very skeptical of "everything eventually passes" as it pertains to hardware prices. Right now, prices are high because supply can't keep up with demand. But if/when supply increases to meet demand or demand decreases, there's no reason for companies to drop prices now that consumers have become accustomed to them.
hombre_fatal 1 hours ago [-]
> there's no reason for companies to drop prices
Competition.
bitmasher9 2 hours ago [-]
My primary concern is for next generation hardware.
Will we continue to see steady improvement in top quality CPU/GPUs? Would they even bother releasing consumer versions of ram faster than DDR5?
surgical_fire 41 minutes ago [-]
Being in control of your own computing device was always a niche. The vast majority of people are not interested in computing itself, only in the output. For that majority, this is fine.
The niche is still there, probably as big as it was before. For example, as I grew weary of being subject to services I have little control over, I set up my own home server using a refurbished PC. It has been an amazing journey so far. But I don't think a normie would ever get interested in buying a refurbished Dell, install Debian on it, and set up their own services there.
As long as there is a niche of people interested in buying their own computers, there will be companies willing to fill that niche.
BLKNSLVR 35 minutes ago [-]
This may not be entirely appropriate to the reasons behind the article, but it feels tangentially related:
I'd like to say a brief thank you to what the brief, golden period of globalisation was able to bring us.
I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future, but it seems the pendulum is swinging the other way for the time being.
I hope that, wherever the current direction ends up, there are lessons that can be learnt about what we had, and somehow fumbled, such that there is motivation enough to get back there.
stronglikedan 30 minutes ago [-]
> I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future
Me too, but without all the slavery this time please. It'll never work if some actors are willing to abuse their workforces to keep prices low as they do.
commandlinefan 4 minutes ago [-]
When I started programming in the early 80's, personal computing had just recently become a thing. Before that, if you wanted to learn to program, you first needed access to a very rare piece of hardware that only a select few were granted access to. But when personal computing became a reality, programming exploded - anybody could learn it with a modest investment.
I suspect we're trending back to the pre-personal computing era where access to 'raw' computing power will be hard to come by. It will become harder and harder to learn to program just because it'll be harder and harder to get your hands on the necessary equipment.
rswail 2 hours ago [-]
A long article begging the question when the last paragraph or two countered the panic of the beginning. Two Chinese firms are ramping up production of consumer RAM/SSDs because they see a market opening as the existing producers move to selling to enterprise/hyperscalars.
There have been memory chip panics before, the US funded RAM production back into the 80s/90s in competition with Japan at the time.
The AI boom/"hyperscale" currently is almost exactly like the dotcom boom.
It's already starting to shake down. Anthropic is occupying the developer space, OpenAI has just exited the video/media production space. More focused and vertical market AI is emerging.
The current vortice of money between OpenAI <-> Microsoft <-> Oracle <-> NVidea <-> Google <-> etc etc is going to break.
jameshart 4 minutes ago [-]
The effects of the AI hyper scaling boom on the commodity hardware and energy markets are very much not like the dot com boom.
Outside of the obvious economic effect of the dot com boom - the creation of near infinitely scalable high margin online businesses - there was a secondary effect on consumer electronics, with a massive growth in demand for networked devices; there was then much more of a balance between the hardware growth in the network infrastructure and data center worlds as well as in desktop and mobile.
The AI boom’s hardware impact is much more skewed, as this article details.
zozbot234 2 hours ago [-]
> Two Chinese firms are ramping up production of consumer RAM/SSDs because they see a market opening
Yes but these Chinese firms are a tiny share of the overall RAM/SSD market, and they'll have the same problems with expanding production as everyone else. So it doesn't actually help all that much.
jameshart 18 minutes ago [-]
The fact that there’s been a massive expansion in the nonconsumer market means the consumer market makes up a smaller proportion of the overall market, but it doesn’t mean the consumer market is any smaller than it used to be.
bluGill 34 minutes ago [-]
The biggest problem in expanding for everyone else is they don't trust the market to exist for long enough to be worth paying for a new factory so they are not investing in it. The Chinese might be small, but they think the market will exist and are investing. Will they be right or wrong - I don't know.
bitmasher9 1 hours ago [-]
Chinese firms won’t have the exact same problems as anyone else. Some problems will be the same but not all.
* Chinese firms finance through different banks and investors than current ram producers
* A company with a mission statement of consumer ram won’t have their supply outbid by data centers
* Chinese manufacturing has more expertise in scaling then any other manufacturing culture
upofadown 2 hours ago [-]
This article inspired me to look and see what this computer is. Apparently it is a "AMD Athlon(tm) II X2 250 Processor" from 2009. So 17 years old. It has 8 GB of DDR3 memory and runs at 3 GHz. It currently has OpenBSD on it, but at least one source thinks it could run Windows 10.
The fact that I didn't know any of this is what is significant here. At some point I stopped caring about this sort of thing. It really doesn't matter any more. Don't get my wrong, I am as nerdy as they come. My first computer was a wire wrapped 8080 based system. That was followed by an also wire wrapped 8086 based system of my own design I used for day to day computing tasks (it ran Forth). If someone like me can get to the point of not caring there is no real reason for anyone else to care.
1970-01-01 2 hours ago [-]
Your electricity bill alone could justify the cost of a new computer purchase if you're not shutting that down after every session.
einr 2 hours ago [-]
65W TDP? Let's say we want to run a PC so we're switching to a newer low-end Ryzen with a 35W TDP and that that's a 30W difference for the whole system. Let's say we're running the system 24/7 and the CPU is pulling its full TDP constantly. Average US residential electricity price is $0.18/kWh.
0.03 kW * 24 h * 365 d * $0.18 = $47.30/year
cjs_ac 1 hours ago [-]
In the UK, residential electricity tariffs are currently capped by the regulator at 27.69p per kWh, resulting in a total yearly cost of £72.77. Much higher than in the US, but still much cheaper than a new PC.
So $50/yr for 4 years gives you ~$150 with $50 extra for shipping or whatever, which gets you a decent Lenovo M700 Tiny with much better performance in both power and power consumption.
einr 1 hours ago [-]
I guess. It's hardly an open-and-shut case of "throw your old computer away!" though, especially when this is a worst-case scenario of running a desktop computer at full blast 24/7 without it ever going into sleep mode or being turned off, and when you don't know what the user's needs are. Maybe a mini-PC with basically no expansion just won't really work for them?
SeanAnderson 1 hours ago [-]
Someone's never tried to locally compile a Rust program. :)
meindnoch 2 hours ago [-]
I know this may sound ridiculous, but m-maybe... maybe it's time for us to make software... less bloated?
Maybe... just maybe, a TODO list app shouldn't run 4 processes, and consume hundreds of megabytes of RAM?
guardian5x 2 hours ago [-]
Let me be the devils advocate here.
Ok, let's say you optimize that TODO list app to only use 16 mb of RAM. What did you gain by that? Would you buy a smartphone that has less RAM now?
cheschire 2 hours ago [-]
It’s the upgrade treadmill you would stop using, and stick to the initial entry device.
TeMPOraL 2 hours ago [-]
If only there wasn't a security update treadmill forcing everyone to do regular hardware upgrades.
3form 1 hours ago [-]
Of course, as long as we're in the dreamland, most of these security upgrades do not actually require a hardware upgrade.
TeMPOraL 14 minutes ago [-]
Technically no (except for the gradual performance drop they introduce, + occasional TPM bullshit), but of course in practice, companies see this as a choice of spending money on back-porting security fixes to a growing range of hardware, vs. making money by not doing that and forcing everyone to buy new hardware instead.
layer8 1 hours ago [-]
I’m running Windows 10 ESU on a 13 year old PC without issues. While it’s admittedly near the end of its life (mostly just due to Windows 11, though I might repurpose it for Linux), I’m expecting the next one to also last a decade or longer.
TeMPOraL 3 minutes ago [-]
So is my wife, her laptop is still decent today, but doesn't support Win 11. I'm not worried about Microsoft as much as certain other competitors killing it - similarly to how she was forced to update to Windows 10 in the first place because, one day, out of the sudden, her web browser decided to refuse running on Windows 7.
layer8 1 hours ago [-]
It would be nice for browser tabs and apps to reload less often.
TeMPOraL 2 hours ago [-]
We can't ever escape the market forces? You're right, of course if software gets less bloated, vendors will "value-optimize" hardware so in the end, computers keep being barely usable as they are today.
robinsonb5 1 hours ago [-]
This year's average phone is already going to have less RAM than last year's average phone - so anything that reduces the footprint of the apps (and even more importantly, websites) we're using can only be a good thing. Plus it extends the usable life of current hardware.
1 hours ago [-]
TeMPOraL 2 hours ago [-]
That's crazy talk. What will you ask for next? Add functionality to make apps at least as good/capable as they were in the 1990s and early 2000s? And then? Apps that interoperate? Insane.
More seriously and more ironically, at the same time, we've now reached a strange time where even non-programmers can vibe-code better software than they can buy/subscribe to - not because models are that good, or programming isn't hard, but because enshittification that has this industry rotten to the core and unable to deliver useful tools anymore.
rvz 2 hours ago [-]
Tell that to those who are still using Electron, TypeScript to create bloated desktop apps.
dust42 3 hours ago [-]
Just to mention one thing, helium -which is a necessity for chip production- is a byproduct of LNG production. And 20% of that is just gone (Qatar) and the question is how long it will take to get that back. So not only a chip shortage because of AI buying chips in huge volumes but also because production will be hampered.
Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.
adrianN 3 hours ago [-]
Fusion fuel is so energy dense that fusion plants will never produce industrially meaningful amounts of helium.
TeMPOraL 2 hours ago [-]
Well, as long as they can make electricity too cheap to meter, we can get helium from somewhere. Mine it from LNG sources currently untapped due to EROI < 1, or ship it from the goddamn Moon - ultimately, every problem in life (except that of human heart) can be solved with cheap energy.
kibwen 1 hours ago [-]
The mere existence of proof-of-work cryptocurrencies means that it is impossible to ever have electricity that is "too cheap to meter". Any time electricity prices would fall below the price of mining, that creates a market opportunity that will be filled by more mining. Wasted electricity is the product.
Arch485 21 minutes ago [-]
I think that's only because electricity is the bottleneck, though. If it was no longer the bottleneck, crypto miners would expand rapidly with more hardware, mining difficulty would increase, and eventually the bottleneck is storage space for all your GPUs, if not the GPUs themselves.
bitmasher9 1 hours ago [-]
With the trend of orbital launches becoming cheaper, it might be that mining helium off-Tera will be our long term supply. Especially if the alternative is adjusting the amount of protons in an atom.
There are several challenges, not least of which is storage. We have considerable leakage in most of our current helium storage solutions on earth because it’s so light. Our national reserves are literally in underground caverns because it’s better than anything we can build. Space just means any containment system will need to work in a wider range of pressure/temperatures.
adrianN 2 hours ago [-]
There is to my knowledge no reason to assume that complicated physics experiments that heat water to run a steam engine will be much cheaper than fission power plants, unfortunately.
TeMPOraL 11 minutes ago [-]
I can't say I agree with the conclusion, but I commend you for the concise and poetic description of what most power plants fundamentally are.
gobdovan 2 hours ago [-]
I think this is why he labelled the comment 'Tongue in cheek'. Thanks for pointing it out explicitly tho, was not aware of this.
numpad0 2 hours ago [-]
Can't they irradiate tanks of H2 or something with so much neutrons and electrons until morale improves and they become He? Or would that make radioactive He?
1 hours ago [-]
Arn_Thor 3 hours ago [-]
Considering my helium-filled hard drives a strategic reserve now
TeMPOraL 2 hours ago [-]
Gonna sit on my half-empty tank for party balloons from my daughter's birthday, maybe we'll be able to sell it to pay off mortgage quicker than the helium itself escapes the tank.
halapro 2 hours ago [-]
Same energy as "buy bitcoin" in 2011
TeMPOraL 2 hours ago [-]
Unfortunately bitcoins don't leak from storage tanks on their own.
myself248 2 hours ago [-]
That's another lifetime-limited thing -- the helium leaks out, and you cannot (for practical purposes) stop it or even meaningfully slow it down. When it's gone, the drives are dead. And the helium leaks by calendar-days, it doesn't matter whether the drive is powered on or off.
Non-helium hard drives are basically limited by their bearing spin hours. If one only spins a few hours a week, it'll probably run for decades. Not so with helium.
adrianN 2 hours ago [-]
You just have to put your hard drive in a pressure vessel filled with helium.
Forgeties79 2 hours ago [-]
It’s helium all the way down
tangotaylor 20 minutes ago [-]
I'm going to fight pessimism with cynicism here: the Department of Defense is not going to let everything move to the cloud because they need compute at the edge for AI-enabled weapons and R&D. For example, Anduril's products, Eric Schdmit's secretive Bumblebee project, or startups like Scout AI. Communications and GPS are just too easy to jam and their answer is giving weapons more last-mile autonomy to operate in radio silence.
War aside, I also bet there's going to be a huge demand for edge-compute for other kinds of robotics: self-driving cars, delivery robots, factory robots, or general-purpose humanoids (Tesla Optimus, Boston Dynamics Atlas, 1X NEO, etc). Moving that kind of compute to the cloud is too laggy and unreliable. I know researchers who've tried it, the results were mixed.
Also, the engineers working on these platforms aren't going to reinvent the wheel every time they need to connect hardware together and they're going to use interoperable standards, like PCIe for storage or GPUs, DIMM slots for memory, ATX for power, etc. So I don't see general-purpose computing dying.
CraigJPerry 3 hours ago [-]
Articles entire thesis looks like it can be completely de-railed if one activity happened: ai infrastructure firms cease to be able to secure more capital.
Is that likely? History says it's inevitable, but timeframe is an open question.
Shank 3 hours ago [-]
> ai infrastructure firms cease to be able to secure more capital
If this does occur, unfortunately it isn’t like any of the production capacity is going to immediately shift or be repurposed. A lot of the hardware isn’t usable outside of datacenter deployments. I would guess a more realistic recalibration is 2-3 years of immense pain followed by gradual availability of components again.
lugu 3 hours ago [-]
> If this does occur
The capital from the gulf is already disrupted. It isn't anymore a matter of if or when.
gzread 2 hours ago [-]
My computer, and I think all threadripper systems, has registered ECC DDR5 RAM which I think is the same type used in AI datacenters. Well one half of it, the other half being HBM memory used on video cards, which is soldered to them and non-upgradeable. But the main system memory from a used AI server can become your main system memory.
myself248 2 hours ago [-]
So that becomes the next question -- will we see an ecosystem of modifications and adapters, to desolder surplus and decommissioned datacenter HBM and put it on some sort of daughterboard with a translator so it can be used in a consumer machine?
Stuff like that already exists for flash memory; I can harvest eMMC chips from ewaste and solder them to cheaply-available boards to make USB flash drives. But there the protocols are the same, there's no firmware work needed...
duskdozer 48 minutes ago [-]
Aren't some people already doing this with consumer GPUs?
CraigJPerry 3 hours ago [-]
yeah 3 years sounds reasonable to me, less than one asset depreciation cycle in business. Pain for you and me, but just a bump in the road for the accounts dept.
sva_ 3 hours ago [-]
I think some players like xAi and Google can burn money for a long time. Google made $240B profit last year
ramon156 2 hours ago [-]
They would rent out the data centers, not sell it off
2716057 1 hours ago [-]
As long as there are consumers paying for hardware ownership there will be businesses willing to sell it to them. The worst scenario I could imagine is that one has to pay a premium for fully-owned hardware simply because consumer's desire for it becomes an oddity and it is thus sold in low quantities.
The current AI-induced shortages aside, the times have never been better in my opinion. There is overwhelming choice; ordinary consumers can access anything from Raspberry PIs all the way up to enterprise servers and AI accelerators. The situation was very different in the 1990s when I built my first PC.
anonzzzies 3 hours ago [-]
I do not see this from an infinite shortage point; I see this from a locked down hardware point. Old hardware is hackable, new hardware mostly not. That is for me where the real pain is and why I just buy old computers and phones that are rootable.
Bender 50 minutes ago [-]
It is a good article but I am holding onto my hardware for other reasons. I predict it will not be long until all hardware has a set of Nanny chips that are named and marketed so that even people here on HN will argue on behalf of having them. It will be some "Secure enclave AI accelerated Super Mega Native Processing Underminer" and will start off securing and accelerating something or a set of somethings but will eventually tie into age verification, censorship and a Central Nanny Agency that all countries will obey.
- "Stare into this hole to verify your age.
- "Stick your finger in the box.
- "Ignore the pain to get your AI token bucks and unlock access to the shiny new attestation accelerated internet."
- "Sync ALL of your usernames and passwords into this secure enclave."
Every packet and data stream will be analyzed locally by the AI to determine the intentions and predict future behavior. The AI summarized behavior will be condensed into an optimized encoded table to be submitted hourly to the Central Nanny Overseer. I might be slightly exaggerating and a bit hyperbolic but it will be something in this spirit and people will sleep walk right into it.
My only question is which country will control the behavior of these chips.
the__alchemist 48 minutes ago [-]
It is wild thinking how a few years ago, I didn't buy a 4090 direct from nvidia because "$1600 (USD) is too much to pay for a graphics card; if I need a better one, i'll upgrade in a few years. (Went with 4080, which is substantially slower and was $1200) Joke's on me!
It will be scarcity mindset from here on out; will always buy the top tier thing .
darkwater 2 hours ago [-]
> For the better part of two decades, consumers lived in a golden age of tech. Memory got cheaper, storage increased in capacity and hardware got faster and absurdly affordable.
I got my first PC circa 1992 (a 2nd hand IBM PS/2, 80286 processor with 2MB RAM and 30MB HDD) and the "golden age" was already there. We are well over 40 years of almost uninterrupted "pay less for more performances" in the home/personal computing space, and that's because that space started around 50 years ago. There was some fluctuation (remember the earthquake affecting HDD prices a few years ago?) but demand was there and manufacturing tech became more efficient.
The actual important change is that for most consumer uses, the perf improvements stopped to make sense already what, over 10 years ago?
duskdozer 11 minutes ago [-]
Do you mean for hardware? Because a big chunk of that imo is how unnecessarily demanding software has become in the last 10 years, largely due to the web.
jleyank 3 hours ago [-]
Hold onto your hardware. Hold on to your existing software and the current version. Don’t upgrade without a specific need. None of the “progress” is actually helpful to hackers and I’m not sure it’s even helpful to typical users. There’s enough information being given to and slurped by others, don’t make it more effective.
archargelod 2 hours ago [-]
My PC has an Intel Xeon from 2007, a GPU from 2010, and 4GB of RAM.
It’s enough for web browsing and can handle 1080p/60fps video just fine.
For gaming, I have a dedicated device - a Nintendo Switch, but I also play indie PC games like Slay the Spire, Forge MTG, some puzzle games e.g. TIS-100.
Linux with i3 is fast and responsive. I write code in the terminal, no fancy debuggers, no million plugins, no Electron mess.
It’s enough for everything I need, and I don’t see a reason to ever upgrade. Unless my hardware starts failing, of course.
bitwize 4 minutes ago [-]
Wait, you type the code in directly? That's like a baby's toy!
compounding_it 3 hours ago [-]
In order to go from 360p video 15 years ago to 4K HDR today, I have upgraded from a 2mbps 802.11g WiFi on a 1366x768 display to a 200mbps connection on 802.11ax and a 55 inch 4k television.
The experience is quite immersive and well worth the upgrade that happened very progressively (WiFi 5 1080p then WiFi 6/7 4K).
sigio 2 hours ago [-]
At the same time, we had cheap consumer gigabit ethernet, and still have cheap consumer gigabit ethernet. 2.5 is getting there price-wise, but switches are still somewhat rare/expensive.
adamwong246 55 minutes ago [-]
I have often imagined writing a book, roughly "Fahrenheit 451 but with computers instead of books". Imagine a world you do not buy an iPhone- one is assigned to you at birth, a world were "installing software" on "a computer you own" are not just antiquated or taboo, but unthinkable.
red-iron-pine 51 minutes ago [-]
google and facebook are actively salivating at this possibility.
"don't create the torment nexus, etc."
mememememememo 3 hours ago [-]
In such a future the iPhone and android ecosystem is dead? Because a single $1k phone is a hell of a computer. So if you can still buy a phone you can still get a computer. Local AI aside these are very capable.
layer8 55 minutes ago [-]
You don’t really own an iPhone in terms of being a computer. It’s different for certain Android phones where you can install a custom OS. Those are also less powerful, however.
pjc50 2 hours ago [-]
iOS is apparently going to have mandatory age gating, so likely that will come to Android as well.
mememememememo 2 hours ago [-]
I was trying to avoid the software side of this argument as it is a worm cannister. I was just musing from a hardware availability point of view.
That said.... hopefully at least on Android side you can get a free (as in unchastified) OS to run on it.
Until they come for the HW.
jagged-chisel 3 hours ago [-]
Maybe we’ll finally get some good tools to make real productive work possible on phones.
Angostura 3 hours ago [-]
Like a large screen and a keyboard? Hello Mac Neo
kingleopold 3 hours ago [-]
I see they are offering to macos for iphone pro and ipad pro next years with subsc. ? or via upgrade with price I mean it's now possible more than ever
mememememememo 2 hours ago [-]
Hello HDMI adaptor and magic keyboard
9wzYQbTYsAIc 3 hours ago [-]
Virtual desktop casting, a killer HID product, and what else do you need?
tmtvl 2 hours ago [-]
I grabbed an upgrade at the end of last year because my ~10 year old workhorse is starting to show signs of aging. Despite 16 gigs of RAM having lasted me thus far I decided to bite the bullet and get 32; so I expect this new machine to last me another 10 years (although I now have a full SSD, whereas my old workhorse had an SSD for the OS and a hybrid drive for /home, so we'll see whether or not it will actually last).
Forgeties79 2 hours ago [-]
Built my PC last April and also did 32gb. Almost did 64 since ram was so cheap at the time too, but hey live and learn I suppose ha
vladde 2 hours ago [-]
when you click away to another tab, the title and favicon of the page changes to something weird, but really legit looking.
a couple of my favorites: "rust programming socks - Google", "Amazon.com: waifu pillow", "Rick Astley - Never Gonna Give You Up", "censorship on hacker news - Google"
foolserrandboy 1 hours ago [-]
It actually gives you warning in an overlay first that the favicon would change if you open a new tab. I did and I got "zuckerberg nudes"
abmmgb 1 hours ago [-]
I actually think the central thesis is thought provoking, we have shifted far away from locally installed shit to remote data centre access, this was initially driven by cloud-based initiatives and now spiralling upwards by AI. For any researchers, hackers, builders wanting to play with locally installed AI, hardware could become a bottleneck especially as many machines, such as the beloved Macs, are not upgradable
pcblues 59 minutes ago [-]
I think what many people don't realise is that there will be a glut of cheap computer parts including CPUs, GPU cards, and memory when the AI and AI-adjacent businesses go bust and a bunch of data centres get pulled down.
taikahessu 10 minutes ago [-]
Pidän kaksin käsin kiinni muistitikuista niinku.
Cha cha cha ...
pmdr 2 hours ago [-]
I've seen comments on here before that went somewhere along the line of "adults don't care about RAM prices." HN is no stranger to siding with the oppressors.
duskdozer 3 hours ago [-]
uBlock Origin has prevented the following page from loading:
https://xn--gckvb8fzb.com/hold-on-to-your-hardware/
This happened because of the following filter:
||xn--$document
The filter has been found in: IDN Homograph Attack Protection - Complete Blockage
lexlambda 1 hours ago [-]
This is a filter added by you (or by an overzealous list maintainer), it does not happen by default or even with the provided additional filterlists.
duskdozer 8 minutes ago [-]
Yeah, I remember going on https://filterlists.com/ one day all mad and just adding a ton because of how many ads and manipulative patterns I was dealing with
sva_ 3 hours ago [-]
What a silly filter, blocking all xn domains
zvqcMMV6Zcr 3 hours ago [-]
That whole feature is kind of paragraph 22. No legit/popular site uses it so users don't expect national characters in domain names, so no one actually hosts sites using "xn-" domains.
Shrug. First time I'd seen this. If it displayed as the original text it would have been clearer.
numpad0 2 hours ago [-]
It would make it hard to spot impostor domains like "news.усомbiнаtor[.]сом" if it was. There's enough inertia for FQDNs to be strictly ASCII and any UTF-8(outside ASCII) in domain names to be felt unnatural for an URL, so most systems default to the raw "Punycode" xn-- scheme for all IDNs.
httpsterio 2 hours ago [-]
In this case yes but it's meant as a punycode scam prevention where common Latin alphabet letters are swapped for similar looking alternatives.
pjmlp 1 hours ago [-]
I have been holding for my hardware for decades, some of my private hardware traces back to 2009.
Phones and tablets only get replaced when they die.
Why should I throw away stuff that still works as intended?
xbmcuser 1 hours ago [-]
In the last month 20-30% of oil supply 30% gas supply and 30-40% of fertilizer production has been destroyed and could take any where from 8 months to 5 years to come back online. Governments are acting as everything is okay so that there is no panic but we have crossed the point of no return even if the war ends today food & energy shortages are over the horizon.
If you can get an ev, solar heat pumps, battery storage etc get it now today as fossil fuel based energy prices are going to go through the roof. I see similarities to when covid hit people kept looking at things happening in other countries and not preparing for the shit to hit their own cities and countries.
mmackh 2 hours ago [-]
We are in a renaissance of computing right at this moment. If expand our definition of computers outside of screens and traditional input devices, microcontrollers are capable of so much more, with so much less (energy consumption | ram | storage).
The tipping point for MCUs was WiFi - which not only allows you to speak multiple protocols (UDP/Zigbee/HTTP/etc) and have audio IO, but also P2P communication and novel new form factors. There's been incredible progress with the miniaturisation of sensors and how we're able to understand and perceive our environment.
So yes, whilst traditional hardware is getting more expensive and locked down, there's a strong counter movement towards computing for everyone - and by that I also mean that there's going to be less abstraction in the entire stack. Good times ahead!
functional_dev 41 minutes ago [-]
you are right! Power management improvements are what really enable these form factors... being able to run a wifi sensor on a coin cell for a year makes applications possible that were unthinkable just a few years ago
altcognito 46 minutes ago [-]
Part of this is that memory companies recognize that nobody is going to enforce antitrust law for the forseeable future, so collusion to raise prices is the norm now.
shusaku 3 hours ago [-]
> These days, the biggest customers are not gamers, creators, PC builders or even crypto miners anymore. Today, it’s hyperscalers. …
> These buyers don’t care if RAM costs 20% more and neither do they wait for Black Friday deals. Instead, they sign contracts measured in exabytes and billions of dollars.
Does all this not apply to businesses buying computers for their employees?
rolandhvar 1 hours ago [-]
So what happens when the datacenters need to upgrade (new hardware, or stupid enterprisey reasons like "must be new when replacing broken stuff")? Surely there remains a secondary market for the enthusiasts?
vjerancrnjak 3 hours ago [-]
haha, all of a sudden I see a tab "waifu pillow" on Amazon, and think I have a split personality that runs searches in between consciousness shifts, and then I come back to a funny message.
grahammccain 1 hours ago [-]
I feel like we will get out of the hardware constraints eventually.
aurareturn 3 hours ago [-]
Capitalism at work. There is more value to be generated by moving resources to data centers for the moment. This isn't some me be insensitive or anything. It's the same people who are buying iPhones and PCs who are demanding more compute for AI.
There could be a swing in the future where people will demand local AI instead and resources could shift back to affordable local AI devices.
Lastly, this thesis implies that we will be supply constrained forever such that prices for personal devices will always be elevated as a percentage of one's income. I don't believe that.
usrbinbash 3 hours ago [-]
As the old saying goes: "This too will pass."
Consumer hardware will always be a market worth serving for companies who don't see their stock price as their product.
If the existing companies are unwilling to make a sale, I am sure new players will arise picking up their slack.
The overlay and page title changing worked really well, ffs, I was like "Why is my machine displaying a page with zuckerberg nudes" haha.
camgunz 3 hours ago [-]
I feel like this is just the bubble talking. I'm pretty naive here, but at some point suppliers will adjust so they can take money from data center builders and consumers, just like pre-bubble.
not_the_fda 3 hours ago [-]
Chip manufacturers are used to boom bust cycles and are always hesitant to bring on more capacity, since it costs billions to do so.
They will let the hyperscalers buy their supply at a premium and wait for the bust. Then they will shift back to the consumer space.
Hardware is going to be expensive for awhile but its not as dire as the article makes it out to be.
Tade0 1 hours ago [-]
I recall how in the 2010s RAM manufacturers were in crisis, as their margins were low and competition fierce - it got to a point where they started doing price fixing and got fined for it:
To me this is just a temporary swing in the other direction - they're riding the gravy train while they can, because once it ends it's back to low prices.
9wzYQbTYsAIc 3 hours ago [-]
> Hardware is going to be expensive for awhile but its not as dire as the article makes it out to be.
At the same time, the article’s argument that the value of personal computer ownership is only going to rise, in terms of the value of speech, not strictly in terms of the value of lunch, is important to call out.
I’m glad I held on to my 2009 MacBook, for example, as it still functions today as an active part of my homelab, at an amortized yearly cost of practically the price of taking a nice steak dinner once a year.
XorNot 3 hours ago [-]
It's probably closer to the suppliers don't think this will last and are ramping slowly if at all so they're not left holding the bag.
The US is headed for a cataclysmic crash at this point and it's not clear what will trigger it, but all those companies pushing underpriced tokens and Rust ports of existing tools by agents aren't going to survive it.
llm_nerd 2 hours ago [-]
[dead]
jagermo 3 hours ago [-]
I knew the time for my cable box would come!
selectively 49 minutes ago [-]
This is just brainrot garbage. The idiotic stuff you see YouTubers saying. Why is this at the top of HN? Bots, I assume?
lmz 3 hours ago [-]
Micron is killing its Crucial consumer brand, not supplies to consumer brands who use its chips. Hynix never had a consumer brand for RAM I don't think?
flyinglizard 2 hours ago [-]
It's a thought provoking article and I felt the pain when I shopped around for a new GPU lately to replace a 4090 I thought was faulty (eventually a cleaning of the PCIe connector solved those crashes). I bought it at the end of 2022 and three and a half years it seems like we've gone backwards, not forward on GPUs available for end users. They cost more and do less.
But also consider that PCs have been an anomaly for very long. I don't think there's an equivalent market where you, as a consumer, can buy off-the-shelf cutting-edge technical pieces in your local mall and piece them together into a working device. It's a fun model, for sure, but I'm not sure it's an efficient model. It was just profitable enough to keep the lights on, thanks primarily to a bunch of Taiwanese companies in that space but it wasn't growing anywhere and the state of software is a mess.
Apple the PCs collective lunch before DCs did. So have gaming consoles. So I weep for consumer choice but as things become more advanced maybe PCs and their entire value chain don't make a lot of sense any more.
Obviously at the end there will still be consumer devices, because someone needs to consume all of this AI (at least people are thrown entirely out of the loop, but then all those redundant meat sacks will need entertainment to keep them content). We have the consumer device hyperscaler Apple doing rather OK even with these supply crunches although I'm not sure for how long.
the__alchemist 43 minutes ago [-]
Yea; I believe this is unprecedented. This is the firs time I've observed this regression in GPU price/ performance. That 4090 is still top-tier, and now costs more than when it was new.
Oh bubbles... their so bubbly. Remember when there was an unlimited demand for fibre optics because - The Internet?
So Nortel and other manufacturers lent the money to their clients building the Internet because the growth was unlimited forever? Except they actually didn't have any money, just stock valuations?
"This is a critical step in our effort to unleash the full potential of our high-performance optical component solutions business," said Clarence Chandran, COO of Nortel Networks. "This acquisition really strengthens Nortel Networks' leadership position in high-performance optical components and modules which are essential to delivering the all-optical Internet."
defraudbah 3 hours ago [-]
I refuse, I'll buy when I need to and can hold on for a few months if prices become insane. This means I'll spend less on hardware then what I could, if I wanted to buy max mpro or latest framework I just will not, because prices are too mad and g o for a cheaper version.
whatever happens it's crazy and hope AI madness is worth it
sigio 2 hours ago [-]
For laptops, I always spring for the lowest amount of ram + hdd/ssd, and then instantly upgrade this from local after-market sources. However, this wouldn't work for apple devices (Hence I don't own any Apple devices).
For example, my current Thinkpad T14-gen5, was bought with 8GB ram and 256GB NVME, and then upgraded to 64GB ram and 2TB NVME, for the same price as 16G/512G would have cost at Lenovo. And then I still have the 8GB/256GB to re-use/re-sell.
xyst 40 minutes ago [-]
Not a single comment mentioning how programmers these days don’t give a shit about optimization.
shevy-java 2 hours ago [-]
AI companies driving RAM prices up is, in my opinion,
theft from the common man (and common woman). Sure,
you can say that in capitalism, those who pay more
benefit the most, but no system, not even the USA,
has a purely driven capitalistic system. You still have
transfer money, public infrastructure and what not. So
private companies driving up the prices, such as for
RAM, is IMO also theft from common people. And that should
not happen. It can only happen when you have lobbyists
disguised as politicians who benefit personally from
helping establish such a system. The same can be said
about any other prive-upwards scaling that is done via
racketeering.
pissinwind 2 hours ago [-]
Everything about tech and economy slowing is 1000% man made.
The Trump/anti-America phase has gone on way longer than I thought but it won’t last forever.
Even if we have to wait for this old world cabal to die and fade away, time is still on our side.
Boomers are stupid for using time as a weapon.
I’m chillin. Waiting for people to die while growing my businesses.
Travel to a functional place off the beaten path to see nobody can really stop forward progress. Even in these places where time has stopped.
dist-epoch 2 hours ago [-]
I'm not sure why people are upset. This is how Capitalism is supposed to work - resource allocation towards the most productive (in terms of Capital) usage.
Those who are best able to use a resource are willing to pay the most for it thus pricing out unproductive usages of it.
This is pure Capitalism.
If one is in general against Capitalism, yes, one can complain.
But saying "I want free markets" and "I want capitalism", but then complaining when the free markets increase the price of your RAM is utterly deranged.
Some will say "but Altman is hoarding the RAM, he's not using it productively". It's irrelevant, he is willing to pay more than you to hoard that RAM. In his view he's extracting more value from that than you do, so he's willing to pay more. The markets will work. If this is unproductive use of Capital, OpenAI will go bankrupt.
And the RAM sellers make more money, which is good in Capitalism. It would be irresponsible for them to sell to price sensitive customers (retail), when they have buyers (AI companies) willing to pay much more. And if this is a bad decision, because that AI market will vanish and they will have burned the retail market, Capitalism and Free Markets will work again and bankrupt them.
Survival of the fittest. That is Capitalism. And right now AI companies are the fittest by a large margin.
AI and Capitalism are the exact same thing, as famously put. We are in the first stages of turning Earth into Computronium, you either become Compute or you will fade away.
forinti 2 hours ago [-]
The market can remain irrational longer than the capacitors on my motherboard can resist bloating.
cynicalsecurity 2 hours ago [-]
Fear mongering hysteria.
not_a9 3 hours ago [-]
[dead]
yubainu 3 hours ago [-]
[dead]
inquirerGeneral 2 hours ago [-]
[dead]
sheefers 3 hours ago [-]
[dead]
keybored 3 hours ago [-]
Owning hardware is great. But I get the impression that some people view owning petty hardware as some liberty panacea.
You might have a DVD collection, ten external drives, three laptops, and a workstration. You may still for all intents and purposes be wholly dependent on cloud computing, say, because that it is the only practical way to run whatever AI-driven software three years from now.
Edit: That’s an example. It goes beyond AI. and...:
Liberty goes beyond that.
falense 2 hours ago [-]
I disagree. There is in fact a non-zero chance that we will get good enough models that are MOE optimized for desktop size hardware that can do a lot of the same things as the SOTA models. Im certainly crossing my fingers that the open-weights models continue improving. Engram from Deepseek for instance seems very interesting for a compute to memory offloading perspective.
I just realized that this blog site is pretending to be malware. I opened the tab and was constantly switching between the blog and writing this HN comment (I deleted the rest of the comment after realizing it) and was wondering where the tab went and kept opening it over and over again, then I realized that it completely rewrote the tab title with NSFW content (one of the title contained the world "nudes" with a faked amazon favicon) and when you reopen the tab, it shows you a black overlay with a message intended to induce shock if you ever bother to read it (I didn't read past the first sentence so I don't know what it was actually about).
Can dang/a moderator please ban the domain from HN? Even if its not exactly malware, it's pretending to be malware to grab your attention and it's obviously intending to fill your browser history with inappropriate content, which didn't work on my browser because I opened the blog in a private browser session. The operator clearly doesn't run his blog in good faith.
fishbacon 2 hours ago [-]
The pop-up is about disabling javascript, to avoid this kind of website doing this kind of thing.
I thought it was clever. But it also seems ham-fisted, and in poor taste.
highmango 2 hours ago [-]
The whole point is to grab your attention and bully you to turn off JavaScript. It links to another page: https://disable-javascript.org/
I opened the tab on my work laptop and having NSFW title and icon in the office is unacceptable, I understand the intent but the implementation and this way of forcing people to do something is ridiculous. I do not own or control this machine, I trust the links of the frontage of HN to be somewhat safe and not put me in an uncomfortable position.
Yes, the site not necessarily malware but a dark pattern and that’s not how you teach your average day-to-day user.
jaen 3 hours ago [-]
It doesn't write anything extra to the browser history. How about actually checking before exaggerating. If you are bothered by a single wrong title with the right URL, well... I think something else is wrong.
You are also completely speculating on the intent. Less drama please.
3 hours ago [-]
Rendered at 14:00:38 GMT+0000 (Coordinated Universal Time) with Vercel.
However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.
Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.
I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.
This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.
I could sell the RAM alone now for the price I paid for it.
People who are willing to drop $20k on a computer might not be affected much tho.
They probably won't, but those willing to drop $3-10k will be if the consumer and data-center computing diverge at the architectural level. It's the classical hollowing out the middle - most of the offerings end up in a race-to-the-bottom chasing volume of price-sensitive customers, the quality options lose economies of scale and disappear, and the high-end becomes increasingly bespoke/pricey, or splits off into a distinct market with an entirely different type of customers (here: DC vs. individuals).
This what always happens in capitalism. Scarcity is almost always followed by glut
Memory makers, for example, have sold out their inventory for several years, but instead of investing to manufacture more, they’re shutting down their consumer divisions. They’re just transferring their consumer supply to their B2B (read AI) supply instead.
Thats likely because they don’t expect this demand to last past a few years.
This is what I'm afraid of. As more stuff moves to the cloud helped in part by the current prices of HW, the demand for consumer hardware will drop. This will keep turning the vicious cycle of rising consumer HW prices and more moves to the cloud.
I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform. If a GPU is so expensive, you move to a rental model and the subsequent drop in demand will make GPUs even more expensive. They're far from the only ones with dollar signs in their eyes, between the money and total control over customers this future could bring.
Being entirely reliant on someone else's software and hardware is a bleak thought for a person used to some degree of independence and self sufficiency in the tech world.
Roblox is not popular because of its graphics. Younger gamers care more about having fun than having an immersive experience.
Also, if gamers demand infinitely improving graphics so much that they would rather pay for cloud gaming than relax their expectations and be happy with, say, current gen graphics, then that is more a claim about modern self-pwned gamer behavior than megacorp conspiracy.
But I don't buy that either. The biggest games on Steam Charts and Twitch aren't AAA RTX 5090 games.
My phone has 16gigs of ram and a terabyte of storage, laptops today are ridiculous compared to anything I studied with.
I'm not arguing mind you, just trying to understand the usecases people are thinking of here.
Running Electron apps and browsing React-based websites, of course.
--
[0] - Having https://sectograph.com/ as a watch face is 80%+ of value of having a modern smartwatch to me. Otherwise, I wouldn't bother. I really miss Pebble.
For the rest: I agree with you.
I haven't noticed any kind of difference when using Teams. That piece of crap is just as slow and borken as it always was.
tl;dr, no one is looking for their RAM to stay idle. They're looking for their RAM to be available.
these days individual _tabs_ are using multiple gb of ram.
Are they slow because they're Electron? No idea. But you can't deny that most Electron apps are sluggish for no clear reason. At least if they were pegging a CPU, you'd figure your box is slow. But that's not even what happens. Maybe they would've been sluggish even using native frameworks. Teams seems to do 1M network round-trips on each action, so even if it was perfectly optimized assembly for my specific CPU it would probably make no difference.
I wonder if there’s a computer science law about this. This could be my chance!
https://en.wikipedia.org/wiki/Wirth%27s_law
Not exactly the same (it's about power rather than price). But close enough that when you said it, I thought, "oh! there is something like that." There's also more fundamental economics laws at play for supply and demand of a resource / efficiencies at scale / etc. Given our ever increasing demand of compute compared increasing supply (cheaper more powerful compute), I expect the supply will bottleneck before the demand does.
The constant increases in website and electron app weight don't feel great either.
That's "non powerful" to you?
768GB of RAM is insane…
Meanwhile, I’ve been going back and forth for over a year about spending $10k on a MacBook Pro with 128GB. I can’t shake the feeling I’d never actually use that much, and that, long term, cloud compute is going to matter more than sinking money into a single, non-upgradable machine anyway.
I don't know your workloads, but for me personally 64 GB is the ceiling buffer on RAM - I can run entire k8s cluster locally with that and the M5 Pro with top cores is same CPU as M5 Max. I don't need the GPU - the local AI story and OSS models are just a toy for my use-cases and I'm always going to shell out for the API/frontier capabilities. I'm even thinking of 48 config because they already have those on 8% discounts/shipped by Amazon and I never hit that even on my workstation with 64 GB.
No, it won't. The power drain of merely refreshing DRAM is negligible, it's no higher than the drain you'd see in S3 standby over the same time period.
Similarly, having more cache may mean less SSD activity, which may mean less energy draw overall.
If I had a chip to put on the roulette table of this "what if" I'd put it on the "it won't make a difference in the real world in any meaningful way" square.
It wasn't my primary motivator but it hasn't made me regret my decision.
I hummed and hawed on it for a good few months myself.
How is this going to work? You need uncontrolled compute for developing software. Any country locking up that ability too much will lose to those who don't.
I've read about companies where all software developers have to RDP to the company's servers to develop software, either to save on costs (sharing a few powerful servers with plenty of RAM and CPU between several developers) or to protect against leaks (since the code and assets never leave the company's Citrix servers).
Before this price spike, it used to be you could get a second-hand rack server with 1TB of DDR4 for about $1000-2000. People were massively underestimating the performance of reasonably priced server hardware.
You can still get that, of course, but it costs a lot more. The recycling company I know is now taking the RAM out of every server and selling it separately.
Apple hardware is incredibly overpriced.
See a $1100 GPU on eBay, but it’s in the US? Actually a $1900 GPU.
A colleague were just talking about how well he timed the purchase of his $700 24GB 3090.
We live in world where we optimised for globalization. Industry in china, oil in middle east, etc...
This approach proved to be fragile on the hands of people with money and/or power enough to tilt the scale
Also, I wonder how many of us, even here on HN, have the ability to spend that amount of money on computer for personal use. Frankly I wouldn't even know what to do with all the RAM - should I just ramdisk every program I use and every digital thing I made in the last five years?
Anyhow, I suppose for the folks who can't afford hardware (perhaps by design), one ought to own nothing and be happy.
That’s for everyone
The RAM choice was because I have never regretted buying more RAM - it's practically always a better trade than a slightly faster CPU - and 96GB DIMMs were at a sweet spot compared to 128GB DIMMs.
That, and the ability to have big LLMs in memory, for some local inference, even if it's slow mixed CPU/GPU inference, or paged on demand. And if not for big LLMs, then to keep models cached for quick swapping.
I don't mean to judge, it's your money but to me it seems like an enormous waste. Just like spending $100k on a car when you can get one for $15k that does pretty much exactly the same job.
It's not so easy to get nice second-hand hardware here in Switzerland, and my HEDT is nice and quiet, doesn't need to be rack-mounted, plugs straight into the wall. I keep it in the basement next to the internet router anyway.
The "sensible" choice is to rent. It's the same with cars; most people these days lease (about 50% of new cars in CH, which will be a majority if you compare it with auto loan and cash purchase).
How can you say this when Apple is releasing extremely fast M5 MacBook Pros? Or the $600 MacBook Neo that has incredible performance for that price point?
Even x86 is getting some interesting options. The Strix Halo platform has become popular with LLM users that the parts are being sold in high numbers for little desktop systems.
If you haven't tried out a desktop CPU in a while, I highly recommend you giving it a try if you're used to only using laptops, even when in the same class the difference is obvious.
A 300W GPU released in 2025 is about 10x M5 perf. The difference is going to be smaller for CPU perf, but also not close.
People laugh at young men for looksmaxxing. And then there’s this. I dunno. As someone who has been playing computer games since the 70s, I clearly do not understand the culture anymore. But what forces would drive a young man to spend the price of a used car to play a derivative FPS? It seems heartbreaking. Just like the looksmaxxer.
I thought the trend is the opposite direction, with RTX 5x series converging with server atchitectures (Blackwell-based such as RTX 6000 Pro+). Just less VRAM and fewer tensor cores, artificially.
Where is the divergence happening? Or you don't view RTX 5x as consumer hardware?
I don't share the same 1:1 opinion with regards to the article, but it is absolutely clear that RAM prices have gone up enormously. Just compare them. That is fact.
It may be cheaper lateron, but ... when will that happen? Is there a guarantee? Supply crunch can also mean that fewer people can afford something because the prices are now much higher than before. Add to this the oil crisis Trump started and we are now suddenly having to pay more just because a few mafiosi benefit from this. (See Krugman's analysis of the recent stock market flow of money/stocks.)
Open source efforts need to give up on local AI and embrace cloud compute.
We need to stop building toy models to run on RTX and instead try to compete with the hyperscalers. We need open weights models that are big and run on H200s. Those are the class of models that will be able to compete.
When the hyperscalers reach take off, we're done for. If we can stay within ~6months, we might be able to slow them down or even break them.
If there was something 80-90% as good as Opus or Seedance or Nano Banana, more of the ecosystem would switch to open source because it offers control and sovereignty. But we don't have that right now.
If we had really competitive open weights models, universities, research teams, other labs, and other companies would be able to collaboratively contribute to the effort.
Everyone in the open source world is trying to shrink these models to fit on their 3090 instead, though, and that's such a wasted effort. It's short term thinking.
An "OpenRunPod/OpenOpenRouter" + one click deploy of models just as good as Gemini will win over LMStudio and ComfyUI trying to hack a solution on your own Nvidia gaming card.
That's such a tiny segment of the market, and the tools are all horrible to use anyway. It's like we learned nothing from "The Year of Linux on Desktop 1999". Only when we realized the data center was our friend did we frame our open source effort appropriately.
We have this class of models already, Kimi 2.5 and GLM-5 are proper SOTA models. Nemotron might also release a larger-sized model at some time in the future. With the new NVMe-based offload being worked on as of late you can even experiment with these models on your own hardware, but of course there's plenty of cheap third-party inference platforms for these too.
Oh god no, please not more slop, you're already consuming over 1 percent of human energy output, could you, like, chill a bit?
I.e., /if/ I am going to consume LLM tokens, I figure that a local LLM with 10s of billions of parameters running on commodity hardware at home will still consume far more energy per token than that of a frontier model running on commercial hardware which is very strongly incentivized to be as efficient as possible. Do the math; it isn't even close. (Maybe it'd be closer in your local winter, where your compute heat could offset your heating requirements. But that gets harder to quantify.)
Maybe it's different if you have insane and modern local hardware, but at least in my situation that is not the case.
^ Fair. Yep, I agree the calculus changes if you don't have _any_ local hardware and you're needing to factor in the cost of acquiring such hardware.
When I did this napkin math, I was mostly interested in the energy aspect, using cost as a proxy. I was calculating the $/token (taking into consideration the cost of a KWh from my utility, the measured power draw of my M1 work machine, and the measured tokens per second processed by a ~20BP open-weight model). I then compared this to the published $/token rate of a frontier provider, and it was something like two orders of magnitude in favor of the frontier model. I get it, they're subsidizing, but I've got to imagine there's some truth in the numbers.
I wonder, does (or will) the $/token ratio fall asymptotically toward the cost of electricity? In my mind I'm drawing a parallel to how the value of mined cryptocurrency approximately tracks the cost of electricity... but I might be misremembering that detail.
- https://xn--gckvb8fzb.com/projects/
Their github repos:
- https://github.com/mrusme
They even built a BBS-style reader client that supports Hacker News:
https://github.com/mrusme/neonmodem
I miss the days of the web being weird like this :-)
Running a VPS with Tailscale for private access, SQLite instead of managed databases, flat files synced with git instead of cloud storage. None of this requires expensive hardware, it just requires caring enough to set it up
So yeah, it's fun. But don't under-estimate that time, it could easily be your time spent with friend or family.
I have a homelab that supports a number of services for my family. I have offsite backups (rsync.net for most data, a server sitting at our cottage for our media library), alerting, and some redundancy for hardware failures.
Right now, I have a few things I need to fix: - one of the nodes didn't boot back up after a power outage last fall; need to hook up a KVM to troubleshoot - cottage internet has been down since a power outage, so those backups are behind (I'm assuming it's something stupid, like I forgot to change the BIOS to power on automatically on the new router I just put in) - various services occasionally throw alerts at me
I have a much more complex setup than necessary (k8s in a homelab is overkill), but even the simplest system still needs backups if you care at all about your data. To be fair, cloud services aren't immune to this, either (the failure mode is more likely to be something like your account getting compromised, rather than a hardware failure).
Most of this I didn't for many years because it is not my core competence (in particular the security aspects). Properly fleshed-out explanations from any decent AI will catapult you to this point in no time. Maintenance? Almost zero.
Sure - self hosting takes a bit more work. It usually pays for itself in saved costs (ex - if you weren't doing this work, you're paying money which you needed to do work for to have it done for you.)
Cloud costs haven't actually gotten much cheaper (but the base hardware HAS - even now during these inflated costs), and now every bit of software tries to bill you monthly.
Further, if you're not putting services open on the web - you actually don't need to update all that often. Especially not the services themselves.
Honestly - part of the benefit of self-hosting is that I can choose whether I really want to make that update to latest, and whether the features matter to me. Often... they don't.
---
Consider: Most people are running outdated IP provided routers with known vulnerabilities that haven't been updated in literally years. They do ok.
If anyone reading this has struggled with servers accumulating cruft, and requiring maintainance, I recommend NixOS.
They need to get over it.
Pick up some Ansible and or Terraform/tofu and automate away. It can be easy or as involved as you want it to be.
Meanwhile my much more expensive laptop mostly interfaces with applications that primarily exist on servers that I have no control over, and it would be nearly worthless if I disconnected it from the Internet. Your central point is right, the economics are concerning, but I think it's been a ship slowly sailing away that we're now noticing has disappeared over the horizon.
Personally I think it will be a big headache for HP, people can be hard on laptops and HP is already not excited about consumer support (i.e. mandatory 15 minute wait time for support calls). But if they make it work, I think there's probably a good number of people who feel like they need a laptop but don't care so much about the specifics and want to keep their costs low (as all of their costs appear to be rising right now).
For consumers who don't replace their laptops on a schedule it makes less sense.
(A large factor here is, obviously, the cloud. With photos, documents, e-mail, IMs, etc. all hosted for cheap or free on "other people's computers", the total hardware demands on the end-user computing device is much less. Think storage, not just RAM.)
It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once. I have a separate company laptop for work, and I occasionally turned on my PC, but it turns out that a foldable phone is good enough to do everything on personal side I'd normally use a laptop for. So here I am, with my primary compute device I don't have full control over - and yes, I'm surprised by this development myself, and haven't fully processed it yet.
It's a deeply flawed comparison, because many of the things we do with a phone now wasn't something we'd do at all with the computers we grew up with. We didn't pay at the grocery store with a computer, we didn't buy metro tickets, we didn't use it to navigate (well, there was a short period of time where we might print out maps, but anyway..)
When I grew up, I feel like our use of home computers fell into two categories:
1. Some of us kids used them to play games. Though many more would have a Nintendo/Sega for that, and I feel like the iPhone/iPad is a continuation of that. The "it just works" experience where you have limited control over the device.
2. Some parents would use it for work/spreadsheets/documents ... and that's still where most people use a "real" computer today. So nothing has really changed there.
There is now a lot more work where you do the work on services running on a server or in the cloud. But that's back to the original point: that's in many cases just not something we could do with old home computers. Like, my doctor can now approve my request for a prescription from anywhere in the world. That just wasn't possible before, and arguably isn't possible without a server/cloud-based infrastructure.
Phones/tablets as an interface to these services is arguably a continuation of like those old dumb terminals to e.g. AS/400 machines and such.
> It's true even in tech; half a year ago I switched my phone to a Galaxy Z Fold7, and I haven't used my personal laptop since then, not once.
I do agree, I am in a similar situation.
> [...] my doctor can now approve my request for a prescription from anywhere in the world. That just wasn't possible before [...]
I'm picking nits, but wasn't this more or less instantaneous approval possible before with e.g., a fax and a telephone? Or (although this is a bit of a stretch) a telegram and telegraph?
There is a reason I have a server in my basement - it lets me edit files on my phone (if I must - the keyboard is and screen space are terrible compromises but sometimes I can live with it), laptop (acceptable keyboard and screen), or desktop (great keyboard, large screen); it also lets me share with my wife (I haven't got this working but it can be done). I have nearly always had a server in my house because sharing files between computers is so much better than only being able to work on one (or using floppies). The cloud expands my home server to anywhere in the world: it offloads security on someone else, and makes it someone else's problem to keep the software updated.
There is a lot to hate about the cloud. My home servers also have annoyances. However for most things it is conceptually better and we just need the cloud providers to fix the annoyances (it is an open question if they will)
Competition.
Will we continue to see steady improvement in top quality CPU/GPUs? Would they even bother releasing consumer versions of ram faster than DDR5?
The niche is still there, probably as big as it was before. For example, as I grew weary of being subject to services I have little control over, I set up my own home server using a refurbished PC. It has been an amazing journey so far. But I don't think a normie would ever get interested in buying a refurbished Dell, install Debian on it, and set up their own services there.
As long as there is a niche of people interested in buying their own computers, there will be companies willing to fill that niche.
I'd like to say a brief thank you to what the brief, golden period of globalisation was able to bring us.
I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future, but it seems the pendulum is swinging the other way for the time being.
I hope that, wherever the current direction ends up, there are lessons that can be learnt about what we had, and somehow fumbled, such that there is motivation enough to get back there.
Me too, but without all the slavery this time please. It'll never work if some actors are willing to abuse their workforces to keep prices low as they do.
I suspect we're trending back to the pre-personal computing era where access to 'raw' computing power will be hard to come by. It will become harder and harder to learn to program just because it'll be harder and harder to get your hands on the necessary equipment.
There have been memory chip panics before, the US funded RAM production back into the 80s/90s in competition with Japan at the time.
The AI boom/"hyperscale" currently is almost exactly like the dotcom boom.
It's already starting to shake down. Anthropic is occupying the developer space, OpenAI has just exited the video/media production space. More focused and vertical market AI is emerging.
The current vortice of money between OpenAI <-> Microsoft <-> Oracle <-> NVidea <-> Google <-> etc etc is going to break.
Outside of the obvious economic effect of the dot com boom - the creation of near infinitely scalable high margin online businesses - there was a secondary effect on consumer electronics, with a massive growth in demand for networked devices; there was then much more of a balance between the hardware growth in the network infrastructure and data center worlds as well as in desktop and mobile.
The AI boom’s hardware impact is much more skewed, as this article details.
Yes but these Chinese firms are a tiny share of the overall RAM/SSD market, and they'll have the same problems with expanding production as everyone else. So it doesn't actually help all that much.
* Chinese firms finance through different banks and investors than current ram producers
* A company with a mission statement of consumer ram won’t have their supply outbid by data centers
* Chinese manufacturing has more expertise in scaling then any other manufacturing culture
The fact that I didn't know any of this is what is significant here. At some point I stopped caring about this sort of thing. It really doesn't matter any more. Don't get my wrong, I am as nerdy as they come. My first computer was a wire wrapped 8080 based system. That was followed by an also wire wrapped 8086 based system of my own design I used for day to day computing tasks (it ran Forth). If someone like me can get to the point of not caring there is no real reason for anyone else to care.
0.03 kW * 24 h * 365 d * $0.18 = $47.30/year
Maybe... just maybe, a TODO list app shouldn't run 4 processes, and consume hundreds of megabytes of RAM?
More seriously and more ironically, at the same time, we've now reached a strange time where even non-programmers can vibe-code better software than they can buy/subscribe to - not because models are that good, or programming isn't hard, but because enshittification that has this industry rotten to the core and unable to deliver useful tools anymore.
Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.
There are several challenges, not least of which is storage. We have considerable leakage in most of our current helium storage solutions on earth because it’s so light. Our national reserves are literally in underground caverns because it’s better than anything we can build. Space just means any containment system will need to work in a wider range of pressure/temperatures.
Non-helium hard drives are basically limited by their bearing spin hours. If one only spins a few hours a week, it'll probably run for decades. Not so with helium.
War aside, I also bet there's going to be a huge demand for edge-compute for other kinds of robotics: self-driving cars, delivery robots, factory robots, or general-purpose humanoids (Tesla Optimus, Boston Dynamics Atlas, 1X NEO, etc). Moving that kind of compute to the cloud is too laggy and unreliable. I know researchers who've tried it, the results were mixed.
Also, the engineers working on these platforms aren't going to reinvent the wheel every time they need to connect hardware together and they're going to use interoperable standards, like PCIe for storage or GPUs, DIMM slots for memory, ATX for power, etc. So I don't see general-purpose computing dying.
Is that likely? History says it's inevitable, but timeframe is an open question.
If this does occur, unfortunately it isn’t like any of the production capacity is going to immediately shift or be repurposed. A lot of the hardware isn’t usable outside of datacenter deployments. I would guess a more realistic recalibration is 2-3 years of immense pain followed by gradual availability of components again.
The capital from the gulf is already disrupted. It isn't anymore a matter of if or when.
Stuff like that already exists for flash memory; I can harvest eMMC chips from ewaste and solder them to cheaply-available boards to make USB flash drives. But there the protocols are the same, there's no firmware work needed...
The current AI-induced shortages aside, the times have never been better in my opinion. There is overwhelming choice; ordinary consumers can access anything from Raspberry PIs all the way up to enterprise servers and AI accelerators. The situation was very different in the 1990s when I built my first PC.
- "Stare into this hole to verify your age.
- "Stick your finger in the box.
- "Ignore the pain to get your AI token bucks and unlock access to the shiny new attestation accelerated internet."
- "Sync ALL of your usernames and passwords into this secure enclave."
Every packet and data stream will be analyzed locally by the AI to determine the intentions and predict future behavior. The AI summarized behavior will be condensed into an optimized encoded table to be submitted hourly to the Central Nanny Overseer. I might be slightly exaggerating and a bit hyperbolic but it will be something in this spirit and people will sleep walk right into it.
My only question is which country will control the behavior of these chips.
It will be scarcity mindset from here on out; will always buy the top tier thing .
I got my first PC circa 1992 (a 2nd hand IBM PS/2, 80286 processor with 2MB RAM and 30MB HDD) and the "golden age" was already there. We are well over 40 years of almost uninterrupted "pay less for more performances" in the home/personal computing space, and that's because that space started around 50 years ago. There was some fluctuation (remember the earthquake affecting HDD prices a few years ago?) but demand was there and manufacturing tech became more efficient.
The actual important change is that for most consumer uses, the perf improvements stopped to make sense already what, over 10 years ago?
For gaming, I have a dedicated device - a Nintendo Switch, but I also play indie PC games like Slay the Spire, Forge MTG, some puzzle games e.g. TIS-100.
Linux with i3 is fast and responsive. I write code in the terminal, no fancy debuggers, no million plugins, no Electron mess.
It’s enough for everything I need, and I don’t see a reason to ever upgrade. Unless my hardware starts failing, of course.
The experience is quite immersive and well worth the upgrade that happened very progressively (WiFi 5 1080p then WiFi 6/7 4K).
"don't create the torment nexus, etc."
That said.... hopefully at least on Android side you can get a free (as in unchastified) OS to run on it.
Until they come for the HW.
a couple of my favorites: "rust programming socks - Google", "Amazon.com: waifu pillow", "Rick Astley - Never Gonna Give You Up", "censorship on hacker news - Google"
Cha cha cha ...
Phones and tablets only get replaced when they die.
Why should I throw away stuff that still works as intended?
The tipping point for MCUs was WiFi - which not only allows you to speak multiple protocols (UDP/Zigbee/HTTP/etc) and have audio IO, but also P2P communication and novel new form factors. There's been incredible progress with the miniaturisation of sensors and how we're able to understand and perceive our environment.
So yes, whilst traditional hardware is getting more expensive and locked down, there's a strong counter movement towards computing for everyone - and by that I also mean that there's going to be less abstraction in the entire stack. Good times ahead!
Does all this not apply to businesses buying computers for their employees?
There could be a swing in the future where people will demand local AI instead and resources could shift back to affordable local AI devices.
Lastly, this thesis implies that we will be supply constrained forever such that prices for personal devices will always be elevated as a percentage of one's income. I don't believe that.
Consumer hardware will always be a market worth serving for companies who don't see their stock price as their product.
If the existing companies are unwilling to make a sale, I am sure new players will arise picking up their slack.
https://www.youtube.com/watch?v=SrX0jPAdSxU
They will let the hyperscalers buy their supply at a premium and wait for the bust. Then they will shift back to the consumer space.
Hardware is going to be expensive for awhile but its not as dire as the article makes it out to be.
https://web.archive.org/web/20180513133803/https://www.techr...
Prices went down again after that.
To me this is just a temporary swing in the other direction - they're riding the gravy train while they can, because once it ends it's back to low prices.
At the same time, the article’s argument that the value of personal computer ownership is only going to rise, in terms of the value of speech, not strictly in terms of the value of lunch, is important to call out.
I’m glad I held on to my 2009 MacBook, for example, as it still functions today as an active part of my homelab, at an amortized yearly cost of practically the price of taking a nice steak dinner once a year.
The US is headed for a cataclysmic crash at this point and it's not clear what will trigger it, but all those companies pushing underpriced tokens and Rust ports of existing tools by agents aren't going to survive it.
But also consider that PCs have been an anomaly for very long. I don't think there's an equivalent market where you, as a consumer, can buy off-the-shelf cutting-edge technical pieces in your local mall and piece them together into a working device. It's a fun model, for sure, but I'm not sure it's an efficient model. It was just profitable enough to keep the lights on, thanks primarily to a bunch of Taiwanese companies in that space but it wasn't growing anywhere and the state of software is a mess.
Apple the PCs collective lunch before DCs did. So have gaming consoles. So I weep for consumer choice but as things become more advanced maybe PCs and their entire value chain don't make a lot of sense any more.
Obviously at the end there will still be consumer devices, because someone needs to consume all of this AI (at least people are thrown entirely out of the loop, but then all those redundant meat sacks will need entertainment to keep them content). We have the consumer device hyperscaler Apple doing rather OK even with these supply crunches although I'm not sure for how long.
Oh bubbles... their so bubbly. Remember when there was an unlimited demand for fibre optics because - The Internet? So Nortel and other manufacturers lent the money to their clients building the Internet because the growth was unlimited forever? Except they actually didn't have any money, just stock valuations?
"This is a critical step in our effort to unleash the full potential of our high-performance optical component solutions business," said Clarence Chandran, COO of Nortel Networks. "This acquisition really strengthens Nortel Networks' leadership position in high-performance optical components and modules which are essential to delivering the all-optical Internet."
whatever happens it's crazy and hope AI madness is worth it
For example, my current Thinkpad T14-gen5, was bought with 8GB ram and 256GB NVME, and then upgraded to 64GB ram and 2TB NVME, for the same price as 16G/512G would have cost at Lenovo. And then I still have the 8GB/256GB to re-use/re-sell.
The Trump/anti-America phase has gone on way longer than I thought but it won’t last forever.
Even if we have to wait for this old world cabal to die and fade away, time is still on our side.
Boomers are stupid for using time as a weapon.
I’m chillin. Waiting for people to die while growing my businesses.
Travel to a functional place off the beaten path to see nobody can really stop forward progress. Even in these places where time has stopped.
Those who are best able to use a resource are willing to pay the most for it thus pricing out unproductive usages of it.
This is pure Capitalism.
If one is in general against Capitalism, yes, one can complain.
But saying "I want free markets" and "I want capitalism", but then complaining when the free markets increase the price of your RAM is utterly deranged.
Some will say "but Altman is hoarding the RAM, he's not using it productively". It's irrelevant, he is willing to pay more than you to hoard that RAM. In his view he's extracting more value from that than you do, so he's willing to pay more. The markets will work. If this is unproductive use of Capital, OpenAI will go bankrupt.
And the RAM sellers make more money, which is good in Capitalism. It would be irresponsible for them to sell to price sensitive customers (retail), when they have buyers (AI companies) willing to pay much more. And if this is a bad decision, because that AI market will vanish and they will have burned the retail market, Capitalism and Free Markets will work again and bankrupt them.
Survival of the fittest. That is Capitalism. And right now AI companies are the fittest by a large margin.
AI and Capitalism are the exact same thing, as famously put. We are in the first stages of turning Earth into Computronium, you either become Compute or you will fade away.
You might have a DVD collection, ten external drives, three laptops, and a workstration. You may still for all intents and purposes be wholly dependent on cloud computing, say, because that it is the only practical way to run whatever AI-driven software three years from now.
Edit: That’s an example. It goes beyond AI. and...:
Liberty goes beyond that.
https://www.reddit.com/r/LocalLLaMA/comments/1s0czc4/round_2...
Can dang/a moderator please ban the domain from HN? Even if its not exactly malware, it's pretending to be malware to grab your attention and it's obviously intending to fill your browser history with inappropriate content, which didn't work on my browser because I opened the blog in a private browser session. The operator clearly doesn't run his blog in good faith.
I thought it was clever. But it also seems ham-fisted, and in poor taste.
I opened the tab on my work laptop and having NSFW title and icon in the office is unacceptable, I understand the intent but the implementation and this way of forcing people to do something is ridiculous. I do not own or control this machine, I trust the links of the frontage of HN to be somewhat safe and not put me in an uncomfortable position. Yes, the site not necessarily malware but a dark pattern and that’s not how you teach your average day-to-day user.
You are also completely speculating on the intent. Less drama please.