30 comments

  • torginus2 hours ago
    It's somewhat alarming to see that companies (owned by a very small slice of society) producing these AI thingies (whose current economic is questionable value and actual future potential is up to hot debate), can easily price the rest of humanity out of computing goods.
    • palmotea2 hours ago
      &gt; It&#x27;s somewhat alarming to see that companies (owned by a very small slice of society) ... can easily price the rest of humanity out of computing goods.<p>If AI lives up to the hype, it&#x27;s a portent of how things will feel to the common man. Not only will unemployment be a problem, but prices of any resources desired by the AI companies or their founders will rise to unaffordability.
      • torginus2 hours ago
        I think living up to the hype needs to be defined.<p>A lot of AI &#x27;influencers&#x27; love wild speculation, but lets ignore the most fantastical claims of techno-singularity, and let&#x27;s focus on what I would consider a very optimistic scenario for AI companies - that AI capable of replacing knowledge workers can be developed using the current batch of hardware, in the span of a year or two.<p>Even in this scenario, the capital gains on the lump sump invested in AI far outpaces the money that would be spent on the salaries of these workers, and if we look at the scenario with investor goggles, due to the exponential nature of investment gains, the gap will only grow wider.<p>Additionally, AI does not seem to be a monopoly, either wrt companies, or geopolitics, so monopoly logic does not apply.
      • m_mueller1 hour ago
        and if you&#x27;re unlucky to live close to a datacenter, this could include energy and water? I sure hope regulators are waking up as free markets don&#x27;t really seem equipped to deal with this kind of concentration of power.
      • BoredPositron22 minutes ago
        AI probably will end up living up to the hype. It won&#x27;t be on the generation of hardware they are now mass deploying. We need another tock before we can even start to talk about AGI.
      • walterbell2 hours ago
        <i>&gt; rise to unaffordability</i><p>Or require non-price mechanisms of payment and social contract.
    • testbjjl1 hour ago
      &gt; It&#x27;s somewhat alarming to see that companies (owned by a very small slice of society) producing these AI thingies (whose current economic is questionable value and actual future potential is up to hot debate)<p>Some might conclude the same for funds (hedge funds&#x2F;private equity) and housing.
      • lpapez1 hour ago
        Stop right there you terrorist antifa leftie commie scum! You are being arrested for thought crime!
    • jstanley1 hour ago
      You don&#x27;t think that as prices go up, the supply might also go up, and the equilibrium price will be maintained?<p>And possibly even a lower equilibrium will be reached due to greater economies of scale.
      • marcyb5st1 hour ago
        That assumes that supply production means can scale up instantly. Fabs for high end chips don&#x27;t and usually take years from foundations being laid to First chip out of the production line.<p>In the interim, yeah, they will force prices up.<p>Additionally those fabs cost billions. Given the lead time I mentioned a lot of companies won&#x27;t start building them right away since the risk of demand going away is high and the ROI in those cases might become unreachable
      • gehatare1 hour ago
        Is there any reason to believe that this will happen? Prices of graphic cards only went down after the crypto boom went down again.
    • UltraSane1 hour ago
      Supply should increase as a response to higher prices, this bringing prices down.
      • arnaudsm42 minutes ago
        That&#x27;s economical theory, but the real world is often non-linear.<p>Crucial is dead. There&#x27;s a finite amount of rare earth. Wars and floods can bankrupt industries, supply chains are tight.
        • tirant8 minutes ago
          Those are all temporary events and circumstances.<p>If the market is big enough, competitors will appear. And if the margins are high enough, competitors can always price-compete down to capture market-share.
        • lotsofpulp31 minutes ago
          The business that owns crucial is producing more chips than ever.<p>Rare earth metals are in the dirt around the world.<p>Supply and demand curves shifting, hence prices increasing (and decreasing) is an expected part of life due to the inability to see the future.
          • mschuster916 minutes ago
            &gt; Rare earth metals are in the dirt around the world.<p>They are. The problem is, the machinery to extract and refine them, and especially to make them into chips, takes <i>years</i> to build. We&#x27;re looking at a time horizon of almost a decade if you include planning, permits and R&amp;D.<p>And given that almost everyone but the AI bros expects the AI bubble to burst rather sooner than later (given that the interweb of funding and deals more resembles the Habsburg family tree than anything healthy) and the semiconductor industry is infamous for pretty toxic supply&#x2F;demand boom-bust cycles, they are all preferring to err on the side of caution - particularly as we&#x27;re not talking about single billion dollar amounts any more. TSMC Arizona is projected to cost 165 billion dollars [1] - other than the US government and cash-flush Apple, I don&#x27;t even know <i>anyone</i> willing to finance such a project under the current conditions.<p>[1] <a href="https:&#x2F;&#x2F;www.tsmc.com&#x2F;static&#x2F;abouttsmcaz&#x2F;index.htm" rel="nofollow">https:&#x2F;&#x2F;www.tsmc.com&#x2F;static&#x2F;abouttsmcaz&#x2F;index.htm</a>
      • abenga46 minutes ago
        How does this square with some companies just stopping sales to consumers altogether?
        • baq42 minutes ago
          This is exactly it: supply of high margin products is increasing at the cost of low margin products. Expect the low end margin to catch up to the high end as long as manufacturing capacity is constrained (at least 1 year).
      • atq211915 minutes ago
        As usual, the problem is: how fast does this happen?
      • dandanua1 hour ago
        in a fairy world
        • YetAnotherNick1 hour ago
          No just stop being cynical. The reason almost every electronic item is cheaper now than 2 decades back is just becuase the demand(and thus supply) is higher.
    • 01HNNWZ0MV43FF2 hours ago
      Without regulation, money begets money and monopolies will form.<p>If the American voter base doesn&#x27;t pull its shit together and revive democracy, we&#x27;re going to have a bad century. Yesterday I met a man who doesn&#x27;t vote and I wanted to go ape-shit on him. &quot;My vote doesn&#x27;t matter&quot;. Vote for mayor. Vote for city council. Vote for our House members. Vote for State Senate. Vote for our two Senators.<p>&quot;Voting doesn&#x27;t matter, capitalism is doomed anyway&quot; is a self-fulling prophecy and a fed psy-op from the right. I&#x27;m so fucking sick of that attitude from my allies.
      • lanyard-textile31 minutes ago
        Jovially -- you simultaneously believe that they&#x27;re a victim of a psy-op *and* that their attitude is self formed?<p>;) And you wanted to go ape shit on him... For falling for a psy-op?<p>My friend, morale is very very low. There is no vigor to fight for a better tomorrow in many people&#x27;s hearts. Many are occupied with the problems of today. It doesn&#x27;t take a psy-op to reach this level of hopelessness.<p>Be sick of it all you want, it doesn&#x27;t change their minds. Perhaps you will find something more persuasive.
      • Yizahi28 minutes ago
        It is very likely that his vote for the parliament literally and legally doesn&#x27;t matter, depending on the party allegiance of the candidates and the state he is in. All because of the non-democratic ancient first past the post system. Though in his place I would go to the station and at least deface a ballot as a sign of contempt.
      • tirant10 minutes ago
        What regulation are you expecting to be passed and why do you believe monopolies are bad?<p>If a monopoly appears due to superior offerings, better pricing and quicker innovation, I fail to see why it needs to be a bad thing. They can be competed against and historically that has always been the case.<p>On the other hand, monopolies appearing due to regulations, permissions, patents, or any governmental support, are indeed terrible, as they cannot be competed against.
      • llmslave22 hours ago
        This is a common sentiment but it doesn&#x27;t make any sense. Voting for the wrong politician is worse than not voting at all, so why is it seen as some moral necessity for everyone to vote? If someone doesn&#x27;t have enough political knowledge to vote correctly, perhaps they shouldn&#x27;t vote.
        • maeln50 minutes ago
          Someone, I can&#x27;t remember who, explained it better than me, but the gist of it is by not voting, you are effectively checking yourself out of politician consideration.<p>If we see politician as just a machine who&#x27;s only job is to get elected, they have to get as many votes as possible. Pandering to the individual is unrealistic, so you usually target groups of people who share some common interest. As your aim is to get as many votes as possible, you will want to target the “bigger” (in amount of potential vote) groups. Then it is a game of trying to get the bigger groups which don&#x27;t have conflicting interest. While this is theory and a simplification of reality, all decent political party do absolutely look at statistics and survey to for a strategy for the election.<p>If you are part of a group that, even though might be big in population, doesn&#x27;t vote, politician have no reason to try to pander to you. As a concrete example, in a lot of “western” country right now, a lot of politician elected are almost completely ignoring the youth. Why ? Because in those same country the youth is the age group which vote the less.<p>So by not voting, you are making absolutely sure that your interest won&#x27;t be defended. You can argue that once elected, you have no guarantee that the politician will actually defend your interest, or even do the opposite (as an example, soybean farmer and trump in the U.S). But then you won&#x27;t be satisfied and possibly not vote for the same guy &#x2F; party next election (which is what a lot of swing voters do).<p>But yeah, in an ideal world, everyone would vote, see through communication tactics and actually study the party, program and the candidate they vote for, before voting.
      • layer850 minutes ago
        The thing is that while voting matters collectively, it’s insignificant individually: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Paradox_of_voting" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Paradox_of_voting</a><p>Nonvoters aren’t being irrational.
      • irjustin2 hours ago
        Just so we&#x27;re clear the current voter base says this is exactly how it should be.
        • i80and2 hours ago
          Just so we&#x27;re clear, the voter base of <i>over a year ago</i> asked for this because they were actively lied to, and were foolish enough to believe said lies.<p><i>Current</i> polling however says the current voter base is <i>quite unhappy</i> with how this is
          • nemomarx1 hour ago
            People spend a lot more effort and money lying to the voter base during election years than during the rest of the time.
  • vee-kay8 hours ago
    For last 2 years, I&#x27;ve noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs).<p>Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing&#x2F;IT industry is regressing.<p>New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.<p>Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).<p>It is as if the industry has decided to focus on AI and nothing else.<p>And this will be a huge setback for humanity, especially the students and scientific communities.
    • SXX5 hours ago
      No dedicated GPU is certainly unrelated to whatever been happening for last two years.<p>It&#x27;s just in last 5 years integrated GPUs become good enough even for mid-tier gaming let alone running browser and hw accel in few work apps.<p>And even before 5 years ago majority of dedicated GPUs in relatively cheap laptops was garbage barely better than intrgrated one. Manufacturers mostly put them in there for marketing of having e.g Nvidia dGPU.
      • amiga-workbench5 hours ago
        A dedicated GPU is a red flag for me in a laptop. I do not want the extra power draw or the hybrid graphics sillyness. The Radeon Vega in my ThinkPad is surprisingly capable.
        • iancmceachern2 hours ago
          For me it&#x27;s a necessity to run the software I need to do my work (CAD design)
        • silon4252 minutes ago
          Same here... I do not wish for a laptop with &gt;65W USB-C power requirements.
        • trinsic23 hours ago
          Yea I agree it&#x27;s not worth it to have a igpu a dedicated. If I&#x27;m correct in what you are talking about. There&#x27;s always issues with that setup in laptops. But I&#x27;d stay away from all laptops at this point until we get an Adminstration that enforces anti trust. All manufactures have been cutting so many corners, your likely to have hardware problems within a year unless it&#x27;s a MacBook or a business class laptop.
      • Fabricio203 hours ago
        I&#x27;m gonna be honest thats not my experience at all. I got a laptop with a modern ryzen 5 CPU four years ago that had an iGPU because &quot;its good enough for even mid-tier gaming!&quot; and it was so bad that I couldn&#x27;t play 1440p on youtube without it skipping frames. Tried parsec to my desktop PC and it was failing that as well. I returned it and bought a laptop with a nvidia dGPU (low end still, I think it was like a 1050-refresh-refresh equivalent) and haven&#x27;t had any of those problems. That AMD Vega gpu just couldn&#x27;t do it.
        • copx48 minutes ago
          No Ryzen 5 system should have any trouble playing YouTube videos, there must have been something wrong with your system.
        • Iulioh3 hours ago
          What processor are we talking about?<p>Your experience is extremely weird
      • trinsic23 hours ago
        Yea mid-tier is a stretch. Maybe low-end gaming
        • SXX1 hour ago
          Low end gaming ia 2d indie titles and they now run on toasters.<p>All the popular mass matket games work on iGPU: fortnite, roblox, mmos, arena shooters, battlen royales. Good chunk of cross platform console titles also work just fine.<p>You can play Cyberpunk or BG3 on damn Steam Deck. I wont call this low end.<p>Number of games that dont run to some extent without dGPU is limited to heavy AAA titles and niche PC only genres.
    • koito171 hour ago
      This is what I find a bit alarming, too. My M3 Max MacBook Pro takes 2 full seconds to boot Slack, a program that used to literally be an IRC client. Many people still believe client-side compute is cheap and worrying about it is premature optimization.<p>Of course, some performance-focused software (e.g. Zed) does start near-instantly on my MacBook, and it makes other software feel sluggish in comparison. But this is the exception, not the rule.<p>Even as specs regress, I don&#x27;t think most people in software will care about performance. In my experience, product managers never act on the occasional &quot;[X part of an app] feels clunky&quot; feedback from clients. I don&#x27;t expect that to change in the near future.
    • 9999000009994 hours ago
      Or, we can expect better from software. Maybe someone can fork Firefox and make it run better, hard cap how much a browser window can use.<p>The pattern of lazy almost non existent optimization combined with blaming consumers for having weak hardware, needs to stop.<p>On my 16GB ram lunar lake budget laptop CachyOS( Arch) runs so much smoother than Windows.<p>This is very unscientific, but using htop , running Chrome&#x2F;YouTube playing music, 2 browser games and VS code having Git Copilot review a small project, I was only using 6GBs of ram.<p>For the most part I suspect I could do normal consumer stuff( filing paperwork and watching cat videos) on an 8GB laptop just fine. Assuming I&#x27;m using Linux.<p>All this Windows 11 bloat makes computers slower than they should be. A part of me hopes this pushes Microsoft to at least create a low ram mode that just runs the OS and display manager. Then let&#x27;s me use my computer as I see fit instead of constantly doing a million other weird things.<p>We don&#x27;t *need* more ram. We need better software.
      • walterbell4 hours ago
        <i>&gt; hopes this pushes Microsoft to at least create a low ram mode</i><p>Windows OS and Surface (CoPilot AI-optimized) hardware have been combined in the &quot;Windows + Devices&quot; division.<p><i>&gt; We don&#x27;t *need* more ram</i><p>RAM and SSDs both use memory wafers and are equally affected by wafer hoarding, strategic supply reductions and market price manipulation.<p>Nvidia is re-inventing Optane for AI storage with higher IOPS, and paid $20B for Groq LPUs using SRAM for high memory bandwidth.<p>The architectural road ahead has tiers of memory, storage and high-speed networking, which could benefit AI &amp; many other workloads. How will industry use the &quot;peace dividend&quot; of the AI wars? <a href="https:&#x2F;&#x2F;www.forbes.com&#x2F;sites&#x2F;robtoews&#x2F;2020&#x2F;08&#x2F;30&#x2F;the-peace-dividends-of-the-autonomous-vehicle-wars&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.forbes.com&#x2F;sites&#x2F;robtoews&#x2F;2020&#x2F;08&#x2F;30&#x2F;the-peace-d...</a><p><pre><code> The rapid growth of the mobile market in the late 2000s and early 2010s led to a burst of technological progress.. core technologies like GPS, cameras, microprocessors, batteries, sensors and memory became dramatically cheaper, smaller and better-performing.. This wave of innovation has had tremendous second-order impacts on the economy. Over the past decade, these technologies have spilled over from the smartphone market to transform industries from satellites to wearables, from drones to electric vehicles.</code></pre>
        • PunchyHamster2 hours ago
          &gt; RAM and SSDs both use NAND flash and are equally affected by wafer hoarding, strategic supply reductions and market price manipulation.<p>Why on earth you think RAM uses NAND flash ?
          • walterbell2 hours ago
            Sorry, still editing long comment, s&#x2F;NAND flash&#x2F;memory wafers&#x2F;.
      • viccis4 hours ago
        Yeah I&#x27;m sure that will happen, just like prices will go back down when the stupid tariffs are gone.
      • dottjt4 hours ago
        That&#x27;s not happening though, hence why we need more ram.
        • ssl-33 hours ago
          Eh? As I see it, we&#x27;ve got options.<p>Option A: We do a better job at optimizing software so that good performance requires less RAM than might otherwise be required<p>Option B: We wish that things were different, such that additional RAM were a viable option like it has been at many times in the past.<p>Option C: We use our time-benders to hop to a different timeline where this is all sorted more favorably (hopefully one where the Ballchinians are friendly)<p>---<p>To evaluate these in no particular order:<p>Option B doesn&#x27;t sound very fruitful. I mean: It can be fun to wish, but magical thinking doesn&#x27;t usually get very far.<p>Option C sounds fun, but my time-bender got roached after the last jump and the version of Costco we have here doesn&#x27;t sell them. (Maybe someone else has a working one, but they seem to be pretty rare here.)<p>That leaves option A: Optimize the software once, and duplicate that optimized software to whomever it is useful using that &quot;Internet&quot; thing that the cool kids were talking about back in the 1980s.
      • MangoToupe2 hours ago
        &gt; I was only using 6GBs of ram.<p>Insane that this is seen as &quot;better software&quot;. I could do basically the same functionality in 2000 with 512mb. I assume this is because everything runs through chrome with dozens more layers of abstraction but
        • kasabali1 hour ago
          More like 128MB.<p>512MB in 2000 was like HEDT level (though I&#x27;m not sure that acronym existed back then)
          • anthk35 minutes ago
            512MB weren&#x27;t that odd for multimedia from 2002, barely a few years later. By 2002 256MB of RAM were the standard, almost a new low-end PC.<p>64MB = w98se OK, XP will swap a lot on high load, nixlikes really fast with fvwm&#x2F;wmaker and the like. KDE3 needs 128MB to run well, so get a bit down. No issues with old XFCE releases. Mozilla will crawl, other browsers will run fine.<p>128MB = w98se really well, XP willl run fine, SP2-3 will lag. Nixlikes will fly with wmaker&#x2F;icewm&#x2F;fvwm&#x2F;blackbox and the like. Good enough for mozilla.<p>192MB = Really decent for a full KDE3 desktop or for Windows XP with real life speeds.<p>256MB = Like having 8GB today for Windows 10, Gnome 3 or Plasma 6. Yes, you can run then with 2GB and ZRAM, but, realistically, and for the modern bloated tools, 8GB for a 1080p desktop it&#x27;s mandatory. Even with UBlock Origin for the browser. Ditto back in the day. With 256MB XP and KDE3 flied and they ran much faster than even Win98 with 192MB of RAM.
    • GuB-421 hour ago
      I think it is the result of specs like RAM and CPU no longer being the selling point it once was, except for gaming PCs. Instead people want thin laptops, good battery life, nice screens, premium materials, etc... We have got to the point where RAM and CPU are no longer a limiting factor for most tasks, or at least until software become bloated enough to matter again.<p>If you want a powerful laptop for cheap, get a gaming PC. The build quality and battery life probably won&#x27;t be great, but you can&#x27;t be cheap without making compromises.<p>Same idea for budget mobiles. A Snapdragon Gen 6 (or something by Mediatek) with UFS2.2 is more than what most people need.
    • zahlman7 hours ago
      I&#x27;m reading this thread on an 11-year-old desktop with 8GB of RAM and not feeling any particular reason to upgrade, although I&#x27;ve priced it out a few times just to see.<p>Mint 22.x doesn&#x27;t appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes <i>have</i> increased.)
      • Groxx5 hours ago
        I&#x27;ve been enjoying running Mint on my terrible spec chromebook - it only has 3GB of RAM, but it rarely exceeds 2GB used with random additions and heavy firefox use. The battery life is obscenely good too, I easily break 20 hours on it as long as I&#x27;m not doing something obviously taxing.<p>Modern software is fine for the most part. People look at browsers using tens of gigabytes <i>on systems with 32GB+</i> and complain about waste rather than being thrilled that it&#x27;s doing a fantastic job caching stuff to run quickly.
      • callc5 hours ago
        Mint is probably around 0.05% of desktop&#x2F;laptop users.<p>I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.<p>This wasn’t mentioned, but it’s a new thing for everyone to experience, since the general trend of computer hardware is it gets cheaper and more powerful over time. Maybe not exponentially any more, but at least linearly cheaper and more powerful.
        • OGEnthusiast4 hours ago
          &gt; I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.<p>A $999 MacBook Air today is vastly better than the same $999 MacBook Air 5 years ago (and even more so once you count inflation).
          • chii3 hours ago
            The OP is not looking at the static point (of the price of that item), but the trend - ala, the derivative of the price vs quality. It was on a steep upward incline, and now it&#x27;s flattening.
    • chii4 hours ago
      &gt; Industry mandate should have become 16GB RAM for PCs<p>it was only less than 10 yrs ago that a high end PC would have this level of ram. I think the last decade of cheap ram and increasing core count (and hz) have spoiled a lot of people.<p>We are just returning back on trend. May be software would be written better now that you cannot expect the average low budget PC to have 32G of ram and 8 cores.
    • linguae8 hours ago
      I wonder what we can do to preserve personal computing, where users, not vendors, control their computers? I’m tired of the control Microsoft, Apple, Google, OpenAI, and some other big players have over the entire industry. The software has increasingly become enshittified, and now we’re about to be priced out of hardware upgrades.<p>The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech?
      • Saris8 hours ago
        Run a lightweight Linux distro on older hardware maybe?
        • ta90007 hours ago
          This is it. Buy used Dell and HP hardware with 32 GB of RAM and swap the pcie ssd for 4 TB.
        • thatguy09005 hours ago
          Exclusively using a ever dwindling stock of old hardware is not really a practical solution to preserving hardware rights in the long term
    • deadbabe5 hours ago
      I think it will be a good thing actually. Engineers, no longer having the luxury of assuming that users have high end system specs, will be forced to actually write fast and efficient software. No more bloated programs eating up RAM for no reason.
      • rurp5 hours ago
        The problem is that higher performing devices will still exist. Those engineers will probably keep using performant devices and their managers will certainly keep buying them.<p>We&#x27;ll probably end up in an even more bifurcated world where the well off have access to lot of great products and services that most of humanity is increasingly unable to access.
        • anigbrowl4 hours ago
          Have the laws of supply and demand been suspended? Capital is gonna pour into memory fabrication over the next year or two, and there will probably be a glut 2-3 years from now, followed by retrenchment and wails that innovation has come to a halt because demand has stalled.
          • kasabali1 hour ago
            We&#x27;re not talking about growing tomatoes in your backyard.
          • re-thc3 hours ago
            &gt; Have the laws of supply and demand been suspended?<p>There is the law of uncertainty override it eg trade wars, tariffs , etc.<p>No 1 is going all in with new capacity.
        • charcircuit4 hours ago
          If &quot;performant&quot; devices are not widespread then telemetry will reveal that the app is performing poorly for most users. If a new festure uses more memory and sugnificantly increases the crash rate, it will be disabled.<p>Apps are optimized for the install base, not for the engineer&#x27;s own hardware.
          • DeepSeaTortoise3 hours ago
            What is the point of telemetry if your IDE launching in under 10s is considered the pinnacle of optimization?<p>That&#x27;s like 100B+ instructions on a single core of your average superscalar CPU.<p>I can&#x27;t wait for maps loading times being measured in percentage of trip time.
            • charcircuit1 hour ago
              Because you don&#x27;t want to regress any of the substeps of such a loading progress to turn it back into 10+ seconds of loading.
        • hansvm5 hours ago
          Can confirm, I&#x27;m currently requesting as much RAM as can fit in the chassis and permission to install an OS not too divorced from what we run in prod.<p>On the bright side, I&#x27;m not responsible for the UI abominations people seem to complain about WRT laptop specs.
        • incompatible4 hours ago
          How many &quot;great products and services&quot; even need a lot of RAM, assuming that we can live without graphics-intensive games?
          • rhdunn1 hour ago
            Image, video, and music editing. Developing, running, and debugging large applications.
            • Ekaros19 minutes ago
              The last three sounds to me like self-inflicted issues. If applications weren&#x27;t so large, wouldn&#x27;t less resources be needed?
          • TheDong4 hours ago
            Some open source projects use Slack to communicate, which is a real ram hog. Github, especially for viewing large PR discussions, takes a huge amount of memory.<p>If someone with a low-memory laptop wants to get into coding, modern software-development-related services are incredible memory hogs.
      • Insanity5 hours ago
        I&#x27;m not optimistic that this would be the outcome. You likely will just have poor running software instead. After all, a significant part of the world is already running lower powered devices on terrible connection speeds (such as many parts of Africa).
        • chii3 hours ago
          &gt; a significant part of the world is already running lower powered devices<p>but you cannot consider this in isolation.<p>The developed markets have vastly higher spending consumers, which means companies cater to those higher spending customers proportionately more (as profits demand it). Therefore, the implication is that lower spending markets gets less investment and less catered to; after all, R&amp;D spending is still a limiting factor.<p>If the entirety of the market is running on lower powered devices, then it would get catered for - because there&#x27;d be no (or not enough) customers with high powered devices to profit off.
      • TheDong4 hours ago
        At the same time, AI has made it easier than ever to produce inefficient code, so I expect to rather see an explosion of less efficient software.
      • trinsic23 hours ago
        I don&#x27;t think it&#x27;s going to happen in this day and age. Some smart people will but most barley know how to write there own code let alone write efficient code
      • laterium4 hours ago
        Why are you celebrating compute becoming more expensive? Do you actually think it will be good?
      • thatguy09005 hours ago
        I think the actual outcome is they will expect you to rent servers to conduct all your computing on and your phone and pc will be a dumb terminal.
      • jghn4 hours ago
        Good luck with that.
  • goku1246 minutes ago
    I understand the issue with all the devices. But what about the rest of the things that depend on these electronics, especially DRAMs? Automotive, Aircraft, Marine vessels, ATC, Shipping coordination, traffic signalling, rail signalling, industrial control systems, public utility (power, water, sewage, etc) control systems, transmission grid control systems, HVAC and environment control systems, weather monitoring networks, disaster altering and management systems, ticketing systems, e-commerce backbones, scheduling and rostering systems, network backbones, entertainment media distribution systems, defense systems, and I don&#x27;t know what else. Don&#x27;t they all require DRAMs? What will happen to all of them?
    • synack32 minutes ago
      Industrial microcontrollers and power electronics use older process nodes, mostly &gt;=45nm. These customers aren’t competing for wafers from the same fabs as bleeding edge memory and TPUs.<p>The world ran just fine on DDR3 for a long time.
      • goku1227 minutes ago
        Okay, but what about the rest? The ones that aren&#x27;t embedded in someway and use industrial grade PCs&#x2F;control stations? Or ones with large buffers like network routers? I&#x27;m also wondering about the supply of the alternate nodes and older technologies. Will the manufactures keep those lines running? Was it micron that abandoned the entire retail market in favor of supplying the hyperscalers?
    • lysace35 minutes ago
      A $100k EV has roughly the same amount of DRAM as a $1k phone.<p>The EV is a therefore, on a whole, a lot less sensitive to DRAM price increases.
      • goku1224 minutes ago
        Okay, accepted. But are you sure that the supply won&#x27;t be a problem as well? I mean, even if these products choose a different process nodes compared to the hyperscalers, will the DRAM manufactures even keep those nodes running in favor of these industries?
  • elthor892 hours ago
    If all manufacturers jump into serving the ai market segment.<p>Can this not be a opportunity for new entrants to start serving the other market segments?<p>How hard is it to start and manufacture memory for embedded systems in cars, or pc?
    • lexicality1 hour ago
      If it was easy there would be more memory manufacturers, rather than 2-3 wholesalers who sell to the people who put badges &amp; rgb on it
    • nice_byte1 hour ago
      No, because if you have the capacity to make e.g. ram chips it makes more economic sense to sell them for ai bucks. Serving the other market segment is an opportunity cost unless you&#x27;re selling to them at same prices. In the long run though if enough players emerge the price will eventually come down just due to oversupply.
      • SunlitCat10 minutes ago
        Not quite. Making specialized DRAM chips for AI hardware needs, requires high tech components. Making low(er) end DRAM chips for consumer needs might be easier to get started with.<p>I am pretty sure, in the next year we will see a wave of low end ram components coming out of china.
  • jazzyjackson9 hours ago
    Question: are SoCs with on die memory be effected by this?<p>Looks like the frame.work desktop with Ryzen 128GB is shipping now at same price it was on release, Apple is offering 512GB Mac studios<p>Are snapdragon chips the same way?
    • nrp8 hours ago
      We’ve been able to hold the same price we had at launch because we had buffered enough component inventory before prices reached their latest highs. We will need to increase pricing to cover supplier cost increases though, as we recently did on DDR5 modules.<p>Note that the memory is on the board for Ryzen AI Max, not on the package (as it is for Intel’s Lunar Lake and Apple’s M-series processors) or on die (which would be SRAM). As noted in another comment, whether the memory is on the board, on a module, or on the processor package, they are all still coming from the same extremely constrained three memory die suppliers, so costs are going up for all of them.
      • mips_avatar4 hours ago
        How do suppliers communicate these changes? Are they just like yep now it’s 3x higher? Im surprised you don’t have longer contracts
        • appellations4 hours ago
          Longer contracts are riskier. The benefit of having cheaper RAM when prices spike is not strong enough to outweigh the downside of paying too much for RAM when prices drop or stay the same. If you’re paying a perpetual premium on the spot price to hedge, then your competitors will have pricing power over you and will slowly drive you out of the market. The payoff when the market turns in your favor just won’t be big enough and you might not survive as a business long enough to see it. There’s also counterparty risk, if you hit a big enough jackpot your upside is capped by what would make the supplier insolvent.<p>All your competitors are in the same boat, so consumers won’t have options. It’s much better to minimize the risk of blowing up by sticking as closely to spot at possible. That’s the whole idea of lean. Consumers and governments were mad about supply chains during the pandemic, but companies survived <i>because</i> they were lean.<p>In a sense this is the opposite risk profile of futures contracts in trading&#x2F;portfolio management, even though they share some superficial similarities. Manufacturing businesses are fundamentally different from trading.<p>They certainly have contracts in place that cover goods already sold. They do a ton of preorders which is great since they get paid before they have to pay their suppliers. Just like airlines trade energy futures because they’ve sold the tickets long before they have to buy the jet fuel.
        • baq35 minutes ago
          If you’re Apple, maybe that works, in this case we’re seeing 400% increases in price, instead of your RAM you’ll be delivered a note to pay up or you’ll get your money back with interest and termination fees and the supplier is still net positive.
        • chii3 hours ago
          &gt; longer contracts<p>the risk is that such longer contracts would then lock you into a higher cost component for longer, if the price drops. Longer contracts only look good in hindsight if ram prices increased (unexpectedly).
    • addaon9 hours ago
      &gt; Question: are SoCs with on die memory be effected by this?<p>SoCs with on-die memory (which is, these days, exclusively SRAM, since I don&#x27;t think IBM&#x27;s eDRAM process for mixing DRAM with logic is still in production) will not be effected. SiPs with on-package DRAM, including Apple&#x27;s A and M series SiPs and Qualcomm&#x27;s Snapdragon, will be effected -- they use the same DRAM dice as everyone else.
      • layer826 minutes ago
        *affected
      • pixelpoet9 hours ago
        The aforementioned Ryzen AI chip is exactly what you describe, with 128 GB on-package LPDDR5X. I have two of them.<p>To answer the original question: the Framework Desktop is indeed still at the (pretty inflated) price, but for example the Bosgame mini PC with the same chip has gone up in price.
        • plagiarist5 hours ago
          Are your two chips in Framework Desktops, or some other package? I&#x27;m interested in a unified memory setup and curious about the options.
      • zahlman7 hours ago
        <a href="https:&#x2F;&#x2F;en.wiktionary.org&#x2F;wiki&#x2F;die#Noun" rel="nofollow">https:&#x2F;&#x2F;en.wiktionary.org&#x2F;wiki&#x2F;die#Noun</a><p>&quot;dice&quot; is the plural for the object used as a source of randomness, but &quot;dies&quot; is the plural for other noun uses of &quot;die&quot;.
    • piskov9 hours ago
      Apple secured at least a year-worth supply of memory (not in actual chips but in prices).<p>The bigger the company = longer the contract.<p>However it will eventually catch up even to Apple.<p>It is not prices alone due to demand but the manufacturing redirection from something like lpddr in iphones to hbm and what have you for servers and gpu
      • mirsadm58 minutes ago
        Apple charges so much for RAM upgrades that they could probably not even increase prices and still be fine. They won&#x27;t but they probably could.
        • layer824 minutes ago
          At the cost of reduced margins, which shareholders may not like.
      • greesil4 hours ago
        Apparently Google fucked up<p><a href="https:&#x2F;&#x2F;www.google.com&#x2F;amp&#x2F;s&#x2F;www.indiatoday.in&#x2F;amp&#x2F;technology&#x2F;news&#x2F;story&#x2F;ram-shortage-so-bad-google-told-employees-to-secure-supply-in-korea-or-else-they-will-be-fired-2842016-2025-12-26" rel="nofollow">https:&#x2F;&#x2F;www.google.com&#x2F;amp&#x2F;s&#x2F;www.indiatoday.in&#x2F;amp&#x2F;technolog...</a>
        • magicalhippo1 hour ago
          Non-AMP link:<p><a href="https:&#x2F;&#x2F;www.indiatoday.in&#x2F;technology&#x2F;news&#x2F;story&#x2F;ram-shortage-so-bad-google-told-employees-to-secure-supply-in-korea-or-else-they-will-be-fired-2842016-2025-12-26" rel="nofollow">https:&#x2F;&#x2F;www.indiatoday.in&#x2F;technology&#x2F;news&#x2F;story&#x2F;ram-shortage...</a>
          • SunlitCat7 minutes ago
            To be honest, it starts to look more and more like a single company (we all know which one), is just buying up all DRAM capacities to keep others out of the (AI) game.
      • trollbridge8 hours ago
        I have a feeling every single supplier of DRAM is going to be far more interested in long-term contracts with Apple than with (for example) OpenAI, since there&#x27;s basically zero possibility Apple goes kaput and reneges on their contracts to buy RAM.
        • saagarjha8 hours ago
          Yes, but OpenAI wants $200 billion in RAM and Apple wants $10.
    • dehrmann4 hours ago
      I would think so because fab capacity is constrained, and if you make an on-die SoC with less memory, it uses fewer transistors, so you can fit more on a wafer.
      • hvb231 minutes ago
        But bigger chips mean lower yields because there&#x27;s just more room for errors?
  • zdc11 hour ago
    Thankfully we&#x27;re at a stage where a 4 year old second-hand iPhone is perfectly usable, as are any M-series Macs or most Linux laptops. Sucks for anyone needing something particularly beefy for work; but I feel that a lot of purchases can be delayed for at least a year or two while this plays out.
  • Ekaros2 hours ago
    Outside say video and image editing and maybe lossless audio. Why is this much ram even needed in most use cases? And I mean actually thinking about using it. Computer code unless you are actually doing whole Linux kernel, is just text. So lot of projects probably would fit in cache. Maybe software companies should be billed for user&#x27;s resources too...
    • system244 minutes ago
      I have multiple apps using 300 GB+ PostgreSQL databases. For some queries, high RAM is required. I enable symmetrical NVMe swaps, too. Average Joe with gaming needs wouldn&#x27;t need more than 64 GB for a long time. But for the database, as the data grows, RAM requirements also grow. I doubt my situation is relatable to many.
      • Ekaros24 minutes ago
        I understand servers. But why do actually average user need more than 2 or 4GB? For what actual data in memory at one time?
  • l9o2 hours ago
    It feels like a weird tension: we worry about AI alignment but also want everyone to have unrestricted local AI hardware. Local compute means no guardrails, fine-tune for whatever you want.<p>Maybe the market pricing people out is accidentally doing what regulation couldn&#x27;t? Concentrating AI where there&#x27;s at least some oversight and accountability. Not sure if that&#x27;s good or bad to be honest.
    • walterbell2 hours ago
      <i>&gt; market pricing people out</i><p>For now. Chinese supply chains include DRAM from CXMT (sanctioned) and NAND from YMTC (not sanctioned, holds patents on 3D stacking that have been licensed by Korean memory manufacturers).
  • loudandskittish2 hours ago
    Love all the variations of &quot;8GB of RAM should be enough for anybody&quot; in here.
    • walterbell2 hours ago
      AI PacMan eats memory, then promises to eat&#x2F;write software so we need less memory.
  • compounding_it4 hours ago
    Software has gotten bad over the last decade. Electron apps were the start but these days everything seems to be so bloated, right from the operating systems to browsers.<p>There was a time when apple was hesitant to add more ram to its iPhones and app developers would have to work hard to make apps efficient. Last few years have shown Apple going from 6gb to 12gb so easily for their &#x27;AI&#x27; while I consistently see the quality of apps deteriorating on the App Store. iOS 26 and macOS 26 are so aggressive towards memory swapping that loading settings can take time on devices with 6gb ram (absurd). I wonder what else they have added that apps need purging so frequently. 6gb iphone and 8gb M1 felt incredibly fast for the couple of years. Now apparently they are slow like they are really old.<p>Windows 11 and Chrome are a completely different story. Windows 10 ran just fine on my 8th gen pc for years. Windows 11 is very slow and chrome is a bad experience. Firefox doesn&#x27;t make it better.<p>I also find that gnome and cosmic de are not exactly great at memory. A bare minimum desktop still takes up 1.5-1.6gb ram on a 1080p display and with some tabs open, terminal and vscode (again electron) I easily hit 8gb. Sway is better in this regard. I find alacrity sway and Firefox together make it a good experience.<p>I wonder where we are heading on personal computer software. The processors have gotten really fast and storage and memory even more so, but the software still feels slow and glitchy. If this is the industry&#x27;s idea of justifying new hardware each year we are probably investing in the wrong people.
    • heavyset_go4 hours ago
      Firefox is set to allocate memory until a certain absolute limit or memory pressure is reached. It will eat memory whether you have 4GB of RAM of 40GB.<p>Set this to something you find reasonable: `browser.low_commit_space_threshold_percent`<p>And make sure tab unloading is enabled.<p>Also, you can achieve the same thing with cgroups by giving Firefox a slice of memory it can grow into.
    • anigbrowl4 hours ago
      Fair. I installed a MIDI composition app recently that was 1.2 GB! Now, it does have some internal synthesis that like uses samples, but only a limited selection of sounds so I think 95% of the bulk is from Electron.
  • xbmcuser3 hours ago
    Might not be the best thing for US but rest of the world needs China to reach parity on node size with TSMC to crash the market.
  • arjie4 hours ago
    DRAM spot prices are something like what they were 4 years ago. Having RAM for cheap is nice. But it doesn&#x27;t cost an extraordinary amount. I recently needed some RAM and was able to pick up 16x32 DDR4 for $1600. That&#x27;s about twice as expensive as it used to be but $1600 is pretty cheap for 512 GiB of RAM.<p>A 16 GiB M4 Mac Mini is $400 right now. That covers any essential use-case which means this is mostly hitting hobbyists or niche users.
    • p0w3n3d50 minutes ago
      &gt; A 16 GiB M4 Mac Mini is $400 right now<p>Where do you live? In Poland it&#x27;s 740usd.<p>However, 16gb is very little for a Mac where the OS itself uses 7GB
    • sgerenser1 hour ago
      $400 or $499?
    • chzblck4 hours ago
      you running a server or local llms with a need for 512?
      • arjie4 hours ago
        Server stuff. Nothing interesting. Supermicro H11 + Epyc 7xxx + RAM. I have a 6x4090 setup for local LLMs and I got myself a 128 GB M4 Max laptop thinking I&#x27;d do that, but if I&#x27;m being honest I need to get rid of that hardware. It&#x27;s sitting idle because the SOTA ones are so much better for what I want.
  • memoriuaysj10 hours ago
    the first stages of the world being turned into computronium.<p>next stage is paving everything with solar panels.
  • Culonavirus22 minutes ago
    I mean yea, but this is THE wrong site to post stuff like this. Half the people here are the AI cock and the other half is riding it.
  • czhu124 hours ago
    It seems… fine? Hasn’t DRAM always been a boom and bust industry with no real inflation — in fact massive deflation — over the past 30 years?<p>Presumably the boom times are the main reason why investment goes into it so that years later, consumers can buy for cheap.
  • agilob2 hours ago
    It&#x27;s going to be interesting for Google Chrome team when new laptops will be equipped with 8Gb RAM by default.
  • netbioserror10 hours ago
    Positive downstream effect: The way software is built will need to be rethought and improved to utilize efficiencies for stagnating hardware compute. Think of how staggering the step from the start of a console generation to the end used to be. Native-compiled languages have made bounding leaps that might be worth pursuing again.
    • yooogurt9 hours ago
      Alternatively, we&#x27;ll see a drop in deployment diversity, with more and more functionality shifted to centralised providers that have economies of scale and the resources to optimise.<p>E.g. IDEs could continue to demand lots of CPU&#x2F;RAM, and cloud providers are able to deliver that cheaper than a mostly idle desktop.<p>If that happens, more and more of its functionality will come to rely on having low datacenter latencies, making use on desktops less viable.<p>Who will realistically be optimising build times for usecases that don&#x27;t have sub-ms access to build caches, and when those build caches are available, what will stop the median program from having even larger dependency graphs.
      • linguae8 hours ago
        I’d feel better about the RAM price spikes if they were caused by a natural disaster and not by Sam Altman buying up 40% of the raw wafer supply, other Big Tech companies buying up RAM, and the RAM oligopoly situation restricting supply.<p>This will only serve to increase the power of big players who can afford higher component prices (and who, thanks to their oligopoly status, can effectively set the market price for everyone else), while individuals and smaller institutions are forced to either spend more or work with less computing resources.<p>The optimistic take is that this will force software vendors into shipping more efficient software, but I also agree with this pessimistic take, that companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can’t afford tech at inflated prices.<p>I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.
    • piskov9 hours ago
      Some Soviet humor will help you understand the true course of events:<p>A dad comes home and tells his kid, “Hey, vodka’s more expensive now.” “So you’re gonna drink less?” “Nope. You’re gonna eat less.”
    • ip266 hours ago
      I have some hope for transpiling to become more commonplace. What would happen if you could write in Python, but trivially transpile to C++ and back?
  • johnea11 hours ago
    &quot;May rise&quot;?<p>Prices are already through the roof...<p><a href="https:&#x2F;&#x2F;www.tomsguide.com&#x2F;news&#x2F;live&#x2F;ram-price-crisis-updates" rel="nofollow">https:&#x2F;&#x2F;www.tomsguide.com&#x2F;news&#x2F;live&#x2F;ram-price-crisis-updates</a>
    • piskov9 hours ago
      Big companies secure long-term pricing (multi-year), so iPhones probably won’t feel this in 2026 (or even 2027).<p>2028 is another story depending on whether this frenzy continues &#x2F; fabs being built (don’t know whether they are as hard as cpu)
    • Imustaskforhelp10 hours ago
      Asus is ramping up production of ram...<p>So lets see if they might &quot;save us&quot;
      • jazzyjackson10 hours ago
        Asus doesn&#x27;t operate fabs and has denied the rumor<p><a href="https:&#x2F;&#x2F;www.tomshardware.com&#x2F;pc-components&#x2F;dram&#x2F;no-asus-isnt-going-into-memory-manufacturing-taiwanese-tech-giant-issues-statement-smashing-rumor" rel="nofollow">https:&#x2F;&#x2F;www.tomshardware.com&#x2F;pc-components&#x2F;dram&#x2F;no-asus-isnt...</a>
        • Imustaskforhelp3 hours ago
          Hey sorry, I didn&#x27;t knew that. I had watched the short form content (<a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;shorts&#x2F;eSnlgBlgMp8" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;shorts&#x2F;eSnlgBlgMp8</a>) [Asus is going to save gaming] and I didn&#x27;t knew that It was a rumour.<p>My bad
      • CamperBob210 hours ago
        Asus doesn&#x27;t make RAM. That&#x27;s the whole problem: there are plenty of RAM retail brands, but they are all just selling products that originate from only a couple of actual fabs.
        • nrp10 hours ago
          Three major ones: Micron, Samsung, SK Hynix<p>And a couple of smaller ones: CXMT (if you’re not afraid of the sanctions), Nanya, and a few others with older technology
          • whaleofatw20227 hours ago
            Is this glofo&#x27;s time to shine?
            • bee_rider6 hours ago
              Do they make DRAM? I thought they made compute chips mostly.<p>If I recall correctly, RAM is even more niche and specialized than the (already quite specialized) general chip manufacturing. The structure is super-duper regular, just a big grid of cells, so it is super-duper optimized.
              • FastFT5 hours ago
                They (GF) do not make DRAM. They might have an eDRAM process inherited from IBM, but it would not be competitive.<p>You’re correct that DRAM is a very specialized process. The bit cell capacitors are a trench type that is uncommon in the general industry, so the major logic fabs would have a fairly uphill battle to become competitive (they also have no desire to enter the DRAM market in general).
      • shevy-java9 hours ago
        So far all I am seeing is an increase in prices, so any company claiming it will &quot;ramp up production&quot; here is, in my opinion, just lying for tactical reasons.<p>Governments need to intervene here. This is a mafia scheme now.<p>I purchased about three semi-cheap computers in the last ~5 years or so. Looking at the RAM prices, the very same units I bought (!) now cost 2.5x as much as before (here I refer to my latest computer model, from 2 years ago). This is a mafia now. I also think these AI companies should be extra taxed because they cause us economic harm here.
        • trinsic22 hours ago
          Taxed extra is a good idea. But they bought our current administration so we all know that&#x27;s not going to happen unless something big happens like Trump gets impeached, and all the criminals in congress go to prison. I&#x27;m wondering how likely that will happen. people need to get more directly involved in putting pressure on senators.
  • deadbabe5 hours ago
    Are we finally going to be forced to use something like CollapseOS, when the supply chains can no longer deliver chips to the masses?
  • cglan8 hours ago
    At this current pace, if &quot;the electorate&quot; doesn&#x27;t see real benefits to any of this. 2028 is going to be referendum on AI unfortunately.<p>Whether you like it or not, AI right now is mostly<p>- high electricity prices - crazy computer part prices - phasing out of a lot of formerly high paying jobs<p>and the benefits are mostly - slop and chatgpt<p>Unless OpenAI and co produce the machine god, which genuinely is possible. If most people&#x27;s interactions with AI are the negative externalities they&#x27;ll quickly be wondering if ChatGPT is worth this cost.
    • caconym_4 hours ago
      &gt; they&#x27;ll quickly be wondering if ChatGPT is worth this cost<p>They should be, and the answer is obviously no—at least to them. No political or business leader has outlined a concrete, plausible path to the sort of vague UBI utopia that&#x27;s been promised for &quot;regular folks&quot; in the bullish scenario (AGI, ASI, etc.), nor have they convincingly argued that this isn&#x27;t an insane bubble that&#x27;s going to cripple our economy when AGI <i>doesn&#x27;t</i> happen—a scenario that&#x27;s looking more and more likely every day.<p>There is no upside and only downside; whether we&#x27;re heading for sci-fi apocalypse or economic catastrophe, the malignant lunatics pushing this technology expect to be insulated from consequences whether they end up owning the future light-cone of humanity or simply enjoying the cushion of their vast wealth while the majority suffers the consequences of an economic crash a few rich men caused by betting it all, even what wasn&#x27;t theirs to bet.<p>Everybody should be fighting this tooth and nail. Even if these technologies are useful (I believe they are), and even if they can be made into profitable products and sustainable businesses, what&#x27;s happening now isn&#x27;t related to any of that.
    • zaptheimpaler8 hours ago
      I hope they do. We live in a time of incredibly centralized wealth &amp; power and AI and particularly &quot;the machine god&quot; has the potential to make things 100x worse and return us to a feudal system if the ownership and profits all go to a few capital owners.
      • trinsic22 hours ago
        IMHO this is exactly what is happening. Everyone should be on the phone with there senators putting pressure to enforce anti-trust and deal with citizens united
    • OGEnthusiast4 hours ago
      &gt; At this current pace, if &quot;the electorate&quot; doesn&#x27;t see real benefits to any of this. 2028 is going to be referendum on AI unfortunately.<p>Not saying this is necessarily a bad prediction for 2028, but I&#x27;m old enough to remember when the 2020 election was going to be a referendum on billionaires and big tech monopolies.
    • stefan_7 hours ago
      For good measure, a bunch of this is funded through money taken directly from the electorates taxes and given to a few select companies, whose leaders then graciously donate to the latest Ballroom grift. Micron, so greedy they thought nothing of shutting down their consumer brand even when it costs them nothing at all, got $6B in Chips Act money in 2024.
  • arnaudsm38 minutes ago
    [deleted]
    • fartfeatures18 minutes ago
      This take is pure Luddite nonsense. AI &quot;lowering labor value while boosting capital&quot; ignores centuries of automation: productivity gains cut costs, expand markets, create new jobs, and raise real wages despite short-term disruption.<p>Steam engines, electricity, computers displaced workers but spawned far more opportunities through new industries and cheaper goods. Same pattern now.<p>The &quot;jobless masses stuck with 1GB phones eating slop&quot; fantasy is backwards. Compute keeps getting vastly cheaper and more capable; AI speeds that up.<p>&quot;Terrible for indie creators and startups&quot;? The opposite: AI obliterates barriers to building, shipping, and competing. Solo founders are moving faster than ever.<p>It&#x27;s the same tired doomer script we get with every tech wave. It ages poorly.
  • shmerl9 hours ago
    <i>&gt; She said the next new factory expected to come online is being built by Micron in Idaho. The company says it will be operational in 2027</i><p>Isn&#x27;t Micron stopping all consumer RAM production? So their factories won&#x27;t help anyway.
    • terribleperson9 hours ago
      Micron is exiting direct to consumer sales. That doesn&#x27;t mean their chips couldn&#x27;t end up in sticks or devices sold to consumers, just that the no-middleman Crucial brand is dead.<p>Also, even if no Micron RAM ever ended up in consumer hands, it would still reduce prices for consumers by increasing the supply to other segments of the market.
      • walterbell2 hours ago
        <i>&gt; no-middleman Crucial brand is dead</i><p>It could be restarted in the future by Micron.<p>Crucial SSDs offer good firmware (e.g. nvme sanitize for secure erase) and hardware (e.g. power loss capacitors).
  • vittore10 hours ago
    I&#x27;ve been ruminating on this past two years, with life before AI most of the compute staying cheap and pretty much 90% idle , we are finally getting to the point of using all of this compute. We probably will find more algorithms to improve efficiency of all the matrix computations, and with AI bubble same thing will happen that happened with telecom bubble and all the fiber optic stuff that turned out to be drastically over provisioned. Fascinating times!
    • shevy-java9 hours ago
      I don&#x27;t think any of this is &quot;fascinating&quot; - it is more of a racket scheme. They push the prices up. Governments failed the people here.
      • yooogurt9 hours ago
        Isn&#x27;t this more easily explained by supply-demand? Supply can&#x27;t quickly scale, and so with increased demand there will be increased prices.
        • ozgrakkurt4 hours ago
          Imagine someone goes to the supermarket and buys all the tomatoes. Then supermarket owner says I don’t know, he bought all at once so it is a better sale. And he sells the remaining 10% of tomatoes at a huge markup
          • vittore3 hours ago
            I think it is better compared to Dutch folks buying all the tulip bulbs. And the price skyrocketed.
            • Ekaros30 minutes ago
              Tulips were by my understanding more so NFTs. Rich people gambling when bored. With promises for tulips in future... Future contracts for tulips. And prices were high because they were insanely rich merchants.<p>The RAM looks like cornering market. Probably something OpenAI should be prosecuted for if they end up profiting from it.
    • squibonpig5 hours ago
      Except it&#x27;s still sitting idle in warehouses while datacenters get built. They aren&#x27;t running yet. Unlike with fiber, GPUs degrade rapidly with use, and for now datacenters need to be practically rebuilt to fit new generations, so we shouldn&#x27;t expect much reusable hardware to come from this
  • 29athrowaway8 hours ago
    AI needs data and data that comes from consumer devices.
  • shevy-java9 hours ago
    I now consider this a mafia that aims to milk us for more money. This includes all AI companies but also manufacturers who happily benefit from this. It is a de-facto monopoly. Governments need to stop allowing this milking scheme to happen.
    • DamnInteresting8 hours ago
      When it&#x27;s more than one company working together in a monopoly-like fashion, the term is &quot;oligopoly&quot;.<p><a href="https:&#x2F;&#x2F;www.merriam-webster.com&#x2F;dictionary&#x2F;oligopoly" rel="nofollow">https:&#x2F;&#x2F;www.merriam-webster.com&#x2F;dictionary&#x2F;oligopoly</a>
      • vee-kay8 hours ago
        There is another word for it: cartel.<p>e.g., the Phoebus cartel <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Phoebus_cartel" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Phoebus_cartel</a>
    • throwaway942759 hours ago
      &quot;Monopoly&quot; means one seller, so you can&#x27;t say multiple X makes a monopoly and make sense. You probably mean collusion.<p>If demand exceeds supply, either prices rise or supply falls, causing shortages. Directly controlling sellers (prices) or buyers (rationing) results in black markets unless enforcement has enough strength and integrity. The required strength and integrity seems to scale exponentially with the value of the good, so it&#x27;s typically effectively impossible to prevent out-of-spec behavior for anything not cheap.<p>If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.<p>The alternative is decreasing demand. Governments could hold bounty and incentive programs for building electronics that last a long time or are repairable or recyclable, but it&#x27;s entirely possible the market will eventually do that.
      • rileymat25 hours ago
        &gt; If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.<p>If there is already demand at this inflated price, shouldn’t we ask why more capacity is not coming online naturally first?
    • yupyupyups9 hours ago
      Why would government officials and politicians want to stop making money?
    • yowlingcat8 hours ago
      Technically it&#x27;s a lot closer to monopsony (Sam Altman&#x2F;OAI cornering 40% of the market on DRAM in a clever way for his interests that harms the rest of the world that would want to use it). I keep hoping that somehow necessity will spur China to become the mother of invention here and supply product to serve the now lopsided constrained supply given increasing demand but I just don&#x27;t know how practical it will be.
    • klooney5 hours ago
      I mean, if you were the Micron CEO, would you bet the company on demand sustaining from AI? It seems like it could all go belly up very fast.
    • ekianjo7 hours ago
      There is no monopoly in AI. I can name at least 10 big actors worldwide.
      • ggm5 hours ago
        If they collude on pricing and restrict new entrants, that&#x27;s what the Sherman anti trust laws are about.
        • bdangubic5 hours ago
          the fines that would be levied via potential sherman law violations would negligible so that is for sure not a deterrent
          • ggm1 hour ago
            It would be understood that any action under the sherman act is unlikely and as you say, the financial penalties are tokenistic.<p>The non financial parts, which include mandated restructuring and penalties to directors including incarceration however, are not tokenistic. They&#x27;d be appealed and delayed, but at some point the shareholders would seek redress from the board. Ignoring judicial mandated instructions isn&#x27;t really a good idea, current WH behaviour aside. If the defence here is &quot;courts don&#x27;t matter any more&quot; that&#x27;s very unhelpful, if true. At some point, a country which cannot enforce judicial outcomes has stopped being civil society.<p>My personal hope the EU tears holes in the FAANG aside, the collusive pricing of chips has been a problem for some time. The cost&#x2F;price disjunction here is strong.
  • bschmidt979795 hours ago
    [flagged]
  • bschmidt979794 hours ago
    [flagged]
  • appreciatorBus7 hours ago
    [flagged]
    • dehrmann4 hours ago
      I remember the HDD shortage after flooding in Thailand. There was a price surge for a year or so, capacity came back online, and the price slowly eased. If AI crashes, prices might quickly collapse this time. If it doesn&#x27;t, it&#x27;ll take time, but new capacity will come online.<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Bullwhip_effect" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Bullwhip_effect</a>
    • bigbadfeline6 hours ago
      &gt; has more demand than forecast<p>Nah. The demand is driven by corporations that hoard hardware in datacenters, starving the market and increasing prices in the process - otherwise known as <i>scalping</i>. Then they sell you back that same hardware, in the form of cloud services, for even higher prices.<p>More explanations here:<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=46416934">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=46416934</a>
      • callc5 hours ago
        Gosh. This sounds a lot like <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Silver_Thursday" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Silver_Thursday</a> with extra steps<p>A bunch of problems pop up when a handful of super wealthy corporations can control markets.
      • appreciatorBus6 hours ago
        If it’s that easy to make infinite money, maybe the govt should could just do that too so we’d have enough for healthcare and other programs?
        • bigbadfeline5 hours ago
          The government can&#x27;t, and doesn&#x27;t, make infinite money. Govt debt can&#x27;t grow to infinity either, so if revenue is not increased, unpredictable and unpleasant events will follow. However, the taxation issue is so hopelessly misrepresented and misunderstood that I&#x27;m not going to discuss it here.
        • squibonpig5 hours ago
          What? It&#x27;s just a business you can only execute with lots of resources, and in a space where supply changes slowly.
      • hn_throwaway_994 hours ago
        Wow, this is an economics-free take if I&#x27;ve ever heard one. And calling it scalping feels laughable.<p>There are multiple RAM providers. Datacenters, many competing ones, are gobbling up RAM because there is currently a huge demand. And, unlike actual ticket scalpers, datacenters perform a very valuable service beyond just reselling the RAM. After all, end users could buy up RAM and GPUs themselves and build their own systems (which is basically what everybody did as recently as the early 00s), but they&#x27;d rather rent it because it&#x27;s much less risky with much less capex.<p>This is simply run-of-the-mill supply and demand. You might convince me there was something nefarious if there was widespread collusion or monopoly abuse, and while some of the circular dealing is worrisome, it&#x27;s still a robust market with multiple suppliers and multiple competitors in the datacenter space.
        • bigbadfeline3 hours ago
          &gt; This is simply run-of-the-mill supply and demand.<p>That&#x27;s definitely NOT run-of-the-mill demand because it comes from companies buying hardware at <i>operating loss</i>, which can only be recouped by scalping higher prices at the expense of a starved market.<p>Demand funded by circular financial agreements and off-the-book debt isn&#x27;t &quot;run-of-the-mill&quot; by any stretch.
      • derekdahmer5 hours ago
        Well that’s one interpretation.<p>Another way to think about it is a good that we once bought for private use where it sat around underutilized the majority of the time is instead being allocated in data centers where we rent slices of it, allowing RAM to be more efficiently allocated and used.<p>Yes it sucks that demand for RAM has led to scarcity and higher prices but those resources moving to data centers is a natural consequence of a shareable resource becoming expensive. It doesn’t have to be a conspiracy.
      • kortilla5 hours ago
        That’s not scalping. These companies are not selling nor renting back the memory.
        • yoyohello134 hours ago
          Azure&#x2F;GCP&#x2F;AWS and all the AI companies are selling compute. That’s renting back hardware.
          • hn_throwaway_994 hours ago
            And in the process of doing so they&#x27;re providing a hugely valuable service by building these complex systems out of lots different components, and people and companies find a lot of value in that service.<p>Calling this &quot;scalping&quot; is just economically illiterate conspiracy theorizing.
            • bigbadfeline3 hours ago
              The majority of these companies have long term purchase agreements for hardware at prices far lower than the market prices today, that&#x27;s front-running the hardware market, the same as scalpers buying up GPU&#x27;s to restrict supply and then sell at higher prices.<p>&gt; And in the process of doing so they&#x27;re providing a hugely valuable service<p>The value is in the hardware itself, the added services aren&#x27;t essential - just packaging. The GPU scalpers also sell &quot;a hugely valuable&quot; product, but that has nothing to do with anything, manipulating supply for a price differential is what matters.<p>&gt; Calling this &quot;scalping&quot; is just economically illiterate conspiracy theorizing.<p>Scalping is not a conspiracy, it&#x27;s a well known fact observed since antiquity. Conspiracy or the lack of it thereof aren&#x27;t among my concerns, the economic fundamentals enabling that phenomenon are.
  • kankerlijer10 hours ago
    Well thank th FSM that the article opens right up with buy now! No thanks, I&#x27;m kind of burnt out on mindless consumerism, I&#x27;ll go pot some plants or something.
    • johnea10 hours ago
      I didn&#x27;t see any of that.<p>I highly recommend disabling javascript in your browser.<p>Yes, it makes many sites &quot;look funny&quot;, or maybe you have to scroll past a bunch of screen sized &quot;faceplant&quot; &quot;twitverse&quot; and &quot;instamonetize&quot; icons, but, there are far fewer ads (like none).<p>And of course some sites won&#x27;t work at all. That&#x27;s OK too, I just don&#x27;t read them. If it&#x27;s a news article, its almost always available on another site that doesn&#x27;t require javascript.
      • metadope9 hours ago
        I whole-heartedly agree with your recommendation and join in encouraging more adopters of this philosophy and practice.<p>Life online without javascript is just better. I&#x27;ve noticed an increase in sites that are useful (readable) with javascript disabled. Better than 10 years ago, when broken sites were rampant. Though there are still the lazy ones that are just blank pages without their javascript crutch.<p>Maybe the hardware&#x2F;resource austerity that seems to be upon us now will result in people and projects refactoring, losing some glitter and glam, getting lean. We can resolve to slim down, drop a few megs of bloat, use less ram and bandwidth. It&#x27;s not a problem; it&#x27;s an opportunity!<p>In any case, <i>Happy New Year!</i> [alpha preview release]
      • zahlman7 hours ago
        I would not be able to handle that due to video streaming, web clients for things like email, etc. And some sites I trust (including HN) provide useful functionality with JS (while degrading gracefully).<p>But I use NoScript and it is definitely a big help.
      • piskov9 hours ago
        Probably using reader mode by default would be less guttural experience (and you’ll have an easy fallback).