Where is the music video for us who only want to learn about low level hardware through that medium?<p>Get perpendicular: <a href="https://www.dailymotion.com/video/x62mja" rel="nofollow">https://www.dailymotion.com/video/x62mja</a>
Off-topic, but 574 analytics and advertising partners is a few too many. Like… can you make money with just 200 ad partners?
That song is etched into my memory and bring back a flood of great feelings. Daily Motion is giving me a playback error, but here's a copy from youtube: <a href="https://www.youtube.com/watch?v=xb_PyKuI7II" rel="nofollow">https://www.youtube.com/watch?v=xb_PyKuI7II</a>
But will this go the way of a “non core” product like Optane (or modems for that matter?)
The article says nothing about the construction or special qualities of ZAM, as compared to HBM :(
There doesn't seem to be much detail anywhere else either. All I was able to gather was that the memory dies are stacked (not new) but that the vias connecting the stack are angled instead of straight up and down and this is better because ... reasons?
Indeed, what is it? The article doesn't say, only espouses the supposed benefits.
It’s crazy that we have stalled on the structure of the basic DRAM cell for decades now.
This wccf article also doesn't do a great job of describing, but the third slide it shows is very illustrative: rather than stack horizontally it stacks dram on its side. <a href="https://wccftech.com/intel-zam-memory-threatens-hbms-ai-throne-with-2x-the-bandwidth-of-hbm4/" rel="nofollow">https://wccftech.com/intel-zam-memory-threatens-hbms-ai-thro...</a><p>I thought this was going to mean each stack was able to directly talk to the controller, since all stacks are resting on an interposer thing. But actually there is still a logic controller slice at the bottom of the stack, not at a right angle to the stack.<p>Instead of HBM microbumps between layers there is a more compact/dense TSV ("fusion bonded via-in-one") system. Intel once more showing their strong chiplet packing prowess! The claim is that thermals are still much better somehow, in spite of volumetric cell density increasing (from thinner layers). The demo has 8+1 dram+controller layers.
Intel does these "throw spaghetti on the wall" kind of investments into potientially interesting companies/technologies all of the time - and have done so for decades.<p>Every time the recipient hypes the shit out of it, of course.
The main problem is that they often don't stick with it.<p>As far as I can tell, Intel more-or-less pioneered the idea of SSDs being the <i>best</i> storage rather than the <i>cheap</i> storage, for instance. The X25-M and X25-E were absurdly good. Then, once the market was established...they pulled out of it.
I’m still waiting for the Intel Arc B770 since a 5060 Ti and 9060 XT are already overpriced and if they just committed to something for once it wouldn’t be marginally worse.<p>Not that releasing the GPU would be something super innovative, they already have the B70.
An extreme and related example: <a href="https://en.wikipedia.org/wiki/Opticom_(company)" rel="nofollow">https://en.wikipedia.org/wiki/Opticom_(company)</a><p>Popular science kind of backgrounder (can't vouch for the accuracy/relevancy - details are very scarce): <a href="https://www.geeksforgeeks.org/digital-logic/polymer-memory/" rel="nofollow">https://www.geeksforgeeks.org/digital-logic/polymer-memory/</a>
Most of the big hit's in tech had a trendy index swinging moment, Intel has been searching for one for a long time since AMD64 undercut the Itanium. Hype drives a currently multi-billion dollar bubble. It's not always a bad idea to throw our holy noodles at the wall. You might find they hover is the sky and grow meatballs, could be big.
Well, peak weirdness was the thing involving Will-i-am from the Black Eyed Peas as a 'Futurist'/Spokesperson/IDEK.<p>I think what's semi-unfortunate is all the swings and misses, especially the cases where it wasn't necessarily a bad idea but Intel gives up too soon;<p>- Massively parallel simple-ish x86 cores a-la Xeon Phi; okay maybe not the best idea on the surface but I feel like nowadays the opportunities could be more forthcoming with how to reuse parts of that tech (And maybe they do but are just quiet about it... i.e. GPU acceleration)<p>- Optane. I think the tech would have been cheaper if they made terms for licensing easier, but maybe I'm missing part of the equation...<p>- This thing where they keep half assing the GPU strategy; Imagine if B70 launched last year alongside the B60 and B50, <i>before</i> DRAM prices went sideways. Or if they didn't take so long to release a >16GB GPU in the first place; <i>that</i> would have built a lot of interest, but instead they finally release a 32GB GPU alongside more bad news for the overall roadmap. The whole situation instead becomes a jarring rollercoaster that makes everyone worry that Intel is gonna kill the project the way everything but CPUs gets killed lately.
[dead]