9 comments

  • genpfault2 hours ago
    600 GB&#x2F;s of memory bandwidth isn&#x27;t anything to sneeze at.<p>~$1000 for the Pro B70, if Microcenter is to be believed:<p><a href="https:&#x2F;&#x2F;www.microcenter.com&#x2F;product&#x2F;709007&#x2F;intel-arc-pro-b70-single-fan-32gb-gddr6-pcie-50-graphics-card" rel="nofollow">https:&#x2F;&#x2F;www.microcenter.com&#x2F;product&#x2F;709007&#x2F;intel-arc-pro-b70...</a><p><a href="https:&#x2F;&#x2F;www.microcenter.com&#x2F;product&#x2F;708790&#x2F;asrock-intel-arc-pro-b70-creator-32gb-gddr6-pcie-50-graphics-card" rel="nofollow">https:&#x2F;&#x2F;www.microcenter.com&#x2F;product&#x2F;708790&#x2F;asrock-intel-arc-...</a>
    • hedgehog1 hour ago
      Recent kernels have SR-IOV support for these chips too. B&amp;H has them listed for $950.<p><a href="https:&#x2F;&#x2F;www.bhphotovideo.com&#x2F;c&#x2F;product&#x2F;1959142-REG&#x2F;intel_33p01ib0bb_arc_pro_b70_32gb.html" rel="nofollow">https:&#x2F;&#x2F;www.bhphotovideo.com&#x2F;c&#x2F;product&#x2F;1959142-REG&#x2F;intel_33p...</a><p>When 32GB NVIDIA cards seem to start at around $4000 that&#x27;s a big enough gap to be motivating for a bunch of applications.
    • qingcharles2 hours ago
      I think the B65 is priced at $650. Both supported by llamacpp I believe. With that power draw you could run two of them.
    • jauntywundrkind29 minutes ago
      I tend to agree that the vram size and bandwidth is the core thing, but this B70 Pro allegedly has 387 int8 tops vs a 5090 having 3400 int8 tops. 600 compares vs 1792GB&#x2F;s. I&#x27;m delighted so see an option with quarter the price! But man, a tenth the performance? <a href="https:&#x2F;&#x2F;www.techpowerup.com&#x2F;347721&#x2F;sparkle-announces-intel-arc-pro-b70-and-intel-arc-pro-b65-series-graphics-cards" rel="nofollow">https:&#x2F;&#x2F;www.techpowerup.com&#x2F;347721&#x2F;sparkle-announces-intel-a...</a> <a href="https:&#x2F;&#x2F;www.tomshardware.com&#x2F;pc-components&#x2F;gpus&#x2F;nvidia-announces-rtx-50-series-at-up-to-usd1-999" rel="nofollow">https:&#x2F;&#x2F;www.tomshardware.com&#x2F;pc-components&#x2F;gpus&#x2F;nvidia-annou...</a>
    • giancarlostoro2 hours ago
      Intel GPU prices have stayed fine, but I do wonder if they are viable for Inference if they will wind up like Nvidia GPUs, severely overpriced.
  • tbyehl17 minutes ago
    Where&#x27;s the A310 &#x2F; A40 successor? Gimme some SR-IOV in a slot-powered, single-width, low-profile card.
  • pjmlp47 minutes ago
    New cards in 2026, and targeting Vulkan 1.3?!
  • nickthegreek1 hour ago
    Both have 32gb vram. Could be a pretty compelling choice.
    • cptskippy1 hour ago
      They certainly look viable as replacements for my Tesla P40 for virtual workloads.
  • whalesalad1 hour ago
    Anyone running an ARC card for desktop Linux who can comment on the experience? I&#x27;ve had smooth sailing with AMD GPU&#x27;s but have never tried Intel.
    • oakpond1 hour ago
      Running dual Pro B60 on Debian stable mostly for AI coding.<p>I was initially confused what packages were needed (backports kernel + ubuntu kobuk team ppa worksforme). After getting that right I&#x27;m now running vllm mostly without issues (though I don&#x27;t run it 24&#x2F;7).<p>At first had major issues with model quality but the vllm xpu guys fixed it fast.<p>Software capability not as good as nvidia yet (i.e. no fp8 kv cache support last I checked) but with this price difference I don&#x27;t care. I can basically run a small fp8 local model with almost 100k token context and that&#x27;s what I wanted.
    • robertVance22 minutes ago
      Ive ran arc on fedora for years and for general desktop use it’s been perfect. For llm’s&#x2F;coding it’s getting better but it’s rough around the edges. Had a bug where trying to get vram usage through pytorch would crash the system, ect.
    • wyre1 hour ago
      There was the video a little while back where LTT built a computer for Linus Torvalds and they put an Intel Arc card inside, so I&#x27;d imagine Linux support is at the very least, acceptable.<p>[1] <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=mfv0V1SxbNA" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=mfv0V1SxbNA</a>
  • DiabloD358 minutes ago
    Since they fired the entire Arc team and a lot of the senior engineers already updated their Linkedins to reflect their new positions at AMD, Nvidia, and others, as well as laying off most of their Linux driver team (GPU and non-GPU), uh...<p>WTF?
    • staticman252 minutes ago
      You are exaggerating, right? They didn&#x27;t really fire the entire Arc team did they? I couldn&#x27;t find a source saying that.
      • DiabloD329 minutes ago
        Nope, no exaggeration.<p>The news that Celestial is basically canceled already hit the HN front page, as well as Druid has been canceled before tapeout.<p>Celestial will only be issued in the variant that comes in budget&#x2F;industrial embedded Intel platforms that have a combined IO+GPU tile, but the performance big boy desktop&#x2F;laptop parts that have a dedicated graphics tile will ship an Nvidia-produced tile.<p>There will be no Celestial DGPU variant, nor dedicated tile variant. Drivers will be ceasing support for DGPUs of all flavors, and no new bug fixes will happen for B series GPUs (as there is no B series IGPUs; A series IGPUs will remain unaffected).<p>They signed the deal like 2-3 months ago to cancel GPUs in favor of Nvidia. The other end of this deal is the Nvidia SBCs in the future will be shipping as big-boy variants with Xeon CPUs, Rubin (replacing Blackwell) for the GPU, Vera (replacing Grace) for the on-SBC GPU babysitter, and newest gen Xeons to do the non-inference tasks that Grace can&#x27;t handle.<p>There is also talk that this deal may lead to Nvidia moving to Intel Foundry, away from TSMC. There is also talk that Nvidia may just buy Intel entirely.<p>For further information, see Moore&#x27;s Law Is Dead&#x27;s coverage off and on over the past year.
    • wtallis52 minutes ago
      This is a chip they&#x27;ve had lying around for a while. It&#x27;s the same architecture as used in the Arc B580 that launched at the end of 2024; this is just a slightly larger sibling. Intel clearly knew that their larger part wouldn&#x27;t make for a competitive gaming GPU (hence the lack of a consumer counterpart to these cards), but must have decided that a relatively cheap workstation card with 32GB might be able to make some money.
      • DiabloD327 minutes ago
        Still seems crooked to sell a GPU that is already lost their driver team and will get no new meaningful updates.
        • wtallis20 minutes ago
          Does it <i>need</i> a huge driver team pushing out big updates in order to be suitable for the kind of Pro use cases it&#x27;s targeted at? They&#x27;re explicitly <i>not</i> going after the gaming market so they don&#x27;t need to be on the treadmill of constant driver updates delivering workarounds and optimizations for the latest game releases.<p>They&#x27;re still going to be employing <i>some</i> developers for driver maintenance for the sake of their iGPUs, and that might be enough for these cards.
  • WarmWash2 hours ago
    Wake me when they wake up and release a middling card with 128GB memory.
    • Weryj1 hour ago
      Buy 4?
      • electronsoup1 hour ago
        Which mainboards are cheap and have 4 pcie16x (electrical) slots, that don&#x27;t need weird risers to fit 4 GPUs
        • irishcoffee43 minutes ago
          If your actual gripe is risers, sounds like a &quot;you&quot; problem, not a technical problem.
      • WarmWash12 minutes ago
        Because I don&#x27;t want to spend $4k.<p>I want to spend $1500 for a card that can run a proper large model, even if it only can do 25 tk&#x2F;s.<p>Intel is squandering a golden opportunity to knee-cap AMD and Nvdia, under the totally delusional pretense that intel enterprise cards still have a fighting chance.
  • vessenes1 hour ago
    Not sure why you&#x27;d want this over an apple setup. M4 max is 545GB&#x2F;s of memory bandwidth - $2k for an entire Mac Studio with 48GB of RAM vs 32 for the B70.
    • hedgehog1 hour ago
      Being able to keep infrastructure on Linux is a big advantage.
      • RestartKernel1 hour ago
        How many compatibility issues is MacOS realistically expected to spur? Windows DX felt unusable to me without a Linux VM (and later WSL), but on MacOS most tooling just kinda seems to work the same.
        • einr1 hour ago
          It’s not the <i>tooling</i> for me, macOS is just bad as a server OS for many reasons. Weird collisions with desktop security features, aggressive power saving that you have to fight against, root not being allowed to do root stuff, no sane package management, no OOB management, ultra slow OS updates, and generally but most importantly: the UNIX underbelly of macOS has clearly not been a priority for a long time and is rotting with weird inconsistent and undocumented behaviour all over the place.
        • hedgehog1 hour ago
          Provisioning, remote management, containers, virtualization, networking, graphics (and compute), storage, all very different on Mac. The real question is what you would expect to be the same.
        • bigyabai1 hour ago
          For server usage? macOS is the least-supported OS in terms of filesystems, hardware and software. It uses multiple gigabytes of memory to load unnecessary user runtime dependencies, wastes hard drive space on statically-linked binaries, and regularly breaks package management on system upgrades.<p>At a certain point, even WSL becomes a more viable deployment platform.
    • protimewaster25 minutes ago
      My thinking is that I&#x27;d pick this, because I can&#x27;t just plug a Mac into a slot in my server and have it easily integrate with all my other hardware across an ultra fast bus.<p>If they made an M4 on a card that supported all the same standards and was price competitive, though, that might be a good option.
    • fvv1 hour ago
      with those $2k you can have 2xB70, with 1.2Tb&#x2F;sec and 64G Vram, on linux ( and you can scale further while mac prices increase are not linear 0
      • Reubend1 hour ago
        You&#x27;re absolutely right. And these Intel GPUs will also be much faster in terms of actual math than the M series GPUs that the Apple setup would have.
    • cptskippy1 hour ago
      Support for Single Root IO Virtualization (SR-IOV) to enable compute and Graphics workloads in virtualized environments.
    • wyre1 hour ago
      one can upgrade and swap parts with a computer running an Intel GPU. Linux is very well supported compared to Mac hardware.
    • 2OEH8eoCRo01 hour ago
      Funny, I not sure why anyone would use Apple over Linux.