7 comments

  • in-silico7 hours ago
    This is great to see.<p>I did trained some research models using the existing PyTorch&#x2F;XLA on TPUs, and it was a mess of undocumented behavior and bugs (silently hanging after 8 hours of training!).<p>If anyone is trying to use PyTorch on TPU before TorchTPU is released, you can check out the training pipeline that I ended up building to support my research: <a href="https:&#x2F;&#x2F;github.com&#x2F;aklein4&#x2F;easy-torch-tpu" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;aklein4&#x2F;easy-torch-tpu</a>
  • Reubend7 hours ago
    Sounds good, but my main question is: is this a fork, or a new backend they&#x27;re building in (like MPS)?
    • musebox352 hours ago
      I attended the related session at Next’26 yesterday. From my understanding it is a new backend and they will release the torch tpu source on github in one or two months. It will not support all ops initially but they are moving fast. Still for a while torchax is mature enough to run torch models on tpus by translating to jax.
  • MASNeo33 minutes ago
    Now all that’s missing is an actual chip that can be purchased. Any ideas?
  • noracists3 hours ago
    Very excited for this.
  • yujunjie3 hours ago
    [dead]
  • crimebrasil5 hours ago
    [dead]