6 comments

  • sneilan11 minute ago
    This is exactly what I needed. I've been thinking about making this tool. For running and experimenting with local models this is invaluable.
  • kamranjon1 hour ago
    This is a great idea, but the models seem pretty outdated - it's recommending things like qwen 2.5 and starcoder 2 as perfect matches for my m4 macbook pro with 128gb of memory.
  • dotancohen39 minutes ago
    In the screenshots, each model has a use case of General, Chat, or Coding. What might be the difference between General and Chat?
  • castral55 minutes ago
    I wish there was more support for AMD GPUs on Intel macs. I saw some people on github getting llama.cpp working with it, would it be addable in the future if they make the backend support it?
  • andsoitis40 minutes ago
    Claude is pretty good at among recommendations if you input your system specs.
  • fwipsy1 hour ago
    Personally I would have found a website where you enter your hardware specs more useful.
    • user_783211 minutes ago
      Same, I opened HN on my phone and was hoping to get an idea before I booted my computer up.
    • greggsy25 minutes ago
      I was hoping for the same thing.