I understand this could be posted in a hardware forum or I could use a stats comparison tool (and I’ve poked around a fair bit as is), but I’m curious, specifically from the self-hosted, roll-your-own NAS perspective, does the Minisforum n5 Pro seem like a decent machine for self-hosting? Any impressions? What percebtage of this is the marketing hype-train and what percentage would still be good if it shipped unbranded in a cardboard box. What would you expect this to cost?

https://www.minisforum.com/pages/n5_pro

Currently I’m running one of the DS-Series Synology NAS but I want to remove the Synology dependency because I don’t fully trust them to deliver and not remove features. I would rather give the TrueNaAS thing a try (or something in that direction) now so I’m prepared to jump ship when I need to. I’m lucky enough to be able to buy a decent NAS and hang onto it for a while, but I want to come in below the point where an extra $100 doesn’t really get me much anymore.

I am specifically interested in the hardware because I don’t plan to use the default OS.

  • Skunk@jlai.lu
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    21 hours ago

    I have one of those minisforum amd hx 370 (the x1 ai pro). Those are very powerful awesome hardware. I use the mini pc as a work computer for 3D and dev on OpenSuse and lightweight low power gaming machine (like long haul Xplane12 flight during the night).

    Everything is well made and beautifully built.

    As for this NAS version, if money is not an issue I wouldn’t hesitate. 10Gbs, tons of ram, Amd hx 370. It sure is overkill for a NAS, it’s more tailored for a very beefy docker server and/or virtualization station while being a multimedia NAS at the same time.

    I built my own synology replacement with second hand itx parts in a jonsbo n3 case, but if I hadn’t or just had plenty of cash to spare, I would definitely go for a server like this one (my use case is NAS + docker + virtualization + eventual game server all in one).

    As a side note, the “AI” part is just communication for now, those chips are not yet supported for local LLM on Linux (Windows only atm), they need ROCm support for iGPU RDNA 3.5 and the new AMD NPU integration into those local frameworks (llama.ccp etc).

    https://github.com/amd/gaia

    It will come for sure, it’s just not ready yet.

    • Kyle@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      41 minutes ago

      Thanks for that, almost pulled the trigger so I can have an all in one solution for running immich with AI, perplexica and other self hosted AI things. Currently I just serve the processing power from my MacBook and desktop PC.