Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

i thought my post was already too long to include this, but to your point, you can run AI inference in this setup and the performance can be pretty good.


There are definitely some use cases where it works out, others where it doesn't; I spent a bit of time testing that side of things late last year: https://www.jeffgeerling.com/blog/2025/big-gpus-dont-need-bi...


a great post that definitely inspired this one. i link to it in the first paragraph of my blog post.


I feel like maybe by the end of this year someone with access to a bunch of RTX Pro 6000s will have them running on a Pi or RK3588 lol.


we can only hope


I appreciate you making the post not about AI.


Yeah it is obviously actually useful for AI inference over which the pcie speed isn't particularly important and a single board computer gets you a small system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: