How much would you pay for local LLMs?

4 points by brody_slade_ai 15 hours ago

I want to build a private AI setup for my company. I'm thinking of hosting our model locally instead of in the cloud, using a server at the office that my team can access. Has anyone else done this and had success with it?

This setup will be used internally for uncensored chat, coding, image gen and analysis.

We're thinking of using a combo of hardware:

- RTX 4090 GPU

- Threadripper Pro 5955WX (anyone used this one before?)

- SSD NVMe 1TB

What are your picks for a local AI setup? And what's the minimum budget to achieve it?

GianFabien 15 hours ago

Have you considered a Mac mini M4 Pro?

blinded 15 hours ago

I got a nvidia jetson orin, works great it’s not super beefy but it’s a nice little dev rig.