A lemmy nomad. Wish there was a way to migrate posts and comments from .world to .ml to here… 😪

  • 2 Posts
  • 25 Comments
Joined 5 months ago
cake
Cake day: March 14th, 2025

help-circle










  • Sure, I run OpenWebUI in a docker container from my TrueNAS SCALE home server (it’s one of their standard packages, so basically a 1-click install). From there I’ve configured API use with OpenAI, Gemini, Anthropic and DeepSeek (part of my job involves evaluating the performance of these big models for various in-house tasks), along with pipelines for some of our specific workflows and MCP via mcpo.

    I previously had my ollama installation in another docker container but didn’t like having a big GPU in my NAS box, so I moved it to its own box. I am mostly interested in testing small/tiny models there. I again have Ollama running in a Docker container (just the official Docker image), but this time on a Debian bare-metal server, and I configured another OpenWebUI pipeline to point to that (OpenWebUI lets you select which LLM(s) you want to use on a conversation-by-conversation basis, so there’s no problem having a bunch of them hooked up at the same time).



  • My team uses Expensify, and I have to say, I don’t hate it. The website is full featured, the mobile app is actually pretty good, and you can even just email receipts to an email address and it will parse them out properly the vast majority of the time. Management-wise it has all of the approval chains, grouping, etc that you might expect. The company I work for is only about 50 people and my team is only 10 so I couldn’t say how well it scales, but I imagine unless you have some particularly unique requirements it’d do the job.