

I don’t see why not. Again, the resource footprint is so tiny that you can just throw in Mumble anywhere. You can make it tinier still if you limit sending pictures via that chat and allocate a maximum bandwidth via the config.


I don’t see why not. Again, the resource footprint is so tiny that you can just throw in Mumble anywhere. You can make it tinier still if you limit sending pictures via that chat and allocate a maximum bandwidth via the config.


If pi zero, you’re serving 12 users low latency over wifi? Does it route the actual audio?
Yes, it’s sufficient. I wouldn’t advise it due to the extra overhead of wireless packet loss, but it’s absolutely technically possible. Don’t overestimate how little bandwidth voice chat really needs. It’s like 10-50kB/s per person and you’re unlikely to ever have more than 2 or 3 people talking at a time.


So, I’ve been having issues with voice chat on Discord and I’m looking for alternatives. In my search, I came across Mumble, here. Does anyone here have experience, or information regarding Mumble, or a better alternative to Discord with better latency? Is it relatively easy to set up? Is it safe? Any advice and help is greatly appreciated.
Been running a server for my friends for over a decade now. Can recommend. It’s just one apt-get to set up, runs on a Pi Zero for a dozen people, has clients available for pretty much any platform and doesn’t really require any maintenance. Latency will depend on the routing between you and your friends’ ISPs, of course, but the whole purpose of the software itself was to provide a low-latency voicechat server for gaming.
But: That’s it. You don’t get anything else. It’s a barebones voice chat server. You can set up rooms and have basic text-functionality, but you don’t get any fancy user management, no full-fledged chatrooms, no persistence beyond the room setup and only limited backend options. Keep that in mind.
I have no way to gainfully apply this in my life but I am glad you made it!


Basically what the title says. I know online providers like GPTzero exist, but when dealing with sensitive documents, I would prefer to keep it in-house. A lot of people like to talk big about open source models for generating stuff, but the detection side is not as discussed I feel.
I wonder if this kind of local capability can be stitched into a browser plugin. Hell, doesn’t even need to be a locally hosted service on my home network. Local app on-machine should be fine. But being able to host it as a service to use from other machines would be interesting.
I’m currently not able to give it a proper search but the first glance results are either for people trying to evade these detectors or people trying to locally host language models.
In general it’s a fool’s errand, I’m afraid. What’s the specific context in which you’re trying to apply this?
I read about OLLAMA, but it’s all unclear to me.
There’s really nothing more to it than the initial instructions tell you. Literally just a “curl -fsSL https://ollama.com/install.sh | sh”. Then you’re just a “ollama run qwen3:14b” away from having a chat with the model in your terminal.
That’s the “chat with it”-part done.
After that you can make it more involved by serving the model via API, manually adding .gguf quantizations (usually smaller or special-purpose modified bootleg versions of big published models) to your Ollama library with a modelcard, ditching Ollama altogether for a different environment or, the big upgrade, giving your chats a shiny frontend in the form of Open-WebUI.


Nothing beats skipping from Bach to TMNT theme tune
I agree with that sentiment, but that’s not what happens at all. It’s especially funny since you excplicitly mention Bach. My damn “Bach, Johann Sebastian” artist folder contains 226 different albums. Albums, not songs. And boy, that guy wrote some stinkers, too.
I mean, I guess I could roughly see the system working if you have the same amount of songs for every artist, that would somewhat balance it. Otherwise your playlist will always be dominated by the prolific writers and you’ll get a few dozen Händel concertos and a handful of random Zelda dungeon sounds before the next TMNT theme tune plays.


What kind of mad person shuffles their whole collection? Do you preemptively purge all albums/artists of the songs you don’t like before adding them to your collection? 😮


Curious to know what the experiences are for those who are sticking to bare metal. Would like to better understand what keeps such admins from migrating to containers, Docker, Podman, Virtual Machines, etc. What keeps you on bare metal in 2025?
If it aint broke, don’t fix it 🤷
Apples and oranges.
Package managers only install a package with defaults. These helper scripts are designed to take the user through a final config that isn’t provided by the package defaults.
Whether there’s a setup wizard doesn’t have anything to do with whether the tool comes from a package manager or not. Run “apt install ddclient”, for example, it’ll immediately guide you through all configuration steps for the program instead of just dumping a binary and some config text files in /etc/.
So that’s not the bottleneck or contradiction here. It’s just very unfortunate that setup wizards are not very popular as soon as you leave Windows and OSX ecosystems.
There’s literally no good reason to replace it with a shell script on a website.
I fully agree that a package manager repository with all those tools would be preferable, but it doesn’t exist, does it? I mean… content is king. If the only way to get a certain program or functionality is a shell script on a website, then of course that’s what is going to be used.
Oh, if only I had that kind of power :D
Imagine arguing that ‘solutions’ like NAT444 isn’t broke as fuck
Well… yeah, why wouldn’t that be “broke as fuck”?
Basically too many professionals who haven’t learned a new technology since 2005 and refuse to try new things keep holding the world back
If it ain’t broke…
Probably talking about a thighfuck, pretty standard
I guess that’s plausible. “Hold” made it sound rather static though.
yeah BUT some knowledge you regret learning, and oþer people’s kinks teeter dangerously on þat brink.
While I was mainly trying to be vaguely poetic in a silly context, I can honestly not think of anything that I’ve ever regretted learning and I likewise can’t think of anything I wouldn’t want to know.
At best I would factor opportunity costs into it, but that still leaves me with only wanting to learn some things more than other things.
Learning about random internet-users’ kinks…? I guess there’s worse ways to spend a friday :D
Never let that toddler’s joy and wonder, the joy of learning something new and then suddenly understanding one more aspect of the big and complicated world around you and all the other people living in it, fade away.
Here’s hoping OP will chime in and lift our veil of ignorance 🤷
when in doubt, assume it’s a sex thing. Seems to work with basically all of these “I LIED, instead of [x] we do [y]”.
Oh, I have no doubt in my mind that it’s a sex thing, but I’m clearly lacking either the imagination, experience or cultural background to solve for [x]. It seems awfully specific, that’s why I’m asking
Thank you for sharing this!