It would. But it’s a good option when you have computationally heavy tasks and communication is relatively light.
- 0 Posts
- 14 Comments
Once configured, Tor Hidden Services also just work (you may need to use some fresh bridges in certain countries if ISPs block Tor there though). You don’t have to trust any specific third party in this case.
Audalin@lemmy.worldto
Technology@lemmy.world•Researchers claim GPT-4 passed the Turing testEnglish
16·1 year agoIf config prompt = system prompt, its hijacking works more often than not. The creators of a prompt injection game (https://tensortrust.ai/) have discovered that system/user roles don’t matter too much in determining the final behaviour: see appendix H in https://arxiv.org/abs/2311.01011.
Audalin@lemmy.worldto
Technology@lemmy.world•Chrome: 72 hours to update or delete your browser.English
161·2 years agoxkcd.com is best viewed with Netscape Navigator 4.0 or below on a Pentium 3±1 emulated in Javascript on an Apple IIGS at a screen resolution of 1024x1. Please enable your ad blockers, disable high-heat drying, and remove your device from Airplane Mode and set it to Boat Mode. For security reasons, please leave caps lock on while browsing.
Audalin@lemmy.worldto
Technology@lemmy.world•Chrome: 72 hours to update or delete your browser.English
47·2 years agoCVEs are constantly found in complex software, that’s why security updates are important. If not these, it’d have been other ones a couple of weeks or months later. And government users can’t exactly opt out of security updates, even if they come with feature regressions.
You also shouldn’t keep using software with known vulnerabilities. You can find a maintained fork of Chromium with continued Manifest V2 support or choose another browser like Firefox.
Audalin@lemmy.worldto
Technology@lemmy.world•‘Let yourself be monitored’: EU governments to agree on Chat Control with user “consent” [updated]English
2·2 years agoVery cool and impressive, but I’d rather be able to share arbitrary files.
And looks like you can only send images in DMs, but not in groups/forums.
Audalin@lemmy.worldto
Selfhosted@lemmy.world•Any of you have a self-hosted AI "hub"? (e.g. for LLM, stable-diffusion, ...)English
5·2 years agoIf your CPU isn’t ancient, it’s mostly about memory speed. VRAM is very fast, DDR5 RAM is reasonably fast, swap is slow even on a modern SSD.
8x7B is mixtral, yeah.
Audalin@lemmy.worldto
Selfhosted@lemmy.world•Any of you have a self-hosted AI "hub"? (e.g. for LLM, stable-diffusion, ...)English
7·2 years agoMostly via terminal, yeah. It’s convenient when you’re used to it - I am.
Let’s see, my inference speed now is:
- ~60-65 tok/s for a 8B model in Q_5_K/Q6_K (entirely in VRAM);
- ~36 tok/s for a 14B model in Q6_K (entirely in VRAM);
- ~4.5 tok/s for a 35B model in Q5_K_M (16/41 layers in VRAM);
- ~12.5 tok/s for a 8x7B model in Q4_K_M (18/33 layers in VRAM);
- ~4.5 tok/s for a 70B model in Q2_K (44/81 layers in VRAM);
- ~2.5 tok/s for a 70B model in Q3_K_L (28/81 layers in VRAM).
As of quality, I try to avoid quantisation below Q5 or at least Q4. I also don’t see any point in using Q8/f16/f32 - the difference with Q6 is minimal. Other than that, it really depends on the model - for instance, llama-3 8B is smarter than many older 30B+ models.
Audalin@lemmy.worldto
Selfhosted@lemmy.world•Any of you have a self-hosted AI "hub"? (e.g. for LLM, stable-diffusion, ...)English
10·2 years agoHave been using llama.cpp, whisper.cpp, Stable Diffusion for a long while (most often the first one). My “hub” is a collection of bash scripts and a ssh server running.
I typically use LLMs for translation, interactive technical troubleshooting, advice on obscure topics, sometimes coding, sometimes mathematics (though local models are mostly terrible for this), sometimes just talking. Also music generation with ChatMusician.
I use the hardware I already have - a 16GB AMD card (using ROCm) and some DDR5 RAM. ROCm might be tricky to set up for various libraries and inference engines, but then it just works. I don’t rent hardware - don’t want any data to leave my machine.
My use isn’t intensive enough to warrant measuring energy costs.
Audalin@lemmy.worldto
Technology@lemmy.world•Why mathematics is set to be revolutionized by AIEnglish
6·2 years agoThe article isn’t about automatic proofs, but it’d be interesting to see a LLM that can write formal proofs in Coq/Lean/whatever and call external computer algebra systems like SageMath or Mathematica.
Disabling root login and password auth, using a non-standard port and updating regularly works for me for this exact use case.
Audalin@lemmy.worldto
Selfhosted@lemmy.world•Welcome to !selfhosted@lemmy.world - What do you selfhost?English
3·3 years agoI have a MediaWiki instance on my laptop (I’ve found the features of all other wikis/mindmaps/knowledge databases decisively insufficient after having a taste of MW templates, Semantic MediaWiki and Scribunto).
Also some smaller things like pihole-standalone, Jellyfin and dictd.
Audalin@lemmy.worldto
Asklemmy@lemmy.ml•How many users here do you think are going to get bored and end up back on Reddit as soon as the blackout ends?
4·3 years agoWhat will make me return to reading Reddit is all those old posts and comments on very specific topics. It doesn’t mean I’ll stop using Lemmy (especially if some communities I follow migrate here entirely), but there’s no proper replacement for Reddit yet.
Also, I’ve found no app for Lemmy working on Android 7. For Reddit, I’m using Stealth: it’s incredibly useful that you can create multiple pseudo-accounts with different subscription lists and saved posts without ever logging in or having an account. API changes are a sad development, but Stealth has an option to work by scraping
old.reddit.com- unless they happen to delete it, of course.
Have been using Neo Launcher since it had the features I needed from Nova (mostly hiding most apps from the app list while having them on the home screen in some folder so that it isn’t a mess when you want to find something specific). It hasn’t been updated in a while, but it works perfectly fine for me.