The Picard Maneuver@piefed.world to Microblog Memes@lemmy.worldEnglish · 23 hours agoDid you know GIMP could do that instead?media.piefed.worldexternal-linkmessage-square56fedilinkarrow-up1500arrow-down122
arrow-up1478arrow-down1external-linkDid you know GIMP could do that instead?media.piefed.worldThe Picard Maneuver@piefed.world to Microblog Memes@lemmy.worldEnglish · 23 hours agomessage-square56fedilink
minus-squareLazycog@sopuli.xyzlinkfedilinkEnglisharrow-up59·23 hours agoWhere can I find open source friends? I’d like to compile them myself
minus-squaresp3ctr4l@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·edit-214 hours agoOpenLlama. Alpaca. Run a local friend model, today! I… actually futzed around with this just to see if it would work, and… yeah, there actually are models that will run on a Steam Deck, with Bazzite. EDIT: Got my angry spitting long necked quadrupeds mixed up. Alpaca is a flatpak, literally could not be easier to set up a local LLM.
minus-squareMidnight Wolf@lemmy.worldlinkfedilinkEnglisharrow-up4·16 hours agoI’m already available as an executable, a few known bugs but they won’t accept pull requests for patches on oneself, something about eternal life or something. I mean uhhhhhh beep boop.
minus-squareLeon@pawb.sociallinkfedilinkEnglisharrow-up4·13 hours agoAh neat, so I can just curl you?
Where can I find open source friends? I’d like to compile them myself
You can fork them too 😏
And pull them too.
OpenLlama. Alpaca.Run a local friend model, today!
I… actually futzed around with this just to see if it would work, and… yeah, there actually are models that will run on a Steam Deck, with Bazzite.
EDIT: Got my angry spitting long necked quadrupeds mixed up.
Alpaca is a flatpak, literally could not be easier to set up a local LLM.
I’m already available as an executable, a few known bugs but they won’t accept pull requests for patches on oneself, something about eternal life or something.
I mean uhhhhhh beep boop.
Ah neat, so I can just curl you?
🥺👉👈
Calm down Dr. Frankenstein