• 2 Posts
  • 80 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023




  • The first one kinda works, but I think it’d be more clear, when used without “selbst”/self, as this would be read to reference the invention instead of the inventor.

    On the other hand, that then feels like “yeah, it didn’t work. The invention misfired and is crap”. Maybe “Erfindungserschafferzerstörer”? (Invention’s creator destructor) but that sounds off, too.

    There’s not really a word that I can come up with that really conveys this meaning. There’s a german saying “wer Andern eine Grube gräbt, fällt selbst hinein” (he, who digs a hole for others, will fall into it by itself). Then there’s the humorous “Rohrkrepierer” (along the lines of “died in the barrel”) which basically means something like “dead on arrival” / that went wrong and didn’t work. So it’d be probably something that references one of those, which would make it work culturally?













  • elvithtoComic Strips@lemmy.worldNo scurvy here
    arrow-up
    25
    arrow-down
    1
    ·
    6 months ago
    link
    fedilink

    When life gives you lemons, don’t make lemonade. Make life take the lemons back! Get mad! I don’t want your damn lemons, what the hell am I supposed to do with these? Demand to see life’s manager! Make life rue the day it thought it could give Cave Johnson lemons! Do you know who I am? I’m the man who’s gonna burn your house down! With the lemons! I’m gonna get my engineers to invent a combustible lemon that burns your house down!

    Cave Johnson



  • elvithtoTechnology@lemmy.worldSomebody managed to coax the Gab AI chatbot to reveal its promptEnglish
    arrow-up
    38
    arrow-down
    0
    ·
    6 months ago
    edit-2
    6 months ago
    link
    fedilink

    Yeah, basically you have three options:

    1. Create and train your own LLM. This is hard and needs a huge amount of training data, hardware,
    2. Use one of the available models, e.g. GPT-4. Give it a special prompt with instructions and a pile of data to get fine tuned with. That’s way easier, but you need good training data and it’s still a medium to hard task.
    3. Do variant 2, but don’t fine-tune the model and just provide a system prompt.