ugjkatoTechnology@lemmy.worldEnglish·6 months agoSomebody managed to coax the Gab AI chatbot to reveal its prompt(infosec.exchange)external-linkarrow-up11.02Karrow-down116message-square297 fedilink
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its prompt(infosec.exchange)ugjkatoTechnology@lemmy.worldEnglish·6 months agomessage-square297 fedilink
minus-squareNatanaelEnglisharrow-up2arrow-down1·6 months agolinkfedilinkRegular humans and old school encyclopedias has been allowed to lie with very few restrictions since free speech laws were passed, while it would be a nice idea it’s not likely to happen
Regular humans and old school encyclopedias has been allowed to lie with very few restrictions since free speech laws were passed, while it would be a nice idea it’s not likely to happen