mozz@mbin.grits.dev to Technology@beehaw.org · 7 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square201fedilinkarrow-up1487arrow-down10file-text
arrow-up1487arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 7 months agomessage-square201fedilinkfile-text
minus-squaresweng@programming.devlinkfedilinkarrow-up3·7 months agoThe point is that the second LLM has a hard-coded prompt
minus-squaremeteokr@community.adiquaints.moelinkfedilinkarrow-up1·7 months agoI don’t think that can exist within the current understanding of LLMs. They are probabilistic, so nothing is 0% or 100%, and slight changes to input dramatically change the output.
The point is that the second LLM has a hard-coded prompt
I don’t think that can exist within the current understanding of LLMs. They are probabilistic, so nothing is 0% or 100%, and slight changes to input dramatically change the output.