• ᵀʰᵉʳᵃᵖʸᴳᵃʳʸ@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    20
    ·
    edit-2
    9 days ago

    Except character.ai didn’t explicitly push or convince him to commit suicide. When he explicitly mentioned suicide, it made efforts to dissuade him and showed concern. When it supposedly encouraged him, it was in the context of a roleplay in which it said “please do” in response to him “coming home,” which GPT3.5 doesn’t have the context or reasoning abilities to recognize as a euphemism for suicide when the character it’s roleplaying is dead and the user alive

    Regardless, it’s a tool designed for roleplay. It doesn’t work if it breaks character