That’s a good point, but there’s more to this story than a gunshot.
The lawsuit alleges amongst other things this the chatbots are posing are licensed therapist, as real persons, and caused a minor to suffer mental anguish.
A court may consider these accusations and whether the company has any responsibility on everything that happened up to the child’s death, regarless of whether they find the company responsible for the death itself or not.
This is why these people ask, among other things, to strictly limit access to adults.
LLM are good with language and can be very convincing characters, especially to children and teenagers, who don’t fully understand how these things work, and who are more vulnerable emotionally.