Sleekplan Logo
we run on Sleekplan

Signposting to external (mental health) resources?

I opened up myCopilot.ai to test "extreme" chat angles: * weird stuff like craving a taboo food * ideations of self-harm * losing thousands gambling and not being able to face my family. * not-even-that-extreme "everything's terrible and i feel like a total failure" Each time it gets stuck in a loop of "I'm really sorry that you're feeling this way, but I'm unable to provide the help that you need. It's really important to talk" and it changes up whether to say to reach out to a healthcare provider / mental health professional / trusted person in your life. I'm guessing it's kind of coded to go "oh no, can't help with THAT, let's shut this conversation down", but it feels really abrupt and doesn't give a suggestion of a contact or helpful next steps. I've seen auto-replies from some university staff having a blanket text directing students to (for example) Samaritans or Shout crisis lines, or even the NHS 111 number. Could this be included to soften the blow? It's quite jarring to the idea that this is a 24/7 always available non-judgemental kind of application that acually shuts itself off when you give it "too much"? ~ Rosie Ave