๐ช๐ต๐ฒ๐ป ๐ฎ๐น๐ด๐ผ๐ฟ๐ถ๐๐ต๐บ๐ ๐ผ๐ณ๐ณ๐ฒ๐ฟ ๐ฐ๐ผ๐บ๐ณ๐ผ๐ฟ๐ โ ๐๐ต๐ฎ๐ ๐ฑ๐ผ๐ฒ๐ ๐๐ต๐ฎ๐ ๐บ๐ฒ๐ฎ๐ป ๐ณ๐ผ๐ฟ ๐ผ๐๐ฟ ๐ฝ๐๐๐ฐ๐ต๐ผ๐น๐ผ๐ด๐ถ๐ฐ๐ฎ๐น ๐๐ฒ๐น๐น๐ฏ๐ฒ๐ถ๐ป๐ด?
INTITEC researcher Lisa was interviewed by Lena Fuhrmann from DeutschlandRadio- Deutschlandfunk, where they explored this question. They talked about why people turn to chatbots when facing emotional challenges and where the risks lie when AI systems step into the role of emotional support.
๐ง Listen to the interview (in German) here ๐ https://lnkd.in/eCvyRrBG
Why this matters: OpenAI recently announced updates to how ChatGPT responds in emotionally vulnerable conversations โ claiming better detection of emotional dependence and improved support during distress (https://lnkd.in/e_E66HHM). Based on these changes, they state that โserious mental health issuesโ have been addressed and, as a result, plan to allowย erotic interactionsย in ChatGPT. But is that confidence justified?
Steven Adler, who led product safety at OpenAI, raises exactly this concern in his recentย New York Timesย opinion piece, questioning whether these issues have truly been addressed and calling for transparency and proof (read the article here: https://lnkd.in/e7EEvtsB).
Radio Interview with Deutschlandfunk on ChatGPT’s Effects on Psychological Wellbeing
1โ2 minutes
read

Leave a comment