Skip to content
Welcome to Caint!

Issues? Post in Comments & Feedback
You can now view, reply, and favourite posts from the Fediverse. You can click here or click on the on the navigation bar on the left.
  • 0 Votes
    2 Posts
    0 Views
    Christian StöckerC
    Ich persönlich halte das ja für das Mindeste. Wer klagt bei uns? https://www.spiegel.de/wissenschaft/mensch/ki-trainingsdaten-hat-metas-llama-meine-buecher-gelesen-kolumne-a-80365448-7759-45b2-8e7e-b479ca3853a9?giftToken=7dc757c7-b8fd-49a2-b00e-edd1882fbbac
  • 0 Votes
    1 Posts
    5 Views
    Miguel Afonso CaetanoR
    "Claude’s update relies on a striking pop-up with a large, black "Accept" button. The data sharing toggle is tucked away, switched on by default, and framed positively ("You can help..."). A faint "Not now" button and hard-to-find instructions on changing the setting later complete the manipulative design.These interface tricks, known as dark patterns, are considered unlawful under the General Data Protection Regulation (GDPR) and by the European Court of Justice when used to obtain consent for data processing. Pre-checked boxes do not count as valid consent under these rules.The European Data Protection Board (EDPB) has also stressed in its guidelines on deceptive design patterns that consent must be freely given, informed, and unambiguous. Claude’s current design clearly fails to meet these standards, making it likely that Anthropic will soon draw the attention of privacy regulators."https://the-decoder.com/anthropic-uses-a-questionable-dark-pattern-to-obtain-user-consent-for-ai-data-use-in-claude/#EU #AI #GenerativeAI #Anthropic #LLMs #Chatbots #Claude #DarkPatterns #Privacy #DataProtection