Ever read a headline and thought, “Something feels off, but I can’t explain why?”
I built CLARi, a custom GPT designed not just to verify facts—but to train your instincts for clarity, logic, and truth.
Instead of arguing back, CLARi shows you how claims:
-
Distort your perception (even if technically true)
-
Trigger emotions to override logic
-
Frame reality in a way that feels right—but misleads
She uses tools like:
🧭 Clarity Compass – to break down vague claims
🧠 Emotional Persuasion Detector – to spot manipulative emotional framing
🧩 Context Expansion – to expose what’s being left out
Whether it’s news, social media, or “alternative facts,” CLARi doesn’t just answer—she trains you to see through distortion.
Try asking her something polarizing like:
👉 “Was 5G ever proven unsafe?”
👉 “Is crime actually going up, or is it just political noise?”
She’s open to all with this link —designed to challenge bias, dissect manipulation, and help you think clearer than ever.
Let me know what you think! Thanks Lemmy FAM!
Here’s an example, if anyone is interested:
prompt: 5g causes cancer
response:
💬 Claim: “5G causes cancer.”
🔍 Analysis: 🟠 Misleading Correlation
📌 Distortion Analysis (Perception Distortion Scale)
This claim misrepresents correlation as causation, which is one of the most common distortion patterns. It sounds alarming but lacks solid scientific grounding.
✅ What Does the Evidence Say?
🛠️ Why This Claim Persists (Clarity Toolkit Insight)
This is a classic case of distortion via emotional triggers and misframing, especially fear-based framing:
🧠 Perception Note
This claim preys on scientific uncertainty—the fact that science is always “open” to new findings is reframed to imply something is being hidden or ignored.
✅ Bottom Line:
📌 There is no credible evidence that 5G causes cancer.
📌 The claim is based on misinterpretation, emotional manipulation, and out-of-context data.
Would you like a breakdown of how this claim differs from scientifically valid caution, or want help evaluating a specific source spreading it?