Close Menu

    Subscribe to Updates

    Get the latest Tech news from SynapseFlow

    What's Hot

    I avoided liquid cooling for years and that was a huge mistake

    March 14, 2026

    Which phone-powered PC experience is better?

    March 14, 2026

    Elon Musk Orders Sweeping Layoffs as xAI Fails to Catch Up

    March 14, 2026
    Facebook X (Twitter) Instagram
    • Homepage
    • About Us
    • Contact Us
    • Privacy Policy
    Facebook X (Twitter) Instagram YouTube
    synapseflow.co.uksynapseflow.co.uk
    • AI News & Updates
    • Cybersecurity
    • Future Tech
    • Reviews
    • Software & Apps
    • Tech Gadgets
    synapseflow.co.uksynapseflow.co.uk
    Home»Reviews»AI chatbots like ChatGPT can copy human traits and experts say it’s a huge risk
    AI chatbots like ChatGPT can copy human traits and experts say it’s a huge risk
    Reviews

    AI chatbots like ChatGPT can copy human traits and experts say it’s a huge risk

    The Tech GuyBy The Tech GuyDecember 19, 2025No Comments2 Mins Read0 Views
    Share
    Facebook Twitter LinkedIn Pinterest Email
    Advertisement


    AI agents are getting better at sounding human, but new research suggests they are doing more than just copying our words. According to a recent study, popular AI models like ChatGPT can consistently mimic human personality traits. Researchers say this ability comes with serious risks, especially as questions around AI reliability and accuracy grow.

    Advertisement

    Researchers from the University of Cambridge and Google DeepMind have developed what they call the first scientifically validated personality test framework for AI chatbots, using the same psychological tools designed to measure human personality (via TechXplore).

    The team applied this framework to 18 popular large language models (LLMs), including systems behind tools like ChatGPT. They found that chatbots consistently mimic human personality traits rather than responding randomly, adding to concerns about how easily AI can be pushed beyond intended safeguards.

    The study shows that larger, instruction-tuned models such as GPT-4-class systems are especially good at copying stable personality profiles. Using structured prompts, researchers were able to steer chatbots into adopting specific behaviors, such as sounding more confident or empathetic.

    This behavorial change carried over into everyday tasks like writing posts or replying to users, meaning their personalities can be deliberately shaped. That is where experts see the danger, particularly when AI chatbots interacts with vulnerable users.

    Why AI personality raises red flags for experts

    chatgpt-shared-projects
    Matheus Bertelli / Pexels

    Gregory Serapio-Garcia, a co-first author from Cambridge’s Psychometrics Centre, said it was striking how convincingly LLMs could adopt human traits. He warned that personality shaping could make AI systems more persuasive and emotionally influential, especially in sensitive areas such as mental health, education, or political discussion.

    The paper also raises concerns about manipulation and what researchers describe as risks linked to “AI psychosis” if users form unhealthy emotional relationships with chatbots, including scenarios where AI may reinforce false beliefs or distort reality.

    Artificial Intelligence
    Unsplash

    The team argues that regulation is urgently needed, but also notes that regulation is meaningless without proper measurement. To that end, the dataset and code behind the personality testing framework have been made public, allowing developers and regulators to audit AI models before release.

    As chatbots become more embedded in everyday life, the ability to mimic human personality may prove powerful, it also demands far closer scrutiny than it has received so far.

    Advertisement
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    The Tech Guy
    • Website

    Related Posts

    Your ROG Xbox Ally X is about to get a free performance upgrade soon

    March 14, 2026

    The vivo X300 Ultra will upgrade audio quality on all levels

    March 14, 2026

    D-Link D501 5G adapter review

    March 13, 2026

    Dyson Airwrap i.d hits Black Friday big saving again

    March 13, 2026

    Bumble wants you to trust its Bee AI assistant to date humans

    March 13, 2026

    One UI 8.5 beta program opens for the Galaxy Z Fold7 and Z Flip7

    March 13, 2026
    Leave A Reply Cancel Reply

    Advertisement
    Top Posts

    The iPad Air brand makes no sense – it needs a rethink

    October 12, 202516 Views

    ChatGPT Group Chats are here … but not for everyone (yet)

    November 14, 20258 Views

    Facebook updates its algorithm to give users more control over which videos they see

    October 8, 20258 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Advertisement
    About Us
    About Us

    SynapseFlow brings you the latest updates in Technology, AI, and Gadgets from innovations and reviews to future trends. Stay smart, stay updated with the tech world every day!

    Our Picks

    I avoided liquid cooling for years and that was a huge mistake

    March 14, 2026

    Which phone-powered PC experience is better?

    March 14, 2026

    Elon Musk Orders Sweeping Layoffs as xAI Fails to Catch Up

    March 14, 2026
    categories
    • AI News & Updates
    • Cybersecurity
    • Future Tech
    • Reviews
    • Software & Apps
    • Tech Gadgets
    Facebook X (Twitter) Instagram Pinterest YouTube Dribbble
    • Homepage
    • About Us
    • Contact Us
    • Privacy Policy
    © 2026 SynapseFlow All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.

    Ad Blocker Enabled!
    Ad Blocker Enabled!
    Our website is made possible by displaying online advertisements to our visitors. Please support us by disabling your Ad Blocker.