Words Matter: Why ‘Pig Butchering’ Is the Wrong Term for Online Scams

Contents

    When Interpol speaks, it’s worth listening. This week, the global police organization called for an end to the term “pig butchering” to describe certain online financial scams. Their reason? The phrase trivializes the severe emotional and financial harm suffered by victims. They’re absolutely right.

    As content moderation professionals, we see firsthand the power of language to shape narratives and, unfortunately, perceptions of harm. Terms like “pig butchering” or even “catfishing” can sound quirky or even amusing at first glance — but this diminishes the real-world consequences. According to the FBI, the stakes are devastating, with victims losing an average of $180,000 in these scams.

    The current terminology describes a ruthless process where victims are “fattened up” with fake relationships and false promises before being financially “slaughtered” by scammers. It’s not just tasteless — it’s harmful.

    The human cost behind the language

    These scams are not just minor online deceptions. Victims often lose their life savings, their emotional stability, and their trust in others. By using euphemistic or callous terms, platforms and commentators unintentionally perpetuate a culture that downplays the trauma these crimes cause. The psychological impact often extends far beyond the financial losses, affecting victims’ ability to form future relationships and trust financial institutions.

    Interpol’s statement highlights an important point: we need terminology that respects victims and clearly communicates the gravity of the crime. Instead of “pig butchering,” terms like “romance baiting” or “investment scam grooming” may be less catchy but are far more accurate and responsible.

    The international dimension

    These scams know no borders, and neither should our response. Our terminology must work across languages and cultures while maintaining its gravity. What seems acceptable slang in one language could be deeply offensive when translated into another. This is particularly crucial for global platforms managing content moderation teams across multiple regions.

    Clear, respectful terminology helps ensure consistent enforcement and victim support across linguistic and cultural boundaries.

    Why this matters for platforms

    If your platform hosts user-generated content — dating apps, social media networks, investment forums — you’ve likely seen scams of this nature.

    1. Language influences policy and empathy – Using inappropriate terms risks diminishing the urgency of tackling these scams.
    2. Terminology informs moderation – Accurate language helps content moderators better identify and act on harmful content.
    3. Responsible communication builds trust – Your users will appreciate platforms that take these crimes seriously, in action and words.
    4. Regulatory compliance and risk management – As regulators worldwide increase scrutiny of platform responsibility for user safety, appropriate terminology demonstrates your commitment to user protection and can help align with emerging compliance requirements.

    Take action: Lead with clarity and care

    Interpol’s call is a reminder to all of us — platforms, moderators, and industry leaders — that words are part of how we fight harm online. Review your own terminology. Educate your teams. If your platform communicates about these scams, choose terms that reflect the real stakes.

    Language can uplift victims or leave them feeling dismissed. Let’s choose wisely.

    Ahem… tap, tap… is this thing on? 🎙️

    We’re Besedo and we provide content moderation tools and services to companies all over the world. Often behind the scenes.

    Want to learn more? Check out our homepage and use cases.

    And above all, don’t hesitate to contact us if you have questions or want a demo.

    Contents