Blogs

TrustCon 2025: Why Every Platform Must Rethink Trust and Safety Now

TrustCon 2025: Why Every Platform Must Rethink Trust and Safety Now

10 August 2025

By Ling Quek
VP of Trust & Safety (Global Client Solutions)
 
At TrustCon 2025, one of the largest global gatherings of trust and safety professionals, one message stood out. Research has shown that customers who trust a brand are 88% more likely to buy again which is a clear signal that trust is not just a feel-good metric but a measurable growth driver. The traditional model of content moderation and basic fraud checks is no longer enough. Businesses must now take a more strategic, integrated approach by combining AI, human expertise, and cross-functional design to build safer digital experiences and drive long-term success.

Here are the key takeaways from the event, and what they mean for platforms today.

1. Trust and safety has outgrown the back office.

As Charlotte Willner, Executive Director of the TSPA (Trust & Safety Professional Association) put it, “Our job is like looking into a flashlight that tells us bad news.” T&S professionals continue to operate in high-pressure environments, often with shrinking teams and rising expectations. Businesses cannot afford to underinvest in safety infrastructure just as threats become more complex and public scrutiny intensifies. Platforms need scalable, resilient trust and safety services — not just to reduce harm, but to retain users and meet growing compliance demands.

2. Collaboration is the new advantage.  

One of the most talked-about moments came when Roblox announced it was open-sourcing its voice moderation AI. This move reflects a larger shift in the T&S community toward transparency and collaboration. Rather than treating safety capabilities as proprietary, more companies are now sharing tools and best practices.

This spirit of openness benefits everyone. A voice moderation model built for a gaming platform, for example, could help a marketplace monitor for verbal harassment or abuse. Working with partners who participate in this ecosystem can accelerate innovation and improve safety outcomes across industries.

3. Regulation is reshaping platform responsibility.  

Regulatory discussions were prominent at TrustCon 2025. Government representatives from Singapore, the UK, Australia, and Ireland all highlighted a common priority: protecting minors through better age assurance.

Australia’s upcoming regulation will require social media platforms to block users under 16 by the end of 2025. Yet, as Edward Wee, Director of Online Safety & Content Regulation at Singapore’s Infocomm Media Development Authority, noted, this is a difficult goal. Research shows that around 60% of children on social platforms falsify their age. Therein lies the challenge: How do we protect these users if they don't even know who they are?

This shift will impact every platform. Meeting these new standards will require stronger fraud detection and prevention systems, smarter onboarding flows, and privacy-conscious age verification tools.

Early movers will benefit — not just by avoiding penalties, but by building ecosystems that attract users who value safety and transparency.

4. Moderator well-being is business-critical.  

During workshop sessions, we discussed the flaw of most wellness programs being a “check in the box,” something  moderators are “made” to attend making them more of a chore rather than something actually impacting their professional or personal wellbeing.

Really effective well-being needs to be embedded into operations, not tacked on as an afterthought. This includes thoughtful workflow task, meaningful mental health support, and clear escalation frameworks.

At TDCX, we’ve seen the value of meaningful wellness programs firsthand. In a project for a social media platform supporting accounts across Southeast Asia, East Asia, and South Asia, we delivered over 90% quality scores, maintained less than 20% voluntary attrition, and improved moderator well-being through tailored interventions.

For companies relying on outsourcing, this is a critical insight. Operational excellence depends not only on speed or scale, but on sustaining the humans behind the work.

5. AI is reshaping roles, not replacing them.  

The future of content moderation services is an increasingly hybrid one. AI systems are already automating straightforward cases, such as detecting nudity or hate speech. Roblox shared that its automated moderation tools now outperform humans on repeatable tasks.

But AI doesn’t replace people; it changes their roles. Humans are still essential for edge cases, ethical decisions, and crises that require empathy and context.

Forward-looking platforms are already hiring AI policy managers and AI product managers to govern how AI is used. These roles ensure alignment with safety, privacy, and regulatory expectations.

TDCX’s AI solutions are designed around this hybrid approach. We deploy machine learning models for speed and scale, while our expert teams handle complex and sensitive issues, especially in areas such as fraud prevention, and child safety.

This is the future: humans and AI, working together to deliver safer digital spaces at scale.

6. Trust is a growth driver.  

Across industries, one truth is clear: Safety and integrity drive user behavior. In e-commerce, over half of consumers won’t buy if they suspect fake reviews. In FinTech, users will abandon an app they believe is vulnerable to scams. In marketplaces, unchecked content risks reputational damage.

Trust isn’t just a regulatory issue. It’s a customer experience issue, and, ultimately, a business issue.

That’s why leading companies are investing in full-stack trust and safety services — from onboarding to escalation, from policy to operations. And they’re seeing results: higher retention, stronger brand equity, and faster growth.

Final word: Trust is the new baseline

The message from TrustCon 2025 is clear. Building trust isn’t about avoiding the next PR crisis. It’s about building platforms that users want to stay on, regulators want to support, and partners want to grow with.

If you're leading a platform now is the time to act. Safety must be embedded, not bolted on. And that means working with partners who can deliver more than headcount or automation alone.

Ready to future-proof your trust and safety strategy? Let’s talk about how to build safer, stronger platforms, starting today.

Speak with our experts