Social echo chambers and the brand risk of unchecked communities
Jan 29, 2025
Paramark News Desk
Credit: JD Lasica via WikiMedia Commons
Key Points
X and Meta's changes to online speech regulations create uncertainty in the marketing sphere during Trump's second administration
Execs at tech-enabled influencer marketing company NeoReach warn that lack of moderation poses brand risks, as communities struggle to self-police.
The biggest risk to regulation is no regulation.
James Michalak
CEO
,
NeoReach
As X and Meta create sweeping changes around what can be said online, the relationship between free speech, community moderation, and brand safety hangs in the balance. For companies and people working in the marketing sphere of social, a tense uncertainty looms over how speech will be governed during Trump's second administration.
Community guidelines and deregulation: While deregulation may seem liberating, NeoReach CEO James Michalak argues that the absence of moderation holds danger for users: "The biggest risk to regulation is no regulation," Michalak says. "Communities have a very difficult time policing themselves in terms of things like hate speech. Mainly because those communities have become echo chambers—not really by fault of the platforms, but just by human nature. You engage with people that have similar viewpoints, and if you do that enough, some style of discussion becomes the norm and not the exception."
NeoReach is a tech-enabled influencer marketing agency helping influencers seamlessly search, manage, and track user-generated content campaigns. It works with the world's most famous brands like Honda, AirBnB, and Amazon to deliver memorable social experiences through top content creators.
Misinformation and brand safety: Diminishing community guidelines creates heightened concerns over misinformation "I think that does cause an issue for misinformation," NeoReach's CMO Steph Payas notes. "We do need some moderation and moderators within those platforms to ensure that things are going smoothly."
This extends to brand safety, as companies rely on social media to convey their messages without distortion. "From a brand safety side of things, I think it is important to have some moderation and moderators within those platforms to ensure that messaging is getting across the way it's intended to," she says.
From a brand safety side of things, I think it is important to have some moderation and moderators within those platforms to ensure that messaging is getting across the way it’s intended to.
Steph Payas
CMO
,
NeoReach
Self-policing: Without community guidelines, social media platforms and their inherent communities are left with the task of self-policing conversations: "Platforms like Rumble and X have proven that they can [moderate] pretty effectively, and self-policing can happen in some capacities," says Michalak. "Meta is going to be different because there are so many disparate groups of demographics and individuals on that platform, which will be a challenge."
Company burden: These changes may force companies to take on roles they aren’t prepared for—policing creators to enforce brand values. "The responsibility of content policing may fall on the company and its brand partners," Michalak explains. "This is an undesirable position, as it involves enforcing brand values on creators, which should not be the company's role."
Related articles
B2C
Balancing paid and brand in 2025, with BigBrain CMO Sam McLellan
Feb 2, 2025
Paramark News Desk
B2C
The growing list of competitors and acquirers ready to scoop up TikTok
Jan 22, 2025
Paramark News Desk
B2C
In the global-scale battle over TikTok, who defends the small creator economy?
Jan 22, 2025
Paramark News Desk
B2C
The D2C-ification of legacy industries falls flat with high-profile proptech shutdowns
Jan 22, 2025
Paramark News Desk
Load More
Solutions
© 2024 Paramark, Inc.
Social echo chambers and the brand risk of unchecked communities
Jan 29, 2025
Paramark News Desk
Credit: JD Lasica via WikiMedia Commons
Key Points
X and Meta's changes to online speech regulations create uncertainty in the marketing sphere during Trump's second administration
Execs at tech-enabled influencer marketing company NeoReach warn that lack of moderation poses brand risks, as communities struggle to self-police.
The biggest risk to regulation is no regulation.
James Michalak
CEO
,
NeoReach
As X and Meta create sweeping changes around what can be said online, the relationship between free speech, community moderation, and brand safety hangs in the balance. For companies and people working in the marketing sphere of social, a tense uncertainty looms over how speech will be governed during Trump's second administration.
Community guidelines and deregulation: While deregulation may seem liberating, NeoReach CEO James Michalak argues that the absence of moderation holds danger for users: "The biggest risk to regulation is no regulation," Michalak says. "Communities have a very difficult time policing themselves in terms of things like hate speech. Mainly because those communities have become echo chambers—not really by fault of the platforms, but just by human nature. You engage with people that have similar viewpoints, and if you do that enough, some style of discussion becomes the norm and not the exception."
NeoReach is a tech-enabled influencer marketing agency helping influencers seamlessly search, manage, and track user-generated content campaigns. It works with the world's most famous brands like Honda, AirBnB, and Amazon to deliver memorable social experiences through top content creators.
Misinformation and brand safety: Diminishing community guidelines creates heightened concerns over misinformation "I think that does cause an issue for misinformation," NeoReach's CMO Steph Payas notes. "We do need some moderation and moderators within those platforms to ensure that things are going smoothly."
This extends to brand safety, as companies rely on social media to convey their messages without distortion. "From a brand safety side of things, I think it is important to have some moderation and moderators within those platforms to ensure that messaging is getting across the way it's intended to," she says.
From a brand safety side of things, I think it is important to have some moderation and moderators within those platforms to ensure that messaging is getting across the way it’s intended to.
Steph Payas
CMO
,
NeoReach
Self-policing: Without community guidelines, social media platforms and their inherent communities are left with the task of self-policing conversations: "Platforms like Rumble and X have proven that they can [moderate] pretty effectively, and self-policing can happen in some capacities," says Michalak. "Meta is going to be different because there are so many disparate groups of demographics and individuals on that platform, which will be a challenge."
Company burden: These changes may force companies to take on roles they aren’t prepared for—policing creators to enforce brand values. "The responsibility of content policing may fall on the company and its brand partners," Michalak explains. "This is an undesirable position, as it involves enforcing brand values on creators, which should not be the company's role."
Related articles
B2C
Balancing paid and brand in 2025, with BigBrain CMO Sam McLellan
Feb 2, 2025
Paramark News Desk
B2C
The growing list of competitors and acquirers ready to scoop up TikTok
Jan 22, 2025
Paramark News Desk
B2C
In the global-scale battle over TikTok, who defends the small creator economy?
Jan 22, 2025
Paramark News Desk
B2C
The D2C-ification of legacy industries falls flat with high-profile proptech shutdowns
Jan 22, 2025
Paramark News Desk
Load More
Solutions