I thought this part was interesting. People are correct about the amount of toxic content (~33%) but just missattributes it to an equivalent share of the population.
These findings reveal a striking pattern in how people misunderstand online toxicity. Participants drastically overestimated how many users posted toxic comments on social media—believing it was 38% when it is actually 3% (a 13-fold overestimation). However, they were nearly accurate about how much total content this set of users produces—estimating 38% when it is actually 33% (a 1.15-fold overestimation). This suggests people may encounter toxic content at roughly the expected volume but attribute it to far more widespread participation than actually occurs. Rather than recognizing a small group of highly active accounts, people appear to imagine toxic behavior as broadly distributed across the user base.
Tl;dr:
On average, they believed that 43% of all Reddit users have posted severely toxic comments and that 47% of all Facebook users have shared false news online. In reality, platform-level data shows that most of these forms of harmful content are produced by small but highly active groups of users (3–7%).



