Media Violence: A Problem We Cannot Ignore


By Anita Kartchner

On February 26th, 2025, Instagram’s parent company Meta experienced a brief glitch that caused users’ Reels feeds to be flooded with disturbing real‑life violent content, with users reporting highly graphic videos appearing without warning. After the shooting of Charlie Kirk on September 10th, footage of the incident spread rapidly across platforms such as X (formerly Twitter), Instagram, and TikTok, highlighting how easily violent material—often referred to as “gore”—can circulate online regardless of whether users seek it out. These videos typically originate from the darkest corners of the internet and gradually spread across platforms under vague or misleading captions, slipping past moderation systems and appearing on the feeds of unsuspecting users, many of whom are teenagers or children. A study by the Youth Endowment Fund found that 70% of internet users under 18 encountered real‑life violent content online in the past year, with the highest reports coming from X and TikTok; X in particular has minimal content restrictions, often placing violent videos behind a simple “see anyway” button, alongside large amounts of sexual content that also bypass moderation. Pornographic material is similarly under‑regulated across social media, with 65% of surveyed teenagers reporting they had seen explicit images while scrolling, and with 15% of all content on X officially tagged as adult content—an amount that does not include untagged posts. Even though only 22% of teens report using X, many minors still encounter content their brains are not equipped to process, and TikTok and Instagram also host significant amounts of explicit material. Research shows that exposure to such content can lead to symptoms similar to PTSD and increase real‑life aggression, with neuroimaging studies revealing changes in brain regions related to emotional response and impulse control; additional studies indicate that exposure to pornography during adolescence is a statistical risk factor for sexual aggression. A USC study found that youth exposure to traumatic media online leads to declines in mental health, and this traumatic content is not always illegally produced material—sometimes it is news footage. When I asked my peers whether they had seen the close‑up video of Charlie Kirk being shot, 80% said yes, and all reported severe distress afterward; none had searched for the video, as it simply appeared on their feeds. Although TikTok and Instagram guidelines require immediate removal of such footage, the platforms’ algorithms pushed the videos widely because the topic was trending, outpacing moderation. It is clear that content moderation alone cannot stop the spread of graphic media online, and despite many apps requiring users to be 16 or 18 to join, there is no meaningful age verification. Parents must take responsibility for limiting their children’s access to the internet and protecting them from harmful content.


New comments are not allowed.*

Previous Post Next Post