Social media firms including Meta (META.O), opens new tab Facebook, and ByteDance’s TikTok are required to take steps to address criminal conduct on their platforms and make them safer by design as a result of Britain’s online safety legislation, which went into effect on Monday.
The first standards of practice on addressing unlawful harms such child sexual abuse and aiding or inciting suicide have been released, according to media regulator Ofcom.
According to Ofcom, websites and apps have until March 16, 2025, to evaluate the hazards that unlawful content presents to both adults and children on their platforms.
According to Ofcom, they will need to begin putting policies in place to reduce those risks beyond the deadline, including improved moderation, simpler reporting, and integrated safety checks.
Melanie Dawes, the CEO of Ofcom, stated that digital businesses are now squarely in the safety spotlight.
“We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year,” she stated.
The Online Safety Act, which went into effect last year, places more stringent requirements on websites like Facebook, YouTube, and TikTok, with a focus on safeguarding children and eliminating unlawful information.
The reporting and complaint features must be simpler to locate and utilize under the new code. To identify child sexual abuse content, high-risk providers will have to employ automated technologies known as hash-matching and URL detection, according to Ofcom.