Proposed codes let IMDA direct social media platforms to block access to harmful content

Facebook, TikTok, Twitter apps are seen on a smartphone in this illustration taken, July 13, 2021. *(Photo: REUTERS/Dado Ruvic)
Facebook, TikTok, Twitter apps are seen on a smartphone in this illustration taken, July 13, 2021. *(Photo: REUTERS/Dado Ruvic)

SINGAPORE — Social media platforms will have to abide by directions by the Infocomm Media Development Authority (IMDA) to take action against harmful content, such as those promoting sexual or self-harm, under a proposed set of rules to protect users, Minister for Communications and Information Josephine Teo said on Monday (20 June).

Meanwhile, some social media platforms will also have to provide mechanisms for users to flag harmful or inappropriate content, such as those involving self-harm or intimate videos, under another set of proposed rules to protect users.

The IMDA is engaging tech companies on the mooted "Content Code for Social Media Services" and "Code of Practice for Online Safety", said Teo on her Facebook page.

A public consultation exercise will begin next month.

According to media reports, Singapore's Ministry of Communications and Information (MCI) gave further information on the two proposed codes of practices at a press conference on Monday.

In the proposal, actions the the IMDA would be able to direct social media platforms to do include disabling access to certain content for users in Singapore, or preventing specific online accounts on these platforms from interacting with or communicating content to Singapore users.

The MCI also said that the proposals only covered social media platforms that allowed posting of content online aimed at getting interaction and links, and would not include messenger applications.

In her post, Teo said online safety is a growing concern and acknowledged that social media platforms have put in place measures to ensure user safety over the years.

"Still, more can be done given the evolving nature of harms on these platforms and the socio-cultural context of our society," she said.

In her post, Teo cited a survey by the Sunlight Alliance for Action (AfA) in January, which found that one in two respondents experienced "online harms", with teens and young adults forming the majority of those harmed.

There are calls to develop "more support mechanisms and infrastructure for victims of online harms", she said.

She also noted that many countries have enacted or are in the process of enacting laws to protect users online.

"Singapore’s preferred approach in strengthening our online regulatory approach is to do so in a consultative and collaborative manner," she said.

"This means learning from other countries’ experiences, engaging tech companies on the latest tech developments and innovations, and understanding our people’s needs.

"These will allow us to develop requirements that are technologically feasible, can be effectively enforced and that are fit for our purpose," she added.

Stay in the know on-the-go: Join Yahoo Singapore's Telegram channel at http://t.me/YahooSingapore