the sun malaysia ipaper logo 150x150
Sunday, December 14, 2025
21.5 C
Malaysia
the sun malaysia ipaper logo 150x150

‘Platforms must be more responsible for online safety’

Security has to be designed in from start, says expert

KUALA LUMPUR: WeProtect Global Alliance executive director Iain Drennan said social media platforms must shoulder greater responsibility in preventing online harm involving children as offenders increasingly exploit artificial intelligence (AI), encrypted apps and cross-platform movement to evade detection.

Speaking to theSun during the Asean ICT Forum on Child Online Protection on Nov 18, Drennan said the scale and sophistication of online child exploitation has entered “a phase no country could manage alone”.

“Platforms control the environment, so the responsibility for safety rests with them. It is not optional. It is fundamental.

“This cannot be left to parents or children. You cannot expect parents to sift through 10 different safety menus across 10 different apps. The burden cannot be on families. Safety has to be designed in from the start.”

He said the accountability gap is evident even in the most basic layer of platform governance; their own terms and conditions (T&Cs).

“Most people don’t read T&Cs in detail, but we all just click. But the T&C talk about what behaviour is acceptable. The problem is that those commitments are not reflected in how the platforms should operate.”

On Nov 18, Unicef highlighted the scale and speed of AI-generated child sexual abuse material (CSAM), adding that global reports had surged from nearly 7,000 cases in 2024 to about 440,000 in the first half of 2025.

Drennan said the NGO’s upcoming Global Threat Assessment, to be released next month, shows a sharp rise in grooming, sextortion and AI-generated CSAM worldwide, with offenders exploiting technology “faster than governments, laws or parents could respond”.

“We did a report last month looking at this issue. And it is something that is really challenging to deal with. In the UK, for example, they find that 40% of cases being dealt with by the National Society for the Prevention of Cruelty to Children were peer-to-peer harmful sexual behaviour.”

He said forward-leaning jurisdictions are already shifting towards mandatory safety-by-design requirements, red-team testing and stronger cross-platform cooperation, measures he believes Malaysia would eventually need to adopt.

“Australia and the eSafety Commissioner; they’re one of the first governments to implement an Online Safety Act. A lot of the world is looking at how that is going and what lessons could be learned. They are making an effort to drive accountability and transparency.

“Every platform has a responsibility to ensure that if children are able to access their services, then they need to think about child safety.

“Do things such as red teaming, in which you have people whose job is to find problems. You get white-hat hackers (ethical hackers who test systems for weaknesses) going in and trying to pretend to be the people who are going to exploit this. So, you then get a service that has been tested rigorously.”

He said prevention must also include early digital literacy in schools.

“We can’t arrest our way out of this. You have got to harness technology, but you’ve also got to start early in terms of education.

“The market is not going to do it single-handedly. Governments need to set the expectations; you (platforms) are operating in this jurisdiction, therefore these are the standards.

“Children today live fully integrated digital lives. They move across gaming platforms, social media, messaging apps and AI tools without distinction, so the vulnerabilities move with them.”

Related

spot_img

Latest

Most Viewed

spot_img

Popular Categories