Experts call for mandatory safety trials for AI tools and platforms in Malaysia to combat harmful content and protect vulnerable users online.
PETALING JAYA: Malaysia should require a digital “crash test” trial before allowing AI tools and social media platforms to operate in the country, as weak safeguards are increasingly enabling the spread of explicit and harmful content online.
Universiti Malaya Computer Systems and Technology lecturer Prof Dr Ainuddin Wahid Abdul Wahab warned that platforms deploying AI tools without rigorous safety testing expose users, particularly women and minors, to serious risks, including harassment, exploitation and blackmail.
He stressed that companies must prove their systems are safe rather than let people rely on voluntary assurances.
“At minimum, this means pre-launch safety testing, strict age protections and clear channels for victims to report abuse.
“If a platform cannot demonstrate fast response times and regular independent audits to prevent harm, it should not be allowed on our digital roads.”
His comments come amid growing scrutiny of X, on which explicit material and AI-generated content, including those produced via its AI tool Grok, have drawn criticism from Malaysian authorities, prompting discussions on possible legal action.
Ainuddin Wahid highlighted “jailbreaking” as a major vulnerability in AI systems, in which users deliberately manipulate tools to bypass built-in restrictions.
“It is like tricking a librarian into handing over a forbidden book.
“Many tools also lack proper checks on who is using them or how often. This turns the AI into an unsupervised printer for abuse.”
He warned that once safety locks are breached, harmful content could be generated and spread at massive scale before platforms are able to respond.
“Weak safeguards are like leaving a loaded gun on a park bench—anyone can pick it up and use it.
“Just one photo and a name can be turned into fake explicit images or blackmail material in seconds. The damage spreads fast and is incredibly hard to remove.”
He said while platforms often rely on detection and filtering systems, threat actors, meaning individuals or groups who intentionally cause digital harm, consistently stay ahead.
“Attackers use slang, misspellings or coded language to evade filters. Every time one trick is blocked, a new one appears and dangerous material continues to leak through.”
He also said blocking access or legal action is not a perfect solution but remains necessary.
“Blocks are not a perfect cure, as users can bypass them with VPNs, but they are important speed bumps.
“They force tech giants to slow down and face accountability. The goal is not to push harm underground, but to make it harder to create and impossible to spread.”
From a sociological perspective, Universiti Kebangsaan Malaysia Anthropology and Sociology senior lecturer Dr Velan Kunjuraman warned that poorly moderated platforms risk reshaping social norms and values.
“When a platform such as X is widely used for pornographic content and fails to moderate harmful material, it may normalise harmful behaviour, weaken shared moral norms and increase tolerance towards misinformation, harassment and exploitation.”
He added that Malaysia’s international image could suffer as the country’s reputation as a religious nation may be undermined.
He said young people and other vulnerable groups are especially at risk.
“Unmoderated explicit content could affect mental wellbeing, distort understanding of relationships and sexuality, and increase exposure to online risks and exploitation.
“Young adults and underage users are easily influenced and may be drawn into immoral activities.”
He warned that prolonged exposure to explicit material could desensitise users and shift attitudes towards sexuality and relationships, with public protests possible if such content continues unchecked.
He urged authorities to act decisively, saying regulators must enforce accountability to protect users, uphold local values and ensure safer digital spaces.
“Stronger enforcement could build public trust and social wellbeing,” he added, noting that the Malaysian Communications and Multimedia Commission may introduce new policies to address similar issues in the future.








