If We Don’t Teach Teens To Navigate The Internet, Who Will?
We recently dropped by UTeM for TikTok Malaysia’s Surf’s Up: Deputies of Digital Literacy event, where a refreshing conversation was taking place that focused on something a bit different from the usual chatter about laws and restrictions.
Subscribe to our FREE Newsletter, or Telegram and WhatsApp channels for the latest stories and updates.
Spend five minutes on your feed, and it becomes painfully obvious that the internet and social media can be a dangerous place.
On the extreme end, there are predators hiding behind anonymous DMs, and assault cases that a single chat can spark. Then there’s the rise of toxic ideologies, spreading hate and radicalising impressionable minds on the same platforms meant for creativity and connection.
Even on a normal day, the internet is messy with scams that trick people out of money, catfishing schemes that betray trust, cyberbullying that eats away at confidence, fake news and misinformation that warps reality, and AI-generated content that just adds to the noise.
Parents are understandably worried. And honestly, who wouldn’t be? Today, the internet feels less like a sandbox of ideas and more like a maze, where one wrong turn can send you deep into uh-oh territory.

The Big Question: Should the Government Step In?
This year, the government floated a bold idea: a blanket ban on social media for anyone under 16, starting next year. Alongside that, platforms might be required to implement eKYC verification, using MyKad, passports, or MyDigital ID to confirm users’ ages.
The goal is simple enough: protect younger Malaysians from online harm. However, the implications are anything but.
See, once social media accounts are tied to government IDs, the conversations inevitably shift to the uncomfortable stuff, like surveillance, privacy, freedom of expression, and the very real fear of doxing and data leaks (something we’ve seen many times before).
Add in the new regulation requiring platforms with over eight million users to obtain a licence to operate in the country, and it’s clear the digital landscape is heading for a major shake-up.
The Smarter Approach: Teaching Digital Literacy
We recently dropped by Universiti Teknikal Malaysia Melaka (UTeM) for TikTok Malaysia’s Surf’s Up: Deputies of Digital Literacy event, where a refreshing conversation was taking place that focused on something a bit different from the usual chatter about laws and restrictions.
Now in its fourth instalment, the program aims to encourage youths (and everyone else, of course) to develop the skills needed for safer, healthier and smarter interactions online.
This stop at UTeM is part of TikTok Malaysia’s campus series promoting responsible digital citizenship, following engagements at Universiti Teknologi MARA (UiTM), University of Nottingham Malaysia and International Islamic University Malaysia (IIUM) — with over 1,000 students pledging to be responsible netizens.

The event was centred around a think tank that pulled in voices from across the digital ecosystem—TikTok Malaysia, the Communications and Multimedia Content Forum (CMCF), academics, and even local content creators—to wrestle with a bigger question:
“Instead of shutting kids out of online spaces, how do we teach them to surf the internet safely and become responsible netizens?”

The 1st Layer of Control Comes From Ourselves
For Mediha Mahmood, Executive Director of CMCF, addressing online safety doesn’t start with bans or sweeping regulations, but begins with something far more manageable, yet often overlooked: personal responsibility.
If anything, she explained that the internet’s biggest challenge isn’t trolls or algorithms — it’s complacency.
In an age where outrage drives clicks and rage-bait gets rewarded with engagement, her message hits home as she pointed out that, while platforms and policymakers can set boundaries, it’s users who ultimately decide what kind of internet culture we’re creating.
The responsibility remains with us. For example, if I, as a speaker here, and I suddenly use foul language or say something seditious, I can’t blame others. The first layer of control always comes from ourselves. It’s the same online — what we post, comment, or share is our own responsibility.
CMCF Executive Director Mediha Mahmood.
Mediha stressed that as an always-online society, we need to “commit to being credible and purposeful creators, and responsible sharers”. And her call for accountability extends to parents, too.
As a mother, she sees a worrying gap between children’s online access and parents’ awareness. “Our kids don’t buy their own devices — we do. That means we’re responsible for what they’re exposed to,” she said.
It’s a collaborative effort. Platforms can help, but they can’t control our children for us.
CMCF Executive Director Mediha Mahmood.
Mediha reminded parents that danger can lurk anywhere, and it’s up to them to protect their kids from harmful influences, online or otherwise. “Many parents don’t realise that where there are kids, there are also predators,” she warned.
For parents who feel they “can’t control” their kids’ online habits, she makes a firm point: self-regulation starts at home, and if supervision fails, temporarily taking away internet access or devices may be necessary.
As she puts it, at the end of the day, digital literacy starts at home, and before we even begin talking about laws, moderation, or censorship, self-regulation plays a critical role in keeping everyone, especially our youth, safe online.
The foundation of digital safety and credibility lies in personal accountability.
CMCF Executive Director Mediha Mahmood.
Shared Responsibility Across the Community

Safety online isn’t just the parent’s or a platform’s responsibility — it’s a shared effort involving everyone within the digital space, explained Aliff Zakaria, Public Policy Manager at TikTok Malaysia.
He made it clear that users, especially the young ones, can’t afford to scroll mindlessly anymore.
Instead, he pointed out the importance of “#ThinkTwice”—a motto the platform has been championing since 2024—urging netizens to pause before reacting, question whether the content is true, and consider if something is even worth engaging with.
We want our community to feel safe in expressing themselves online, protected by our Community Guidelines, and encouraged to practice positive online behaviours by using our safety tools like Family Pairing, Manage Topics, Creator Care Mode and many more.
TikTok Malaysia Public Policy Manager Aliff Zakaria.
He admitted that moderation on a platform with millions of users worldwide is a challenging feat, and reiterated that safety isn’t just the platform’s responsibility. Everyone, from users to creators, and the community as a whole, has a role to play in keeping our online spaces safe.
“People often ask why TikTok is so strict with its content guidelines,” Aliff explained. “It’s because we want our platform to stay safe — not just for users, but for the platform itself. The first layer of protection is always the user.”
He stressed that the platform’s community guidelines must be followed alongside laws and cultural expectations. “Because we operate in Malaysia, we ensure compliance with our local legal and cultural standards”.
Aliff also touched on TikTok’s rules around AI-generated content, which require creators to disclose when they use artificial intelligence.
“As long as it doesn’t go against social norms or harm others, AI-generated content is fine,” he said. “But users must label such content. Transparency is key”.
And while TikTok is famous for its fun, entertainment-first culture, Aliff pointed out that the platform is also evolving into a learning space through initiatives like #LearnOnTikTok, #EduTok, and #TikTok STEM.
“Think of TikTok as a big creative canvas. We want to inspire creativity and create joy, but also promote education and learning,” he said. Something that many youths will miss out on if a simple, blanket ban is enforced to keep them out of these important online spaces.
Digital Literacy as a Life Skill

From the academic side, Deputy Dean (Student Development and Alumni) from the Faculty of Artificial Intelligence and Cybersecurity at UTeM, Ts Dr Muhammad Noorazlan Shah Zainudin, stresses that education is central to shaping responsible online behaviour and is a skill that needs to be learned.
Digital literacy should be seen as a life skill, not just a classroom topic. We emphasise to our students that visibility comes with responsibility.
UTeM Faculty of Artificial Intelligence and Cybersecurity Deputy Dean Ts Dr Muhammad Noorazlan Shah Zainudin.
Noorazlan believes that responsible posting starts with examining motivations: why do students—or anyone—create content in the first place? “Is it for visibility, or is it for value?” he asked, emphasising that the digital ecosystem should ultimately be a tool for growth, not a place for idle noise or negativity.
Employers today routinely check social media profiles as part of their hiring decisions, and every post leaves a permanent digital trace — We encourage our students to use platforms to build meaningful, skill-based content that reflects their capabilities, rather than risking their professional impression with mindless sharing.
UTeM Faculty of Artificial Intelligence and Cybersecurity Deputy Dean Ts Dr Muhammad Noorazlan Shah Zainudin.
The Deputy Dean also touched on the growing influence of AI tools like ChatGPT and Gemini, and cautioned against relying on them blindly.
I always tell my students, the biggest challenge in AI and Cybersecurity isn’t just the technology itself, it’s the critical thinking required to wield it ethically and the purpose for which we use it. We build powerful systems, but those systems reflect our values. That’s why we emphasise so much on ethics, to ensure the technical skills they gain here translate into responsible action online.
UTeM Faculty of Artificial Intelligence and Cybersecurity Deputy Dean Ts Dr Muhammad Noorazlan Shah Zainudin.
His stance is clear: treat AI as an aid, not a shortcut. It’s fine to be creative and resourceful, but vigilance, critical thinking, and ethical awareness are essential to using these tools responsibly.
The Internet as a Force for Good

The panel also heard from Ts Aqilah Zainuddin, an engineer, content creator, and entrepreneur who shared how the internet and social media have allowed creators like herself to teach, learn, and build communities that grow alongside them.
“I used to be scared to show myself online because I’m very introverted,” she said. “But I realised I have a skill that’s valuable to others. People told me to share my knowledge, so I picked up a camera and started recording.”
For her, the internet is what you make of it, “With our technical or niche skills, we shouldn’t be shy to share them — as long as it’s positive,” she explained. “If it’s not, don’t spread it.”
It’s a simple principle that goes to the heart of netizenship — using your voice responsibly, whether you’re speaking to 50 followers or 500,000. Because as her story showed, the internet and social media itself aren’t inherently harmful — it simply becomes what its users make of it.
So, Should Malaysia Have Stricter Laws For The Internet?
If you boil down everything the panel said, being a good netizen and keeping our online spaces safe, isn’t that complicated.
It simply means pausing before you post, checking before you share, contributing instead of criticising, and using the internet to learn and grow rather than just mindlessly scrolling. And most importantly, it’s about being aware of the kind of digital space we’re all shaping together.
So ask yourself: what truly protects our kids online — locking the doors and hiding the key, or giving them the skills to navigate the digital world confidently and safely?
Because let’s be real, bans alone won’t keep kids safe. What actually works is education, supervision, digital know-how, and accountability.
Share your thoughts with us via TRP’s Facebook, Twitter, Instagram, or Threads.
Typing out trending topics and walking the fine line between deep and dumb.



