What is the Online Safety Act and how can you keep children safe online?

What is the Online Safety Act and how can you keep children safe online?

Tech companies need to take more action to keep children safe on the Internet, following the introduction of the Online Safety Act.

But the new rules won’t come in until 2025 – and critics say it doesn’t go far enough.

How much time do UK children spend online?
Children aged eight to 17 spend between two and five hours online each day, research by communications regulator Ofcom suggests. Time spent online increases with age.

Almost every child over the age of 12 has a mobile phone and almost all of them watch videos on platforms like YouTube or TikTok.

Four out of five teenagers who go online say they have used artificial intelligence (AI) tools such as Snapchat’s ChatGPT or MyAI.

Around half of children over 12 think being online is good for their mental health, according to Ofcom.

But there is a significant minority who do not. One in eight eight- to 17-year-olds say someone has hurt or hurt them on social media, or a messaging app.

The Children’s Commissioner said that half of the 13-year-olds surveyed by her team reported seeing “hardcore, misogynistic” porn on social media sites.

What online parental controls are available?
Two-thirds of parents say they use controls to limit what their children see online, according to Internet Matters, a security organization set up by several major UK-based internet companies.

It has a list of available parental controls and a step-by-step guide on how to use them.

For example, parents who want to reduce the likelihood of their children viewing inappropriate material on YouTube – the most popular platform for young people in the UK – can set up a “kids” version, which filters adult content.

For older children who use the main site, parents can set up a supervised account, which allows them to check the sites their child visits.

Supervision can also be provided on Facebook messenger, through its Family Center.

TikTok says its family pairing tool lets parents decide whether to make a teen’s account private.

Instagram parental controls include daily time limits, scheduled breaks and can list the accounts their child reports.

But this control is not foolproof. Ofcom data suggests around one in 20 children use the solution.

Zuckerberg apologized to the family
What controls are there on mobile phones and consoles?
Phone networks may block some explicit websites until users have indicated they are over 18 years of age.

Some also have parental controls that can limit which websites kids can visit on their phone.

Android and Apple phones and tablets have apps and systems that parents can use.

This can block or limit access to certain apps, block explicit content, prevent purchases and monitor browsing.

Apple has Screen Time and Google has Family Link. There are similar apps available from third-party developers.

Broadband services also have parental controls to filter certain types of content.

Game console controls also allow parents to ensure age-appropriate games and control in-game purchases.

How should you talk to your kids about online safety?
Talking to children about online safety and being interested in what they do online is also important, according to the NSPCC.

It recommends making discussions about it part of everyday conversations, just like chats about their day at school, which can make it easier for children to share any concerns they have.

What are the new rules for tech companies?
The government says the Online Safety Act – which will come into force in the second half of 2025 – puts the onus on social media firms and search engines to protect children from some legal but harmful material.

Platforms also need to show they are committed to removing illegal content, including:

child sexual abuse
controlling or coercive behavior
extreme sexual violence
promote or facilitate suicide or self-harm
animal cruelty
selling illegal drugs or weapons
violence
Pornographic sites need to stop children viewing content, by checking age.

Other new offenses have been established, including:

cyber-flashing – sending unsolicited sexual imagery online
share “deepfake” pornography, where artificial intelligence is used to insert a person’s likeness into pornographic content
The move also makes it easier for grieving parents to get information about their children from tech companies.

Regulator Ofcom has been given extra enforcement powers to ensure companies comply with the new rules and has published a draft code for them to follow.

It says the company must reconfigure the algorithms that determine what content users see, to ensure the most harmful material does not appear in children’s feeds and reduce the visibility and prominence of other harmful content.

Chief executive Dame Melanie Dawes warned any company that failed to comply could raise their minimum user age to 18.

And Technology Secretary Michelle Donelan urged big tech to take the code seriously:

“Get involved with us and be ready,” he said.

“Don’t wait for enforcement and high fines – rise up to meet your responsibilities and act now.”

LIVE: Grieving parents say social media is costing lives
What are critics saying about the new rules?
Some parents of children who have died after being exposed to harmful online content have called the new rules “inadequate” and criticized the delay before they come into effect.

Ian Russell, father of Mollie, and Esther Ghey, mother of Brianna, were among a group of grieving parents who signed an open letter to Prime Minister Rishi Mr Sunak and Opposition Leader Sir Keir Starmer, calling for further action.

They want a commitment to strengthen the Online Safety Act in the first half of the next Parliament and for mental health and suicide prevention to be added to the school curriculum.

“While we will be examining Ofcom’s latest proposals carefully, we are so far disappointed by their lack of ambition,” they added in the letter.

Esther Ghey says the Online Safety Act is not strong enough
In her own words – Molly Russell’s secret Twitter account
What are tech companies saying about the new rules?
Meta and Snapchat say they already have additional protections for children under 18 and highlight their existing parental tools.

“As a platform popular with young people, we know we have an extra responsibility to create a safe and positive experience,” a Snapchat representative said.

A Meta representative said it wanted young people to “connect with others in an environment where they feel safe”.

“Content that incites violence, promotes suicide, self-harm or eating disorders violates our rules – and we remove that content when we find it,” they said.

A number of other technology companies contacted by BBC News declined to respond to the draft measure.

About Kepala Bergetar

Kepala Bergetar Kbergetar Live dfm2u Melayu Tonton dan Download Video Drama, Rindu Awak Separuh Nyawa, Pencuri Movie, Layan Drama Online.

Leave a Reply

Your email address will not be published. Required fields are marked *