Illustration in Poland.

Social media experts find flaws in Meta’s new protections for teens

Listen to Australian and world news and follow trending topics with
Can we trust tech companies to keep vulnerable Australians safe?
Well, social media company Meta would like you to think it’s up to the task.
They have begun attempting to reshape the narrative about the impact of their Facebook and Instagram platforms on children, with Meta regional policy director Mia Garlick announcing that teenage users will now have new restrictions on their Instagram account.
“Today, we’re excited to announce that we’re introducing teen accounts to Instagram that will automatically place teens in built-in safeguards and reassure parents that teens are having safe experiences. Starting today Today, we will automatically begin the process of helping teens be placed in these new age-appropriate protection settings, which limits who can contact them, what content they can see, and also gives them new ways to. explore safely their interests and passions and teens under 16 will need a parent’s permission to change any of the settings to be less strict.
The announcement follows growing pressure from the Albanian government to address concerns from parents and academics about the harm children are exposed to on social media.
Meta was also found to be the third most distrusted brand in Australia, according to a new survey of 2,000 people by Roy Morgan.
Dr Karley Beckman, senior lecturer in digital technologies for learning at the University of Wollongong, says Meta is trying to regain community trust and avoid the government’s proposed ban on young teenagers accessing to their platforms.
“I think it’s definitely a PR move on Meta’s part. It seems to me that they are trying to get ahead of the law and are really trying to regain the trust of parents. A lot of the conversations and media around teens on social media are really focused on the adults in their lives and mostly on the parents who really care about the well-being of their children and the effects of social media. »
Lisa Given, a professor of information science at RMIT University, says Meta hopes to ease parents’ minds, but the move could actually give them more responsibility for monitoring their children.
“So what they’re basically saying is that this teen account will give parents more control, more visibility. So, for example, parents will now be able to see who their kids are messaging with. The Challenge is that it actually puts the onus on parents to monitor and control the content instead of saying the tech company is responsible for controlling it on the other side.
These new accounts will also set teen users’ profiles to private by default, have an automatic sleep mode that turns off notifications between 10 p.m. and 7 a.m., and include a notification to exit the app after 60 minutes of use each day.
Carly Dober, a practicing psychologist and director of the Australian Psychological Association, says the changes are welcome, but the new announcement doesn’t solve many problems on the Instagram platform.
“So I think it’s a start. I don’t think it goes far enough. It’s a blocking period, which is great because teenagers need sleep. Without sleep, your health mental won’t be good. So it’s really good Filter words for bullying and discrimination and other sensitive words. things like content on eating disorders, misinformation, because there has been no discussion about it, which remains a major concern, not only for young people, but for everyone using these applications.
The Albanian government’s proposed social media ban promises to cut off children’s access to new technologies that could accurately determine their age.
They have not yet set an exact age at which children would have access, but it will likely be between 14 and 16 years old.
Academics have criticized the plan as coming far too late and failing to pressure platforms to tackle harmful content that can affect Australians of all ages.
RMIT’s Professor Given says it’s one of many quick solutions to a complex problem.
“A lot of people are looking, I think, for a quick fix. None of us want kids to be hurt online, but does that mean we kind of throw the baby out with the bathwater and say that it can’t be on a social media platform? It’s not really technically possible, but more importantly, there are a lot of good things on social media that we would exclude. so these children It can also be the same with. this new Instagram for teens, because if kids are especially using social media in order to educate themselves and be part of communities that fill a need in their lives, and that means that now they have. This kind of big brother parental monitoring, they may actually choose to walk away and not participate, which could actually end up becoming more harmful.”
Dr Joanne Orlando, a researcher in digital literacy and digital wellbeing at Western Sydney University, says she’s not convinced the Albanese ban or these new teen accounts will actually stop keen kids from technology to use VPN technology to bypass local policies.
“I think platforms continue to push out these revamped parental features, and one of the glaring shortcomings of this feature is that it’s only tested in four countries, I think, which means a VPN is again easily used by young people They register with an account using access through another country which has no laws so children can easily bypass it.
So, what form of regulation can get to the heart of the problem?
Psychologist Carly Dober says self-regulation by profit-driven companies is never reliable and the government must challenge tech companies to make meaningful changes to content moderation on their platforms – not just for children – but for all users.
“I really don’t think the self-regulation model is helpful or effective. Again, we saw today that these tech giants make changes when they feel like their hands are being forced And even then, personally, professionally, I don’t think it goes far enough and I strongly support and believe that the government needs to take a stronger approach. The research tells us that at 18, the. world doesn’t change like magic. that, like you said, you can be 18 and you have full freedom with these apps, and then what happens is you fall into an echo chamber, you become an incel? when you don’t have the tools to learn how to navigate the technology space in a healthy and safe way, again, without any misinformation or misinformation changing, how does that protect you?
Experts also point to better education of adolescents so that they can acquire the skills and abilities necessary to safely navigate social networks and distinguish between safe and potentially harmful content.
Dr. Orlando says this must avoid tired moralizing that makes social media a significant cultural threat, because teens are likely to ignore it.

“The way we approach it right now is kind of the way we used to approach sex education. It’s this taboo thing, don’t do it. All the downsides, and of course young people were still doing it, but but now it’s really changed, and we’re discussing consent and it’s really explained so that young people have a really good understanding so that they can make decisions with information and evidence and really make a good decision. we must approach with social media, this real holistic approach, which reveals everything, without judgment, but helping young people to really understand the space so that they can make good decisions about it.

More From Author

Investment funds hold hope in China's recovery despite disappointing stimulus measures - InvestorDaily

Investment funds hold hope in China’s recovery despite disappointing stimulus measures – Usdafinance

Financial services more vulnerable to outages

Leave a Reply

Your email address will not be published. Required fields are marked *