Social Media
Portugal Approves Law Restricting Social Media Access for Children Under 16
Under 16? In Portugal, social media now comes with conditions — and parental approval.
Portugal’s parliament has approved new legislation restricting social media access for children under 16, joining a growing list of countries tightening digital rules for minors.
Under the new law, the minimum age for autonomous access to social networking platforms, video-sharing services and open communication services will rise from 13 to 16. Children aged 13 to 16 will only be allowed to create and use accounts with express and verified parental consent. Those under 13 will be barred entirely from accessing platforms, services, games and applications covered by the legislation.
The restrictions apply to major platforms such as Instagram, Facebook and TikTok. Messaging services such as WhatsApp, widely used by families for communication, are not included.
The bill requires companies to implement age-verification mechanisms. For users between 13 and 16, account creation must be linked to Portugal’s Digital Mobile Key or a similar identification system that confirms age without publicly exposing additional personal data.
Lawmakers from the Socialist Party said platforms must also introduce features to reduce minors’ exposure to violence, explicit content, addictive gaming elements and manipulated media.
Oversight of the law will fall to Portugal’s National Communications Authority and the National Data Protection Commission. During parliamentary debate, opposition lawmakers raised concerns about enforcement, privacy protections and the possibility that young users could bypass restrictions using VPNs. Some critics also warned that the measure could infringe on personal freedoms.
Portugal follows similar moves elsewhere. Australia has enacted strict age-verification requirements for under-16s, while France recently approved limits for users under 15. Legislative efforts are also underway in Denmark, Italy and Spain, reflecting broader European concern over the impact of social media on children’s mental health and development.
Social Media
Instagram CEO Pushes Back as Families Confront Meta in Court
Is Instagram a product — or a psychological trap? A California courtroom is now wrestling with that question.
The chief executive of Instagram on Wednesday rejected claims that users can be clinically addicted to social media, as he testified in a closely watched California trial that could reshape how courts view the design of digital platforms.
Adam Mosseri, appearing on behalf of parent company Meta, sought to draw a distinction between medical addiction and what he described as “problematic use.” Under questioning from plaintiffs’ attorney Mark Lanier, Mosseri acknowledged that he may have used the word “addicted” casually in the past — likening it to binge-watching a television series — but said he was not qualified to diagnose clinical dependency.
“I’ve never claimed being able to diagnose addiction clinically,” Mosseri told the jury.
The case centers on allegations that major social media platforms, including Instagram and YouTube — owned by Google — deliberately engineered their products to hook young users for profit. The plaintiff, a 20-year-old woman identified in court filings as Kaley G.M., argues that she suffered severe mental harm after years of heavy use beginning in early childhood.
According to testimony, she began watching YouTube at age six and joined Instagram at 11, later expanding to other platforms. Plaintiffs contend that features such as algorithmic feeds and engagement loops function like dopamine triggers, encouraging compulsive behavior in vulnerable adolescents.
Mosseri countered that Instagram has evolved significantly since Facebook acquired it in 2012. He pointed to safety tools introduced over the years, some of which, he said, reduced engagement and revenue. He also disputed the notion that teenagers are the company’s most lucrative demographic, noting that younger users tend to generate less advertising income.
Attorneys for Meta and YouTube have argued that the plaintiff’s mental health struggles stem from complex personal circumstances rather than platform design. YouTube’s legal team further maintained that the service operates more as a video-viewing platform than a traditional social network.
The emotional weight of the proceedings has been palpable. In the gallery, parents who say their children suffered severe harm — including suicide — listened quietly as executives defended their companies’ practices.
More than a thousand lawsuits nationwide accuse social media firms of fostering addiction, depression, eating disorders and self-harm among young users. Legal observers say this case could serve as a bellwether, potentially influencing how courts evaluate responsibility in the digital age.
Mosseri’s testimony precedes the scheduled appearance of Meta founder Mark Zuckerberg next week — a moment that may sharpen an already defining legal confrontation over whether the platforms shaping modern adolescence were built as tools, or as traps.
-
Interagency Assessment2 months agoTOP SECRET SHIFT: U.S. MILITARY ORDERED INTO SOMALILAND BY LAW
-
Somaliland4 months agoSomaliland Recognition: US, UK, Israel, and Gulf Bloc Poised for Historic Shift
-
Minnesota1 month agoFraud Allegations Close In on Somalia’s Top Diplomats
-
Middle East2 months agoSaudi Arabia vs. UAE: How The Gulf Rivalry is Heating Up
-
American Somali3 months agoWhy Frey Won a Significant Share of the Somali Vote Against a Somali Opponent
-
Middle East2 months agoTurkey’s Syria Radar Plan Triggers Israeli Red Lines
-
Editor's Pick2 months agoWhy India Is Poised to Become the Next Major Power to Recognize Somaliland
-
The Million-Follower Exile2 months agoWhy America Deported Its Most Famous Somali TikTok Star And Who Paid The Price
