The move comes as several European countries consider tougher measures to restrict minors’ access to digital platforms.
On the same day, Spain announced plans to ban social media use for people under 16, while also pushing legislation that would make social media executives personally liable for hate speech on their platforms.
Digital age of adulthood at the heart of the debate
The concept of a “digital age of adulthood” has moved to the forefront in Europe following an initiative announced a few months ago by Greek Prime Minister Kyriakos Mitsotakis. At the time, the prime minister said that “Greece is considering banning the use of social media for those aged 16 and under, as Australia has done.”
The Greek government has already taken concrete steps in this direction. It has banned mobile phone use in schools and launched parco.gov.gr, a new tool on gov.gr that allows parents to verify their child’s age when purchasing their first mobile phone and to access parental control applications for social media.
As revealed by THEMA, Greek government agencies have engaged in extensive talks with social media companies and major internet players such as Google on how restrictions could be implemented.
According to sources at the Ministry of Digital Governance, the technical know-how comes from the European Union, proposals from online companies have already been assessed, and the final decisions on the scope of restrictions will be taken exclusively by the prime minister, who has been personally involved in the issue.

How the Greek plan differs from Australia’s model
Government sources note that while the Greek—and by extension European—plan shares the same goals as Australia’s ban, it will not operate in the same way.
In Australia, authorities—working with social media platforms—ban people under 16 from holding accounts on 16 social media services, including Facebook, TikTok, and YouTube.
By contrast, the Greek plan does not impose a blanket ban on access. Instead, it shifts control to the source: the devices used by minors.
Filtering, monitoring, and even full blocking of access to social media—without the need for logging in—will be handled by the Kids Wallet application, introduced by the Ministry of Digital Governance.
Because minors’ details are registered in the state-run app, Kids Wallet will block and remove social media platforms (Facebook, TikTok, Instagram, X, and others) from search results or direct access if the device user is under 15 years old (not 16, as in Australia).
Similarly, for minors, the app will also block access to platforms and websites prohibited for users under 18, including online gambling sites.

Platforms under scrutiny
Authorities are primarily targeting online betting companies and gambling platforms, as well as websites selling tobacco and alcohol products. Platforms hosting pornographic content will also be filtered.
Access to online dating and sex platforms—such as Tinder, the world’s most popular dating app—will also be automatically blocked for minors.
The “heavy lifting” will be done by the device used by the minor, provided parents or guardians have activated all security protocols, including Kids Wallet. The device will automatically identify whether the user is over 15 or 18 and restrict content accordingly. For example, a 14-year-old attempting to access an online gambling platform will be blocked.
A second line of defense from platforms
In addition to device-level controls, a second barrier will come from social media platforms themselves. Platforms will be required to strengthen safety measures and increase pressure on users to declare their real age when there is suspicion that minors are using their services.
According to THEMA, Brussels has already demonstrated how the technical ban can be implemented, guiding EU member states—including Greece, which is ahead of others in progress—toward a harmonized approach across the EU. The goal is common solutions, shared expertise, and easier implementation for countries that choose to introduce protections for minors.
European authorities have maintained open communication with social media platforms to find mutually acceptable solutions. Greek officials also highlight strong cooperation, noting that some internet giants worked intensively on child protection measures.
Google stands out as it submitted a comprehensive package of proposals to both European and Greek authorities. While officials deemed these proposals technically sound and highly interesting, the EU ultimately did not adopt them, as national apps—such as Greece’s Kids Wallet—were already in place.
One unresolved issue remains whether, and how, messaging platforms like WhatsApp—widely used for educational purposes and parent-child communication—will be included in the restrictions.

Putting the brakes on doomscrolling
Why are governments rushing to impose bans and restrictions on minors’ use of social media?
The answer lies in a growing body of scientific research linking doomscrolling—the compulsive consumption of endless short videos and posts—to ADHD (Attention Deficit Hyperactivity Disorder), other learning difficulties, and broader neurodevelopmental disorders.
Experts warn that social media addiction contributes to attention deficits, impaired short-term memory, anxiety, depression, eating disorders, reduced reading ability, social isolation, lower productivity, poor academic performance, sleep problems, and overexposure to blue light.
The TikTok effect and dopamine addiction
TikTok is often cited as the most striking example, frequently described as having “the world’s best AI algorithm—one that reads your mind and predicts what you want to see.”
Scientists explain that these algorithms do not control the mind directly but affect it by triggering dopamine release with every video or post viewed. This leads to overstimulation and withdrawal symptoms when exposure stops—creating addiction.
Each swipe delivers a small “dopamine hit,” similar to nicotine. The process is fast, instinctive, and highly rewarding, pushing users into endless scrolling, often at the expense of sleep and meals.
Algorithms analyze language, faces, backgrounds, viewing time, replays, volume changes, comments, and interactions, building a dynamic psychological profile for each user through deep learning techniques.
The platform does not wait for users to say who they are—it infers it. It suggests content users did not even ask for, maximizing engagement and minimizing decision fatigue.
The Australian precedent
Warnings about social media’s impact on children have echoed worldwide for years. FBI Director Christopher Wray once warned U.S. lawmakers that China could potentially manipulate recommendation algorithms for influence operations.
At the same time, China has designed its domestic platforms to promote educational and health-related content, boosting productivity and well-being rather than endless doomscrolling.
A 2022 cybersecurity report by Australian firm Internet 2.0 found that TikTok engages in “excessive data collection,” including location data, device details, and installed apps.
Combined with rising youth violence, online predators, and sexual exploitation risks, these concerns pushed Australia to act.
Australia’s eSafety Commissioner, Julie Inman Grant, has presented a list of platforms likely to fall under age restrictions set to take effect soon. The law, effective December 10, will limit account access for users under 16 across a wide range of platforms.
Companies must restrict access when:
- content is available to Australian users,
- users can post content,
- users can interact with others,
- social interaction is a primary function of the platform.
The eSafety Commissioner has already notified an initial list of 16 companies—including Facebook, TikTok, and YouTube—that they may fall under the new rules.
Ask me anything
Explore related questions