Commission, has decided to examine society as a whole
Commission.
Europe is now effectively the first jurisdiction in the world where online platforms no longer benefit from a ‘free pass’ and set their own rules. It was time to turn the tables and ensure that no online platform behaves as if it was ‘too big to care’.
To see important ads, turn off your ad blocker! Article continued below:Commission, opportunities in Europe.
Reshaping users’ online experience. Some of the requirements companies face include swiftly removing illegal content. Stopping the use of people’s sensitive data, like their health information and sexual orientation.
To show them personalized ads and revealing previously secret information about how they operate. Companies will have to tell users if they remove their content, limit its visibility, or stop its monetization — and explain why.
Social media networks like Instagram and Facebook have already announced that European users will be able to tailor their feeds to see posts shared by accounts they follow, or in chronological order.
TikTok said its users could choose to be shown videos based on their location, or on worldwide popularity, instead of based on the company’s own algorithm.
Other companies like Snapchat announced how they were by making it impossible for advertisers to use teenagers’ data to show them personalized ads. Companies will also have to identify — and implement concrete measures to counter — major long-term risks that their platforms pose for society
Such as disinformation and negative effects on mental health under the scrutiny of the Commission, auditors and vetted researchers.
My expectation is that throughout the DSA enforcement saga, we will see a change in the business structures of platforms.
Commission, plays big cop.
It’s one thing to come up with an ambitious rule book. It’s another to successfully enforce it.
The content-moderation law has serious potential to bite. The law provides for stronger fines than its GDPR sister rulebook — 6 percent of companies’ annual revenue, compared with 4 percent.
Led by the team that wrote the law — and knows it inside and out — the Commission will have broad enforcement powers, similar to antitrust investigators’, to oversee and ensure the compliance of the biggest tech firms.
It will also receive extra yearly funding — an estimated €45 million for 2024 — funded through an annual levy from the Big Tech firms themselves.
The teams in Brussels will be backed by dozens of artificial intelligence and computer scientists at the Commission’s European Centre for Algorithmic Transparency (ECAT).
The Commission will also cooperate with national EU digital regulators, including in Ireland, where most of the affected tech firms have their EU headquarters.
Commission, EU digital regulators.
Priorities will include checking whether the designated companies are doing enough to protect children online and to fight disinformation campaigns. Especially ahead of crucial national elections in Slovakia and Poland next year, as well those for the European Parliament in June 2024.
After years marred by tech scandals and criticism that the world’s biggest companies lack proper accountability. The Commission will face widespread demands to show its EU law has teeth.
Commission, list of obstacles cannot be underestimated.
For one, observers fear that the Commission’s enforcement teams could lack the competence, staff and cash to confront Big Tech firms.
The Commission plans to have 123 full-time staff to enforce the DSA in 2024 and estimates it will roughly need an extra 30. Staff at the algorithmic center amount to 30.
For reference, the British regulator estimates it will need 350 people to oversee between 30-40 tech companies for its own content law, the Online Safety Bill.
Already, Amazon and European fashion company Zalando have challenged the Commission in court, arguing they aren’t very large online platforms — and shouldn’t face the ensuing extra obligations.
Facebook and Twitter have previously fought activists and gone to court to avoid opening up about how they operate.
Commission, still under construction.
Some pieces of the DSA’s enforcement puzzle are not yet not fully in place, which could arguably make the Commission’s work harder in the first months.
EU countries still have until February 2024 to designate their national watchdogs, who will be in charge of parts of the law, like vetting researchers who will be able to access platforms’ data.
The network of national regulators, the Board of Digital Services Coordinators, will also approve more detailed standards for platforms when it comes to fighting disinformation under the DSA.
While the DSA empowers users to challenge potential suspensions or the takedown of their content, the full process hasn’t yet taken shape.
Google has warned that EU out-of-court settlement bodies have yet to be finalized.
Some of the more detailed rules and processes laying out how large companies need to assess and limit major societal risks have yet to be decided. And the Commission is still hashing out details on the auditing of companies’ assessment and mitigation reports.
What bans on internet platforms are supported from the state’s point of view?
The state’s perspective on bans and regulations of internet platforms varies widely from one country to another and can change over time due to shifts in government policies and public opinion.
However, there are several common types of bans.
Censorship of Content.
This could include blocking or removing content that is deemed illegal, harmful, or offensive according to local laws and regulations.
China’s Great Firewall and Russia is an example of a comprehensive content censorship system.
Social Media Shutdowns.
In response to civil unrest or political protests, have shut down access to social media platforms or the entire internet to prevent the spread of information and to maintain control. This has been observed in countries like Iran and Sudan.
Data Localization.
Some states require internet platforms to store user data within their borders, ostensibly for national security and data privacy reasons. Russia has implemented such regulations, requiring companies to store Russian user data on servers located within the country.
Encryption Backdoors.
Governments in some countries have called for bans on end-to-end encryption or demanded backdoors to be built into messaging apps. They argue that this is necessary for law enforcement and national security purposes.
Social Media Regulation.
Some states seek to regulate social media platforms to combat issues like misinformation, hate speech, and cyberbullying. These regulations may involve requiring platforms to remove certain types of content or adhere to specific reporting and moderation guidelines.
Germany’s Network Enforcement Act (NetzDG) is an example of such regulation. Banning Certain Apps or Platforms: Governments may ban specific internet platforms or apps altogether due to concerns about their impact on national security or society.
For example, India has banned certain Chinese-owned apps citing data security concerns.
Regulation of Online Advertising.
States may impose regulations on online advertising, including bans on certain types of ads or requirements for transparency in political advertising to prevent foreign interference.
It’s important to note that the reasons for and extent of these bans and regulations can vary significantly depending on the country’s political, social, and cultural context.
While some states argue that these measures are necessary for security and societal well-being, others view them as infringements on free speech and privacy rights.
Public opinion and legal challenges can also influence the enforcement and effectiveness of these policies. Are the European institutions doing the right thing?
All The Best!