With the Online Safety Bill bill, the UK government wants to be able to fine companies up to £ 18 million, or 10 percent of global annual revenue, for not removing harmful content or not removing it in time. The regulator must also be able to block sites for this.
With the bill, the British government wants to ‘protect children’ and combat the ‘worst abuses on social media, such as racist hate crime’. In addition, the law must give regulators more options to stop financial fraud on social media and dating apps. The government states that the law guarantees freedom of speech and freedom of the press and says that the law “will not lead to unnecessary censorship of sites and platforms.”
Under the proposal, all sites and platforms covered by the new rules must take action to combat illegal abuse. The government says that companies must take ‘quick and effective’ action against hate crimes, intimidation and threats against individuals. The sites must also adhere to their own standards and conditions. They should also consider the risks their sites pose to ‘the youngest and most vulnerable people’ and protect children from inappropriate messages and harmful activities.
The bill will soon also include provisions that will oblige companies to report child abuse images. For example, police services must be able to better investigate cases of child abuse.
If the tech companies fail to comply with the new law, senior executives could be sued personally if they fail to comply with Ofcom’s requests for information. This is not yet included in the current bill, but could be added later if the current measures prove to be insufficient.
The government emphasizes that the bill will safeguard freedom of expression and that pluralistic online conversations will still be possible. Therefore, platforms and sites must employ safeguards that protect freedom of expression. The government cites the example of human moderators who, in complex cases where the context is important, have to make the choice whether something should be removed or not. In addition, people must be able to appeal effectively against decisions made by sites and platforms. These users can also complain about a decision made by a platform via Ofcom.
The Category 1 services must also publish ‘up-to-date reports’ on their impact on freedom of expression. Here they must also demonstrate that they have taken steps to counteract any negative impact on freedom of expression.
These Category 1 sites and platforms are also required by law to protect ‘democratically important’ content. They must not discriminate against certain political principles and must treat all political opinions equally. Their terms regarding political discussions must be clearly stated and Ofcom will ensure that they comply with their own terms.
Messages on journalistic sites are excluded from this law, including reactions under news items. Articles shared on sites and platforms are also banned from the law, with Category 1 services being given additional responsibility for ensuring that UK users can access journalistic content shared on their platforms. Journalists must be able to appeal quickly against any deleted messages. The bill does not distinguish between citizen journalism and professional journalism.
Companies and services that do not comply with the new legislation risk fines of between 21 million euros and ten percent of the annual worldwide turnover. Ofcom looks at the highest amount. The full bill will be published later on Wednesday and will go to the UK Parliament later.