Home > Technology > Tech Firms Told To Do Better On Child Abuse Images

Tech Firms Told To Do Better On Child Abuse Images

Indians at UK - Tech Firms

The government is giving Ofcom extra tools to ensure tech companies act to prevent, identify and remove child sexual abuse and exploitation content. The powers will be introduced through an amendment to the Online Safety Bill, which aims to police the internet. If the tech firms do not comply, Ofcom will be able to impose fines of up to £18m or 10% of the company’s global annual turnover, whichever is higher. However, there is growing concern over how it will work in practice. There are questions over what exactly the extra tools that media regulator Ofcom will get.

The government says it is supporting the development of tools which could detect child sexual abuse imagery content within or around an E2EE environment while respecting user privacy. It says this will further inform the wider debate around user privacy and user safety.

But Prof Alan Woodward from the University of Surrey told the BBC that to detect child abuse imagery or associated text, the current techniques will only work on unencrypted data. “If the OSB insists on discovering such material in encrypted data, it can be achieved only by examining the sending and receiving devices, i.e. where it is decrypted for consumption. “The implication is some form of universal ‘client-side scanning’ which many will see as overly intrusive and liable to… be used to detect other items unrelated to child safety.”

Indians at UK - Tech Firms

Client-side scanning refers to technology that scans message contents for matches against a database of content (for example child sexual abuse images) before the message is sent to the intended recipient. If Ofcom gets the powers to impose scanning technologies, there are calls from experts for the government to flesh out the detail on the technical feasibility, security implications and impact on privacy.

“The big issue will be that any technology that can be used to look at what is otherwise encrypted could be misused by bad actors to conduct surveillance,” Prof Woodward said. One of the fundamental premises of E2EE is that only the sender and intended recipients of a message can know or infer the contents of that message. It is one of the reasons that people like using WhatsApp and Signal.

Susie Hargreaves, chief executive of the charity the Internet Watch Foundation (IWF), wants provisions in the OSB to enable Ofcom to co-designate with the IWF to regulate child sexual abuse material online. She said: “Our unparalleled expertise in this area would make the response strong and effective from day one. “We have the strong collaborative relationships with industry, law enforcement, as well as the world-leading expertise which can make sure no child is left behind or their suffering left undetected.”

Proportionality:

The government says it will not be enough for a big tech company to say it simply cannot deploy certain technologies on its platform, because of the way it is configured. If it is proportionate and necessary, Ofcom can now issue a notice to a company to take steps to demonstrate they are using their best endeavours to develop or source tools which will remove child sexual abuse imagery. But this all depends on the regulator’s assessment of the risk of child exploitation.

Prof Woodward said: “Ofcom have a steep hill to climb. They will need to attract a lot of rare talent…to come up with the technical solutions demanded by the OSB. “That’s not to mention the skills they will need to navigate the secondary legislation…It’s a truly huge task ahead of them.”

Indians at UK - Tech Firms

Ofcom told the BBC it was preparing to take on the new role, bringing in skills and expertise from across the tech sector, as well as experts from child protection and advocacy bodies. Its spokesperson said: “Tackling child sexual abuse online is central to the new online safety laws – and rightly so. It’s a big and challenging job, but we’ll be ready to put these ground-breaking laws into practice once the OSB is passed.”

The National Crime Agency estimates there are between 550,000 and 850,000 people in the UK who pose a sexual risk to children. Access to such content online can lead to offenders normalising their own consumption of this content, sharing methods with each other on how to evade detection and escalation to committing actual child sexual abuse offences. Digital minister Nadine Dorries said: “Tech firms have a responsibility not to provide safe spaces for horrendous images of child abuse to be shared online. Nor should they blind themselves to these awful crimes happening on their sites.”

Maeve Hanna, partner at law firm Allen & Overy told the BBC: “While the objectives of the amendment are laudable, it’s not clear what a tech company will have to do in practice to comply with notices issued by Ofcom or to avoid the large fines threatened. “This lack of clarity will also present real challenges to any Ofcom enforcement action. For example, how will Ofcom show that tech companies could have – but failed to – develop any particular new technology where that technology doesn’t exist yet?”

Loading

You may also like
Indians at UK - Rishi Sunak Criticises Political Correctness
Political leaders, and activists call on BBC to drop the UK’s Eurovision entrant who blamed Israel for genocide
Indians at UK - Operation Ganga
BBC announces restructure to India operations
Indians at UK - Luton Airport Flights Suspended
Luton Airport Flights Suspended After Large Car Park Fire
Indians at UK - England Schools Funding Blunder
Inquiry Ordered Over England Schools Funding Blunder
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x