Safety law on chat apps is defended by the minister

The technology secretary defended a controversial part of the Online Safety Bill that would force messaging apps to give the regulator Ofcom access to the content of private conversations.

She thought it was a good way to keep children from being abused.

But tech companies like WhatsApp and Signal have said they will leave the UK if they are made to make their messaging less secure.

The Bill is expected to pass in the fall.

Michelle Donelan talked to the BBC while she was at University College London, where she announced that £13 million will be given to projects in healthcare that use AI.

The tech industry and cyber security experts have both criticized the government’s plan to make the text of encrypted messages public if it is thought that they pose a risk to children.

At the moment, only the person who sent the message and the person who got it can read it. The tech companies can’t.

iMessage and Whatsapp, two of the most famous messaging services, both use this popular security feature by default.

But once there’s a way in, not just the good guys will use it, the argument goes, and some companies say they’ll stop doing business in the UK rather than risk security.

Ms. Donelan said that the government was not against encryption and that access would only be asked for if nothing else worked.

“Like you, I want my privacy because I don’t want my private texts read by other people. She said, “They would be very bored, but I don’t want them to do it.”

“However, we do know that some of these sites are sometimes hotspots for sexual exploitation and abuse of children.

“And if that problem happens, we have to be able to get to that information.”

She also said that tech companies would have to put money into technology to solve this problem.

“Technology is being made that will allow encryption and access to this information, and the safety system we have makes it clear that this can only be used for exploitation and abuse of children.”

Client Side Scanning is the current frontrunner for this. It includes installing software on devices that can scan content and send alerts if something goes wrong. But it hasn’t caught on. Apple stopped a test of it after people complained, and it’s been called “the spy in your pocket.”

The NSPCC is a charity for children. Its study shows that the public is “overwhelmingly supportive” of efforts to stop child abuse on encrypted platforms.

Richard Collard, head of child safety online policy at the NSPCC, said, “Tech companies should show industry leadership by listening to the public and investing in technology that protects both the safety and privacy rights of all users.”

But Ryan Polk, who is in charge of Internet policy at the Internet Society, a global charity that works on Internet policy, technology, and development, isn’t sure that this technology is ready.

“The government’s own Safety Tech Challenge Fund, which was supposed to find a magic technical solution to this problem, failed to do so,” he said.

Scientists from the UK’s National Research Centre on Privacy, Harm Reduction, and Adversarial Influence Online found serious problems with the proposed technologies, including that they “undermine the end-to-end security and privacy necessary to protect the security and privacy of UK citizens.” If the UK government can’t see that the Online Safety Bill will, in effect, ban encryption, they are willingly blinding themselves to the dangers ahead.

In September, the bill should be back in the House of Commons.

Leave a Reply

Your email address will not be published. Required fields are marked *