Introduction to the Case Study Collection

The Cyber Trust
Part of The Cyber Trust Family Internet Monitoring Project

NEW: FAMILY MONITORING PROJECT VIDEOS

The Cyber Trust has released three videos in a series covering different products that families can use to monitor activity. To access them visit that Trust's Youtube Channel here.

This collection of case studies explores real-world news stories highlighting how children and young people can be placed at risk through their online activities.

The collection is drawn from real cases investigated by the Cyber Choices team at the National Crime Agency and stories reported in the press.

All of these cases could have been prevented had parents been able to monitor their child's online activity and intervene.



News Item Link Cyber Choices Link

Meta and YouTube designed addictive products that harmed young people, jury finds

Source: The Guardian

 

 

A significant outcome of the US trial of Meta brought by an unnamed plaintiff using the nom de plume of KGM. KGM's claim was that she had became addicted to YouTube at age six and Instagram at nine, which she said had deleterious effects on her wellbeing. By age 10, she said, she had become depressed and was engaging in self-harm as a result. Her social media use allegedly caused her to have strained relationships with her family and in school. When she was 13, KGM’s therapist diagnosed her with body dysmorphic disorder and social phobia, which KGM attributes to her use of Instagram and YouTube.

Meta and YouTube have been found liable for deliberately designing addictive products that hooked the young user and led to her being harmed, a jury ruled on Wednesday. Jurors found the tech companies to be both negligent and having failed to provide adequate warnings about the potential dangers of their products.

The jury awarded the plaintiff in the case compensatory damages of $3m, with Meta to pay 70% and YouTube the remainder. Deliberations over punitive damages, also awarded, will begin later on Wednesday.

 Read the full story here

Six families sue TikTok after their kids die trying viral ‘choking challenge’

Source: The Independant

  

 
 
This story is incredibly sad. Young people given a challenge that to some might sound the sort of thing any kid might try. 'Choke yourself for as long as you can'. Six families have lost their children in this way as they blacked out and never recovered.
 
Six families have sued TikTok who hosted the 'Blackout Challenge on their platform. One family from the USA and five others from the UK are leading the legal action. The mother of one UK teenager has led a campaign to force social media platforms to release data of a child's social media activity in the event of a child’s death but the access to that data has been denied by the companies citing privacy laws that prevent them releasing the information. They also have stated that the data is deleted after a short period and is no longer available to clarify just what the children we actually watching. 
 
Ellen Roome has led the campaign in the UK for 'Joules Law' , named after he son, which would require social media companies to retain data for set period which can be accessed by parents and law enforcement.
 
This is one more reason why monitoring what your child is accessing with their agreement is so important. Adults can make judgements about such challenges which might elude young people and by building a digital trust relationship through monitoring could have saved some of these families from such awful consequences.

 
Read the full story here 
 
 

Midlands parents asked 'are your kids safe' as police warn of new risk online

Source: Birmingham Live

 

 

Birmingham Live News reports that Police have issued a warning to parents over the online dangers facing their children every day, from cyber bullying to sexual exploitation.

And here in 2026, AI is only worsening the issues by introducing "content based on their searches," West Mercia Police said.

We are aware of the power of AI and that it can be used in very productive ways by professionals and children but it also has the power to carry out searches that would take hours if you were to undertake them on your own. As its learning algorithms improve this will become and even more  powerful tool.

The police report  advises that parents need to ensure that Safety settings are in place, including parental controls on all devices, browsers, and Apps to "filter out inappropriate material," the force advised.  It also advises that "Set strong privacy settings to make sure personal information is only visible to trusted individuals i.e. “Friends only”. The police also recommends introducing your child to smartphones and other devices gradually in a monitored way before giving them fuller access.

Our Cybertrust Internet Monitoring Project aims to support parents to put such controls in place and to make best use of them within the family.  

  



 

Children being 'failed by tech companies' amid rise in online sex abuse images

Source: ITVX

 


Efforts to protect children from some of the worst aspects of online abuse and dangerous online content has made some progress through legislation passed by the UK government last year.  

Even so, reports continue to appear about the continued rise on child sex abuse image crimes logged by police forces in the UK. Such reports have risen by nearly 10% in the past year, the children's charity NSPCC has said.

The NSPCC said that of the 10,811 crimes where police forces recorded which social media platforms perpetrators used in relation to child sex abuse image crimes, 43%, or a total of 4,615, took place on Snapchat.

Meanwhile, Meta, which owns Facebook, Instagram, and WhatsApp accounted for almost a quarter of all offences (24%), the charity said. 

In other reports is is clear that the restrictions on access to video pornography, through such approaches as age verification  are still not totally effective. Many of the lesser known porography sites have not implimented age verification and these are not being blocked by  service providers.

Read the full story here.  

 


Regulator contacts Meta over workers watching intimate AI glasses videos

Source: BBC News

 

Digital glasses have been around for some time. Early attempts tried to make them almost a replacement for watching media direct to your eye rather than on your mobile phone or other devices. Early AI developments produced ideas of using such glasses to wear when in an unfamiliar city with maps and directions coming up as you moved around. Advances in AI are producing much more complex functions.

This story is a result of concerns by the UK data watchdog which has approached  Meta following a "concerning" report claiming outsourced workers were able to view sensitive content filmed by the company's AI smart glasses.

Meta said subcontracted workers might sometimes review content, including films and images, captured by its AI smart glasses for the purpose of improving the "experience".

Videos, including of glasses-wearers using the toilet or having sex, are sometimes reviewed by a Kenya-based Meta subcontractor, according to an investigation by Swedish newspapers, external Svenska Dagbladet (SvD) and Goteborgs-Posten (GP).

You might ask why would this be of any interest to anyone but if you wear such devices at work or reading documents then all of that data can be reviewed by META or those it may sell such data to. At a minimum privacy issues arise but capture commercially sensitive information or even national security information could lead to very dangerous outcomes.

 Read the full story here