The Scottish startup behind a system that scans devices for evidence of child sexual exploitation and terrorism-related activity is taking its technology to America after a successful deployment in the Home Office. Cyan Forensics produces digital forensics tools that can detect the presence of previously known child abuse images, bomb-making manuals or propaganda videos by
The Scottish startup behind a system that scans devices for evidence of child sexual exploitation and terrorism-related activity is taking its technology to America after a successful deployment in the Home Office.
Cyan Forensics produces digital forensics tools that can detect the presence of previously known child abuse images, bomb-making manuals or propaganda videos by building “contraband filters” from original databases of illegal material that can find matches in seized computers, hard drives and media devices.
The company claims that this system is at least 20 times faster than the more established MD5 scans, which use hash functions to assign unique numerical identifiers to files and then search for duplicates.
This added speed can be decisive in law enforcement investigations. In one recent police raid, police scanned the home’s devices to quickly revealed that not only was the father of the family downloading the material – a teenager who lived with him had been too.
“Without our technology, it would have been six to eight months until they discovered that fact, because all of those laptops would have been seized and gone into a queue for analysis at a later date,” Cyan Forensics CEO Ian Stevenson told Techworld.
That delay could have had major consequences. It could have given the suspect time to commit further offenses, or convinced him to take his own life before the evidence of his crime was discovered.
Taking Cyan Forensics global
Cyan Forensics was founded in 2016 as a spinout from Edinburgh Napier University, where Stevenson’s cofounder Bruce Ramsay was researching methods of enhancing the tools that he had used for nearly a decade in his previous job as a forensic computer analyst for the Lothian and Borders Police force in Scotland.
In Edinburgh, he met Stevenson, a seasoned startup executive who was hired by the university to commercialise the technology – but ended up cofounding a company based on the research.
The software can run on the investigator’s computer via a direct connection to the suspect device or through a memory stick that is plugged into the suspect computer. The results of the scans are then divided into traffic light colours. Green signals that no illegal material was found. Red alerts users that something matches the material on the connected database, which a human can then manually verify by a click of a button. Amber indicates that someone is using sophisticated encryption that the system can’t break, and may require further investigation.
They are already using it in the UK, where the Home Office recently announced that it will deploy the system on the Child Abuse Image Database (CAID), the national repository where all abuse images discovered by British police is aggregated to help law enforcement identify the victims and the perpetrators.
Cyan Forensics has also been selected for the latest iteration of the GovStart accelerator programme for startups with public sector applications, which was founded in the UK and has since expanded to France, Germany and Denmark.
Stevenson hopes the accelerator will help his company deploy its technology more widely across Europe and find further business from counterterrorism investigations, an area in which Cyan Forensics already has customers in Europe but won’t reveal any details.
Cyan Forensics is also expanding across the Atlantic. Yesterday, the company announced that its software will be used by the National Centre for Missing and Exploited Children (CSAM) in the US, which Stevenson believes could lead to widespread adoption by American US law enforcement.
He also wants the system to be used by social media moderators to identify illicit material and block it before it is uploaded, but as in every deployment of the software, it will support human operators rather than replace them.
Stevenson compares it to a breathalyser system, which gives an indication of behaviour that a police officer can use alongside their observations of the suspect.
“It’s always going to be used as part of a risk-based process,” he said. “It’s not a substitute for investigating officers thinking about what they’re doing.”