AI for improved cancer detection

AI for improved cancer detection

RationalizationUnlock sales potentialPlug and Play

The problem

25% of cancers in the gullet are missed using standard endoscopy

Medical experts would like to improve early cancer detection in the oesophagus (gullet). The cancer lesions are found using endoscopy pushed down the gullet, but these lesions are very hard to spot and can be life threatening if overlooked. Cancer Research UK gave funding to a young team to fulfill this request using machine learning. 
They started to train an algorithm from video footage with lots of cancer/non cancer examples. 

The solution

By using machine learning to identify the cancers in real time by highlighting the cancer lesion when the clinician takes a video down the gullet, early screening can be improved

Doctors/Medical Experts can use the solution in conjunction with Amethyst annotation tool to rapidly train AI algorithms to detect cancerous lesions and non cancerous lesions in the gullet. It will allow to detect lesions early using endoscope with inbuild algorithm in it, detecting lesions when they can hardly be seen by a human eye. Historically 25%  of squamous cancer in oesophagus is missed, and the new solution will allow to identify patients with cancerous lesions faster, with more accuracy, and  less discomfort for the patients.

Insights

Stumbling blocks

Making sure, data is anonymized to avoid privacy issues as well as getting enough examples.

What had the customer tried before?

Just looking for lesions by eye.

What criteria were important to him?

That the algorithm can be trained quickly and accurately to ensure building the best model. 

Business

Benefits

It is hoped the tool can quickly mark up the images for training data much faster than traditional methods. 

The project schedule

  • Video footage from patients of cancer and non cancer is taken  
  •  The footage is anonymized and uploaded to the annotation tool  
  •  The customer annotates many examples  
  •  The data is trained to find the difference between cancer and non cancerous samples 
  •  The algorithm is deployed to be used in real time in the video system 

Project maturity level

POC (Pilot)

Project duration

1 year, finishes March 2020 

Project cost
(digits)

5

Involved employees
(Operating phase, FTE)

0.5
FTE/M

Running costs
(per month, digits)

4

Technical

Skills required by the customer

Doctors can carry on using endoscope as usual, which will show up/highlight lesions on the screen. No specific training will be required, as the algorithms will be built into the endoscope. Provided the doctor has domain knowledge of what a cancerous lesion looks like.

The project schedule

Technical facts

Still under testing and development. Technology is React (Jacascript) and Python. 
There is a client called Amethyst that does the making up of the cancer lesions.

Where is the data stored?

Locally and in the cloud. All data is anonymized and put in secure cloud environment. 

Providers

Stephen Taylor (Founder and CSO of Zegami)

Steve's mission is to disrupt the analysis of complex data. He wants to put powerful but easy to use analysis methods into the hands of everyone - not just computer experts - by using innovative visualisation and interaction technologies. As well as CSO at Zegami, he heads the Analysis, Visualisation and Informatics research group at Oxford working on understanding big data in genomics and health care.

Stephen Taylor

The information may of course vary in individual cases. Please contact the provider for an assessment of your project.