Unlock sales potential
Plug and Play
Medical experts would like to improve early cancer detection in the oesophagus (gullet). The cancer lesions are found using endoscopy pushed down the gullet, but these lesions are very hard to spot and can be life threatening if overlooked. Cancer Research UK gave funding to a young team to fulfill this request using machine learning.
25% of cancers in the gullet are missed using standard endoscopy
They started to train an algorithm from video footage with lots of cancer/non cancer examples.
Doctors/Medical Experts can use the solution in conjunction with Amethyst annotation tool to rapidly train AI algorithms to detect cancerous lesions and non cancerous lesions in the gullet. It will allow to detect lesions early using endoscope with inbuild algorithm in it, detecting lesions when they can hardly be seen by a human eye. Historically 25% of squamous cancer in oesophagus is missed, and the new solution will allow to identify patients with cancerous lesions faster, with more accuracy, and less discomfort for the patients.
By using machine learning to identify the cancers in real time by highlighting the cancer lesion when the clinician takes a video down the gullet, early screening can be improved
Making sure, data is anonymized to avoid privacy issues as well as getting enough examples.
What had the customer tried before?Just looking for lesions by eye.What criteria were important to him?That the algorithm can be trained quickly and accurately to ensure building the best model.
It is hoped the tool can quickly mark up the images for training data much faster than traditional methods.
The project schedule
Project maturity level
1 year, finishes March 2020
(per month, digits)
(Operating phase, FTE)
Skills required by the customer
Doctors can carry on using endoscope as usual, which will show up/highlight lesions on the screen. No specific training will be required, as the algorithms will be built into the endoscope. Provided the doctor has domain knowledge of what a cancerous lesion looks like.
The project schedule
Still under testing and development. Technology is React (Jacascript) and Python.
Where is the data stored?
Locally and in the cloud. All data is anonymized and put in secure cloud environment.
Steve's mission is to disrupt the analysis of complex data. He wants to put powerful but easy to use analysis methods into the hands of everyone - not just computer experts - by using innovative visualisation and interaction technologies. As well as CSO at Zegami, he heads the Analysis, Visualisation and Informatics research group at Oxford working on understanding big data in genomics and health care.
Stephen Taylor (Founder and CSO of Zegami)
The information may of course vary in individual cases. Please contact the provider for an assessment of your project.