Detecting cancer faster with a pen and smartphone camera diagnostics

Two newly developed devices promise to radically improve cancer detection–the first during surgery, and the second for the earliest signs of the jaundice symptoms of pancreatic cancer with applicability to both telehealth and telemedicine.

  • The MasSpec Pen is a mass spectrometry device (not the pen in the picture) which is intended to be used during surgery to better determine the boundary between cancerous and normal tissue. Current technology uses frozen section analysis, which takes about 30 minutes (in which the surgeons and sedated patient wait for the pathologist’s results) and isn’t always accurate in answering the question ‘is it all out?’ Using mass spectrometry analysis of a drop of water after three seconds of tissue contact, MasSpec Pen returns results in about 10 seconds with 96 percent accuracy in a test of 253 cancer patients, as well as detecting cancer in marginal regions between normal and cancer tissues that presented mixed cellular composition. It was tested on breast, lung, thyroid, and ovary cancerous and normal tissue. The team expects to start testing the new technology during oncologic surgeries in 2018. Futurity, Science Translational Medicine.
  • Over at the University of Washington’s Ubiquitous Computing Lab, researchers there expanded their jaundice detection system for babies, BiliCam, to BiliScreen, which examines the eyes for the earliest sign of jaundice. Jaundice is an early sign of pancreatic cancer as well as hepatitis and related diseases, and is conventionally screened through a professionally-administered blood test and analysis. The BiliScreen app is used with a smartphone camera and a 3D printed box that controls the eye’s exposure to light. It correctly identified cases of concern 89.7 percent of the time, compared to the blood test currently used. As a non-invasive test, it can be used repeatedly for high-risk individuals and remotely. Futurity, paper (PDF, 26 pages) presented September 13 at Ubicomp 2017, the Association for Computing Machinery’s International Joint Conference on Pervasive and Ubiquitous Computing.

Hat tip on both to former Ireland Editor Toni Bunting.

Artificial intelligence with IBM Watson, robotics pondered on 60 Minutes

click to enlargeThis Sunday, the long-running TV magazine show 60 Minutes (CBS) had a long Charlie Rose-led segment on artificial intelligence. It concentrated mainly on the good with a little bit of ugly thrown in. The longest part of it was on IBM Watson massively crunching and applying oncology and genomics to diagnosis. In a study of 1,000 cancer patients reviewed by the University of North Carolina at Chapel Hill’s molecular tumor board, while 99 percent of the doctor diagnoses were confirmed by Watson as accurate, Watson found ‘something new’ in 30 percent. As a tool, it is still considered to be in adolescence. Watson and data analytics technology has been a $15 billion investment for IBM, which can afford it, but by licensing it and through various partnerships, IBM has been starting to recoup it. The ‘children of Watson’ are also starting to grow. Over at Carnegie Mellon, robotics is king and Google Glass is reading visual data to give clues on speeding up reaction time. At Imperial College, Maja Pantic is taking the early steps into artificial emotional intelligence with a huge database of facial expressions and interpretations. In Hong Kong, Hanson Robotics is developing humanoid robots, and that may be part of the ‘ugly’ along with the fears that AI may outsmart humans in the not-so-distant future. 60 Minutes video and transcript

Speaking of recouping, IBM Watson Health‘s latest partnership is with Siemens Healthineers to develop population health technology and services to help providers operate in value-based care. Neil Versel at MedCityNews looks at that as well as 60 Minutes. Added bonus: a few chuckles about the rebranded Siemens Healthcare’s Disney-lite rebranding.

The Future of Medicine – Technology & the Role of the Doctor in 2025 – a brief summary

The following is a brief summary of a joint Royal Society of Medicine/Institute of Engineering & Technology event held at the Academy of Medical Sciences on 6th May. The event was organised, extremely professionally, by the IET events team. The last ticket was sold half an hour before the start, so it was a genuine sell-out.

The speakers for the event were jointly chosen by this editor and by Prof Bill Nailon, who leads the Radiotherapy Physics, Image Analysis and Cancer Informatics Group at the Department of Oncology Physics, Edinburgh and is also a practising radiological consultant. As more of those invited by Prof Nailon were available than those invited by this editor, the day naturally ended up with a strong focus on advances in the many aspects of radiology as applied to imaging & treating cancer, as a surrogate for the wider examination of how medicine is changing.

The event began with a talk by Prof Ian Kunkler, Consultant Clinical Oncologist & Professor in Clinical Oncology at the Edinburgh Cancer research Centre. Prof Kunkler began by evidencing his statement that radiotherapy delivers a 50% reduction in breast cancer reappearance, compared with surgery alone. He stressed the importance of careful targeting of tumours with radiotherapy – not an easy task, especially if the patient is unavoidably moving (eg breathing) – Cyberknife enables much more precise targeting of tumours as it compensates for such movement. Apparently studies have shown that 55% of cancer patients will require radiotherapy at some point in their illness.

This was followed by Prof Joachim Gross, Chair of Systems Neuroscience, Acting Director of the Centre for Cognitive Neuroimaging & Wellcome Trust Senior Investigator, University of Glasgow, talking about magnetoencephalopathy (MEG), which enables excellent spatial & temporal resolution of the brain. However it currently uses superconducting magnets that in turn require liquid helium, so is very expensive to run. He then showed an atomic magnetometer which apparently is developing fast and will be a much cheaper alternative to MEG – he expects people will be able to wear sensors embedded in a cap soon. He then went on to show truly excellent graphics on decoding brain signals with incredible precision; he explained that the 2025 challenge is understanding how the different brain areas interact. Finally he described neurostimulation, using an alternating magnetic field with the same frequency as brain waves to change behaviour; whence the emergence of neuromodulation as a new therapy. Both exciting, and just a little scary.

Dr David Clifton, Lecturer, Dept of Engineering Science & Computational Informatics Group, University of Oxford, followed with a talk on real-time patient monitoring. He began by explaining the challenges that clinicians face with this wall of patient data coming towards them: only “big data in healthcare” enables all the data generated by patients to be analysed to identify the early warning signals that are so important to minimise death and maximise recovery. (more…)