This week’s list of data news highlights covers May 2-8, 2020, and includes articles about a new technique to train image classification algorithms and a wearable that detects COVID-19 symptoms.
The Houston Methodist Hospital in Texas is using an AI-enabled system to monitor COVID-19 patients remotely. The system feeds data from ventilators, electrocardiogram machines, oxygen pumps, and electronic health records to machine learning algorithms that assess the status of patients. The system reduces the number of times healthcare personnel have to come in contact with patients and thus dispose of or sanitize protective equipment.
Researchers from Carnegie Mellon University have created a new technique that trains image classification neural networks by increasing the specificity of data labels in stages. The researchers trained the algorithms repeatedly on the same images, progressing from using broad to specific labels for objects. This training process increased classification accuracy by up to seven percent.
Argonne National Laboratory is using a deep learning model and a specialized computer to predict which molecules may be able to bind to coronavirus’s proteins and block its spread. The computer uses an AI chip that is roughly 60 times larger than most chips. The chip’s size allows the researchers to run a neural network solely on the chip, instead of dispersing it across multiple processors. This allows data to travel shorter distances, which speeds up how fast the network can process information.
Kernel, a California-based startup, has developed helmet-shaped devices that can analyze brain activity. One device measures electromagnetic activity while another uses light to gauge blood movement. The devices, which can identify activity such as what song a person is listening to, use custom microchips and algorithms.
Researchers from the University of Washington have developed an AI-enabled system that can recreate an object’s surrounding environment. The researchers trained the system using videos from a handheld camera, teaching it to infer the surrounding environment of an object using the pattern and motions of how light reflects off it. The system, which can approximate what an object will look like in different lighting, could help researchers develop more realistic augmented reality environments, such as how a piece of furniture would look in a room.
VetNow, a veterinary telemedicine company, and the Smithsonian National Zoo are partnering to detect animal disease outbreaks remotely. VetNow has developed a toolkit and software platform that allows veterinarians to share data, including blood samples and images of sick animals’ eyes and ears. Smithsonian scientists will analyze the data in real-time in cases where the veterinarians cannot make a diagnosis, and researchers will analyze the data to detect patterns in symptoms and diagnoses.
Ibex Medical Analytics, a startup based in Israel, has developed an AI-enabled system that helps pathologists spot cancer in tissue biopsies. The system analyzes cases in parallel to pathologists, alerting them if it has a different diagnosis. The firm is also developing AI-enabled software, which it trained on more than 60,000 biopsy slides, that can detect prostate cancer before a pathologist reviews a slide.
Researchers from Northwestern University and the Shirley Ryan AbilityLab, a rehabilitation hospital in Chicago, have developed a wearable device that can detect the early symptoms of COVID-19. The device, which sits at the bottom of a patient’s throat, detects an individual’s coughing intensity, coughing patterns, heart rate, and body temperature.
France is using AI-enabled software to detect whether people are wearing masks on public transportation. The software analyzes security camera footage, generating statistics about the percentage of individuals wearing masks in 15-minute intervals. The software can help authorities anticipate areas that could see a growth in COVID-19 cases.
Apple has developed a new feature that will allow iPhone or Apple Watch users to automatically share their health data when they make an SOS call to an emergency service. The data can include information such as a person’s medication conditions, allergies, and medications taken. Healthcare personnel can use the data to choose proper treatments, such as whether to use a particular drug.
Image: Kurt Klinner