This week’s list of data news highlights covers February 29-March 6, 2020, and includes articles about tracking the spread of the coronavirus and improving the accuracy of mammograms.
Researchers from the University of Michigan have developed a system that uses machine learning to allow amputees to control their prosthetic hands. The researchers trained the system to understand the individual nerve signals of amputees in minutes, allowing the amputees to move each finger and pick up objects as small as miniature play bricks. The prosthetic hand worked for amputees, without the need for recalibration, during the study’s entire 300 days of testing.
Researchers from Harvard University are using an AI system to help organizations such as the World Health Organization track the spread of coronavirus. The system, called HealthMap, uses data it gathers in from the Internet—including Google searches, social media, blog posts, and chat room interactions—to track and predict the spread of infectious diseases. The system also helps track public perception of the virus.
Researchers led by an individual from Sage Bionetworks, a nonprofit based in Seattle that produces biomedical research, have found that radiologists can interpret mammogram results more accurately with the assistance of an AI system. The researchers analyzed results from the Digital Mammography DREAM Challenge, a crowdsourced competition that challenged participants to train an AI system to screen for breast cancer.
A YMCA in eastern Pennsylvania is testing a robotic lifeguard that uses AI to detect when someone might be drowning in a swimming pool. The robot, called the Coral Manta 3000, can recognize a human head under the water and emits an alarm when a human has been motionless underwater for longer than 15 seconds. Other YMCAs across the country plan to use the robot if it succeeds.
An Oregon-based nonprofit, Wild Me, has developed an AI system that can identify individual animals using their unique markings, allowing conservationists and scientists to monitor threatened or endangered species. The staffers at Wild Me train the AI system using photos of different animals until it can reliably tell them apart. The technology is especially useful for animals like whales that migrate long distances, making them difficult for a single research team to monitor.
Engineers from Rutgers University have created a robot that can draw blood or insert catheters more accurately than humans. The robot uses AI and near-infrared and ultrasound imaging to accurately pinpoint blood vessels and insert a needle or catheter to draw blood or deliver drugs and fluids. This could help reduce the risk of complications and delays in treatment, especially for children, the elderly, the chronically ill, and trauma patients, who are more likely to have small, twisted, rolling, or collapsed blood vessels that make gaining access difficult.
Researchers from Google have created an AI-enabled robot that taught itself how to walk. The four-legged robot learned to walk forward, backward, and turn to the left and right within hours. The robot uses a deep reinforcement learning algorithm and was able to walk on flat ground, a mattress, and a doormat with crevices.
Facebook has created a new machine learning system called Deep Entity Classification to detect fake accounts. The system analyzes 20,000 features to characterize an account, including the average age and gender distribution of a profile’s friends. The system can detect four types of fake profiles, including spammers who trick users into providing personal information.
Exyn Technologies, a startup based in Pennsylvania, has developed autonomous drones that can create 3D maps of mines in real-time, which allows firms to avoid the risk of sending humans into potentially dangerous areas. The drones, which use a series of sensors and AI to navigate, mapped a historic Finnish gold mine in less than four days. The map makes it easier to assess the remaining ore in heavily restricted areas.
Researchers from Case Western Reserve University have developed an AI system that can predict which prostate cancer patients will respond to chemotherapy or immunotherapy and if their cancer will return. The system analyzes images of cancer tissues to make its predictions and revealed that there are cellular differences between black and white cancer patients. The researchers found that developing a race-specific model for which black patients would have a recurrence of cancer increased the systems’ accuracy.
Image: Jim Clark