This week’s list of data news highlights covers October 12-18, 2019, and includes articles using smart speakers to monitor the health of infants and using AI to prevent harmful drug interactions.
Researchers from AI research group OpenAI have developed an AI system that designed its own training regime and taught a robot hand to solve a Rubik’s cube. The system trained using reinforcement learning in virtual simulations, automatically altering parameters of the virtual environments, such as the amount of gravity, once it reached a level of mastery in a particular environment. This process helped the robot hand solve the cube in real-life under conditions it did not train on, such as while having a few fingers bound together.
Researchers from the University of Washington have developed a smart-speaker app that can monitor an infant’s breathing. The app plays white noise and records how the noise is reflected to detect the location and movement of the infant while analyzing the white-noise that comes from the infant’s breathing. During testing on infants in a neonatal intensive care unit, the app detected respiratory rates that closely matched the rate standard vital sign monitors detected.
Researchers from Pennsylvania State University have developed a system that uses machine learning to help drug providers identify dangerous interactions between drugs. The system uses a neural network to analyze U.S. Food and Drug Administration adverse event reports, allowing it to learn how combinations of drugs affect the liver. The system only identifies interactions that have severe effects, such as hospitalization, death, or disability, to reduce the possibility providers experience alarm fatigue due to a high number of benign alerts.
The City of Bristol in England is using an algorithm from IBM to assess citizens’ risk of being exposed to harms such as being sexually exploited or going missing. The algorithm analyzes data from the police, National Health Service, and the Department for Work and Pensions to create 0 to 100 scores for individuals. For example, the algorithm analyzes 80 factors from the records of previously abused individuals, such as if they went missing before, to provide risk scores for sexual exploitation. The algorithm helps social workers identify individuals and families that should receive more support.
The U.S. Department of Agriculture (USDA) has partnered with Microsoft on a pilot using sensors, drones, satellite data, and IoT-enabled farm equipment to provide farmers and researchers near-real-time data on farm conditions. The pilot is taking place at a 7,000-acre farm in Maryland, and will track soil acidity, crop heights, and precipitation, among other types of data. Until this pilot, USDA researchers wrote data points in field books, which not only reduced the time the researchers had to analyze data but also led to the USDA losing data as researchers would retire or leave for another position.
Foodvisor, a French start-up, has developed a mobile app that uses AI to analyze images of individuals’ meals to automatically log nutritional information, such as the number of calories, fat, and protein in a meal. The app uses image recognition to detect the type of food and uses a smartphone’s camera autofocus data to estimate the distance between the phone and the meal to calculate the volume of each food item.
Guiding Eyes for the Blind, a group that trains guide dogs, IBM, and North Carolina State University have collaborated to use data to identify the best guide dogs. The dogs wear collars with a sensor that collects data on how the dog is moving and how often it makes noises. The researchers have also developed a vest with heart monitors and other sensors to assess temperament.
Researchers from Carnegie Mellon University have developed a system that uses AI to help autonomous drones film visually appealing scenes. The researchers trained the system using data from a study in which subjects rated scenes based on their visual appeal. The system learned that constant backshots, which are popular drones shots, bore viewers after a while, but that switching between shots too often lead to viewer fatigue.
The City of Hull in England is implementing a smart city dashboard that integrates real-time data from sensors it has distributed across the city. This data includes information on congestion, air pollution, and waste levels in smart trash bins, which Hull has stored in siloed systems. The city will use the dashboard to improve programs to collect trash, manage parking and congestion, and control street lighting. The city also plans to open the dashboard to the public.
Eyenuk, a start-up based in Los Angeles, has developed software that uses AI to diagnose diabetic retinopathy, which can lead to blindness, with 96 percent accuracy. The software achieved this accuracy while screening 893 patients with diabetes at 15 different medical locations. The system analyzes images of undilated pupils to detect the disease, which it can do in 60 seconds.
Image: Honza Groh