This week’s list of data news highlights covers October 19-25, 2019, and includes articles about monitoring concussions in real-time and automatically tracking forest fires.
Researchers from multiple U.S. universities have developed a neural network that can read the thoughts of paralyzed patients. The researchers trained the neural network using data from microelectrodes that recorded the brain activity of an individual imagining he was moving his arm and hand to write letters of the alphabet. When the researchers incorporated the algorithm into a brain-computer interface, the computer read the volunteer’s thoughts at 66 characters per minute with 92 percent accuracy.
Researchers from the University of California, San Francisco, and the University of California, Berkeley, have developed software that uses AI to detect acute brain hemorrhages, which can be difficult for doctors to spot. The researchers trained the system on 4,400 CT scans where the diagnosis was known, giving it a 99 percent probability of identifying scans that demonstrated an acute hemorrhage.
Spanish car maker SEAT has developed a drone that detects road hazards and communicates them to connected vehicles. The drone uses sensors, cameras, and AI to detect upcoming road obstacles for drivers, such as a cyclist, and display them on the screens of cars. SEAT is trialing the system, which could help reduce accidents in rural areas with poor visibility, in Robledillo de la Jara, a small Spanish village.
OPRO+, a UK firm that creates mouthguards, has developed a mouthguard that uses sensors to detect the impact of forces on athletes’ heads. The mouthguard’s sensors track linear and rotational acceleration and can transmit the data to a computer in real-time. British rugby teams are using the mouthguards, which have helped coaches adjust practice drills according to the level of forces players are experiencing.
Phillips and the U.S. Department of Defense have developed an AI system that can detect infections 48 hours before symptoms appear. The system analyzes 165 biomarkers, and the researchers trained it on a dataset concerning 41,000 hospital infection cases. The system has an 85 percent probability of predicting whether an infection is present.
Liwonde National Park in Malawi has used an AI-enabled program called EarthRanger to reduce poaching. The program analyzes data on the location of elephants, snares, and human footprints to find patterns in poaching behavior, such as an increase in poaching before the holidays. This process has helped the park better allocate its resources, such as where to set up checkpoints. The park also uses cameras equipped with an algorithm to detect if a moving object is a human or animal, which helped the park catch a well-known poacher.
Fujitsu has developed an AI-powered tool that can accurately detect subtle changes in expression. This tool improves on other facial expression systems by converting pictures taken from different angles to frontal shots, which allows the system to detect facial movements such as a raised cheek more easily. The tool can detect expressions that other systems have found challenging, such as nervousness or confusion, including nervous laughter. The tool could help develop robots that are capable of recognizing human emotions.
Fannie Mae has used an AI-enabled tool developed by Moogsoft, a start-up based in San Francisco, to reduce the number of monthly issues its IT staff handles by one third. The tool uses machine learning to track patterns and isolate the causes of crashes or other malfunctions, allowing the IT staff to fix the underlying issue, rather than repeatedly respond to the errors it causes.
Researchers from CrowdAI, a start-up based in Silicon Valley, the U.S. Department of Defense, and the California Air National Guard have developed a system that uses AI to track the spread of wildfires. The researchers trained the system on thousands of frames from videos of wildfires, and the system can analyze 20 frames per second with 92 percent accuracy. This speed allows the system to plot the perimeter of a fire in near-real-time, which is significantly faster than humans.
Google has used BERT, its state of the art language model, to train its core search algorithm to better handle difficult queries. BERT trained the algorithm by teaching it to guess the missing words in chunks of text, which helps the algorithm learn context. The changes will help the search engine provide better results for roughly 10 percent of search queries, such as “How old was Taylor Swift when Kanye went onstage during the 2009 VMAs and interrupted the award ceremony?”