This week’s list of data news highlights covers August 17-23, 2019, and includes articles about an algorithm that is helping refugees find the best places to live and AI system spots sharks in real-time.
Researchers from Worcester Polytechnic Institute in the United States, Lund University in Sweden, and the University of Oxford in the UK have developed a machine-learning algorithm that matches refugees to locations where they will most likely find employment and enjoy a good quality of life. The algorithm uses data on the success of previous placements and a refugee’s health, age, education level, and language to increase an individual’s chance of finding a job within three months by 20 percent.
Airbus saved more than $50,000 in 2018 by using AI to review travel-and-expense reports from employees in three U.S. states and Mexico. The AI system can analyze reports and receipts in 100 languages to discern the vendor, type, and size of the expense. Airbus says the system, which reduces the time it takes to approve a report from weeks to days, could save the firm millions of dollars annually if it implements it worldwide.
Researchers from the University of California, Berkeley, have created a 3D map of how the brain responds to words. The researchers used functional magnetic resonance imaging to measure the brain activity of nine individuals who read and listened to a story. The researchers then used natural language processing to identify relationships between words, finding that social words, such as “husband” and “sister,” sparked activity in the same area of the brain. The researchers also found that there were significant similarities between brain activity when reading and listening to stories, which contradicts the traditional assumption that there would be more explicit differences.
Researchers from Stanford University have developed stretchable stickers with sensors that individuals can place on their skin to wirelessly monitor basic data about their health. The stickers have tiny motion sensors that can measure respiration, a person’s pulse, and arm and leg movements. The stickers transmit data to a battery-powered receiver that individuals clip on to their clothing.
The Ripper Group, an Australian technology company, and the University of Technology Sydney have developed an AI system that automatically spots sharks and crocodiles in video in real-time. The system has helped spot crocodiles in Queensland by analyzing data from cameras mounted on drones. The system can help lifeguards and other rescuers identify when swimmers may be in danger.
Researchers from Google have developed an AI system that can understand some sign language symbols. The researchers trained the system on 30,000 images of hands, which the researchers labeled with 21 3D coordinates. The system uses the coordinates to determine the pose of the hand and compares it to sign language symbols for letters, numbers, and words. Unlike other similar systems, this system requires a small enough amount of data that it can work in real-time on mobile phones.
Cogito, a Boston start-up that creates emotional intelligence software, has developed an AI system that assists call center employees in their conversations with customers. The system analyzes cues such as pitch, tone, and rhythm of voices to provide tips to make employees more effective, such as to speak slower and be more empathetic. The system offers its suggestions in real-time during calls.
Researchers from the University of Science and Technology of China and the University of Vienna have sent a record-breaking amount of data in a quantum form. The researchers sent the data using quantum teleportation, in which a sender and receiver each have one of a pair of entangled qubits, and the receiver deciphers the data by using measurements of the interactions between qubits. The researchers were able to send more information by using qutrits, which can be a combination of a 0, 1, and 2, whereas qubits, which are more traditionally used in quantum computing, can only represent a 0 and 1 simultaneously.
Researchers from multiple groups, including MIT, have developed a machine learning system that can analyze particle collisions significantly faster than existing methods. The researchers trained the system to accurately recognize top quarks, the largest type of elementary particle, and it can process 600 images per second, compared to less than one image per second for other methods. The system can help researchers analyze data from the Large Hadron Collider, which creates 40 million collisions every second.
Researchers from the University of Kentucky are developing autonomous drones that use AI to identify and measure the health of individual cattle, which is difficult for farmers to do regularly because of the vast fields cattle roam. The researchers have created a cattle pen that has 40 cameras to take 360-degree images, which they are using to teach the drone’s AI system to identify individual cattle and estimate body mass. More than two million cattle die each year in the United States due to health issues, and this system could help farmers more quickly identify cattle with health issues.
Image: Albert Kok