This week’s list of data news highlights covers August 8 – August 14, 2020, and includes articles about translating languages with a smart mask and employing supercomputers to simulate high-frequency earthquakes.
An international team of researchers from Northwestern University, Ben Gurion University, Harvard University, and the Massachusetts Institute of Technology have developed an AI-enhanced medical approach to identify early on a specific type of autism called dyslipidemia-associated autism. First, the team created a map of the genes that work together during brain development and then used an AI algorithm to identify genetic mutations that contribute to the development of the autism subtype. Previously, doctors could only diagnose this specific type of autism based on symptoms.
Researchers from Columbia University and the University of Pittsburgh have developed a machine learning tool that can predict the likelihood of a premature infant developing a life-threatening intestinal disease called necrotizing enterocolitis (NEC). The researchers trained the tool to identify shifts in the gut that signal NEC using stool samples from 161 premature infants, 45 of whom were known to have developed NEC.
Google has updated Lookout, a mobile app that uses smartphone cameras to identify surroundings for people with low vision or blindness, to assist in the grocery store. Users can take pictures of food labels or their barcodes, and the app will read aloud the brand, product type, and flavor. The update also allows users to scan paper forms and other long documents to be read aloud.
Researchers at Nanyang Technological University in Singapore have developed an AI system that can better detect and recognize hand gestures, such as a thumbs up. AI gesture recognition systems typically combine sensory data from wearable sensors and visual data to train computer vision models, but existing systems often lack accuracy due to poor data from bulky wearable sensors and inefficient computer vision models that poorly match up visual and sensory data. The researchers’ new system improves on this by using nanotechnology to develop thinner wearable sensors and using AI to better integrate visual and sensory inputs. In a test setting, the system recognized hand gestures with over 96.7 percent accuracy.
Japanese startup Donut Robotics has created a smart face mask that can translate spoken Japanese into English, Chinese, French, Indonesian, Korean, Spanish, or Vietnamese. The smart mask is made of white plastic and silicone, and has an embedded microphone that links to a user’s smartphone.
Researchers from the Massachusetts Institute of Technology have collaborated with Microsoft to develop an AI algorithm called MosAIc that can identify connections between art pieces from different cultures, artists, and mediums of art. MosAIc uses machine learning to spot similarities between images and can help curators create exhibitions and help historians study patterns in art history.
Researchers at the University of Saskatchewan and the Department of Agriculture and Agri-Food in Canada have decoded the entire plant genome of the black mustard plant. Using a genome sequencing technique, researchers pinpointed which genes were responsible for specific traits in the black mustard plant, like fungal resistance, enabling researchers to more easily select genes that are favorable for breeding and increase crop production.
Scientists at the University of California, Irvine have created a public website that provides up-to-date statistics on coronavirus infections in California counties. Using information collected from the California Open Data Portal, the website provides county-to-county comparisons on the number of COVID-19 hospitalizations and patients in the intensive care unit, and the number of daily cases and deaths. The website also contains color-coded maps by county that illustrate patterns of case growth and the rate of positive tests.
Agricultural equipment company Blue River Technology in California is using machine learning to train robotic crop sprayers to identify weeds. Using high-resolution cameras and computer vision, the robotic crop sprayers take pictures of cropland, analyze the pictures for any weeds, and spray herbicides only on identified weeds. This can reduce the amount of herbicide used to control weeds and promote more sustainable agricultural practices.
A team at the Lawrence Livermore National Laboratory (LLNL) have published the highest resolution simulation of a magnitude 7.0 earthquake. Using LLNL’s supercomputer resources and state-of-the-art simulation software, researchers modeled the seismic shaking that would occur from an earthquake on the Hayward Fault in Northern California. The team doubled the resolution of previous simulations and covered a wider domain than previously possible, including additional analysis of the effect earthquakes have on soft-soil urban areas in the Bay Area. With the simulations, researchers are able to assess seismic hazards of high-frequency earthquakes and the risk of damage to existing buildings, homes, and utility lifelines.
Image: Nathália Rosa