This week’s list of data news highlights covers January 7-13, 2017 and includes articles about anew service that crowdsources the collection of training data for AI and a carseat headrest that monitors brain activity.
Genetic diagnostic company FDNA has developed a smartphone app called Face2Gene that uses machine learning to identify subtle facial features in photographs that could be linked with genetic conditions. Many genetic conditions can produce distinct physical characteristics, such as abnormal eye slant or low ear height, in a person’s body structure. These characteristics, called dysmorphic features, can confirm a diagnosis, but they can be very subtle. Face2Gene analyzes a person’s face and compares specific features to dysmorphic features associated with over 7,500 different genetic disorders to calculate the likelihood a person has a particular condition.
Mighty AI, a startup in Seattle, has developed a service that crowdsources the task of providing AI systems with data about new topics to accelerate the time it takes for a business to deploy these systems to solve new tasks. Machine learning algorithms often need a substantial amount of training data to be able to make reliable predictions or classifications, but gathering relevant training data can be time consuming and require subject-matter expertise. Might AI’s service pairs companies with an AI system they want trained in a particular area, such as image categorization, with humans that can cull relevant data from the Internet in exchange for a small payment.
IBM Research’s Physical Analytics group has launched a project to develop what it calls “macroscopes”—systems that can organize and analyze complex data from a large number of sources to extract high-level insights about the physical world. Though current analytics systems are adept at analyzing individual datasets, combining different kinds of data from different sources for the same kind of analysis can be challenging. The group’s goal is to develop systems that can do this to make it easier to analyze difficult-to-predict phenomenon, such as weather and asteroid collisions.
New York City startup Augury has developed machine learning technology that can analyze the sounds of a piece of machinery to predict and diagnose mechanical problems so operators can perform preventative maintenance before it ever breaks down. Augury uses vibration and ultrasonic sensors affixed to machinery to monitor performance and trained machine learning algorithms to differentiate between what a machine is supposed to sound like when functioning normally and the sound it makes when it starts to malfunction. Augury’s systems share these audio recordings with each other so that as new machines are equipped with these sensors, it can learn to identify new kinds of malfunctions and recommend corrective action.
Swiss architecture firm Herzog and De Meuron has completed work on the Elbphilharmonie concert hall in Hamburg Germany, which they built by using parametric design—the process of using algorithms to generate designs based on desired specifications. The Elbphilharmonie’s auditorium is covered in 10,000 interlocking acoustic panels that each have a unique shape generated by an algorithm to meet the architects’ acoustic requirements for the room. It would have been impossibly labor intensive for humans to have designed these panels because they consist of one million different divots that can shape sound in specialized ways and vary in size and shape based on the acoustic needs of different areas of the room.
Researchers at the University of California, Berkeley, have used machine learning to create an “atlas” of where the human brain physically stores over 10,000 individual words. The researchers had test subjects listen to the same audio recordings and recorded data about their brain activity. With machine learning software, the researchers were able to pinpoint specific locations in the brain that exhibited the same patterns of activity whenever a person heard a certain word in the audio. Using this analysis, the researchers built a map of how the human brain stores individual words, providing neuroscientists with a useful model to better understand language comprehension. For example, the atlas shows that similar words, such as “poodle” and “dog,” are stored close to each other in the brain.
Thai conglomerate Charoen Pokphand (CP) Group has deployed a fleet of sensor-laden robots called “nanny robots” in its chicken production facility in Beijing to monitor the health of 3 million chickens. The autonomous nanny robots navigate through the facility’s many chicken coops for 12 hours a day monitoring the chickens’ movement levels and temperatures to warn human workers of any that might be sick so they can be removed. CP Group developed the nanny robot system to reduce the risk of selling tainted meat or eggs that could cause foodborne illness.
Quantum computing company D-Wave has released its programming tool Qbsolv as open source. Because quantum computing is still an emerging technology and very technically challenging, the pool of computer scientists with the necessary expertise to program these machines is very limited. D-Wave designed Qbsolv to make programming its machines possible for people without knowledge of quantum physics, and by making the software available as open source, D-Wave hopes to grow the field and advance quantum computing research.
The U.S. Food and Drug Administration (FDA) has partnered with IBM Watson Health to research how blockchain technology, which can track and authenticate digital transactions, could be used to securely share health data from electronic health records, precision medicine sources such as genetic databases, and Internet of Things devices throughout the health-care system. Because blockchain creates an auditable register of transactions, it could make it easier to track the exchange of health data as patients and providers create and share it. The research, which will last two years, will focus initially on oncology data.
Private research firm Changhong Research Labs and brain-monitoring technology company Freer Logic have developed a car seat headrest with a built-in electroencephalogram (EEG), which monitors brain activity, to automatically detect and warn a driver if he or she is starting to get distracted while driving. The headrest EEG is capable of wirelessly monitoring brain activity while a driver’s head is up to eight inches away, and algorithms can identify when a driver’s brain wave indicates a loss of focus, such as if he or she is starting to fall asleep, and trigger an auditory or haptic warning to alert the driver.