This week’s list of data news highlights covers August 19 – 25, 2017, and includes articles about a trail of platooning self-driving trucks and Amazon’s AI fashion designer.
A startup called Novel Effect has developed a smartphone app with the same name that uses voice recognition to follow along as a human reads a children’s book and play sound effects synced up with actions in the story. Users select a children’s book in the app and start reading that book aloud, and Novel Effect will automatically detect where a reader is in the story and play its corresponding sound effects, such as animal noises or eating sounds, to make reading aloud more engaging of children.
Researchers at the Massachusetts Institute of Technology have developed a machine learning system called ICU (intensive care unit) Intervene that analyzes a wide variety of patient data and historical data about ICU outcomes to make real-time recommendations about treatment options for critical care. ICU Intervene regularly monitors patients’ vital signs and other data, such as doctors’ notes, to make hourly recommendations about the best treatment options, and it can even make predictions, such as whether or not a patient will need a ventilator six hours ahead of time. ICU Intervene can also explain to doctors the reasoning behind its recommendations.
The UK Department of Transportation has approved a trial to test platoons of self-driving trucks on public roads. Trucks can form a platoon by driving closely behind one another to benefit from decreased wind resistance, similar to how cyclists draft off of each other in a race, which reduces fuel consumption and traffic but can be difficult or dangerous to coordinate. The test will use a lead truck with a human driver and two other self-driving trucks following closer behind than human drivers could and wirelessly communicate to accelerate, steer, and brake with the lead truck. The trials will expand to major roads by the end of 2018.
Google has partnered with the University of California, Berkeley to use machine learning to compile crowdsourced images of the eclipse into a timelapse movie to help researchers better study the sun’s corona—the rim of sunlight that appears around a solar eclipse. Google helped develop a smartphone app people could use to take geotagged, timestamped photos of the eclipse across the United States, and used machine learning to align every picture into a sequence depicting the totality of the eclipse the entire time it was over land. This will allow Berkeley researchers to better study the sun’s corona.
Food safety scientists in Germany have developed a protein database to serve as a reference to authoritatively identify fish in an effort to fight fish fraud—when fish sellers misrepresent their fish as more desirable or expensive types. The scientists use mass spectrometry, which can differentiate between the physical characteristics of different molecules, to determine the unique protein profiles of 54 species of fish commonly sold in grocery stores in restaurants. This database can serve as a resource for buyers that want to ensure they are not buying mislabeled fish, which can be dangerous as fish contain different types of allergens and parasites that affect people differently.
Cancer diagnostics company Cambridge Cancer Genomics has developed a test for identifying the effectiveness of a cancer treatment by using machine learning. The test uses advanced sensing techniques to identify the genetic information in a patient’s blood and a machine learning system to monitor changes in this information—indicating an increase or decrease in cancerous mutations—over time. With this approach, doctors could more quickly identify if patients are responding to chemotherapy and seek an alternative treatment if necessary.
Amazon is using AI to analyze fashion styles in images and generate clothing designs in similar styles. The system uses a technique called a generative adversarial network to learn the properties of different styles by studying examples and then apply these properties to other articles of clothing.
The Federal Ministry of Transport and Digital Infrastructure in Germany has published a report detailing 15 ethical guidelines for driverless cars. The guidelines strongly emphasize personal safety, including requiring that protecting humans take precedence over all other considerations and that in the event of an unavoidable accident, a self-driving car cannot make decisions about what to do based on a human’s personal features such as age.
Researchers at the University of Technology Sydney will deploy drones equipped with video cameras and AI software capable of detecting sharks to patrol Australian beaches in September. The software can identify sharks in real time with 90 percent accuracy, whereas humans are only 20 to 30 percent accurate at identifying sharks in aerial images. When the drone detects a shark, it will use a megaphone to play a warning for any nearby swimmers.
The European Space Agency’s (ESA’s) Sentinel-1A satellite is providing crop insurers in India the data they need about drought-stricken areas to ensure farmers can collect insurance payouts. The southern Indian state of Tamil Nadu is experiencing its worst drought in 140 years, and while farmers often have crop insurance, the claims process can be time consuming. The data from Sentinel-1A, which can monitor moisture in crops from space, lets insurers speed up this process and has helped authenticate approximately 200,000 claims resulting in over 10,000 payments to farmers.