Home BlogWeekly News 10 Bits: The Data News Hotlist

10 Bits: The Data News Hotlist

by Joshua New
by
Fishing

This week’s list of data news highlights covers October 3-9, 2015 and includes articles about the legislation to promote interoperability in healthcare and a robot that can teach itself to pick up objects.

1. Putting Health Data to Meaningful Use

The U.S. federal government has finalized new criteria for certifying electronic health record systems. The goal is to improve quality, safety, and efficiency of care, as well as reduce disparities in healthcare, and health-care entities must use certified systems to receive incentive payments under federal health programs. Under the new rules, certified systems will eventually have to provide application programming interfaces to allow patients to access to their own health data. 

2. Cooperating with the Private Sector for Better Open Data in China

China’s National Development Council is encouraging the private sector to improve open government data with two initiatives: the Taiwan Open Data Center (TODC) and Sheethub.com. TODC will allow the private sector, academics, and the public to improve the quality of government data by editing and contributing to government data. Sheethub.com will allow users to download government data directly into their own websites and applications.

3. Making Health Data More Interoperable

Senators Bill Cassidy (R-LA) and Sheldon Whitehouse (D-RI) have introduced the TRUST IT Act to prevent electronic health record system vendors from deliberately blocking the exchange of health information. Should allegations of health information blocking arise, the Department of Health and Human Services Inspector General would have the authority to investigate and penalize offending vendors by decertifying the system or imposing fines. Additionally, the bill would establish a standards rating body to score vendors on the interoperability of their systems.

4. Charting the Course for an Interoperable National Health System

The U.S. Office of the National Coordinator for Health Information Technology (ONC) has published its interoperability roadmap to guide the health-care sector to make it easier to share health data. ONC’s roadmap encourages cooperation between the private sector and health agencies and outlines the process for achieving nationwide interoperability by 2024. The goal of the roadmap is a learning health system in which health-care entities can use real-time data to continuously improve care and public health.

5. Keeping Customers Happy with Machine Learning

Customer service software company Zendesk has developed its Satisfaction Prediction tool, which uses machine learning to help companies detect customer service issues before they become a problem. Satisfaction Prediction analyzes historical data about customer interactions that could indicate dissatisfaction, such as the language used in customer service tickets, to assign new service tickets scores indicating their likely outcome. With this score, customer service workers can give tickets more likely to result in unhappy customers greater attention to resolve them more effectively.

6. Taking Guesswork Out of Diagnostics

Researchers at the Washington University School of Medicine in St. Louis have developed a diagnostic test called ViroCap that can detect the presence of a wide variety of viruses in a patient, even if the viruses are only present at very low levels. The researchers identified specific genomic sequences of every known virus known to affect humans that they could scan for in patient blood, stool, or nasal secretion samples. In initial testing, ViroCap was able to detect 32 different viruses in trial subjects, whereas traditional methods could only identify 21 different viruses. The researchers say they could easily modify ViroCap to detect new viruses and eventually other types of pathogens.

7. Scouting the Seas for Illegal Fishing

The Obama administration has announced a series of new efforts to use modern technologies to combat illegal fishing, which costs the world economy up to $23 billion per year. The administration’s new Sea Scout program will focus on facilitating coordination and information sharing amongst international efforts to identify and prosecute illegal fishing operations. The National Oceanic and Atmospheric Administration (NOAA) will provide data and new tools to aid these efforts, including the space-based Visible Infrared Imaging Radiometer sensing technology that can identify boats engaging in potentially illegal fishing. NOAA will also seek partnerships with other countries to help build their capacity to fight illegal fishing with data technologies.

8. Scoring the Performance of an Entire City

The city of Boston is developing a performance metric called CityScore that can incorporate a wide variety of city data, ranging from crime statistics to housing for veterans, into a single numerical score for the city. The mayor’s office will use this score to assess how well the government is achieving its goals and identify areas for improvement. The city government will make this score, as well as the algorithms that determined this rating, publicly available to encourage other cities to facilitate more data-driven government.

9. Teaching Robots to Teach Themselves

Researchers at Carnegie Mellon University have developed a deep learning system—a form of machine learning—that allows a robot to teach itself how to pick up up objects in an unstructured environment. The researchers trained their software with a robot equipped with mechanical arms and a variety of imaging and force sensors. After the researchers programmed the robot with basic object recognition skills, they left it to learn how to grasp new and familiar objects in a cluttered environment through 700 hours of trial and error without human input. By the end of the experiment, the robot was able to grasp objects successfully 80 percent of the time—considerably more successful than preprogrammed approaches.

10. Detecting Earthquakes on Twitter

The United States Geological Survey (USGS) is using tweets to augment its capacity to detect earthquakes in real time. By using Twitter’s public application programming interface, USGS can analyze geotagged tweets that reference earthquakes to fill gaps in its seismological sensor networks and avoid false alarms. USGS hopes to eventually incorporate Twitter data into its seismic detection algorithms to detect earthquakes and issue alerts even faster.

Image: Robert Brigham.

You may also like

Show Buttons
Hide Buttons