This week’s list of data news highlights covers February 14-20, 2015 and includes articles about powering data centers with fuel cells and a database that can translate its content into different languages in real time.
The White House has hired industry veteran DJ Patil as its first chief data scientist and deputy chief technology officer for data policy. Patil’s role will focus on improving how the government uses data, particularly in the field of healthcare where he will help oversee the administration’s precision medicine initiative. Patil previously held data science related positions at major technology companies like eBay, PayPal, and LinkedIn.
Researchers at Hammersmith Hospital’s Medical Research Council Clinical Science Center in London are scanning and creating digital 3D models of the hearts of 1,600 patient volunteers. The models will be combined with patients’ genetic information to help researchers better understand the links between heart disease and people’s genomes. The project is designed to accelerate the development of new treatments which typically rely on clinical trials that only use small amounts of health information and take several years.
Walgreens has partnered with patient community support company PatientsLikeMe to educate pharmacy customers on how drugs have affected other patients. PatientsLikeMe will share its database of 300,000 patient experiences on 2,300 medical conditions with Walgreens, which will be accessible through Walgreens’ Health Dashboard, a personalized website for pharmacy customers.
The Global Database of Events, Languages, and Tone (GDELT), a database of over 250 million news stories dating back to 1979, is now capable of translating its content for users into 65 languages, which accounts for 98.4 percent of its non-English information, in real time. GDELT is a freely available resource for users to get a full picture of everything that is going on in the world on a daily basis. The newest iteration of GDELT has a real-time translation feature to better deliver this content to users around the globe.
The new Utah Mapping and Information Partnership, consisting of local, regional, and state government organizations, is developing a framework to allow agencies to share geospatial data. The partnership was created to facilitate better use of geospatial data in decision making and to save government resources with improved data flows. The data sharing framework will allow for data from different databases to be viewed as layers on the same map, which is expected to help Utah better manage issues like air quality, transportation, water supply, and public safety.
A group of 15 companies including General Electric, IBM, and Verizon have created the Open Data Platform initiative to spur the development of big data projects. Open Data Platform members will use common standards in product development for tools like Hadoop, the widely used data management framework. The group hopes that common standards will reduce the likelihood of redundant or fragmented development, speeding the development of new applications and driving adoption of data tools throughout the industry.
A partnership between industrial Internet of Things company Sigfox, aerospace company Airbus Defense and Space, French research institute CEA-Leti, and engineering firm Sysmeca will develop a framework to connect the Internet of Things with satellites. The project, dubbed Mustang, will rely on satellites and ground transmitters to relay data from connected devices. Though using satellites to transmit data is nothing new, Mustang is the first project of its kind that will use satellites to connect everyday Internet of Things devices, like smart alarm systems and GPS trackers on dog collars.
Microsoft and Redox Power systems, a new energy startup, are developing a method to power data centers with fuel cells thanks to grant funding from the Department of Energy. The project will focus on integrating fuel cells directly into server racks to power them, circumventing the high infrastructure costs associated with installing fuel cell clusters offsite and converting and transferring the energy they generate, as is the traditional method. Microsoft and Redox estimate that if the project is successful, the cost of operating a data center would be dramatically reduced.
A dietitian and PhD candidate at the University of Wollongong’s Smart Foods Center has developed a database that catalogues information about products containing whole grains sold in the Australian market. The database is designed to help researchers study the nutritional content of foods being sold as well as act as a resource for consumers seeking foods higher in whole grains. The database, which is the first of its kind in Australia and one of the few of its kind in the world, consists of data on 385 food products from 46 companies.
10. Fighting Benefits Fraud with Data Science
State and municipal agencies have found data analytics to be a valuable tool in the fight against benefit fraud. Data scientists in agencies like the New York City Human Resources Administration run data about government benefits recipients through pattern-recognition systems to detect anomalies that could indicate fraud. In New York City, integrating data analytics into fraud investigations greatly improved the effectiveness of investigations, identifying millions more dollars in fraud from a smaller number of investigations than when the program was introduced. These agencies use services provided by companies like IBM, SAS, and LexisNexis, traditionally used by the private sector to detect financial fraud.
Image: PD Photo.