Home BlogWeekly News 10 Bits: The Data News Hot List

10 Bits: The Data News Hot List

by Travis Korte
by
Formula One race car manufacturers have developed extensive sensing and data processing expertise

This week’s list of data news highlights covers October 19-25 and includes articles on data-driven lingerie design and a new study on the epidemiology of divorce.

1. Formula One Companies Enter the Data Speedway The pressure of racing has forced Formula One car manufacturers to eke millisecond performance improvements out of their technologies in recent years, leading to some deep expertise in sensing and data processing. McLaren Electronic Systems, one firm that makes F1 cars, has even branched out from the racetrack to offer its data-driven optimization capabilities to clients in public transit, drug manufacturing and data center design.

2. Building a Better Bra With Data The founder of data-driven bra matchmaking service True&Co started assembling her dataset by hosting fittings in her living room and having her friends fill out questionnaires about their preferences. The database, which today includes data from over 200,000 women, was used to design the brand’s first lingerie collection, which was released this week.  Similar to the system online dating site OKCupid uses to ask users questions and recommend potential matches based on their answers, True&Co lets women input more and more data to zero in on their perfect bra.

3. The Epidemiology of Divorce Divorce can be contagious. At least that is what researchers using network analysis and other techniques borrowed from epidemiology reported in a paper this week. Using data from the famous Framingham Heart Study, which tracked residents of a small town over several decades, the researchers found that study participants were 75% more likely to become divorced if one of their friends is divorced and 33% more likely to divorce if a friend of a friend gets divorced.

4. Why Development Groups Must Embrace Open Data International development organizations would benefit from an ethic of open data. Mobile devices and crowdsourcing tools may make collecting data easier, but development organizations may need to invest in enterprise data management systems to reap the full benefits. This will pay off, however, in allowing organizations to build on the work others have already done, without needing to reinvent the wheel each time some data is needed.

5. Learning from Text at Facebook Facebook is exploring natural language processing to give its algorithms a better understanding of the huge amounts of unstructured text data contained on the site. One major challenge for these efforts is determining the sentiment of a piece of text: not just whether it is positive or negative, but how angry or happy or excited it is. This requires models more complex than those used in traditional text mining, but could ultimately provide a much more accurate picture of what users are really saying.

6. Dynamic Mapping Comes Courtside Soon, all of the NBA’s arenas will be outfitted with high-speed cameras to capture players’ every move and map them in real-time for analysis. The video provider, Chicago-based company Stats, hopes to help the league feed its post-game analysis with a combination of video and custom statistics. The firm also plans to work with marketers around the 2014 Winter Olympics.

7. Data Mining for the Mining Business Anglo-Australian mining firm BHP Billiton is using data to maintain its profits as global metal prices decline. From routing trucks to physically loading the ore onto trains, the company has applied data science throughout its business. It has also implemented a company-wide operations data sharing system. The initiative seems to be working: BHP said this week that its iron ore output was up 23% from last year.

8. Plug-and-Play Sensing for Your Car San Francisco start-up Automatic Labs began offering its vehicle data sensor in 250 U.S. Apple Stores this week. The sensor, called Link, is a $100 cartridge that measures speed, acceleration and geolocation, mapping out a car’s route while it calculates fuel economy and expenses. The firm has big plans for even broader uses of the sensor, including a beta program called Crash Alert, which detects collisions and automatically alerts emergency services and family members.

9. Supercomputers Still Shine In Some Applications Cray, the supercomputing company that helped governments and large companies tackle data analysis starting in the 1970s, is experiencing a resurgence of popularity in the “big data” era. While Hadoop and other distributed technologies might be able to undertake certain big computations more cheaply, supercomputers are still the state of the art when it comes to applications (such as weather modeling) that require the full dataset to be analyzed simultaneously.

10. Logistics Firm Embraces Common Data Platform Logistics firm DHL announced at a conference this week that it was using “big data” tools to keep its global operations running smoothly. Although it took some time to implement, the company switched over three years ago from a wide variety of legacy cost-evaluation systems to a single point of data collection. This improvement alone was estimated to save 7,000 days of labor per year on unnecessary cost-evaluation exercises.

You may also like

Show Buttons
Hide Buttons