This week’s list of data news highlights covers September 2-8, 2017, and includes articles about a neural network that can interpret the dance of honeybees and a new partnership to send sensors into the depths of the ocean.
Auto insurance company Esurance is using predictive analytics to speed the processing of insurance claims related to Hurricane Harvey. Normally to pay out an insurance claim, an insurer would send in inspector to assess damages and determine the appropriate reimbursement. Esurance is instead combining policyholder data about vehicle make and model with aerial imagery of water depth to predict the amount of damages to a policyholder’s vehicle and issue a payment without the need for an inspection.
Researchers at the Free University of Berlin have developed an artificial neural network that can automatically interpret honeybee dances, which honeybees use to communicate with each other about the location of food. The dances, which involved a honeybee waggling back and forth at a rate of 13 hertz and moving in certain patterns, are simple enough for humans to interpret, but it is time consuming to analyze even just a few bees at a time. The neural network can analyze a video recording of a hive and interpret the dances of many bees simultaneously with 90 percent accuracy.
Google has begun deploying a new version of the cameras it uses to make Street View on Google Maps so it can gather higher resolution data about the real world, such as street signs and the names of storefronts. Google already uses machine learning to analyze Street View imagery to extract this information, but until now it has not significantly upgraded its Street View cameras in eight years, making it challenging to reliably identify meaningful data from these images. With the new cameras Street View can automatically identify more subtle details, such as whether a store is open based on the hours posted in its window.
Irish telecommunications provider VT Networks has partnered with fuel tank monitoring firm Dunraven Systems to connect 250,000 fuel tank sensors to a low-power network. By remote monitoring all of its fuel tanks, Dunraven will be able to automatically dispatch refills before a customer runs out, and it expects to reduce the costs of operating fuel tanks by over 50 percent.
Medication management company MedMinder has partnered with Cyft, an AI firm that focuses on identifying patients that would benefit from medical interventions, to improve medication adherence for chronic conditions. MedMinder will share data from its automated pill dispensers, which monitor and remind patients about their medication regimens, with Cyft, which will analyze this data and generate daily predictions about which patients are most at risk.
The U.S. House of Representatives has unanimously passed a bill called the SELF DRIVE Act to reduce regulatory obstacles to the development and deployment of autonomous and semi-autonomous vehicles. The bill would allow car manufacturers to eventually deploy up to 100,000 vehicles per year that do not meet current safety standards designed for human drivers, such as requirements that a car has a steering wheel and brake pedal. The bill would also prohibit states from passing their own regulations about the design or performance of self-driving cars to prevent the creation of a patchwork of inconsistent regulations across the country.
Florida startup Luminar Technologies has developed a new kind of LIDAR system, which uses lasers to map nearby environments for applications such as self-driving cars, that uses a higher wavelength of light than other LIDAR, giving it 10 times the range and 50 times the resolution of existing LIDAR systems. LIDAR systems also normally require multiple lasers for each line of resolution in a 3D map, while Luminar’s only needs one, which can make it cheaper to develop.
The U.S. National Oceanic and Atmospheric Administration (NOAA) has entered into a public-private partnership with Microsoft co-founder Paul Allen to deploy 33 deep-water robotic sensors to help monitor the ocean. NOAA operates an array of floating sensors that collect valuable temperature data, but the sensors cannot withstand the pressure below 2,000 meters deep. NOAA’s partnership will deploy 33 specialized version of the sensors capable of reaching 6,000 meters deep. Every 15 days, the sensors will rise to the surface to transmit their data via satellite and then dive back down to collect more data.
The U.S. Food and Drug Administration (FDA) has published new guidance detailing its procedures for evaluating real-world data about the performance of medical devices, which could speed regulatory decision-making. The guidance clarifies how the FDA considers real-world data, such as electronic health records, clinical evidence, and patient-generated data, to make it easier for device manufacturers to plan studies and clinical trials, as well as aid the FDA in post-approval device surveillance.
Researchers at the Massachusetts Institute of Technology have developed a prototype system called RFly that uses algorithms and radio-frequency identification (RFID) to help a drone navigate through a warehouse and find a specific package. RFID is common in warehouses, but it requires workers to manually scan an RFID tag on a package up close. RFly instead uses a drone to magnify the RFID signals from package tags so a scanner can reliably identify each package up to 50 meters away. As the drone moves about a warehouse, RFly uses a localization algorithm to identify how a specific RFID tag’s signal changes at different locations, allowing it to pinpoint individual items.
Image: Secum Bahia.