This week’s list of data news highlights covers June 2-9, 2018, and includes articles about the world’s new fastest supercomputer and an initiative to track plastic in the ocean.
The U.S. Department of Energy’s Oak Ridge National Laboratory has announced that its new supercomputer, named Summit, is capable of a peak speed of 200 petaflops, or 200,000 trillion calculations per second, making it the most powerful supercomputer in the world. Summit is considerably faster than the former leading supercomputer, China’s TaihuLight, which had a peak speed of 125 petaflops, and eight times faster than the United States’ next fastest system Titan. China still leads the world in overall computing power, with 202 of the worlds fastest 500 supercomputers, while the United States is second with 143.
Researchers at the Massachusetts Institute of Technology and the University of Toronto have developed a virtual system for training AI to learn to perform household tasks, such as making coffee or turning on appliances. The system uses a simulated 3D house and natural language descriptions of household chores. Each task includes descriptions of prerequisite steps, such as “enter the kitchen,” that the AI system must finish to successfully complete a task, such as “grab the juice.”
A team of researchers from six universities, including Auburn University and Harvard University, have developed a machine learning system that can analyze images from wildlife camera traps and identify, count, and describe wildlife with 96.6 percent accuracy. The researchers trained their system on 3.2 million images from a citizen science project called Snapshot Serengeti, which encourages volunteers to submit annotated images of wildlife. The system relies on motion-sensing cameras that automatically take pictures when an animal walks by and then annotates each image with descriptions of which species are present and in what numbers, as well as a description of what the animals are doing.
The U.S. Food and Drug Administration has approved a robotics system called the NeuralBot System, developed by medical technology company Neural Analytics, to automatically orient ultrasound monitors to improve screening accuracy for strokes. Strokes require rapid intervention to save a patient’s life or prevent brain damage, and doctors use ultrasound to measure blood flow in patients’ brains to identify signs of a stroke. However inaccurate blood-flow data from faulty ultrasound readings can lead to a misdiagnosis, further reducing the time doctors have to treat acute strokes.
UK environmental organization Plastic Tide is developing a system to use AI and drone-mounted cameras to identify plastic in photos of the ocean. Plastic Tide launched a program called Marine Litter DRONET to crowdsource images of ocean trash and allow volunteers to identify the presence of plastic in these images. Plastic Tide will use this data to train a machine learning system that can track the spread of plastic in waterways from drone imagery.
DeepMind has developed a method for training an AI system to play simple Atari video games just by having it watch YouTube videos of gameplay. Teaching an AI system to solve problems with clear rules and straightforward options, such as board games, is easier than teaching an AI to solve problems that require exploration, such as video games in which progression requires a player to interact with the environment to figure out what to do next. To overcome this, AI researchers have traditionally relied on large amounts of training with detailed datasets. Instead, DeepMind’s method allows an AI system to learn from watching a human play.
Researchers at the University of Cambridge, India’s National Institute of Technology, and the Indian Institute of Science, have developed a machine learning system that can analyze drone footage of crowds in real time and spot signs of violent behavior. The system has limited accuracy and can only identify five straightforward sings of violence, including strangling, punching, kicking, shooting, and stabbing, but could eventually allow law enforcement to quickly identify when violence breaks out in large gatherings, which can be difficult to police.
Health researchers at New York University have developed a machine learning system that can analyze clinical data about patients receiving treatment for breast cancer and identify if a patient has lymphedema, a chronic side effect of breast cancer treatment, with 94 percent accuracy. Lymphedema is not curable, but early detection and treatment can help reduce symptoms and stop it from progressing. However lymphedema can occur years after cancer surgery, making it difficult to spot quickly. The system analyzes demographic and clinical data about patients, including whether they are experiencing any of 26 lymphedema symptoms, and can diagnose the condition significantly more accurately than existing methods.
An IBM researcher named Chai Wah Wu has developed a machine learning algorithm that can identify sequences of numbers and mathematical structures that humans think of as elegant. Patterns in mathematics, such as the Fibonacci sequence or large prime numbers, have long fascinated mathematicians, and a project called the Online Encyclopedia of Integer Sequences (OEIS) compiles these interesting sequences, many of which related to challenges in mathematics. Wu codified mathematical principles related to interesting patterns, such as Benford’s Law, which states that in any set of numbers, more numbers are likely to begin with 1 than any other number, into his machine learning system. In a test on a mixture of 40,000 randomly generated sequences and 40,000 sequences from OEIS, Wu’s algorithm could identify the “interesting” OEIS sequences with near perfect accuracy.
Researchers at the Massachusetts Institute of Technology working on a project called Roboat have developed a prototype system of autonomous boats. The boats rely on hardware similar to that in self-driving cars and can link up with each other to carry freight or passengers, as well as serve as temporary bridges or platforms. Autonomous boats have an advantage over autonomous cars that may accelerate their deployment since commercial vessels already rely on transponders that broadcast their precise location to other ships, making it easier to avoid obstacles.
Image: Oak Ridge National Laboratory.