This week’s list of data news highlights covers April 13-19, 2019, and includes articles about AI analyzing genes to detect one of the causes of cancer and an AI system that summarizes scientific papers in plain-English.
Researchers from Duke University have developed an AI system that can chart the firing of neurons significantly faster than humans can. Manually mapping 30 minutes of neuronal activity can take as long as 24 hours, but the AI system can complete the process in under 30 minutes by analyzing video recordings of neuronal activity. The system can help researchers gather information for real-time behavioral studies that reveal how neurons fire in relation to different behaviors.
Researchers from Harvard University have developed an algorithm that analyzes genes to identify patients with HR deficiency, a defect in cells’ DNA repair mechanisms that causes cancer. Though treatments for HR deficiency exist, existing algorithms to detect HR deficiency are only 30 to 40 percent accurate. The researchers’ new algorithm achieved 74 percent accuracy by training on thousands of fully sequenced genomes to find the molecular signature of the HR defect.
NASA is sending a pair of autonomous robots to the International Space Station. The robots, which have cameras, a touch screen, a speaker, a microphone, signal lights, and a laser pointer, will help perform research, monitor radiation and air quality, and can even find lost items. The cubed shaped robots, which can work together, also provide a platform for guests scientists on Earth to perform zero-gravity experiments.
AKQA, a company that creates digital products and services, has created a new sport called Speedgate using AI. AKQA trained a neural network on data relating to 400 sports to generate Speedgate’s concepts and rules based on common themes in other sports. The sport consists of six-player teams and three gates, and teams can score on one of the end gates after they have kicked the ball through a center gate.
Researchers from MIT and the nonprofit Qatar Computing Research Institute have developed a neural network that can provide plain-English summaries of scientific papers. Neural networks can struggle correlating information from a long string of data, such as a research paper, but the researchers’ system addresses this challenge by representing each word in a paper as a vector pointing in a particular direction in a theoretical space. The vector’s positioning changes with each subsequent word, and after analyzing each word in a paper, the system translates the vector back into its corresponding words.
Firefighters used drones to track and stop the fire at the Notre Dame Cathedral in Paris. The drones provided the firefighters real-time data about the intensity, positioning, and spread of the fire. This data helped the firefighters decide how to allocate their resources.
Communications analytics company Behavox has developed Broker Vote, a machine-learning tool that analyzes hedge funds’ communications to determine which of its brokers are the most helpful. The tool uses natural language processing to analyze emails, calendar invites, and call logs. Broker Vote, which already helps four hedge funds determine how to administer $200 million in commissions, can help funds objectively decide how to distribute commissions.
The Masters golf tournament worked with IBM to use AI to automatically produce highlight videos for each round of golf as soon as it ended. In addition, the Masters used AI to analyze the hand, arm, and facial movements of the crowd and players to provide excitement scores for each shot. Tiger Woods, who won the Masters, achieved the highest possible crowd reaction score on at least four of his shots.
A researcher from Harvard University has developed an AI system that predicts the 3D structure of proteins significantly faster than previous state-of-the-art methods. A protein’s shape can influence its function, and the system analyzes a protein’s amino acid sequence to predict its structure. The system requires less computational power and runs faster than other methods to predict protein structure because it is comprised of a single mathematical function, allowing it to use a few thousand, instead of a few million, lines of code to run.
Facebook has developed an AI system called Vid2Play that can extract a person performing an action, such as playing tennis, from a video and recreate the individual as a virtual character that other individuals can control with a joystick. The system uses two neural networks, which combine to isolate a person from the background of a video while also providing the capability to place that person into a new scene. Depending on the action, the AI system can be trained on as little as three minutes of video.