Continuing our look back at our most popular stories of 2019, today we countdown the top five. (See #10-#6 here).
5. Hacked by a machine
Hollywood has warped our view of hacking. Films portray cybercriminals sitting in dark rooms, plotting the downfall of our digital lives. In reality, they’re outsourcing their dirty work—to other machines.
Ian Molloy, an artificial intelligence security researcher at IBM, says that a lot of hackers rely heavily on machine learning when they attempt to compromise a person or system. For instance, machine learning tools can scroll through a person’s Twitter feed looking for personal information to exploit.
On the other hand, machine learning is also an important defense. That’s why IBM created the Adversarial Robustness Toolbox, an open-source software library, to help developers defend their networks. The reality is, any tool or technology can be used for nefarious objectives. It’s up to us to also adopt machine learning tools to improve security and privacy.
4. Virtual reality nursing
Nursing is a tough profession. Long hours coupled with life-or-death situations is a lot to put on a single person’s plate. Which is why nursing schools are always looking for better ways to prepare their students for the situations they will encounter in the real world.
One current tactic at the IUPUC Simulation Center in Columbus, Indiana is virtual reality. VR scenarios let students experience and react to emotional situations – such as diagnosing a child born to a mother who is addicted to drugs – they may encounter in professional practice.
VR simulations can present multiple scenarios and include additional family members that the nurse must interact with. “It prepares us for real-life situations because we talk to not only the patient, but also another person in the room,” said second year nursing student Braelyn Evans.
3. Real-time answers for traffic jams
Preventing a traffic jam is a lot harder than it might seem. Adding a traffic light to an intersection, for example, might help waiting cars turn left, but it could also lead to grid lock on another block.
City planning is no easy task, which is why the US Department of Energy’s National Renewable Energy Laboratory is introducing supercomputing to improve efficiency. Scientists there have created a digital twin of Chattanooga, Tennessee that simulates over 170,000 different roads.
The digital twin is informed by real-time traffic conditions captured by sensors on Chattanooga roads. By studying traffic issues over a large area, the researchers hope to optimize infrastructure and improve energy (fuel) use by twenty percent.
2. The end of an era
For more than 50 years, the tech industry has relied on Gordon Moore’s prediction that the number of transistors on a microchip would double every year, while the cost to produce them would reduce by half. This rule has proven true for decades, but we’re getting to the point where we simply can’t fit any more transistors on a microchip, and Moore’s Law seems to be coming to an end.
But it doesn’t have to mean the end of progress in computing. John Shalf, department head for computer science at Berkeley Lab, thinks there are several promising paths to move forward.
One possibility is new high-performance architectures that rely on chips customized for specific workloads. Discovering new materials to replace the ubiquitous silicon also holds promise. And then there are new models of computation, such as quantum and neuromorphic, which could completely change how we view and use computers.
1. What machine learning can tell us about climate change
It’s perhaps no surprise that our most read article of 2019 delves into a topic that’s on a lot of people’s minds: climate change.
Specifically, we spoke with NASA Ames senior research scientist Kamalika Das about her work studying the Amazon rainforest. Das noticed that a lot of research on climate change had contradictory results. She hypothesized that the discrepancies might arise from the segmentation of individual research variables.
To test her theory, Das conducted a large-scale regression study and used machine learning to analyze the relationships between independent and dependent variables. She looked at precipitation, vegetation growth, elevation of the region, and other factors.
Das was able to identify which factors are most important and to develop a model which can be used for other complex environmental studies, such as air quality monitoring.