• Subscribe

Writing the map for secure research

Speed read
  • Major research facilities support advanced instruments and collect large datasets
  • Pilot study for a Cyberinfrastructure Center of Excellence investigates data lifecycle at major facilities
  • Trusted CI Fellowship brings cybersecurity know-how to scientific communities

Cosmic-ray neutrinos detected by the IceCube Observatory in Antarctica. Einstein's theory of relativity confirmed at the Laser Interferometer Gravitational-Wave Observatory (LIGO). These amazing scientific breakthroughs and many more were made possible by major research facilities funded by the National Science Foundation (NSF).

<strong>The IceCube Neutrino Observatory at the South Pole</strong> is a National Science Foundation major research facility. The observatory collects far more raw data every day than its satellite bandwidth allows it to transfer. Courtesy IceCube/National Science Foundation.Such facilities rely on advanced instruments like telescopes, interferometers, gravitational wave detectors, and sensors, sometimes distributed across many environments. Researchers from all over the world access these facilities to help them answer fundamental questions about the world we live in.  

But these complex instruments depend on a sophisticated backbone of computing and software systems and advanced networks to store and transfer the enormous amounts of data they collect and distribute. The need for this advanced cyberinfrastructure (CI) will only increase as scientists push forward and attempt to solve some of humanity’s grand challenges.

Because of this dependence on CI, the NSF has developed a pilot study to explore the creation of a Cyberinfrastructure Center of Excellence (CI CoE). This future Center of Excellence would facilitate CI improvements for existing facilities and exploit advanced CI architecture designs and tools for new facilities.

No one-size-fits-all solution

<strong>One of the most advanced radio telescope arrays on Earth</strong>, the Very Large Array consists of 27 radio antennas in New Mexico. The observatory relies on a supercomputer to process wide fields of spectral data simultaneously. Courtesy Alex Savello/National Radio Astronomy Observatory. “The ultimate goal of this project is literally to write a map for what the pilot team thinks would make a good Cyberinfrastructure Center of Excellence,” says Laura Christopherson, CI facilitator and data analyst for the Renaissance Computing Institute (RENCI) at the University of North Carolina at Chapel Hill.

RENCI is dedicated to developing data science infrastructure for researchers, helping them make informed decisions around data and capitalize on the promise data science offers to society. 

Christopherson is part of a team investigating the NSF’s major facilities to learn everything they can about each facility’s cyberinfrastructure throughout the entire data lifecycle. 

Many facilities have instruments and sensors in different locations that gather data. That raw data goes through multiple stages of filtering and processing before it is considered ‘science-ready’ and made available to researchers and the public.

“We look at everything that happens from the moment they get the data to the moment they hand it off to other people,” says Christopherson.

<strong>Ships in the Academic Research Fleet</strong> collect data in remote parts of the ocean. One of their biggest challenges is transferring large datasets while at sea. Courtesy National Science Foundation. For example, ship-based facilities like the University-National Oceanographic Laboratory System (UNOLS) collect data in remote parts of the ocean. One of their biggest challenges is the lack of a reliable internet connection. 

These mid-ocean research vessels collect large amounts of data. Ideally, scientists would like to perform quality control immediately and transfer the data in real time. But they don’t want to run the risk of losing or corrupting data because of low bandwidth or an unreliable connection. A CI CoE would assist such facilities with thinking about solutions and improving their internet connectivity. 

Christopherson recognizes that there is no one-size-fits-all solution for the NSF’s many research facilities. That’s why the team conducts interviews with facility staff to make sure they get it right. “We ask: What are your needs? How do you work? What are your values? What do you need to accomplish?” says Christopherson. “And then we map it all out to figure out how to improve their work situation.”

Unexpected direction

Thanks to her involvement with the CI CoE Pilot, Christopherson has also plunged headfirst into the world of cybersecurity. She was named one of the six Trusted CI Fellows for 2020. Fellows receive six months of cybersecurity training with the expectation that they will take that knowledge back to their scientific community and advocate for security practices.

<strong>Trusted CI 2020 Fellows.</strong> Clockwise, from top left: Smriti Bhatt, Texas A&M University-San Antonio; Laura Christopherson, RENCI; Tonya Davis, Alabama A&M University; Luanzheng “Lenny” Guo, University of California Merced; Jerry Perez, University of Texas at Dallas; Songjie Wang, University of Missouri. Christopherson didn’t originally see herself as a cybersecurity person. Coming from a background in the arts and information science, she says that she has always been intimidated by computer security. 

But at the same time, Christopherson admits, she did find the field interesting. So when she found out that Trusted CI (the NSF Cybersecurity Center of Excellence), a partner on the CI CoE Pilot, offered the annual fellowships, she decided to apply. 

“It seemed like a nice, gentle introduction,” Christopherson says. “If I’m going to serve on projects like the CI CoE Pilot, it would be good for me to understand the security in these types of environments, particularly in terms of the threats to research organizations.”

During the course of the fellowship, Christopherson’s view of cybersecurity has expanded. In addition to weekly webinars that introduce her to new topics, she has benefited from conversations with cybersecurity experts such as Anita Nikolich, former NSF program director for cybersecurity. 

“It does expose me to different things,” says Christopherson. “I would say I had a narrow view of security, and it’s getting expanded. It’s getting applied more to a research context.”

Now that she’s more than halfway through her one-year fellowship, Christopherson is realizing that the same traits that she brings to solving data problems for major facilities can also be successfully applied to cybersecurity.

“I like for there to be a challenge or gap in information for me to try to solve,” says Christopherson. “I like to find the information and then synthesize it in a way that provides good decision-making ability to whoever needs it.”

Read more:


Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2023 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.


We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.