• Subscribe

Science smells fishy

Speed read
  • NSF-funded longitudinal fish study discovers population variability.
  • 15 years of fish surveys crunched in computer simulations.
  • Modern ecologists must have technological know-how.

“A budget requires knowledge of what future income will be. If you find out your company is closing its doors, you anticipate a decrease in spending because you know what’s coming down the line.”

That anticipatory insight is what University of California, Santa Barbara (UCSB) biologists Daniel Okamoto (now at Simon Fraser University), Russell Schmitt, and Sally Holbrook are offering to natural resource management. Their longitudinal study has used computer models to discover variability in fish populations and give fisheries the ability to adapt conservation methods and aid species survivability. 

Okamoto’s team focused on the black surfperch, a small reef fish found off the west coast of the US. Scientists like to study the black surfperch because they have a particular way of reproducing that allows for generational analysis. Whereas most fish produce eggs that turn into larvae and float around the ocean for a while, the surfperch produce live young. This unique attribute allows researchers to track cohorts of fish through time to see how the population on a given reef may change.<strong> Data dive in the deep.</strong> Scientists went fishing in a supercomputer to learn about fish populations. What they've learned will help fishery managers preserve both profitability and fish mortality. Courtesy Clint Nelson.

Okamoto’s team benefited from resources provided by the National Science Foundation (NSF) Santa Barbara Coastal Long-Term Ecological Research (LTER) and specifically from colleagues who started counting fish back in the early 1980s. Schmitt and Holbrook have conducted annual surveys of these fish at 11 different sites along Santa Cruz Island in California’s Channel Islands since 1992. In addition to the surveys of fish, the team also surveyed the habitat in which they forage to get an estimate of the total abundance of the biomass of food these fish eat.

“It’s quite laborious and takes quite a crew,” Okamoto says. “The surveys take several days underwater, then the material is brought back and analyzed in the lab, which takes a very large team of undergraduate researchers who count and identify all the different species of micro crustaceans that these fish consume.”

To derive the most probable hypothesis to explain these observations (a problem known as the inverse problem), Okamato’s team looked to computational models. They focused on surveys made from 1993 – 2009 and ran statistical models for days to simulate all possible combinations of different hypotheses to best explain the variability in population they were seeing. 

These models helped Okamato’s team conclude that not only is surfperch adult survival widely variable — not static as most fisheries have assumed — but this fluctuation is driven by variability in prey and changes in competition for resources when predators are very low.

When it came time to see how their empirical study would affect a more general fish population, they went to the Eos server (now replaced by the Aurora server) at the National Center for Ecological Analysis (NCEAS). Using eight cores, they ran hundreds of thousands of parallel simulations on Eos to gauge how increasing the mortality rate of adult fish can magnify population fluctuations for different fish.<strong> Seeing underwater.</strong> Rather than assume fish populations are constant, fisheries now have evidence to help them plan harvests more intelligently.

These findings are important, because as in the budgeting analogy, fisheries assume a constant adult survival rate and plan harvest strategies accordingly. “The message we are trying to send with our research," Okamato says, “is that if you can monitor the kind of processes that are governing the fluctuations in survival and birth rates of fish, you might be able to forecast shifts in those processes and change your conservation measures to anticipate the kinds of shifts that are coming down the line.”

Okamoto stresses the work would not be possible without being able to run those computer simulations, and this reliance on computational power is quickly becoming the norm in contemporary ecological research. 

“At least we can say, given the complexity of the systems we are studying, there is a need for almost all ecologists to have a background in computing and programming. The modern era of ecology is a system where we need people out in the field who understand the biological systems but who can also perform model analysis and work with technology.”

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2021 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.


We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.