The reproducibility crisis meets stock assessment science: Sources of inadvertent bias in the stock assessment prioritization and review process
Advanced Search
Select up to three search categories and corresponding keywords using the fields to the right. Refer to the Help section for more detailed instructions.

Search our Collections & Repository

For very narrow results

When looking for a specific result

Best used for discovery & interchangable words

Recommended to be used in conjunction with other fields

Dates

to

Document Data
Library
People
Clear All
Clear All

For additional assistance using the Custom Query please check out our Help Page

The NOAA IR serves as an archival repository of NOAA-published products including scientific findings, journal articles, guidelines, recommendations, or other information authored or co-authored by NOAA or funded partners. As a repository, the NOAA IR retains documents in their original published format to ensure public access to scientific information.
i

The reproducibility crisis meets stock assessment science: Sources of inadvertent bias in the stock assessment prioritization and review process

Filetype[PDF-453.31 KB]



Details:

  • Journal Title:
    Fisheries Research
  • Personal Author:
  • NOAA Program & Office:
  • Description:
    The broader scientific community is struggling with a reproducibility crisis brought on by numerous factors, including “p-hacking” or selective reporting that may increase the rate of false positives or generate misleading effect size estimates from meta-analyses. This results when multiple modeling approaches or statistical tests may be brought to bear on the same problem, and there are pressures or rewards for finding “significant” results. Fisheries science is unlikely to be immune to this problem, with numerous opportunities for bias to inadvertently enter into the process through the prioritization of stocks for assessment, decisions about competing model approaches or data treatments within complex assessment models, and decisions about whether to adopt assessments for management after they are reviewed. I present a simple simulation model of a system where many assessments are performed each management cycle for a multi-stock fishery, and show how asymmetric selection of assessments for extra scrutiny or re-assessment within a cycle can turn a process generating unbiased advice on fishing limits into one that is biased high. I show similar results when sequential assessments receive extra scrutiny if they show large proportional decreases in catch limits compared to a prior assessment for the same stock, especially if there are only small changes in true stock size or status over the interval between assessments. The level of bias introduced by a plausible level of asymmetric scrutiny is unlikely to fundamentally undermine scientific advice, but may be sufficient to compromise the nominal “overfishing probabilities” used in a common framework for accommodating uncertainty, and introduce a level of bias comparable to the difference between buffers corresponding to commonly-applied levels of risk tolerance.
  • Source:
    Fisheries Research, 266, 106763
  • DOI:
  • ISSN:
    0165-7836
  • Format:
  • Publisher:
  • Document Type:
  • Rights Information:
    Accepted Manuscript
  • Compliance:
    Submitted
  • Main Document Checksum:
  • Download URL:
  • File Type:

Supporting Files

  • No Additional Files
More +

You May Also Like

Checkout today's featured content at repository.library.noaa.gov

Version 3.27.1