Objectivity and Trust

Slides: https://silver-choux.netlify.app

Dan Hicks (they/them)

Philosophy, UCM

Disciplines and methods

Disciplines and methods

  • Philosophy of science
    • 2012: PhD
    • Roles for values in science; science and society; esp. science-for-policy
    • Case study method
  • Computational STS
    • 2015-17: AAAS Science & Technology Policy Fellow
      • 2015-16: USEPA Chemical Safety for Sustainability
    • Research portfolio evaluation -> Bibliometrics
    • Text mining, online survey experiments

Overview

  • Objectivity
    • Value-freedom …
    • and a philosophical critique
  • Trust
    • Does trust require value-freedom?
    • Empirical research: no

Objectivity

Objectivity: Four Basic Meanings

  • Objective occurrence:
    public; available, observable, or accessible to multiple people
    The plot of Greta Gerwig’s Barbie vs. the dream I had last night

  • Objective entity:
    existing independently of us; not socially constructed
    Pluto vs. money

  • Objective proposition:
    corresponds to the way the world really is; true
    “Black Americans have a lower life expectancy than white Americans” vs. 
    “Ivermectin is an effective treatment for Covid-19”

  • Objective researcher:
    detached, disinterested, unbiased, impersonal, neutral among controversial points of view

(Lloyd 1995)

The value-free ideal (VFI)

Social and political values should not influence the epistemic core of scientific research.

A challenge for VFI:
USEPA’s ToxCast Estrogen Receptor Model

Dix et al. (2007)

A challenge for VFI:
USEPA’s ToxCast Estrogen Receptor Model

Browne et al. (2015)

Researcher degrees of freedom in developing the ER model

  • What endpoint to predict?
    (ER binding; animal model pathology; human pathology; Plutynski 2017)
  • What sample of chemicals to use to train, test, and validate?
  • What studies/data to use to establish ground truth? (Douglas 2000; vom Saal et al. 2024)


  • Continuous or discretized model output?
  • Where to set the active/inactive threshold? (Douglas 2000)
  • How to classify chemicals close to that threshold? (Hicks 2018)
  • How to calculate accuracy?
    (sensitivity, specificity, F1, balanced accuracy, etc.)


  • Is the ER model fit-for-purpose?

The social aims of science

Q: Why are you writing code in R?
A1: To fit a linear regression model.
Q: Why are you fitting a linear regression model?
A2: To understand how air pollution exposure is correlated with race.
Q: Why are you trying to understand how air pollution is correlated with race?
A3: To reduce racial disparities in asthma.

(Hicks 2022; see also Hicks 2018; Elliott and McKaughan 2014; Potochnik 2017; Fernández Pinto and Hicks 2019; Lusk and Elliott 2022)

the objective of epidemiology … is to create knowledge relevant to improving population health and preventing unnecessary suffering, including eliminating health inequities (Krieger 2011, 31)

Objectivity on the ropes

  • Scientists are taught that they need to be unbiased, disinterested, neutral in political and social controversies
  • The value-free ideal is a widespread norm in academia and science-based policy

But

  • Scientists are regularly confronted by researcher degrees of freedom
  • Choosing among researcher degrees of freedom should be responsive to the social and political purposes of research

Trustworthiness

Trust in science is not low

(Compare Lupia et al. 2024; Većkalov et al. 2024)

Trust vs. trustworthiness

trust
The occurrence:
Members of the public actually do trust the CDC on Covid-19
trustworthiness
Whether it’s appropriate:
Should members of the public trust the CDC on Covid-19?

Objectivity and trustworthiness: Two models

VFI-based
For science to be trustworthy, it needs to be value-free.
Bright (2017); Havstad and Brown (2017); John (2017); Kovaka (2021); Holman and Wilholt (2022); Menon and Stegenga (2023); Metzen (2024)


Social aims of science
For science to be trustworthy, it needs to promote appropriate social aims.

VFI-based trust in the wild

Nichols (2024); compare Lupia (2023) and contrast https://tinyurl.com/3wnjfthp

Riley Spence and BPA

Hicks and Lobato (2022)

Riley Spence and BPA

Re-analysis of data from Hicks and Lobato (2022)

  • Disclosure of an inappropriate aim (economic growth) reduced perceived trustworthiness
  • But disclosure of an appropriate aim (public health) had no effect

Comments on Strengthening Transparency

  • Trump EPA open data rule
  • 21k public comments with text available
  • Manual + ML sorting into supporting vs. opposing comments


  • Opponents
    • Environmentalists
    • High confidence for EPA and science
    • Connected science with promoting health
  • Supporters

Hicks et al. under preparation A; see also Hicks (2022); Hicks (2023)

VFI and generalized trust

Hicks et al. under preparation B; item text

VFI and trustworthiness

  • VFI-based model of trustworthiness is widely assumed, but untested
  • In an experimental study, which values mattered, not whether values
  • In comments on ST, confidence in science and EPA correlated with social aims for science
  • In VISS development, cynicism, not VFI, predicts generalized trust

Conclusions

Conclusions

  • A traditional understanding of objectivity — the value-free ideal — is a poor fit for actual scientific practice in fields like public health
  • Trustworthiness — deserving the public’s trust — is more fundamental than the occurrence of trust
  • Scientists might be hesitant to move away from value-freedom because of concerns about loss of trust(worthiness)
  • But, in the limited empirical research so far,
    social aims of science provide a better model for trustworthiness

References

Almassi, Ben. 2022. “Relationally Responsive Expert Trustworthiness.” Social Epistemology 36 (5): 576–85. https://doi.org/10.1080/02691728.2022.2103475.
Baier, Annette. 1986. “Trust and Antitrust.” Ethics 96 (2): 231–60. http://www.jstor.org/stable/2381376.
Bright, Liam Kofi. 2017. “Du Bois’ Democratic Defence of the Value Free Ideal.” Synthese, March, 1–19. https://doi.org/10.1007/s11229-017-1333-z.
Browne, Patience, Richard S. Judson, Warren M. Casey, Nicole C. Kleinstreuer, and Russell S. Thomas. 2015. “Screening Chemicals for Estrogen Receptor Bioactivity Using a Computational Model.” Environmental Science & Technology 49 (14): 8804–14. https://doi.org/10.1021/acs.est.5b02641.
Dix, David J., Keith A. Houck, Matthew T. Martin, Ann M. Richard, R. Woodrow Setzer, and Robert J. Kavlock. 2007. “The ToxCast Program for Prioritizing Toxicity Testing of Environmental Chemicals.” Toxicological Sciences 95 (1): 5–12. https://doi.org/10.1093/toxsci/kfl103.
Douglas, Heather. 2000. “Inductive Risk and Values in Science.” http://www.journals.uchicago.edu/doi/abs/10.1086/392855.
Elliott, Kevin C., and Daniel J. McKaughan. 2014. “Nonepistemic Values and the Multiple Goals of Science.” Philosophy of Science 81 (1): 1–21. https://doi.org/10.1086/674345.
Fernández Pinto, Manuela, and Daniel J. Hicks. 2019. “Legitimizing Values in Regulatory Science.” Environmental Health Perspectives 127 (3): 035001. https://doi.org/10.1289/EHP3317.
Goldenberg, Maya J. 2021. Vaccine Hesitancy: Public Trust, Expertise, and the War on Science. University of Pittsburgh Press.
Havstad, Joyce C., and Matthew J. Brown. 2017. “Neutrality, Relevance, Prescription, and the IPCC.” Public Affairs Quarterly 31 (4): 303–24. https://www.jstor.org/stable/44732800.
Hendriks, Friederike, Dorothe Kienhues, and Rainer Bromme. 2015. “Measuring Laypeople’s Trust in Experts in a Digital Age: The Muenster Epistemic Trustworthiness Inventory (METI).” PLOS ONE 10 (10): e0139309. https://doi.org/10.1371/journal.pone.0139309.
Hicks, Daniel J. 2017. “Scientific Controversies as Proxy Politics.” Issues in Science and Technology, January 2017. https://www.jstor.org/stable/24891967.
———. 2018. “Inductive Risk and Regulatory Toxicology: A Comment on de Melo-Martín and Intemann.” Philosophy of Science 85 (1): 164–74. https://doi.org/10.1086/694771.
———. 2022. “When Virtues are Vices: ‘Anti-Science’ Epistemic Values in Environmental Politics.” Philosophy, Theory, and Practice in Biology 14 (0). https://doi.org/10.3998/.2629.
———. 2023. “Open Science, the Replication Crisis, and Environmental Public Health.” Accountability in Research 30 (1): 34–62. https://doi.org/10.1080/08989621.2021.1962713.
Hicks, Daniel J., and Emilio Jon Christopher Lobato. 2022. “Values Disclosures and Trust in Science: A Replication Study.” Frontiers in Communication 7. https://doi.org/10.3389/fcomm.2022.1017362.
Holman, Bennett, and Torsten Wilholt. 2022. “The New Demarcation Problem.” Studies in History and Philosophy of Science 91 (February): 211–20. https://doi.org/10.1016/j.shpsa.2021.11.011.
John, Stephen. 2017. “Epistemic Trust and the Ethics of Science Communication: Against Transparency, Openness, Sincerity and Honesty.” Social Epistemology 32 (2): 75–87. https://doi.org/10.1080/02691728.2017.1410864.
Kovaka, Karen. 2021. “Climate Change Denial and Beliefs about Science.” Synthese 198 (3): 2355–74. https://doi.org/10.1007/s11229-019-02210-z.
Krieger, Nancy. 2011. Epidemiology and the People’s Health: Theory and Context. New York: Oxford University Press.
Lloyd, Elisabeth A. 1995. “Objectivity and the Double Standard for Feminist Epistemologies.” Synthese 104 (3): 351–81. http://www.jstor.org/stable/20117438.
Lupia, Arthur. 2023. “Political Endorsements Can Affect Scientific Credibility.” Nature 615 (7953): 590–91. https://doi.org/10.1038/d41586-023-00799-3.
Lupia, Arthur, David B. Allison, Kathleen Hall Jamieson, Jennifer Heimberg, Magdalena Skipper, and Susan M. Wolf. 2024. “Trends in US Public Confidence in Science and Opportunities for Progress.” Proceedings of the National Academy of Sciences 121 (11): e2319488121. https://doi.org/10.1073/pnas.2319488121.
Lusk, Greg, and Kevin C. Elliott. 2022. “Non-Epistemic Values and Scientific Assessment: An Adequacy-for-Purpose View.” European Journal for Philosophy of Science 12 (2): 35. https://doi.org/10.1007/s13194-022-00458-w.
Menon, Tarun, and Jacob Stegenga. 2023. “Sisyphean Science: Why Value Freedom Is Worth Pursuing.” European Journal for Philosophy of Science 13 (4): 48. https://doi.org/10.1007/s13194-023-00552-7.
Metzen, Hanna. 2024. “Objectivity, Shared Values, and Trust.” Synthese 203 (2): 60. https://doi.org/10.1007/s11229-024-04493-3.
Nichols, Tom. 2024. “Scientific American Didn’t Need to Endorse Anybody - The Atlantic.” September 20, 2024. https://web.archive.org/web/20240920234606/https://www.theatlantic.com/newsletters/archive/2024/09/scientific-american-harris-endorsement-science-covid/679931/.
Plutynski, Anya. 2017. “Safe or Sorry? Cancer Screening and Inductive Risk.” In Exploring Inductive Risk: Case Studies of Values in Science, edited by Kevin C. Elliott and Ted Richards. New York: Oxford University Press.
Potochnik, Angela. 2017. Idealization and the Aims of Science. Chicago and London: University of Chicago Press.
Saal, Frederick S. vom, Michael Antoniou, Scott M. Belcher, Ake Bergman, Ramji K. Bhandari, Linda S. Birnbaum, Aly Cohen, et al. 2024. “The Conflict Between Regulatory Agencies over the 20,000-Fold Lowering of the Tolerable Daily Intake (TDI) for Bisphenol A (BPA) by the European Food Safety Authority (EFSA).” Environmental Health Perspectives 132 (4): 045001. https://doi.org/10.1289/EHP13812.
Većkalov, Bojana, Sandra J. Geiger, František Bartoš, Mathew P. White, Bastiaan T. Rutjens, Frenk van Harreveld, Federica Stablum, et al. 2024. “A 27-Country Test of Communicating the Scientific Consensus on Climate Change.” Nature Human Behaviour, August, 1–14. https://doi.org/10.1038/s41562-024-01928-2.

Extra Slides

Epistemic interdependence

  • The world is far too complex for any one individual to know and understand more than a tiny part.
  • So we are epistemically interdependent:
    We need other people to know things on our behalf.
  • Epistemic representatives are simply the people we entrust to know and understand things on our behalf.
  • Often these people are scientists.

Trust: The 3-place relation

  • \(X\) trusts \(Y\) to do \(Z\). (Baier 1986)
  • Dan trusts the cat sitter to take care of Jag and Jasper.
  • The public trusts public health experts to understand and communicate the risks of Covid-19.

Three dimensions of trustworthiness

Should \(X\) trust \(Y\) to do \(Z\)?

:::

competence
Is \(Y\) capable of doing \(Z\)?
Are the public health experts appropriately trained in epidemiology, disease surveillance, immunology, etc.?
integrity
Will \(Y\) actually do \(Z\) the way they indicate that they will?
Are the public health experts honest?
responsiveness
Also called benevolence and good will. In doing \(Z\), will \(Y\) be responsive to \(X\)’s concerns and interest in having \(Z\) done?
Are the public health experts responsive to the public’s concerns and interests in understanding Covid-19 risks?

:::

(Hicks 2017; Goldenberg 2021; Almassi 2022;
compare Hendriks, Kienhues, and Bromme 2015)

ST supporters and opponents

VISS subscales

cynicism
coi.1 Scientists will report conclusions that they think will get them more funding even if the data does not fully support that conclusion.
coi.2 Special interests can always find a scientist-for-hire who will support their point of view.
consensus.1 The consensus of the scientific community is based on social status and prestige rather than evidence.
ir Scientists should be more cautious about accepting a hypothesis when doing so could have serious social consequences.
objectivity
nonsubj.1 When analyzing data, scientists should let the data speak for itself rather than offering their own interpretation.
nonsubj.2 Good scientific research is always free of assumptions and speculation.
vfi.1 The evaluation and acceptance of scientific results must not be influenced by social and ethical values.
textbook
consensus.2 Scientists never disagree with each other about the answers to scientific questions.
fallible.1 Once a scientific theory has been established, it is never changed.
pluralism.1 Scientific investigations always require laboratory experiments.
pluralism.2 All scientists use the same strict requirements for determining when empirical data confirms a tested hypothesis.