Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Comment
  • Published:

How to use generative AI more responsibly

Researchers in the psychological sciences can use generative AI systems for tasks such as generating simulated data and new stimuli and for gaining insights into data. Responsible use of these AI systems requires consideration of how sociocultural systems such as racism are embedded in their development and training.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

References

  1. Demszky, D. et al. Using large language models in psychology. Nat. Rev. Psychol. 2, 688–701 (2023).

    Google Scholar 

  2. Birhane, A. et al. The forgotten margins of AI ethics. In Proc. 2022 ACM Conference on Fairness, Accountability, and Transparency 948–958 (ACM, 2022).

  3. Remedios, J. D. Psychology must grapple with Whiteness. Nat. Rev. Psychol. 1, 125–126 (2022).

    Article  Google Scholar 

  4. Prather, R. W. et al. What can cognitive science do for people? Cognitive Science 46, e13167 (2022).

    Article  PubMed  Google Scholar 

  5. Ray, V. A theory of racialized organizations. Am. Sociol. Rev. 84, 26–53 (2019).

    Article  Google Scholar 

  6. Dancy, C. L. & Saucier, P. K. AI and Blackness: towards moving beyond bias and representation. IEEE Trans. Technol. Soc. 3, 31–40 (2022).

    Google Scholar 

  7. Workman, D. & Dancy, C. L. Identifying potential inlets of man in the artificial intelligence development process: man and antiblackness in AI development. In Proc. CSCW '23 Companion: Companion Publication of the 2023 Conference on Computer Supported Cooperative Work and Social Computing 348–353 (ACM, 2023).

  8. Hutchinson, B. et al. Towards accountability for machine learning datasets: practices from software engineering and infrastructure. In Proc. ACM Conference on Fairness, Accountability, and Transparency 560–575 (ACM, 2021).

  9. Perrigo, B. Exclusive: OpenAI used Kenyan workers on less than $2 per hour to make ChatGPT less toxic. Time https://time.com/6247678/openai-chatgpt-kenya-workers/ (Time USA, 2023).

  10. Scao, T. L. et al. Bloom: a176b-parameter open-access multilingual language model. Preprint at ArXiv https://doi.org/10.48550/arXiv.2211.05100 (2022).

  11. Jiang, H. H. et al. AI art and its impact on artists. In Proc. 2023 AAAI/ACM Conference on AI, Ethics, and Society 363–374 (ACM, 2023).

  12. Metz, R. & Ford, B. Adobe’s ‘ethical’ Firefly AI Was trained on Midjourney image. Bloomberg News https://www.bloomberg.com/news/articles/2024-04-12/adobe-s-ai-firefly-used-ai-generated-images-from-rivals-for-training (Bloomberg Media Distribution, 2024).

  13. Mitchell, M. et al. Model cards for model reporting. In Proc. 2019 ACM Conference on Fairness, Accountability, and Transparency 220–229 (ACM, 2019).

  14. Gebru, T. et al. Datasheets for datasets. Commun. ACM 64, 86–92 (2021).

    Article  Google Scholar 

Download references

Acknowledgements

The author thanks members of The Human in Computing and Cognition (THiCC) Lab for helpful comments. This work was supported by the NSF AI institute for Societal Decision Making (AI-SDM) under grant number IIS 2229881.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christopher L. Dancy.

Ethics declarations

Competing interests

The author declares no competing interests.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dancy, C.L. How to use generative AI more responsibly. Nat Rev Psychol (2024). https://doi.org/10.1038/s44159-024-00339-4

Download citation

  • Published:

  • DOI: https://doi.org/10.1038/s44159-024-00339-4

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing