The Ethics of Research and Teaching in an Age of Big Data

Authors

  • David Lundie University of Glasgow

DOI:

https://doi.org/10.32674/jcihe.v16i2.5779

Keywords:

Research Governance, Research Ethics, Big Data, Higher Education Governance, Large Language Models, Machine Learning

Abstract

Big Data, understood as high-volume, high-velocity and/or high-variety information assets that enable insight, decision-making, and process automation (Gartner, 2015), offers both opportunities and challenges in all aspects of human life. As Higher Education serves as preparation not only for economic but also for health, welfare, social and civic participation, these changes are imbricated in many aspects of academic endeavor.

In relation to research ethics, this change represents a normative difference in degree rather than a difference in kind. Data is more messy, more rapid, more difficult to predict and more difficult to identify owners, but the principles of informed consent, confidentiality and prevention of harm apply equally to digital as to traditional research data. Central to applying these principles, however, is the recognition that technologies are not inherently value neutral, and that data collection, aggregation, and its use in decision making can both create and intensify inequities and harms. A data justice approach to research ethics extends concern with voice and authenticity into the digital domain.

The transparency and ethics of our research processes have wider significance, as they determine the creation of new knowledge, and the processes by which this is disseminated to students. Universities provide an important role as gatekeepers to professional accreditation in a number of fields, including software engineering, and the relation between academic freedom of enquiry, state and corporate interests in the Big Data age raises important questions about power and control in the academy, which in turn have implications for the norms of research governance.

References

British Educational Research Association. (2019). Ethical guidelines for educational research, fourth edition (2018). https://www.bera.ac.uk/publication/ethical-guidelines-for-educational-research-2018-online

Bologna Follow-Up Group. (2005). An overarching framework of qualifications for the EHEA. http://www.ehea.info/cid102059/wg-frameworks-qualification-2003-2005.html

Buzan, B., Waever, O., & de Wilde, J. (1997). Security: A new framework for analysis. Lynne Rienner Publishers.

Carta, S. (2019). Big data, code, and the discrete city: Shaping public realms. Routledge. https://doi.org/10.4324/9781351007405

Easterbrook, F. H. (1996). Cyberspace and the law of the horse. University of Chicago Legal Forum, 207–216. https://chicagounbound.uchicago.edu/cgi/viewcontent.cgi?article=2147

Flanagan, M., Howe, D. C., & Nissenbaum, H. (2009). Embodying values in technology: Theory and practice. In Information technology and moral philosophy. Edited by J. Van Den Hoven, & J. Weckert, (pp. 322–353). Cambridge University Press. https://doi.org/10.1017/CBO9780511498725.017

Floridi, L. (2004). Information. In The Blackwell guide to the philosophy of computing and information. Edited by L. Floridi, (pp. 40–61). Blackwell. https://doi.org/10.1002/9780470757017

Floridi, L. (2011). Children of the fourth revolution. Philosophy & Technology, 24, 227–232. https://doi.org/10.1007/s13347-011-0042-7

Floridi, L. (2014). The fourth revolution: How the infosphere is reshaping human reality. Oxford University Press. https://www.oii.ox.ac.uk/research/publications/the-fourth-revolution/

Floridi, L. (2016). On human dignity as a foundation for the right to privacy. Philosophy & Technology, 29, 307–312. https://doi.org/10.1007/s13347-016-0220-8

Fountain, J. E. (2022). The moon, the ghetto and artificial intelligence: Reducing systemic racism in computational algorithms. Government Information Quarterly, 39(2), 101645. https://doi.org/10.1016/j.giq.2021.101645

French, R. M. (2000). The Turing Test: The first 50 years. Trends in Cognitive Science, 4(3), 115–122. https://doi.org/10.1016/S1364-6613(00)01453-4

Gartner. (2015). Information technology glossary. https://www.gartner.com/en/information-technology/glossary/big-data

Ghosh, D. (2020). Terms of disservice: How Silicon Valley is destructive by design. Brookings Institution Press.

Hartong, S. (2016). Between assessments, digital technologies and big data: The growing influence of ‘hidden’ data mediators in education. European Educational Research Journal, 15(55) 523–536. https://doi.org/10.1177/1474904116648966

Hartzog, W., & Stutzman, F. (2013). The case for online obscurity. California Law Review, 101, 1–50. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1597745

Ivančík, R. (2021). Security theory: Security as a multidimensional phenomenon. Vojenske Reflexie, 16(3), 32–53. https://doi.org/10.52651/vr.a.2021.3.32-53

Lundie, D. (2016). Authority, autonomy and automation: The irreducibility of pedagogy to information transactions. Studies in Philosophy and Education, 35, 279–291. https://doi.org/10.1007/s11217-016-9517-4

Lundie, D. (2022). School leadership between community and the state: The changing civic role of schooling. Palgrave Macmillan. https://doi.org/10.1007/978-3-030-99834-9

Lundie, D., Zwitter, A., & Ghosh, D. (2022, January 31). Corporatized education and state sovereignty. https://www.brookings.edu/blog/techtank/2022/01/31/corporatized-education-and-state-sovereignty/

McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies. I/S: A Journal of Law and Policy for the Information Society, 4(3), 543–568. https://www.technologylawdispatch.com/wp-content/uploads/sites/26/2013/02/Cranor_Formatted_Final1.pdf

McEwen, A., & Cassimally, H. (2013). Designing the internet of things. John Wiley & Sons. https://www.wiley.com/en-us/Designing+the+Internet+of+Things-p-9781118430620

Miller, V., Fernandez, F., & Hutchins, N. H. (2023). The race to ban race: Legal and critical arguments against state legislation to ban critical race theory in higher education. Missouri Law Review. 88(1), 1–46. https://scholarship.law.missouri.edu/mlr/vol88/iss1/6/

Mulligan, D. K., Koopman, C., & Doty, N. (2016). Privacy is an essentially contested concept: A multi-dimensional analytic for mapping privacy. Philosophical Transactions of the Royal Society A, 374(2083), 1–17. https://doi.org/10.1098/rsta.2016.0118

Nietzel, M. T. (2022, May 31). Britain opens up its visas for graduates of world’s top universities. https://www.forbes.com/sites/michaeltnietzel/2022/05/31/britain-opens-up-its-visas-for-graduates-of-worlds-top-universities/?sh=464e8a827fcf

Noorman, M. (2012). Computing and moral responsibility. http://plato.stanford.edu/archives/fall2012/entries/computing-responsibility

O’Donnell, R. M. (2019). Challenging racist predictive policing algorithms under the equal protection clause. New York University Law Review, 94(3), 544–580. https://www.nyulawreview.org/wp-content/uploads/2019/06/NYULawReview-94-3-ODonnell.pdf

Papacharissi, Z. (2012). Without you, I’m nothing: Performances of the self on Twitter. International Journal of Communication, 6, 1989–2006. https://ijoc.org/index.php/ijoc/article/view/1484/775

Pitidis, V., de Albuquerque, J. P., Coaffee, J., & Lima-Silva, F. (2022). Enhancing Community Resilience through Dialogical Participatory Mapping. In ISCRAM (pp. 495-503). https://www.idl.iscram.org/files/vangelispitidis/2022/2435_VangelisPitidis_etal2022.pdf

Reid, D. (2016). Man vs. machine: The battle for the soul of data science. In Big Data Challenges: Society, Security, Innovation and Ethics. Edited by A. Bunnik, A. Cawley, M. Mulqueen, & A. Zwitter, (pp. 11–22). Palgrave. https://doi.org/10.1057/978-1-349-94885-7_2

Solove, D. J. (2008). Understanding privacy. Harvard University Press. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1127888

Staton, B. (2023, April 3). Universities express doubt over tool to detect AI-generated plagiarism. https://www.ft.com/content/d872d65d-dfd0-40b3-8db9-a17fea20c60c

Sumartojo, S., Pink, S., Lupton, D., & Heyes LaBond, C. (2016). The affective intensities of datafied space. Emotion, Space and Society, 33–40. https://doi.org/10.1016/j.emospa.2016.10.004

Tse, J., Schrader, D. E., Ghosh, D., Liao, T., & Lundie, D. (2015). A bibliometric analysis of privacy and ethics in IEEE Security and Privacy. Ethics and Information Technology, 17, 153-163. https://doi.org/10.1007/s10676-015-9369-6

Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433–460. http://www.jstor.org/stable/2251299

UK Research Integrity Office (2019). Concordat to Support Research Integrity. https://ukrio.org/research-integrity/the-concordat-to-support-research-integrity/

University of Glasgow (2023). Online Information Links for Internet Based Research. https://www.gla.ac.uk/colleges/socialsciences/students/ethics/ethicstrainingresources/onlinedatainformationlinks/

Weinberger, D. (2011). Too Big to Know: Rethinking Knowledge Now That the Facts Aren’t the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room. Basic Books.

Wicker, S. B., & Schrader, D. E. (2010). Privacy-aware design principles for information networks. Proceedings of the IEEE, 99(2), 330–350. https://doi.org/10.1109/JPROC.2010.2073670

Woolcock, N., Zeffman, H., & Geddes, D. (2017, Oct 25). Tory whip ‘wanted names of Brexit lecturers for book research.’ https://www.thetimes.co.uk/article/i-want-names-of-brexit-lecturers-tory-whip-chris-heaton-harris-tells-universities-6sv98nn0x

World Economic Forum. (2020). Education 4.0. https://initiatives.weforum.org/reskilling-revolution/education-4-0

Downloads

Published

2024-05-22

Issue

Section

Empirical Article

How to Cite

The Ethics of Research and Teaching in an Age of Big Data. (2024). Journal of Comparative & International Higher Education, 16(2). https://doi.org/10.32674/jcihe.v16i2.5779