The Ethics of Research and Teaching in an Age of Big Data


  • David Lundie University of Glasgow



Research Governance, Research Ethics, Big Data, Higher Education Governance, Large Language Models, Machine Learning


Big Data, understood as high-volume, high-velocity and/or high-variety information assets that enable insight, decision-making, and process automation (Gartner, 2015), offers both opportunities and challenges in all aspects of human life. As Higher Education serves as preparation not only for economic but also for health, welfare, social and civic participation, these changes are imbricated in many aspects of academic endeavor.

In relation to research ethics, this change represents a normative difference in degree rather than a difference in kind. Data is more messy, more rapid, more difficult to predict and more difficult to identify owners, but the principles of informed consent, confidentiality and prevention of harm apply equally to digital as to traditional research data. Central to applying these principles, however, is the recognition that technologies are not inherently value neutral, and that data collection, aggregation, and its use in decision making can both create and intensify inequities and harms. A data justice approach to research ethics extends concern with voice and authenticity into the digital domain.

The transparency and ethics of our research processes have wider significance, as they determine the creation of new knowledge, and the processes by which this is disseminated to students. Universities provide an important role as gatekeepers to professional accreditation in a number of fields, including software engineering, and the relation between academic freedom of enquiry, state and corporate interests in the Big Data age raises important questions about power and control in the academy, which in turn have implications for the norms of research governance.


Download data is not yet available.


British Educational Research Association. (2019). Ethical guidelines for educational research, fourth edition (2018).

Bologna Follow-Up Group. (2005). An overarching framework of qualifications for the EHEA.

Buzan, B., Waever, O., & de Wilde, J. (1997). Security: A new framework for analysis. Lynne Rienner Publishers.

Carta, S. (2019). Big data, code, and the discrete city: Shaping public realms. Routledge.

Easterbrook, F. H. (1996). Cyberspace and the law of the horse. University of Chicago Legal Forum, 207–216.

Flanagan, M., Howe, D. C., & Nissenbaum, H. (2009). Embodying values in technology: Theory and practice. In Information technology and moral philosophy. Edited by J. Van Den Hoven, & J. Weckert, (pp. 322–353). Cambridge University Press.

Floridi, L. (2004). Information. In The Blackwell guide to the philosophy of computing and information. Edited by L. Floridi, (pp. 40–61). Blackwell.

Floridi, L. (2011). Children of the fourth revolution. Philosophy & Technology, 24, 227–232.

Floridi, L. (2014). The fourth revolution: How the infosphere is reshaping human reality. Oxford University Press.

Floridi, L. (2016). On human dignity as a foundation for the right to privacy. Philosophy & Technology, 29, 307–312.

Fountain, J. E. (2022). The moon, the ghetto and artificial intelligence: Reducing systemic racism in computational algorithms. Government Information Quarterly, 39(2), 101645.

French, R. M. (2000). The Turing Test: The first 50 years. Trends in Cognitive Science, 4(3), 115–122.

Gartner. (2015). Information technology glossary.

Ghosh, D. (2020). Terms of disservice: How Silicon Valley is destructive by design. Brookings Institution Press.

Hartong, S. (2016). Between assessments, digital technologies and big data: The growing influence of ‘hidden’ data mediators in education. European Educational Research Journal, 15(55) 523–536.

Hartzog, W., & Stutzman, F. (2013). The case for online obscurity. California Law Review, 101, 1–50.

Ivančík, R. (2021). Security theory: Security as a multidimensional phenomenon. Vojenske Reflexie, 16(3), 32–53.

Lundie, D. (2016). Authority, autonomy and automation: The irreducibility of pedagogy to information transactions. Studies in Philosophy and Education, 35, 279–291.

Lundie, D. (2022). School leadership between community and the state: The changing civic role of schooling. Palgrave Macmillan.

Lundie, D., Zwitter, A., & Ghosh, D. (2022, January 31). Corporatized education and state sovereignty.

McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies. I/S: A Journal of Law and Policy for the Information Society, 4(3), 543–568.

McEwen, A., & Cassimally, H. (2013). Designing the internet of things. John Wiley & Sons.

Miller, V., Fernandez, F., & Hutchins, N. H. (2023). The race to ban race: Legal and critical arguments against state legislation to ban critical race theory in higher education. Missouri Law Review. 88(1), 1–46.

Mulligan, D. K., Koopman, C., & Doty, N. (2016). Privacy is an essentially contested concept: A multi-dimensional analytic for mapping privacy. Philosophical Transactions of the Royal Society A, 374(2083), 1–17.

Nietzel, M. T. (2022, May 31). Britain opens up its visas for graduates of world’s top universities.

Noorman, M. (2012). Computing and moral responsibility.

O’Donnell, R. M. (2019). Challenging racist predictive policing algorithms under the equal protection clause. New York University Law Review, 94(3), 544–580.

Papacharissi, Z. (2012). Without you, I’m nothing: Performances of the self on Twitter. International Journal of Communication, 6, 1989–2006.

Pitidis, V., de Albuquerque, J. P., Coaffee, J., & Lima-Silva, F. (2022). Enhancing Community Resilience through Dialogical Participatory Mapping. In ISCRAM (pp. 495-503).

Reid, D. (2016). Man vs. machine: The battle for the soul of data science. In Big Data Challenges: Society, Security, Innovation and Ethics. Edited by A. Bunnik, A. Cawley, M. Mulqueen, & A. Zwitter, (pp. 11–22). Palgrave.

Solove, D. J. (2008). Understanding privacy. Harvard University Press.

Staton, B. (2023, April 3). Universities express doubt over tool to detect AI-generated plagiarism.

Sumartojo, S., Pink, S., Lupton, D., & Heyes LaBond, C. (2016). The affective intensities of datafied space. Emotion, Space and Society, 33–40.

Tse, J., Schrader, D. E., Ghosh, D., Liao, T., & Lundie, D. (2015). A bibliometric analysis of privacy and ethics in IEEE Security and Privacy. Ethics and Information Technology, 17, 153-163.

Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433–460.

UK Research Integrity Office (2019). Concordat to Support Research Integrity.

University of Glasgow (2023). Online Information Links for Internet Based Research.

Weinberger, D. (2011). Too Big to Know: Rethinking Knowledge Now That the Facts Aren’t the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room. Basic Books.

Wicker, S. B., & Schrader, D. E. (2010). Privacy-aware design principles for information networks. Proceedings of the IEEE, 99(2), 330–350.

Woolcock, N., Zeffman, H., & Geddes, D. (2017, Oct 25). Tory whip ‘wanted names of Brexit lecturers for book research.’

World Economic Forum. (2020). Education 4.0.




How to Cite

Lundie, D. (2024). The Ethics of Research and Teaching in an Age of Big Data. Journal of Comparative & International Higher Education, 16(2).



Empirical Article