Data literacy and historical data criticism

Poster Books to Bytes

Data literacy and, specifically, data criticism, understood as “the ability to collect, manage, evaluate, and apply data in a critical manner” (Ridsdale et al. 2015, 2), are essential skills for nearly all sectors and disciplines of the global knowledge-based economy. However, Data Literacy is of specific relevance for historically oriented scholarship since it distinctively affects the discipline’s methodological self-conception – the “historical method of source criticism”. Source criticism is the process of evaluating the qualities of an information source, such as its validity, reliability and relevance to the subject under investigation. Sound historical scholarship depends on a rigorous, careful application of the “critical historical method” as the basis for historical hermeneutics and the intersubjective validation of historical knowledge production. Historical source criticism takes into account that each individual historical source has a specific origin and tradition history. The reconstruction and critical evaluation of this history is a central element of source contextualisation, which always stands at the beginning of the historical research process. Digitalisation and datafication of historical source material adds additional layers to the history of the origin and tradition of a historical source. Data modelling further complicates historical source criticism in that the critical evaluation of a data model asks for skills that historians usually do not acquire during their scholarly education, such as data modelling standards, coding and script languages and information technologies. Therefore, the method of “historical source criticism” must be expanded by including skills in data criticism. Historians must be able to critically analyse the data models, metadata, infrastructures, IT systems, interfaces, and algorithms that are used in the process of creating historical data. They need to be able to evaluate them with regard to their implications for information retrieval and their impact on research results.

Together with Marina Lemaire (Servicezentrum eSciences) and Stefan Schmunk (Hochschule Darmstadt) Ursula Lehmkuhl coordinates the efforts of the NFDI consortium 4memory to enhance data literacy in historical research and teaching and to establish guidelines for histosrical data criticism. This includes the development of specialised methodological and curricular innovation to enhance data literacy (DL) in historically oriented research and teaching. We see this as a core prerequisite for the realisation of the much-needed cultural change in our profession.

In cooperation with a large network of participants, we promote data literacy by focusing on competence development and skills acquisition. We aim at integrating (i.e. “mainstreaming”) the teaching of data literacy at all levels of history education (regular BA, MA and PhD programs as well as area history courses, e.g. in African, Asian, European and East European, Latin American and Russian history), thus counterbalancing current tendencies to compartmentalise data literacy training by creating ever more specialised Digital History programmes. We also aim to normalise global history approaches by systematically integrating chronological and geographical perspectives in our expert discussions to innovate the “historical method of source criticism” under the digital condition.

Forschungsprojekte

NFDI4Memory

NFDI4@Uni Trier