Oxford Martin School logo

Being able to read and write opens up the world of education and knowledge. When and why did more people become literate? How can progress continue?

By Max Roser and Esteban Ortiz-Ospina

This article was first published in 2013. The last revisions were done in March 2024.

Literacy is a key skill and a key measure of a population’s education. In this topic page, we discuss historical trends, as well as recent developments in literacy.

From a historical perspective, literacy levels for the world population have risen drastically in the last couple of centuries. While only one in ten people in the world could read and write in 1820, today, the share has reversed, with only one in ten remaining illiterate .

Despite large improvements in the expansion of basic education and the continuous reduction of education inequalities, there are substantial challenges ahead. The poorest countries in the world, where basic education is most likely to be a constraint for development, still have very large segments of the population who are illiterate.

See all interactive charts on literacy ↓

Related topics

Featured image

Global Education

See all our data, visualizations, and writing on global education.

Other research and writing on literacy on Our World in Data:

  • Access to basic education: Almost 60 million children of primary school age are not in school
  • Education spending
  • How is literacy measured?
  • Millions of children learn only very little. How can the world provide a better education to the next generation?

Global literacy today

Of the world population older than 15 years, the majority are literate. This interactive map shows how literacy rates vary around the world.

In many countries, more than 95% have basic literacy skills. Literacy skills of the majority of the population are a modern achievement as we show below.

Globally, however, large inequalities remain, notably between some sub-Saharan Africa and the rest of the world. In some countries in sub-Saharan Africa, fewer than 1-in-3 adults (aged over 15 years) are able to both read and write.

Historical change in literacy

Global literacy has grown substantially in the last two centuries.

While the earliest forms of written communication date back to about 3,500-3,000 BCE, for centuries literacy remained a very restricted technology closely associated with the exercise of power. It was only during the Middle Ages that book production started growing, and literacy among the general population slowly started becoming important in the Western World. 1

While the ambition of universal literacy in Europe was a fundamental reform born from the Enlightenment , it took centuries for it to happen. It was only in the 19th and 20th centuries that rates of literacy approached universality in early-industrialized countries.

The following visualization presents estimates of world literacy from the early nineteenth century until the current day. As we can see, literacy rates grew constantly but rather slowly until the beginning of the twentieth century. The rate of growth really climbed after the middle of the 20th century, when the expansion of basic education became a global priority. You can read more about the expansion of education systems around the world on our topic page on Financing Education .

When did literacy start increasing in Europe?

The following visualization shows the spread of literacy in Europe since the 15th century, based on estimates from Buringh and Van Zanden (2009). 2

As can be seen, the rising levels of education in Europe foreshadowed the emergence of modern societies.

Particularly fast improvements in literacy took place across Northwest Europe in the period 1600-1800. As we discuss below, widespread literacy is considered a legacy of the Age of Enlightenment.

The ambition of universal literacy in Europe was a reform born of the Enlightenment

The next chart shows historical estimates of literacy in England over the last five centuries.

The historical estimates are based on the percentage of men and women who could sign documents, a very basic definition of literacy that is often used in historical research on education. 3

The first observations refer to men and women in the diocese of Norwich, which lies to the Northeast of London. Here, the majority of men (61%) were unable to write their names in the late 16th century; for women, it was much lower.

By 1840, two-thirds of men and about half of women were literate in England. The expansion of education led to a reduction in education gender inequality. Towards the end of the 19th century, the share had increased to almost three-quarters for both genders.

As the center of the Industrial Revolution and one of the first countries that established democratic institutions, England was, in important aspects, the center of the development of modernity.

The data shows that improvements in literacy preceded the Industrial Revolution, and in many ways, the rise of living standards became only possible thanks to an increasingly better-educated public. Economic growth is possible when we better understand how to produce the things we need and translate these insights into technological improvements that allow us to produce them more efficiently. Both the development of new technologies (innovation) and their use in production relied on a much better-educated population.

Widespread school education and even basic skills like literacy are very recent achievements that were enabled and, at the same time, required by the progress achieved in recent generations.

literacy rate literature review

In the US, the expansion of literacy helped reduce within-country inequalities

The expansion of literacy in early-industrialized countries helped reduce within-country inequalities. In the preceding visualization, we showed that England virtually closed literacy gender gaps by 1900. Here, we provide evidence of literacy gaps across races in the US.

The following visualization shows illiteracy rates by race for the period 1870-1979. As we can see, in order to reach near-universal levels of literacy, the US had to close the race gap. This was eventually achieved around 1980.

legacy-wordpress-upload

Literacy and access to education have increased around the world

The next visualization shows, in two panels, a side-by-side comparison of long-term trends in school attendance and literacy.

We can see that in 1870, only one in four people in the world attended school, and this meant that only one in five were able to read. Global inequalities in access to education were very large.

Today, in contrast, the global estimates of literacy and school attendance are above 80%, and the inequality between world regions – while still existing – is much lower.

We can see that two centuries ago, only a small elite of the world population had the ability to read and write – the best estimate is that 12% of the world population was literate. Over the course of the 19th century, global literacy more than doubled. And over the course of the 20th century the world achieved rapid progress in education. More than 4 out of 5 people are now able to read. Younger generations are better educated than ever before.

literacy rate literature review

Latin America has made huge improvements in literacy in the last century

As pointed out above, Europe pioneered the expansion of basic education – but global literacy rates only started really climbing in the second half of the 20th century, when the expansion of basic education became a global priority. Here, we present evidence of important recent achievements in Latin America, where literacy has dramatically increased in the past century.

As can be seen, many nations have gained 40-50 percentage points in literacy during this period.

Despite these improvements, however, there is still a wide disparity between nations. Here, you can see that, at the turn of the 21st century, half of the population in poor countries such as Haiti remains illiterate. This motivates the next visualization, where we discuss cross-country heterogeneity in more detail.

Literacy around the world today

Literacy by generation.

To assess the extent to which progress can be expected in the years to come, it is convenient to break down literacy estimates by age groups. The following map, using data from UNESCO, shows such estimates for most countries in the world.

As can be seen, in the majority of nations, there is a large difference in literacy rates across generations.

These large differences across generations point to a global trend: the high literacy rate among the youth indicates that as time passes, the literacy rate for the overall population will continue to increase.

legacy-wordpress-upload

Northern Africa and the Middle East have drastically improved literacy in just one generation

We highlighted above the fact that most low and middle-income countries feature large differences in literacy rates across generations. The visualization shows specifically how remarkably large these differences are in Northern Africa and the Middle East. Using UNESCO data, these maps show that in many countries in these regions, only less than a third of the older generation is literate – while in contrast, more than 90% of the younger generation is literate.

literacy rate literature review

The scatter plot emphasizes the point already made. As you can see, younger generations are more likely to be literate than older generations around the world. In some countries, the gaps are dramatic. For example, countries in the bottom right of the chart where the youth literacy rate may be over 90% but the elderly literacy rate is below 40%.

In the chart you can use the slider at the bottom to check how these generational gaps have changed in recent decades. You can see that throughout Africa, the changes have been mainly horizontal (i.e., gaps have been widening as there have been radical recent improvements specifically benefiting the younger population). This is in contrast to richer regions, such as Europe, where the expansion of education started earlier, and as a consequence, changes have been mainly vertical .

Literacy rates by sex

The visualization shows the literacy for rates for young men and women. In countries above the diagonal dashed line, literacy rates are higher for young men than for young women; this is the case for many poorer countries.

This chart shows the literacy rate by sex over time.

This visualization shows the ratio of the literacy rate between young women and men around the world.

Numeracy skills over the long run

Numeracy is the ability to understand and work with numbers. The visualization shows how this ability became more common in populations around the world based on a very basic definition of numeracy, the ability to state one's own age correctly.

Numeracy skills today

Compared to the data on literacy we have less information on numeracy skills in the world today. Some information comes from PIAAC, the OECD's survey of the skills of adults.

The scatter plot shows how adults in OECD countries scored in the literacy and numeracy dimension. We see that the two aspects are closely correlated; those countries that have high literacy also have high numeracy.

PIAAC is only available for the recent past, but it can still give us some insights into how numeracy skills in the world have changed. If we compare the numeracy scores of the young cohort with the older cohort in a scatterplot , we find that in most countries, numeracy skills have recently increased.

Definitions & Measurement

Measurement today, common methods and data sources.

In the chart, we present a breakdown of UNESCO literacy estimates, showing the main methodologies used and how these have changed over time. (To explore changes across time, use the slider underneath the map.)

The breakdown covers four categories: self-reported literacy declared directly by individuals, self-reported literacy declared by the head of the household, tested literacy from proficiency examinations, and indirect estimation or extrapolation.

In most cases, the categories covering 'self-reports' correspond to estimates of literacy that rely on answers provided to a simple yes/no question asking people if they can read and write.

The category 'indirect estimates ' corresponds mainly to estimates that rely on indirect evidence from educational attainment, usually based on the highest degree of completed education.

In this table , you will find details regarding all literacy definitions and sources, country by country, and how we categorized them for the purpose of this chart.

This chart tells us that:

  • There is substantial cross-country variation, with recent estimates covering all four measurement methods.
  • There is variation within countries across time (e.g., Mexico switches between self-reports and extrapolation).

Another way to dissect the same data is to classify literacy estimates according to the type of measurement instrument used to collect the relevant data. In the next chart, we explore this, splitting estimates into three categories: sampling, including data from literacy tests, and household surveys; census data; and other instruments (e.g., administrative data on school enrollment).

Here, we can see that most countries use sampling instruments (coded as 'Survey' on the map), although in the past census data was more common. Literacy surveys have the potential of being more accurate – when the sampling is done correctly – because they allow for more specific and detailed measurement than short and generic questions in population censuses.

Data quality: Challenges and limitations

As mentioned above, recent data on literacy is often based on a single question included in national population censuses, or household surveys presented to respondents above a certain age, where literacy skills are self-reported. The question is often phrased as "Can you read and write?". These self-reports of literacy skills have several limitations:

  • Simple questions such as "Can you read and write?" frame literacy as a skill you either possess or do not when, in reality, literacy is a multi-dimensional skill that exists on a continuum.
  • Self-reports are subjective in that the question depends on what each individual understands by "reading" and "writing". The form of a word may be familiar enough for a respondent to recall its sound or meaning without actually ‘reading’ it. Similarly, when writing out one’s name to convey written ability, this can be accomplished by ‘drawing’ a familiar shape rather than writing in an effort to produce a written text with meaning.
  • In many cases, surveys ask only one individual to report literacy on behalf of the entire household. This indirect reporting potentially introduces further noise, in particular when it comes to estimating literacy among women and children since these groups are less often considered 'heads of household' in the surveys.

Similarly, inferring literacy from data on educational attainment is also problematic since schooling does not produce literacy in the same way everywhere: Proficiency tests show that in many low-income countries, a large fraction of second-grade primary school students cannot read a single word of a short text, and for very few people in these countries going to school for four or five years guarantees basic literacy .

Even at a conceptual level, there is a lack of consensus – national definitions of literacy that are based on educational attainment vary substantially from country to country. For example, in Greece, people are considered literate if they have finished six years of primary education, while in Paraguay, you qualify as literate if you have completed two years of primary school. 6

Reconstructing estimates from the past

Trends over time.

The correlation between educational attainment and literacy holds across countries and over time. The next chart shows this by plotting changes in literacy rates and average years of schooling.

Each country in this chart is represented by a line, where the beginning and end points correspond to the first and last available observation of these two variables over the period of available data, which varies from country to country. Before 1990, almost all observations correspond to census data.

As we can see by the arrows pointing to the top-right corner, literacy rates tend to be much higher in countries where people tend to have more years of education. As average years of education go up in a country, literacy rates also increase.

Countries with high literacy rates also tend to have higher results in the basic literacy test included in the DHS surveys (this test requires survey respondents to read a sentence shown to them). As we can see in the chart, but these two variables are closely related.

Other historical sources

Other historical sources used to estimate literacy is to calculate the share of people who could sign official documents (e.g., court documents, marriage certificates, etc.). 7

As the researcher Jeremiah Dittmar explains , this approach only gives a lower bound of the estimates because the number of people who could read was higher than the number who could write.

Indeed, other methods have been proposed in order to rely on historical estimates of people who could read. For example, researchers Eltjo Buringh and Jan Luiten van Zanden deduce literacy rates from estimated per capita book consumption . 8 As Buringh and Van Zanden show, their estimates based on book consumption are different but still fairly close to alternative estimates based on signed documents.

Interactive charts on literacy

An overview of the academic literature on the historical origins and spread of literacy can be found in Easton, P. (2014). Sustaining Literacy in Africa: Developing a Literate Environment . United Nations Educational, Scientific and Cultural Organization. Paris, France.

Buringh, E., & Van Zanden, J. L. (2009). Charting the “Rise of the West”: Manuscripts and Printed Books in Europe, A long-term perspective from the sixth through eighteenth centuries . The Journal of Economic History, 69(02), 409-445.

The graph is based on Clark (2008), who, in turn, relies on the sources indicated in the chart. Gregory Clark (2008). A farewell to alms: a brief economic history of the world. Princeton University Press.

National Center for Education Statistics . From the original source, we have excluded some years to have equal time differences on the x-axis (and interpolated the values for 1950), but the data is shown at the linked source.

The data is taken from the UNESCO statistics . The data refers to both genders and to the latest available data in the time between 2000 and 2012.

You will find more details about this in Chapter 6 of the Education for All Global Monitoring Report (2006) .

Since estimates of signed documents tend to rely on small samples (e.g., parish documents from specific towns), researchers often rely on additional assumptions to extrapolate estimates to the national level. For example, Bob Allen provides estimates of the evolution of literacy in Europe between 1500 and 1800 using data on urbanization rates. For more details, see Allen, R. C. (2003). Progress and poverty in early modern Europe . The Economic History Review, 56(3), 403-443.

They use a demand equation that links book consumption to a number of factors, including literacy and book prices. For more details, see Buringh, E. and Van Zanden, J.L., 2009. Charting the “Rise of the West”: Manuscripts and Printed Books in Europe, a long-term Perspective from the Sixth through Eighteenth Centuries. The Journal of Economic History, 69(2), pp.409-445.

Cite this work

Our articles and data visualizations rely on work from many different people and organizations. When citing this topic page, please also cite the underlying data sources. This topic page can be cited as:

BibTeX citation

Reuse this work freely

All visualizations, data, and code produced by Our World in Data are completely open access under the Creative Commons BY license . You have the permission to use, distribute, and reproduce these in any medium, provided the source and authors are credited.

The data produced by third parties and made available by Our World in Data is subject to the license terms from the original third-party authors. We will always indicate the original source of the data in our documentation, so you should always check the license of any such third-party data before use and redistribution.

All of our charts can be embedded in any site.

Our World in Data is free and accessible for everyone.

Help us do this work by making a donation.

  • Open access
  • Published: 15 March 2022

A review of academic literacy research development: from 2002 to 2019

  • Dongying Li   ORCID: orcid.org/0000-0002-6835-5129 1  

Asian-Pacific Journal of Second and Foreign Language Education volume  7 , Article number:  5 ( 2022 ) Cite this article

18k Accesses

15 Citations

4 Altmetric

Metrics details

Academic literacy as an embodiment of higher-order language and thinking skills within the academic community bears huge significance for language socialization, resource distribution and even power disposition within the larger sociocultural context. However, although the notion of academic literacy has been initiated for more than twenty years, there still lacks a clear definition and operationalization of the construct. The study conducted a systematic review of academic literacy research based on 94 systematically selected research papers on academic literacy from 2002 to 2019 from multiple databases. These papers were then coded respectively in terms of their research methods, types (interventionistic or descriptive), settings and research focus. Findings demonstrate (1) the multidimensionality of academic literacy construct; (2) a growing number of mixed methods interventionistic studies in recent years; and (3) a gradual expansion of academic literacy research in ESL and EFL settings. These findings can inform the design and implementation of future academic literacy research and practices.

Introduction

Academic literacy as an embodiment of higher order thinking and learning not only serves as a prerequisite for knowledge production and communication within the disciplines but also bears huge significance for individual language and cognitive development (Flowerdew, 2013 ; Moje, 2015 ). Recent researches on academic literacy gradually moved from regarding literacy as discrete, transferrable skills to literacy as a social practice, closely associated with disciplinary epistemology and identity (Gee, 2015 ). The view of literacy learning as both a textual and contextual practice is largely driven by the changing educational goal under the development of twenty-first century knowledge economy, which requires learners to be active co-constructors of knowledge rather than passive recipients (Gebhard, 2004 ). Academic literacy development in this sense is considered as a powerful tool for knowledge generation, communication and transformation.

However, up-till-now, there still seems to lack a clear definition and operationalization of the academic literacy construct that can guide effective pedagogy (Wingate, 2018 ). This can possibly lead to a peril of regarding academic literacy as an umbrella term, with few specifications on the potential of the construct to afford actual teaching and learning practices. In this sense, a systematic review in terms of how the construct was defined, operationalized and approached in actual research settings can embody huge potential in bridging the gap between theory and practice.

Based on these concerns, the study conducts a critical review of academic literacy research over the past twenty years in terms of the construct of the academic literacy, their methods, approaches, settings and keywords. A mixed methods approach is adopted to combine qualitative coding with quantitative analysis to investigate diachronic changes. Results of the study can enrich the understandings of the construct of academic literacy and its relations to actual pedagogical practices while shedding light on future directions of research.

Literature review

Academic literacy as a set of literacy skills specialized for content learning is closely associated with individual higher order thinking and advanced language skill development (Shanahan & Shanahan, 2008 ). Recent researches suggest that the development of the advanced literacy skills can only be achieved via students’ active engagement in authentic and purposeful disciplinary learning activities, imbued with meaning, value and emotions (Moje et al., 2008 ). Therefore, contrary to the ‘autonomous model’ of literacy development which views literacy as a set of discrete, transferrable reading and writing skills, academic literacy development is viewed as participation, socialization and transformation achieved via individual’s expanding involvement in authentic and meaningful disciplinary learning inquiries (Duff, 2010 ; Russell, 2009 ). Academic literacy development in this sense is viewed as a powerful mediation for individual socialization into the academic community, which is in turn closely related to issues of power disposition, resource distribution and social justice (Broom, 2004 ). In this sense, academic literacy development is by no means only a cognitive issue but situated social and cultural practices widely shaped by power, structure and ideology (Lillis & Scott, 2007 ; Wenger, 1998 ).

The view of literacy learning as a social practice is typically reflected in genre and the ‘academic literacies’ model. Genre, as a series of typified, recurring social actions serves as a powerful semiotic tool for individuals to act together meaningfully and purposefully (Fang & Coatoam, 2013 ). Academic literacy development in this sense is viewed as individual’s gradual appropriation of the shared cultural values and communicative repertoires within the disciplines. These routinized practices of knowing, doing and being not only serve to guarantee the hidden quality of disciplinary knowledge production but also entail a frame of action for academic community functioning (Fisher, 2019 ; Wenger, 1998 ). Therefore, academic literacy development empowers individual thinking and learning in pursuit of effective community practices.

Complementary to the genre approach, the ‘academic literacies’ model “views student writing and learning as issues at the level of epistemology and identities rather than skill or socialization” from the lens of critical literacy, power and ideology (Lea & Street, 1998 , p. 159). Drawing from ‘New Literacies’, the ‘academic literacies’ model approaches literacy development within the power of social discourse with the hope to open up possibilities for innovations and change (Lea & Street, 2006 ). Academic literacy development in this sense is regarded as a powerful tool for access, communication and identification within the academic community, and is therefore closely associated with issues of social justice and equality (Gee, 2015 ).

The notion of genre and ‘academic literacies’ share multiple resemblances with English for Academic Purposes (EAP), which according to Charles ( 2013 , p. 137) ‘is concerned with researching and teaching the English needed by those who use the language to perform academic tasks’. As can be seen, both approaches regard literacy learning as highly purposeful and contextual, driven by the practical need to ‘foregrounding the tacit nature of academic conventions’ (Lillis & Tuck, 2016 , p. 36). However, while EAP is more text-driven, ‘academic literacies’ are more practice-oriented (Lillis & Tuck, 2016 ). That is rather than focusing on the ‘normative’ descriptions of the academic discourse, the ‘academic literacies’ model lays more emphasis on learner agency, personal experiences and sociocultural diversity, regarded as a valuable source for individual learning and the transformation of community practices (Lillis & Tuck, 2016 ). This view of literacy learning as meaningful social participation and transformation is now gradually adopted in the approach of critical EAP (Charles, 2013 ).

In sum, all these approaches regard academic literacy development as multi-dimensional, encompassing both linguistic, cognitive and sociocultural practices (Cumming, 2013 ). However, up-till-now, there still seems to lack a clear definition and operationalization of the academic literacy construct that can guide concrete pedagogies. Short and Fitzsimmons ( 2007 , p. 2) provided a tentative definition of academic literacy from the following aspects:

Includes reading, writing, and oral discourse for school Varies from subject to subject Requires knowledge of multiple genres of text, purposes for text use, and text media Is influenced by students’ literacies in contexts outside of school Is influenced by students’ personal, social, and cultural experiences.

This definition has specified the main features of academic literacy as both a cognitive and sociocultural construct; however, more elaborations may be needed to further operationalize the construct in real educational and research settings. Drawing from this, Allison and Harklau ( 2010 ) and Fang ( 2012 ) specified three general approaches to academic literacy research, namely: the language, cognitive (disciplinary) and the sociocultural approach, which will be further elaborated in the following.

The language-based approach is mainly text-driven and lays special emphasis on the acquisition of language structures, skills and functions characteristic of content learning (Allison & Harklau, 2010 , p. 134; Uccelli et al., 2014 ), and highlights explicit instruction on academic language features and discourse structures (Hyland, 2008 ). This notion is widely influenced by Systemic Functional Linguistics which specifies the intricate connections between text and context, or linguistic choices and text meaning-making potential under specific communicative intentions and purposes (Halliday, 2000 ). This approach often highlights explicit consciousness-raising activities in text deconstruction as embodied in the genre pedagogy, facilitated by corpus-linguistic research tools to unveil structures and patterns of academic language use (Charles, 2013 ).

One typical example is data driven learning (DDL) or ‘any use of a language corpus by second or foreign language learners’ (Anthony, 2017 , p. 163). This approach encourages ‘inductive, self-directed’ language learning under the guidance of the teacher to examine and explore language use in real academic settings. These inquiry-based learning processes not only make language learning meaningful and purposeful but also help form more strategic and autonomous learners (Anthony, 2017 ).

In sum, the language approach intends to unveil the linguistic and rhetorical structure of academic discourse to make it accessible and available for reflection. However, academic literacy development entails more than the acquisition of academic language skills but also the use of academic language as tool for content learning and scientific reasoning (Bailey et al., 2007 ), which is closely connected to individual cognitive development, knowledge construction and communication within the disciplines (Fang, 2012 ).

Therefore, the cognitive or disciplinary-based approach views academic literacy development as higher order thinking and learning in academic socialization in pursuit of deep, contextualized meaning (Granville & Dison, 2005 ). This notion highlights the cognitive functions of academic literacy as deeply related to disciplinary epistemologies and identities, widely shaped by disciplinary-specific ways of knowing, doing and thinking (Moje, 2015 ). Just as mentioned by Shanahan ( 2012 , p. 70), ‘approaching a text with a particular point of view affects how individuals read and learn from texts’, academic literacy development is an integrated language and cognitive endeavor.

One typical example in this approach is the Cognitive Academic Language Learning Approach (CALLA) initiated by Chamot and O’Malley ( 1987 ), proposing the development of a curriculum that integrates mainstream content subject learning, academic language development and learning strategy instruction. This approach embeds language learning within an authentic, purposeful content learning environment, facilitated by strategy training. Another example is the Sheltered Instruction Observation Protocol (SIOP model) developed by Echevarría et al. ( 2013 ). Sheltered instruction, according to Short et al. ( 2011 , p. 364) refers to ‘a subject class such as mathematics, science, or history taught through English wherein many or all of the students are second language learners’. This approach integrates language and content learning and highlights language learning for subject matter learning purposes (Allison & Harklau, 2010 ). To make it more specifically, the SIOP model promotes the use of instructional scaffolding to make content comprehensible while advancing students’ skills in a new language (Echevarría et al., 2013 , p. 18). Over the decade, this notion integrating language and cognitive development within the disciplines has gradually gained its prominence in bilingual and multilingual education (Goldenberg, 2010 ).

Complementary to the language and cognitive approach, the sociocultural approach contends literacy learning as a social issue, widely shaped by power, structure and ideology (Gee, 2015 ; Lea & Street, 2006 ). This approach highlights the role of learner agency and identity in transforming individual/community learning practices (Lillis & Scott, 2007 ). Academic literacy in this sense is viewed as a sociocultural construct imbued with meaning, value and emotions as a gateway for social access, power distribution and meaning reconstruction (Moje et al., 2008 ).

However, despite the various approaches to academic literacy teaching and learning, up-till-now, there still seems to be a paucity of research that can integrate these dimensions into effective intervention and research practices. Current researches on academic literacy development either take an interventionistic or descriptive approach. The former usually takes place within a concrete educational setting under the intention to uncover effective community teaching and learning practices (Engestrom, 1999 ). The later, on the contrary, often takes a more naturalistic or ethnographic approach with the hope to provide an in-depth account of individual/community learning practices (Lillis & Scott, 2007 ). These descriptions are often aligned to larger sociocultural contexts and the transformative role of learner agency in collective, object-oriented activities (Engeström, 1987 ; Wenger, 1998 ).

These different approaches to academic literacy development are influenced by the varying epistemological stances of the researcher and specific research purposes. However, all these approaches have pointed to a common conception of academic literacy as a multidimensional construct, widely shaped by the sociocultural and historical contexts. This complex and dynamic nature of literacy learning not only enables the constant innovation and expansion of academic literacy construct but also opens up the possibilities to challenge the preconceived notions of relevant research and pedagogical practices.

Based on these concerns, the study intends to conduct a critical review of the twenty years’ development of academic literacy research in terms of their definition of the academic literacy construct, research approaches, methodologies, settings and keywords with the hope to uncover possible developmental trends in interaction. Critical reflections are drawn from this systematic review to shed light on possible future research directions.

Through this review, we intended the address the following three research questions:

What is the construct of academic literacy in different approaches of academic literacy research?

What are the possible patterns of change in term of academic literacy research methods, approaches and settings over the past twenty years?

What are the main focuses of research within each approach of academic literacy development?

Methodology

The study adopts mixed methods to provide a systematic review of academic literacy research over the past twenty years. The rationale for choosing a mixed method is to integrate qualitative text analysis on the features of academic literacy research with quantitative corpus analysis applied on the initial coding results to unveil possible developmental trends.

Inclusion criteria

To locate academic literacy studies over the past twenty years, the researcher conducted a keyword search of ‘academic literacy’ within a wide range of databases within the realm of linguistic and education. For quality control, only peer-reviewed articles from the Social Sciences Citation Index (Web of Science) were selected. This initial selection criteria yielded 127 papers containing a keyword of ‘academic literacy’ from a range of high-quality journals in linguistics and education from a series of databases, including: Social Science Premium Collection, ERIC (U.S. Dept. of Education), ERIC (ProQuest), Taylor & Francis Online—Journals, Linguistics and Language Behavior Abstracts, Informa—Taylor & Francis (CrossRef), Arts & Humanities Citation Index (Web of Science), ScienceDirect Journals (Elsevier), ScienceDirect (Elsevier B.V.), Elsevier (CrossRef), ProQuest Education Journals, Sage Journals (Sage Publications), International Bibliography of the Social Sciences, JSTOR Archival Journals, Wiley Online Library etc. Among these results, papers from Journal of Second Language Writing, Language and Education, English for Specific Purposes, Teaching in Higher Education, Journal of English for Academic Purposes and Higher Education Research & Development are among the most frequent.

Based on these initial results, the study conducted a second-round detailed sample selection. The researcher manually excluded the irrelevant papers which are either review articles, papers written in languages other than English or not directly related to literacy learning in educational settings. After the second round of data selection, a final database of 94 high-quality papers on academic literacy research within the time span between 2002 and 2019 were generated. However, considering the time of observation in this study, only researches conducted before October 2019 were included, which leads to a slight decrease in the total number of researches accounted in that year.

Coding procedure

Coding of the study was conducted from multiple perspectives. Firstly, the study specified three different approaches to academic literacy study based on their different understandings and conceptualizations of the construct (Allison & Harklau, 2010 ). Based on this initial classification, the study then conducted a new round of coding on the definitions of academic literacy, research methods, settings within each approach to look for possible interactions. Finally, a quantitative keywords frequency analysis was conducted in respective approaches to reveal the possible similarities and differences in their research focus. Specific coding criteria are specified as the following.

Firstly, drawing from Allison and Harklau ( 2010 ), the study classified all the researches in the database into three broad categories: language, disciplinary and sociocultural. While the language approach mainly focuses on the development of general or disciplinary-specific academic language features (Hyland, 2008 ), the disciplinary approach views academic literacy development as deeply embedded in the inquiry of disciplinary-specific values, cultures and epistemologies and can only be achieved via individual’s active engagement in disciplinary learning and inquiry practices (Moje, 2015 ). The sociocultural approach, largely influenced by the ‘academic literacies’ model (Lea & Street, 1998 ) contends that academic literacy development entails more than individual socialization into the academic community but is also closely related to issues as power, identity and epistemology (Gee, 2015 ; Lillis, 2008 ).

Based on this initial coding, the study then identified the research methods in all studies within each approach as either quantitative, qualitative or mixed method. Drawing from Creswell ( 2014 ), quantitative research is defined as ‘an approach for testing objective theories by examining the relationship among variables’ (p. 3) and is often quantified or numbered using specific statistical procedures. The use of this approach in academic literacy studies are often closely associated with corpus-driven text analysis, developmental studies, academic language assessment or large-scale intervention studies. This approach is particularly useful in unveiling the possible developmental effects of effective interventions but may fall short to account for the process of development which are often highly idiosyncratic and contextual. The use of qualitative methods can to some extent address this concern, as they often intend to explore deep contextualized meanings that individuals or groups ascribe to a social problem (Creswell, 2014 ). Drawing from the notion of literacy learning as a social practice, qualitative methods and especially linguistic ethnographies are highly encouraged in early academic literacy studies for their potential to provide detailed descriptions of a phenomenon through prolonged engagement (Lillis, 2008 ). In complementary, the use of mixed methods integrates both quantitative and qualitative data to ‘provide a more complete understanding of a research problem than either approach alone’ (Creswell, 2014 , p. 3). This approach embodies huge potentialities in academic literacy research as it can align teaching and learning processes with possible developmental outcomes, which not only preserves the contextualized and practice-oriented nature of academic literacy research but also makes their results generalizable.

Secondly, the study classified all the researches into two types: interventionistic and descriptive. The former entails an intentional pedagogical intervention with an aim to improve individual and community learning practices. The latter, however, tends to adopt a more naturalistic approach under an intention to unveil the complex and dynamic interactions between academic literacy development and the wider sociocultural context. These two approaches complement each other in academic literacy researches in real educational settings, serving distinct purposes.

Thirdly, for a closer inspection of the context of research, the study specifies three general research settings: English as a native language (ENL), English as a second language (ESL) and English as a foreign language (EFL) (Kirkpatrick, 2007 ). According to Kirkpatrick ( 2007 , p. 27), ‘ENL is spoken in countries where English is the primary language of the great majority of the population’ where ‘English is spoken and used as a native language’. ESL in contrast, ‘is spoken in countries where English is an important and usually official language, but not the main language of the country’ (Kirkpatrick, 2007 , p. 27). These are also countries that are previously colonized by the English-speaking countries, often with a diverse linguistic landscape and complicated language policies (Broom, 2004 ). Therefore, language choices in these countries are often closely connected to issues as power, identity and justice. Academic literacy development in this respect serves both to guarantee social resource distribution and to empower individuals to change. Finally, ‘EFL occurs in countries where English is not actually used or spoken very much in the normal course of daily life’ (Kirkpatrick, 2007 , p. 27). Within these settings, for example in China, English language education used to serve only for its own purposes (Wen, 2019 ). However, dramatic changes have been going on these days in pursuit of a language-content integrated curriculum to achieve advanced literacy and cognitive skills development. (Zhang & Li, 2019 ; Zhang & Sun, 2014 ).

Finally, the study conducted detailed keywords analysis in terms of their frequency within each approach (language, disciplinary and sociocultural). Based on these, the researcher then merged the raw frequencies of similar constructs for example: testing and assessment, teaching and pedagogy to get a better representation of the results. This analysis reveals the focus of research within each approach and helps promote further operationalization of the academic literacy construct.

The coding was conducted by two independent coders, with coder one in charge of the coding of all data, and coder two responsible for 30% of the coding of the total data. Coder one, also the main researcher trained coder two in terms of the coding procedures in detail with ample practices until the threshold of intercoder reliability was reached. Coder two then coded the remaining 30% of the data independently with an interrater reliability of over 80%. The coding was done on an excel worksheet which makes data access and retrieval readily available. The statistical software R was used for keywords frequency analysis.

Data analyses in the study mainly involve three parts: (1) specifying the construct and operationalization of the academic literacy research; (2) investigating the dynamic interactions among research approaches, methods and settings; (3) identifying the focus of research within each approach through keywords analysis. The following parts deal with these questions respectively.

Definition and operationalization of the academic literacy construct

The study extracted all the explicit definitions of academic literacy within each approach (language, disciplinary and sociocultural) and conducted detailed thematic analysis recategorizing them into different themes (see Table 1 ).

Table 1 shows that the definitions of academic literacy vary with respect to the different conceptualizations and epistemologies of academic literacy development within each approach. For instance, the language-based approach mainly defines academic literacy from two aspects: (1) language use in academic settings; and (2) language competence required for academic study (Baumann & Graves, 2010 ; Sebolai, 2016 ). The former takes a relatively narrow view of academic literacy development as learners’ gradual appropriation of the linguistic and rhetorical features of the academic discourse (Schleppegrell, 2013 ; Uccelli et al., 2014 ). The latter in complementary specifies academic literacy development for content learning purposes, entailing the kind of competence students need to possess for academic study (Kabelo & Sebolai, 2016 ). Academic language learning in this sense does not serve for its own sake but is considered as a tool for content learning and cognitive development. Overall, the language-based approach to academic literacy development lays much emphasis on the acquisition of academic language features which serves as a prerequisite for learners to examine and explore the meaning-making potential of the academic language (Schleppegrell, 2013 ).

The disciplinary-based approach on the other hand focuses on an integrated development of advanced language and cognitive skills within the disciplines, with language learning closely intertwined with the appropriation of disciplinary-specific values, cultures and practices. In this sense, academic literacy development is viewed as a dynamic process of higher-order language socialization in pursuit of deep, collaborative contextual meaning (Lea & Street, 2006 ). During this process, academic literacy development goes hand in hand with cognitive development and knowledge production within the disciplines, along with learners’ gradually expanding involvement with the disciplinary-specific ways of doing knowing and thinking (Granville & Dison, 2005 ). Other researches within this approach regard academic literacy development as more than language socialization but widely shaped and constrained by issues of power, epistemology and identity (Lea & Street, 1998 ). This definition is also widely used in the sociocultural approach, regarding academic literacy development as a sociocultural enterprise, widely related to the identification, reification and transformation of the social practices (Wenger, 1998 ).

The sociocultural approach also known as the ‘academic literacies’ model views literacy learning at the level of power struggle, structure reconstruction and social justice (Gee, 2015 ). Academic literacy development in this sense is not only a shared repertoire for individual access to social communities but also a tool for emancipation and transformation, which is object-oriented, practice-driven and value-laden (Lillis & Scott, 2007 ).

Academic literacy research approaches, methods and settings

The study also analyzed changes in the approaches, methods and settings of academic literacy research over the past twenty years. Table 2 and Fig.  1 in the following present the number of quantitative, qualitative and mixed-methods studies within the language-based, disciplinary-based and sociocultural approach respectively.

figure 1

Methods approach interaction in academic literacy studies

Table 2 and Fig.  1 show that the research methods chosen tend to vary with the approaches. To begin with, the number of qualitative studies generally surpassed the quantitative ones in both the disciplinary and the sociocultural approach, especially in the latter where qualitative studies dominated. However, their numbers tended to decrease in the past five years giving way to the rising mixed method researches. This was particularly evident in the growing number of mixed-methods language and disciplinary studies observed after 2015, which can also be an indication of the emergence of more robust designs in relevant educational researches. Finally, while the sociocultural approach was mainly featured by qualitative research, research methods in the language approach were more evenly distributed, which can possibly be accounted by its relatively longer research tradition and more well-established research practices.

In addition, the study also specified changes in the number of descriptive and intervention studies each year (see Table 2 , Fig.  2 ). Results showed that: (1) generally there were more qualitative researches in both the intervention and descriptive approach compared to the quantitative ones, although their numbers decreased in the past five years, especially in terms of the number of qualitative intervention studies; (2) a growing number of mixed-methods intervention studies were perceived in recent years. The findings echoed Scammacca et al.’s ( 2016 ) a century progress of reading intervention studies, indicating the emergence of more ‘standard, structured and standardized group interventions’ with ‘more robust design’ compared to the previous ‘individualized intervention case studies’ (p. 780). This developmental trend can indicate a possible methodological shift towards more large-scale intervention studies in the future based on recursive and reflective pedagogical practices. For more detailed descriptions of the methods-approach interaction, the study further investigated changes in the number of descriptive and intervention studies within each approach (see Table 3 , Fig.  3 ).

figure 2

Diachronic changes in academic literacy research methods

figure 3

Methods-approach interaction in academic literacy studies

Table 3 suggests that while the sociocultural approach tended to be more descriptive, the language and disciplinary approaches were more likely to interventionist. Another developmental trend was a dramatic decrease in descriptive language studies after 2015, giving way to an evident increase in intervention studies. This phenomenon entails an intricate connection among academic literacy development, education and pedagogy, indicating that language socialization does not come naturally, and well-designed, explicit pedagogical interventions are often in need.

Furthermore, the study tracked diachronic changes in the settings of academic literacy research. Results show that among the 94 selected academic literacy researches, 81 take place in a higher education context, accounting for about 86% of the total. Only 10 out of the 13 remaining researches take place in secondary school settings and 3 in elementary school settings. These results suggest that up-till-now, discussions on academic literacy development are mainly restricted to higher education settings, closely linked to the learning of advanced language and thinking skills. However, future researches may also need to attend to academic literacy development in secondary or primary school settings, especially in face of the growing disciplinary learning demands for adolescents (Dyhaylongsod et al., 2015 ).

Finally, the study recorded the specific countries where academic literacy studies take place, among which South Africa stands as the highest with 22 studies amounting to 20.95% of the total, followed by the United States (17.14%), United Kingdom (12.38%), Australia (11.43%) and China (9.64%). These results suggest that academic literacy research most often take place in ENL or ESL settings with relatively long traditions of literacy teaching and learning, and prominent demands for academic literacy development within subject areas. In the meantime, the study attributes the high number of academic literacy research in the South African context to its complex linguistic realities and historical legacies, where literacy development is closely associated with issues of power, identity and equality (Broom, 2004 ; Lea & Street, 2006 ). Based on this, the study specified the approaches of academic literacy research within the ENL, ESL and EFL settings respectively (see Table 4 , Fig.  4 ).

figure 4

Academic literacy research settings

Table 4 shows that while the ENL settings dominated most of the academic literacy researches, relevant studies in ESL and EFL settings gradually increased in recent years, indicating an expanding influence of the academic literacy construct in different educational settings. Another pattern was the observation of more balanced research approaches or more evenly distributed language, disciplinary and sociocultural researches in all three settings. This phenomenon suggests that there seems to be an increasing flexibility in academic literacy research in recent years under the intention to address specific contextual issues. All these developmental trends reinforce the notion of academic literacy as a multi-dimensional construct (Cumming, 2013 ).

Focus of academic literacy research

To investigate the focus of academic literacy research within each approach, the study conducted detailed keywords analysis in all studies (see Fig.  5 ). Results show that academic literacy development is a situated educational practice, closely linked to issues as content learning, teacher education, assessment and pedagogy. Another feature that stands out is the frequent appearance of ‘writing’ and its related practices, such as: academic writing, student writing etc. This phenomenon suggests that compared to reading, writing seems to share a greater emphasis in academic literacy research. This can possibly be accounted by the intricate connections among writing, language and content learning and the gradual shift of focus from learning to write to writing to learn in higher education settings (Prain & Hand, 2016 ).

figure 5

Keywords analysis of academic literacy research

From Fig.  5 , it can be seen that different approaches share common and distinct research focuses. For instance, the disciplinary approach is mainly featured by content learning and the development of subject-matter knowledge and skills, with a close relation to situated educational practices as genre and pedagogy, disciplinary-specific teaching and learning, reading interventions and teacher education. The language approach on the other hand tends to be more text-oriented, focusing on the development of advanced cognitive and academic language skills, widely influenced by the notions of Systemic Functional Linguistics (SFL) and genre pedagogy. In addition, assessment and testing are also a key issue in the language-based approach, indicating that language testing practices today are still largely text-oriented, focusing on the acquisition of specific academic language skills. Finally, keywords analysis results in the sociocultural approach revealed its deeply held assumptions of academic literacy development as a situated, complex sociocultural practice. One emerging feature is its growing attention to multilingualism, multiculturalism and international students. In an era of rapid globalization and academic exchange, academic literacy development has gradually become a global issue as is manifested in a rapid expansion of international students in ENL countries (Caplan & Stevens, 2017 ). These students, however, often face double barriers in language and content learning, especially in terms of advanced literacy skills development required for content learning and inquiry (Okuda & Anderson, 2018 ). In this sense, more attentions are needed for the implementation and innovation of effective community learning practices.

Data analysis results in the study reveal that: (1) academic literacy development is a multidimensional construct (Cumming, 2013 ); (2) there is a growing number of mixed-methods intervention studies in recent years especially within the language approach; (3) a gradual expansion of academic literacy research in ESL and EFL settings is perceived with increasing attention to international and multilingual students. The following parts of the discussion and conclusion will provide detailed analyses on these aspects.

Definition and keywords analysis of the academic literacy studies reveal that academic literacy is a multidimensional construct, embodying both textual and contextual practices and bears huge significance for individual language and cognitive development. Drawing from this, future researches may need to cross the boundaries to integrate the language, disciplinary and sociocultural aspects of academic literacy development within a holistic view of literacy teaching and learning. In this respect, academic literacy development can widely draw from various research domains as language acquisition, language socialization, genre and pedagogy and critical literacy (Duff, 2010 ; Gee, 2015 ; Hyland, 2008 ; Lea & Street, 2006 ; Russell, 2009 ). Future researches may need to pay more attention to these multiple aspects which closely intertwine and mutually shape one another to serve for the innovation and design of effective practices.

Data analysis in the study also demonstrated the intricate connections between literacy learning and pedagogical interventions. The development of academic literacy does not come naturally, but often calls for explicit instruction and interventions to address situated learning needs (Shanahan, 2012 ). It is hoped that in the future larger-scale interventions with more rigorous designs are necessary in pursuit of more effective pedagogical practices (Scammacca et al., 2016 ). This assumption, however, are not in contradiction to the dynamic and contextual nature of academic literacy development, as more sophisticated designs can generally provide more detailed account of the practice-driven and contextualized learning processes which are often cyclical and recursive in nature.

Lastly, results of the study revealed a growing trend of academic literacy research in EFL settings especially with respect to English language learners and international students. Compared to the ENL and ESL settings, academic literacy research in EFL settings, although a relatively recent issue, embodies huge potentialities. Drawn by the demand to promote higher-order thinking and learning and the need to innovate traditional form-focused, skilled-based EFL pedagogy, the notion of academic literacy development as a disciplinary-based, socioculturally constructed, dynamic academic socialization process offers a sensible option for pedagogical innovation and curriculum development in these contexts. In this sense, the notion of academic literacy as a multidimensional construct has provided a possible solution to the long-standing problems concerning the efficacy the efficiency of EFL education, the alignment of language and content learning as well as the challenges in curriculum design and material development in EFL settings (Wen, 2019 ).

Conclusion and implication

Results of the study suggest a relatively straight-forward agenda for the development of effective academic literacy pedagogies. Firstly, the study revealed an intricate connection between academic literacy development and disciplinary-specific knowledge construction and inquiry activities. Academic literacy development is by no means only a textual issue, but agentive scaffolded learning activities that are meaningful, purposeful and authentic. Literacy activities such as reading and writing in this sense are often object-oriented to serve for real knowledge production and communicative needs. Therefore, effective academic literacy instruction often aligns language development with content learning within meaningful disciplinary and social inquiries.

Secondly, in an era of rapid globalization and communication, the development of academic literacy often takes a critical role in resource distribution and power reconstruction. This has also led to an increasing attention to academic literacy development of international students in multilingual contexts, who often face multiple challenges in learning disciplinary literacy. However, contrary to the traditional ‘deficit model’ seeking for a remediation for their relatively ‘disadvantaged’ language background, the notion of academic literacy highlighted the role of teacher and learner agency in the development of new pedagogical practices. These innovative approaches often acknowledge and build on students’ diverse language and cultural backgrounds to make literacy learning a cognitively meaningful and culturally valuable practice.

The study can shed light on future research from both an empirical and pedagogical perspective. From an empirical perspective, future research may need to pay more attention to the multidimensionality of the construct of academic literacy. As revealed in the current study, academic literacy development embodies multiple dimensions as language learning, cognitive development and social transformation. Future research may need to transcend the epistemological boundaries to seek for a more integrated definition of academic literacy in which language, cognitive and social development mutually transform one another. From a pedagogical perspective, an activity-based, integrated pedagogy should be proposed in academic literacy development. In the case, students generally use language to engage in authentic communication and practices relating not only to the advancement of disciplinary knowledge but also for the betterment of society. As it is through these practices that students’ engagement in complex meaning making and higher order thinking are ensured, and the internalization of language knowledge and transformation of social practices gradually occur.

The study also bears some limitations. Although it seeks to provide a comprehensive overview of the general trend, method and focus of academic literacy research for nearly two decades, it does not go deeper into specific studies of their findings and implications. Future studies can possibly narrow down their scope of investigation to delve deeper and provide a more thorough analysis of specific research findings.

Availability of data and materials

The studies reviewed can be referred from the reference citations in the supplementary materials.

Abbreviations

Cognitive Academic Language Learning Approach

Data driven learning

English for Academic Purposes

English as a native language

English as a second language

Systemic Functional Linguistics

Sheltered Instruction Observation Protocol

Allison, H., & Harklau, L. (2010). Teaching academic literacies in secondary school. In G. Li & P. A. Edwards (Eds.), Best practices in ELL instruction. The Guilford Press.

Google Scholar  

Anthony, L. (2017). Introducing corpora and corpus tools into the technical writing classroom through Data-Driven Learning (DDL). In J. Flowerdew & T. Costley (Eds.), Discipline-specific writing: Theory into practice. Routledge.

Bailey, A. L., Butler, F. A., Stevens, R., & Lord, C. (2007). Further specifying the language demands of school. In A. L. Bailey (Ed.), The language demands of school: Putting academic English to the test. Yale University Press.

Basturkmen, H. (2017). Developing writing courses for specific academic purposes. In J. Flowerdew & T. Costley (Eds.), Discipline-specific writing: Theory into practice. Routledge.

Baumann, J. F., & Graves, M. F. (2010). What is academic vocabulary? Journal of Adolescent & Adult Literacy, 54 (1), 4–12.

Article   Google Scholar  

Bigelow, M., & Vinogradov, P. (2011). Teaching adult second language learners who are emergent readers. Annual Review of Applied Linguistics, 31 , 120–136.

Broom, Y. (2004). Reading English in multilingual South African primary schools. International Journal of Bilingual Education and Bilingualism, 7 (6), 506–528.

Caplan, N. A., & Stevens, S. G. (2017). “Step out of the cycle”: Needs, challenges, and successes of international undergraduates at a U.S. University. English for Specific Purposes, 46 , 15–28.

Chamot, A. U., & O’Malley, J. M. (1987). The cognitive academic language learning approach: A bridge to the mainstream. TESOL Quarterly, 21 (2), 227–249.

Charles, M. (2013). English for academic purposes. In B. Paltridge & S. Starfield (Eds.), The handbook of English for specific purposes (pp. 137–155). Wiley-Blackwell.

Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches . Sage.

Cumming, A. (2013). Multiple dimensions of academic language and literacy development. Language Learning, 63 (1), 130–152.

Duff, P. A. (2010). Language socialization into academic discourse communities. Annual Review of Applied Linguistics, 30 , 169–192.

Dyhaylongsod, L., Snow, C. E., Selman, R. L., & Donovan, M. S. (2015). Toward disciplinary literacy: Dilemmas and challenges in designing history curriculum to support middle school students. Harvard Educational Review, 85 (4), 587–608.

Echevarria, J., Vogt, M., & Short, D. J. (2013). Making content comprehensible for English learners: The SIOP model . Pearson Education.

Engeström, Y. (1987). Learning by expanding: An activity-theoretical approach to developmental research . Cambridge University Press.

Engestrom, Y. (1999). Activity theory and individual and social transformation. In R.M.R.-L.P. Yrjo Engestrom (Ed.), Perspectives on activity theory. Cambridge University Press.

Chapter   Google Scholar  

Fang, Z. (2012). Approaches to developing content area literacies: A synthesis and a critique. Journal of Adolescent & Adult Literacy, 56 (2), 103–108.

Fang, Z., & Coatoam, S. (2013). Disciplinary literacy: What you want to know about it. Journal of Adolescent & Adult Literacy, 56 (8), 627–632.

Fisher, R. (2019). Reconciling disciplinary literacy perspectives with genre-oriented Activity Theory: Toward a fuller synthesis of traditions. Reading Research Quarterly, 54 (2), 237–251.

Flowerdew, J. (2013). Introduction: Approaches to the analysis of academic discourse in English. In J. Flowerdew (Ed.), Academic discourse. Routledge.

Gebhard, M. (2004). Fast capitalism, school reform, and second language literacy practices. The Modern Language Journal, 88 (2), 245–265.

Gee, J. P. (2015). Literacy and education . Routledge.

Goldenberg, C. (2010). Improving achievement for English learners: Conclusions from recent reviews and emerging research. In G. Li & P. A. Edwards (Eds.), Best practices in ELL instruction (pp. 15–44). The Guilford Press.

Granville, S., & Dison, L. (2005). Thinking about thinking: Integrating self-reflection into an academic literacy course. Journal of English for Academic Purposes, 4 , 99–118.

Halliday, M. A. K. (2000). An introduction to functional grammar . Foreign Language Teaching and Research Press.

Hyland, K. (2008). Genre and academic writing in the disciplines. Language Teaching, 41 (4), 543–562.

Kirkpatrick, A. (2007). World Englishes: Implications for international communication and English language teaching . Cambridge University Press.

Lea, M. R., & Street, B. V. (1998). Student writing in higher education: An academic literacies approach. Studies in Higher Education, 23 (2), 157–172.

Lea, M. R., & Street, B. V. (2006). The “Academic Literacies” model: Theory and applications. Theory into Practice, 45 (4), 368–377.

Lillis, T. (2008). Ethnography as method, methodology, and “deep theorizing” closing the gap between text and context in academic writing research. Written Communication , 25 (3), 353–388.

Lillis, T., & Scott, M. (2007). Defining academic literacies research: Issues of epistemology, ideology and strategy. Journal of Applied Linguistics, 4 (1), 5–32.

Lillis, T., & Tuck, J. (2016). Academic literacies: A critical lens on writing and reading in the academy. In K. Hyland & P. Shaw (Eds.), The Routledge handbook of English for academic purposes (pp. 30–44). Routledge.

Lillis, T., & Turner, J. (2001). Student writing in higher education: Contemporary confusion, traditional concerns. Teaching in Higher Education , 6 (1), 57–68.

Moje, E. B. (2015). Doing and teaching adolescent literacy with adolescent learners: A social and cultural enterprise. Harvard Educational Review, 85 (2), 254–278.

Moje, E. B., Overby, M., Tysvaer, N., & Morris, K. (2008). The complex world of adolescent literacy: Myths, motivations, and mysteries. Harvard Educational Review, 78 (1), 107–154.

Okuda, T., & Anderson, T. (2018). Second language graduate students’ experiences at the writing center: A language socialization perspective. TESOL Quarterly, 52 (2), 391–413.

Prain, V., & Hand, B. (2016). Coming to know more through and from writing. Educational Researcher, 45 (7), 430–434.

Russell, D. R. (2009). Texts in contexts: Theorizing learning by looking at genre and activity. In R. Edwards, G. Biesta, & M. Thorpe (Eds.), Rethinking contexts for learning and teaching: Communities, activities and networks. Routledge.

Scammacca, N. K., Roberts, G. J., Cho, E., Williams, J., Roberts, G., Vaughn, S. R., et al. (2016). A century of progress: Reading interventions for students in grades 4–12, 1914–2014. Review of Educational Research , 86 (3), 756–800.

Schleppegrell, M. J. (2013). The role of metalanguage in supporting academic language development. Language Learning, 63 (1), 153–170.

Sebolai, K. (2016). Distinguishing between English proficiency and academic literacy in English. Language Matters, 47 (1), 45–60.

Shanahan, C. (2012). How disciplinary experts read. In T. L. Jetlon & C. Shanahan (Eds.), Adolescent literacy in the academic disciplines: General principles and practical strategies. The Guilford Press.

Shanahan, T., & Shanahan, C. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review, 78 (1), 40–59.

Short, D. J., Echevarría, J., & Richards-Tutor, C. (2011). Research on academic literacy development in sheltered instruction classrooms. Language Teaching Research, 15 (3), 363–380.

Short, D., & Fitzsimmons, S. (2007). Double the work: Challenges and solutions to acquiring language and academic literacy for adolescent English language learners—A report to Carnegie Corporation of New York . Alliance for Excellent Education.

Street, B. (2003). What’s “new” in New Literacy studies? Critical approaches to literacy in theory and practice. Current Issues in Comparative Education, 52 (2), 77–91.

Uccelli, P., Barr, C. D., Dobbs, C. L., Galloway, E. P., Meneses, A., & Sanchez, E. (2014). Core academic language skills: An expanded operational construct and a novel instrument to chart school-relevant language proficiency in preadolescent and adolescent learners. Applied Psycholinguistics, 36 (5), 1077–1109.

Wen, Q. (2019). Foreign language teaching theories in China in the past 70 years. Foreign Language in China, 16 (5), 14–22.

Wenger, E. (1998). Communities of practice: Learning, meaning, and identity . Cambridge University Press.

Book   Google Scholar  

Wingate, U. (2018). Academic literacy across the curriculum: Towards a collaborative instructional approach. Language Teaching, 51 (3), 349–364.

Zhang, L., & Li, D. (2019). An integrated development of students’ language and cognition under the CLIL pedagogy. Foreign Language Education in China, 2 (2), 16–24.

Zhang, L., & Sun, Y. (2014). A sociocultural theory-based writing curriculum reform on English majors. Foreign Language World, 5 , 2–10.

Zhao, K., & Chan, C. K. K. (2014). Fostering collective and individual learning through knowledge building. International Journal of Computer-Supported Collaborative Learning , 9 , 63–95.

Download references

Acknowledgements

I would like to express my sincere thanks to the two anonymous reviewers for their insightful comments on the original manuscript.

The study was supported by the start up research funding for young scholars in Nanjing Normal University (No. 184080H202A135).

Author information

Authors and affiliations.

School of Foreign Languages, Nanjing Normal University, Wenyuanlu #1, Qixia District, Nanjing, 210023, Jiangsu, China

Dongying Li

You can also search for this author in PubMed   Google Scholar

Contributions

All authors read and approved the final manuscript.

Corresponding author

Correspondence to Dongying Li .

Ethics declarations

Competing interests.

The author declares that there is no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Li, D. A review of academic literacy research development: from 2002 to 2019. Asian. J. Second. Foreign. Lang. Educ. 7 , 5 (2022). https://doi.org/10.1186/s40862-022-00130-z

Download citation

Received : 20 September 2021

Accepted : 01 February 2022

Published : 15 March 2022

DOI : https://doi.org/10.1186/s40862-022-00130-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Academic literacy
  • Academic language
  • Cognitive development
  • Intervention
  • Sociocultural context

literacy rate literature review

Purdue Online Writing Lab Purdue OWL® College of Liberal Arts

Writing a Literature Review

OWL logo

Welcome to the Purdue OWL

This page is brought to you by the OWL at Purdue University. When printing this page, you must include the entire legal notice.

Copyright ©1995-2018 by The Writing Lab & The OWL at Purdue and Purdue University. All rights reserved. This material may not be published, reproduced, broadcast, rewritten, or redistributed without permission. Use of this site constitutes acceptance of our terms and conditions of fair use.

A literature review is a document or section of a document that collects key sources on a topic and discusses those sources in conversation with each other (also called synthesis ). The lit review is an important genre in many disciplines, not just literature (i.e., the study of works of literature such as novels and plays). When we say “literature review” or refer to “the literature,” we are talking about the research ( scholarship ) in a given field. You will often see the terms “the research,” “the scholarship,” and “the literature” used mostly interchangeably.

Where, when, and why would I write a lit review?

There are a number of different situations where you might write a literature review, each with slightly different expectations; different disciplines, too, have field-specific expectations for what a literature review is and does. For instance, in the humanities, authors might include more overt argumentation and interpretation of source material in their literature reviews, whereas in the sciences, authors are more likely to report study designs and results in their literature reviews; these differences reflect these disciplines’ purposes and conventions in scholarship. You should always look at examples from your own discipline and talk to professors or mentors in your field to be sure you understand your discipline’s conventions, for literature reviews as well as for any other genre.

A literature review can be a part of a research paper or scholarly article, usually falling after the introduction and before the research methods sections. In these cases, the lit review just needs to cover scholarship that is important to the issue you are writing about; sometimes it will also cover key sources that informed your research methodology.

Lit reviews can also be standalone pieces, either as assignments in a class or as publications. In a class, a lit review may be assigned to help students familiarize themselves with a topic and with scholarship in their field, get an idea of the other researchers working on the topic they’re interested in, find gaps in existing research in order to propose new projects, and/or develop a theoretical framework and methodology for later research. As a publication, a lit review usually is meant to help make other scholars’ lives easier by collecting and summarizing, synthesizing, and analyzing existing research on a topic. This can be especially helpful for students or scholars getting into a new research area, or for directing an entire community of scholars toward questions that have not yet been answered.

What are the parts of a lit review?

Most lit reviews use a basic introduction-body-conclusion structure; if your lit review is part of a larger paper, the introduction and conclusion pieces may be just a few sentences while you focus most of your attention on the body. If your lit review is a standalone piece, the introduction and conclusion take up more space and give you a place to discuss your goals, research methods, and conclusions separately from where you discuss the literature itself.

Introduction:

  • An introductory paragraph that explains what your working topic and thesis is
  • A forecast of key topics or texts that will appear in the review
  • Potentially, a description of how you found sources and how you analyzed them for inclusion and discussion in the review (more often found in published, standalone literature reviews than in lit review sections in an article or research paper)
  • Summarize and synthesize: Give an overview of the main points of each source and combine them into a coherent whole
  • Analyze and interpret: Don’t just paraphrase other researchers – add your own interpretations where possible, discussing the significance of findings in relation to the literature as a whole
  • Critically Evaluate: Mention the strengths and weaknesses of your sources
  • Write in well-structured paragraphs: Use transition words and topic sentence to draw connections, comparisons, and contrasts.

Conclusion:

  • Summarize the key findings you have taken from the literature and emphasize their significance
  • Connect it back to your primary research question

How should I organize my lit review?

Lit reviews can take many different organizational patterns depending on what you are trying to accomplish with the review. Here are some examples:

  • Chronological : The simplest approach is to trace the development of the topic over time, which helps familiarize the audience with the topic (for instance if you are introducing something that is not commonly known in your field). If you choose this strategy, be careful to avoid simply listing and summarizing sources in order. Try to analyze the patterns, turning points, and key debates that have shaped the direction of the field. Give your interpretation of how and why certain developments occurred (as mentioned previously, this may not be appropriate in your discipline — check with a teacher or mentor if you’re unsure).
  • Thematic : If you have found some recurring central themes that you will continue working with throughout your piece, you can organize your literature review into subsections that address different aspects of the topic. For example, if you are reviewing literature about women and religion, key themes can include the role of women in churches and the religious attitude towards women.
  • Qualitative versus quantitative research
  • Empirical versus theoretical scholarship
  • Divide the research by sociological, historical, or cultural sources
  • Theoretical : In many humanities articles, the literature review is the foundation for the theoretical framework. You can use it to discuss various theories, models, and definitions of key concepts. You can argue for the relevance of a specific theoretical approach or combine various theorical concepts to create a framework for your research.

What are some strategies or tips I can use while writing my lit review?

Any lit review is only as good as the research it discusses; make sure your sources are well-chosen and your research is thorough. Don’t be afraid to do more research if you discover a new thread as you’re writing. More info on the research process is available in our "Conducting Research" resources .

As you’re doing your research, create an annotated bibliography ( see our page on the this type of document ). Much of the information used in an annotated bibliography can be used also in a literature review, so you’ll be not only partially drafting your lit review as you research, but also developing your sense of the larger conversation going on among scholars, professionals, and any other stakeholders in your topic.

Usually you will need to synthesize research rather than just summarizing it. This means drawing connections between sources to create a picture of the scholarly conversation on a topic over time. Many student writers struggle to synthesize because they feel they don’t have anything to add to the scholars they are citing; here are some strategies to help you:

  • It often helps to remember that the point of these kinds of syntheses is to show your readers how you understand your research, to help them read the rest of your paper.
  • Writing teachers often say synthesis is like hosting a dinner party: imagine all your sources are together in a room, discussing your topic. What are they saying to each other?
  • Look at the in-text citations in each paragraph. Are you citing just one source for each paragraph? This usually indicates summary only. When you have multiple sources cited in a paragraph, you are more likely to be synthesizing them (not always, but often
  • Read more about synthesis here.

The most interesting literature reviews are often written as arguments (again, as mentioned at the beginning of the page, this is discipline-specific and doesn’t work for all situations). Often, the literature review is where you can establish your research as filling a particular gap or as relevant in a particular way. You have some chance to do this in your introduction in an article, but the literature review section gives a more extended opportunity to establish the conversation in the way you would like your readers to see it. You can choose the intellectual lineage you would like to be part of and whose definitions matter most to your thinking (mostly humanities-specific, but this goes for sciences as well). In addressing these points, you argue for your place in the conversation, which tends to make the lit review more compelling than a simple reporting of other sources.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Med Internet Res
  • v.23(2); 2021 Feb

Logo of jmir

Measurement of Digital Literacy Among Older Adults: Systematic Review

Sarah soyeon oh.

1 Mo-Im Kim Nursing Research Institute, College of Nursing, Yonsei University, Seoul, Republic of Korea

Kyoung-A Kim

2 Department of Nursing, Yeoju Institute of Technology, Yeoju, Gyeonggi-do, Republic of Korea

3 College of Nursing, Yonsei University, Seoul, Republic of Korea

Sang Hui Chu

Jiyeon choi, associated data.

Summary of database (DB) search terms.

Crowe Critical Appraisal Tool (CCAT) form.

Numerous instruments are designed to measure digital literacy among the general population. However, few studies have assessed the use and appropriateness of these measurements for older populations.

This systematic review aims to identify and critically appraise studies assessing digital literacy among older adults and to evaluate how digital literacy instruments used in existing studies address the elements of age-appropriate digital literacy using the European Commission’s Digital Competence (DigComp) Framework.

Electronic databases were searched for studies using validated instruments to assess digital literacy among older adults. The quality of all included studies was evaluated using the Crowe Critical Appraisal Tool (CCAT). Instruments were assessed according to their ability to incorporate the competence areas of digital literacy as defined by the DigComp Framework: (1) information and data literacy, (2) communication and collaboration, (3) digital content creation, (4) safety, and (5) problem-solving ability, or attitudes toward information and communication technology use.

Searches yielded 1561 studies, of which 27 studies (17 cross-sectional, 2 before and after, 2 randomized controlled trials, 1 longitudinal, and 1 mixed methods) were included in the final analysis. Studies were conducted in the United States (18/27), Germany (3/27), China (1/27), Italy (1/27), Sweden (1/27), Canada (1/27), Iran (1/27), and Bangladesh (1/27). Studies mostly defined older adults as aged ≥50 years (10/27) or ≥60 years (8/27). Overall, the eHealth Literacy Scale (eHEALS) was the most frequently used instrument measuring digital literacy among older adults (16/27, 59%). Scores on the CCAT ranged from 34 (34/40, 85%) to 40 (40/40, 100%). Most instruments measured 1 or 2 of the DigComp Framework’s elements, but the Mobile Device Proficiency Questionnaire (MDPQ) measured all 5 elements, including “digital content creation” and “safety.”

Conclusions

The current digital literacy assessment instruments targeting older adults have both strengths and weaknesses, relative to their study design, administration method, and ease of use. Certain instrument modalities like the MDPQ are more generalizable and inclusive and thus, favorable for measuring the digital literacy of older adults. More studies focusing on the suitability of such instruments for older populations are warranted, especially for areas like “digital content creation” and “safety” that currently lack assessment. Evidence-based discussions regarding the implications of digitalization for the treatment of older adults and how health care professionals may benefit from this phenomenon are encouraged.

Introduction

Adopting digital technology is becoming imperative for all areas of service and business including health care. In the era of global aging, digital technology is viewed as a new opportunity to overcome various challenges associated with aging, such as reduced physical and cognitive function, multiple chronic conditions, and altered social networking [ 1 ]. Consistent with this trend, the proportion of older populations using digital technology has increased exponentially [ 2 ], although this proportion is still smaller than that of younger generations. According to the latest Digital Economy Outlook Report from the Organization for Economic Cooperation and Development (OECD), 62.8% of 55–74-year-olds are now connected to the internet, as are 96.5% of 16–24-year-olds [ 3 ].

Improving the inclusion and engagement of older adults in digital technology is becoming increasingly important for the promotion of their health and function [ 4 ]. While numerous studies have measured the digital literacy of younger generations [ 5 , 6 ], few have examined the inclusion of older adults in the research and design of digital technologies. Moreover, existing measures of digital literacy for older adults are generally focused on acceptance models and barriers to adoption [ 7 - 9 ], which fail to consider heterogeneity in user ability. As emphasized by Mannheim et al [ 10 ], designs that focus heavily on barriers may be marginalizing older adults by assuming that they are less capable of utilizing digital technologies than their younger counterparts.

For health care professionals, the rapid digitalization of social and health care services has various implications for providing older adults with improved access, knowledge, and behavior [ 11 ]. Telehealth platforms are a solution for frailer, older adults to receive medical support remotely [ 12 ], while GPS can be used to mine personalized data to locate older patients and track or predict their needs [ 13 ]. Internet use is associated with reduced likelihood of depression among the retired, and social networking sites represent an opportunity for older adults to reduce feelings of loneliness through online interactions with family and friends [ 14 ]. The increasing number of Alzheimer’s disease forums on the microblogging system, Twitter, for example, shows how social networking systems serve as a platform for older individuals to share the latest health-related information with others [ 13 ].

Quantifying the digital literacy of older adults is the first step to assist older adults to take advantage of this trend of digitalization in health care. However, when measuring digital literacy among older adults, measures must consider how basic competencies among one age cohort can be harder to achieve for another cohort with fewer information-and-communication-technology experiences and opportunities [ 15 ]. In the case of older adults, other age-related factors including life transitions, personal health, attitudes, and economic incentives must also be considered during instrument research and design [ 16 ].

To our knowledge, few systematic reviews to date have evaluated instruments of digital literacy for older adults in general, although 1 systematic review of digitally underserved populations attributed poor eHealth literacy to age, as well as language, educational attainment, residential area, and race [ 17 ]. Furthermore, the compatibility between these instruments and older adults has not been measured according to a validated framework.

Goal of This Study

Therefore, this systematic review aimed to (1) identify and critically appraise studies that involved the assessment of digital literacy among older adults and (2) evaluate how digital literacy instruments used in existing studies address the elements of age-appropriate digital literacy using the European Commission’s Digital Competence (DigComp) Framework [ 18 ]. According to DigComp, digital literacy is defined in 5 areas: (1) information and data literacy, (2) communication and collaboration, (3) digital content creation, (4) safety, and (5) problem solving [ 18 ]. For this review, we chose the DigComp over other frameworks, such as the International Computer and Information Literacy Study [ 19 ] and OECD’s Program for the International Assessment of Adult Competencies [ 20 ] because the DigComp Framework is the most generalizable across different regions [ 21 ] and age groups [ 15 ].

Search Strategy and Data Sources

This systematic review was conducted by searching multiple electronic databases according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [ 22 ]. Electronic databases and search engines employed in the initial screening period included PubMed, CINAHL, Embase, and MeSH. The combination of search keywords for each database was summarized in a table (See Multimedia Appendix 1 ). Keywords were matched to database-specific indexing terms, and searches were not limited to a specific region or study design. However, we limited the year of study to those that were conducted after 2009 for a more recent conceptualization of digital literacy.

The reference lists of identified studies were manually reviewed by a team of academics to prevent relevant studies from being excluded in our search for relevant articles. EndNote X9 was used for database management.

Eligibility Criteria

We included studies that (1) were published in English, (2) targeted older adults, and (3) measured the use of a validated instrument to assess digital literacy. However, publications were excluded if older adults were not the study’s main target population. To elaborate, publications targeting the general population, for example, were excluded from our list of eligible articles as older adults were not the main target population examined.

Exceptions to this rule were studies that compared older populations to younger populations with the aim of addressing the age-related digital divide, like the study by Schneider and colleagues [ 23 ] comparing the digital literacy of “baby boomers” (50-65 years old) to that of millennials (18-35 years old).

Study Selection

Using these eligibility criteria, 3 independent investigators (SO, MK, and JO) examined all studies reporting the use of a digital literacy instrument in the databases and search engines. All studies were screened according to their title and excluded if the main target population did not consist of older adults.

Subsequently, abstracts were screened so that non-English studies and studies not assessing digital literacy through a validated instrument could be excluded from our investigation. During this process, any studies that were incapable of providing information on the required general characteristics were excluded.

Last, full-text reviews were performed to ensure that all articles measured the digital literacy of older adults through validated instruments. In this process, investigator-developed questionnaires were included only if authors mentioned that they had been evaluated by experts for face validity. The instruments mentioned in each article were checked to ensure that they were accessible for our quality assessment. All processes were supervised by 2 independent reviewers (SC and JC), and any disagreement was resolved through discussions.

Data Collection

Data on the general characteristics of the included studies included a summary of the year of publication, study design, region where the study was conducted, age of older adults studied, and main literacy instrument used. Regarding the region where the study was conducted, 2 studies were international collaborations, including 1 study between Italy and Sweden [ 24 ] and another study between the United States, United Kingdom, and New Zealand [ 25 ]. For these 2 studies, the first author’s region of study was used in our general characteristics summary.

Quality Assessment

Three independent reviewers (SO, JC, and KK) assessed the quality of each included study using the Crowe Critical Appraisal Tool (CCAT) [ 26 ]. The CCAT is a validated quality assessment tool developed to rate research papers in systematic reviews based on a number of criteria relative to research design, variables and analysis, sampling methods, and data collection ( Multimedia Appendix 2 ) [ 26 ]. Many systematic reviews targeting older adults have used this tool [ 27 , 28 ] for quality appraisal.

Instruments were also assessed to the DigComp’s definition of the 5 areas of digital literacy: (1) information and data literacy (browsing, searching, filtering data), (2) communication and collaboration (interacting, sharing, engaging in citizenship, collaborating), (3) digital content creation (developing, integrating, and re-elaborating digital content; copyright; licenses; programming), (4) safety (protecting devices, protecting personal data and privacy, protecting health and well-being), and (5) problem solving (solving technical problems, identifying needs and technological responses, creatively using digital technologies, identifying digital competence gaps) [ 18 ].

The PRISMA flow diagram in Figure 1 summarizes the search results and selection process of all studies included in our synthesis. Overall, the number of records identified in our database was 1561 (PubMed: 931; CINAHL: 147; Embase: 483). The number of additional records identified through other sources was 435 (MeSH: 434, hand search: 1). Of these records, 1412 remained after duplicates were electronically removed. An additional 1026 articles were removed after title screening, and 308 articles were removed after abstract screening.

An external file that holds a picture, illustration, etc.
Object name is jmir_v23i2e26145_fig1.jpg

PRISMA flowchart of the literature search and study selection process.

Study Characteristics

Of the 78 articles assessed for eligibility, 50 were excluded for the following reasons: (1) no report of an instrument for digital literacy despite the title or abstract of the paper alluding to measures of digital literacy (n=23); (2) the instrument presented was not validated (n=13); (3) studies were mainly on population assessments and measured digital literacy only as part of a wider assessment of multiple factors (n=10); (3) instruments were not available in English or in a publicly accessible format (n=4); and (4) the study did not specifically target older adults (n=1). Ultimately, 27 articles were included in our review.

Table 1 provides a general summary of the included studies. While publication years ranged from 2009 to 2020, most articles reviewed were conducted between 2015 and 2020. The majority (17/27, 63%) of included studies were cross-sectional, but 2 studies were pre- and post-test studies, 2 were randomized controlled trials (RCTs), 1 was longitudinal, and 1 was a mixed-method study with both surveys and focus group interviews. Most studies were conducted in the United States (18/27), but some studies were also conducted in Europe (Germany, 3/27; Italy, 1/27; Sweden, 1/27). Studies mostly defined older adults as aged ≥50 years (10/27) or ≥60 years (8/27).

Summary of included studies (n=27).

a Survey and focus group interviews.

Table 2 presents the detailed characteristics of all 27 included studies. Overall, the eHealth Literacy Scale (eHEALS) [ 29 ] was the most frequently used instrument to measure digital literacy among older adults (16/27, 59%). The Unified Theory of Acceptance and Usage of Technology (UTAUT) was also used by 2 studies from Germany [ 9 , 30 ] and 1 study from Bangladesh [ 31 ]. Loyd-Gressard’s Computer Attitude Scale (CAS) was used in 2 studies that focused heavily on computer anxiety and confidence [ 32 , 33 ]. There was not wide variation in the quality of studies assessed via the CCAT, with scores ranging from 34 (34/40, 85%) to 40 (40/40, 100%) of a total of 40 points.

Characteristics of included studies.

a CCAT: Crowe Critical Appraisal Tool.

b Total CCAT score is 40 points.

c MDPQ: Mobile Device Proficiency Questionnaire.

d CPQ: Computer Proficiency Questionnaire.

e ICTs: information and communication technologies.

f S-ZTPI: Swedish Zimbardo Time Perspective Inventory.

g ATTQ: Attitudes Toward Technologies Questionnaire.

h RCT: randomized controlled trial.

i APOI: Attitudes towards Psychological Online Interventions.

j UTAUT: Unified Theory of Acceptance and Use of Technology.

k CAS: Computer Attitude Scale.

l eHEALS: eHealth Literacy Scale.

m HIT: health information technology.

n TAM: Adapted Technology Acceptance Model.

o ETUQ: Everyday Technology Use Questionnaire.

p ATC/IQ: Attitudes Toward Computer/Internet Questionnaire.

As seen in Table 3 , all instruments were analyzed for quality assessment to assess which DigComp elements of digital literacy were met [ 18 ]. Studies mostly satisfied 1 or 2 aspects of the information and data literacy criteria, but the Mobile Device Proficiency Questionnaire (MDPQ) satisfied all 5 elements, including those related to safety and data creation.

Inclusion of the European Commission’s Digital Competence (DigComp) Framework criteria and quality assessment of the included studies.

a European Commission’s Digital Competence (DigComp) Framework criteria of (1) information and data literacy (browsing, searching, filtering data), (2) communication and collaboration (interacting, sharing, engaging in citizenship, collaborating), (3) digital content creation (developing, integrating, and re-elaborating digital content; copyright; licenses; programming), (4) safety (protecting devices, protecting personal data and privacy, protecting health and well-being), and (5) problem solving (solving technical problems, identifying needs and technological responses, creatively using digital technologies, identifying digital competence gaps) [ 18 ].

b O: included in the questionnaire.

c X: not included in the questionnaire.

Principal Findings

In this systematic review, we highlighted the importance of digital literacy among older adults and provided a comprehensive overview of the instruments that are being employed to measure their digital literacy. We also illustrated the various strengths and limitations of each instrument, relative to age-appropriateness and suitability for older adults, in accordance with the components of a validated, digital competency framework [ 18 ]. Our review is timely because, to the best of our knowledge, few systematic reviews to date have evaluated measurements of digital literacy for older adults specifically.

In the digital era, providing education for patients regarding management of their physical or mental illness or injury, explaining posttreatment home care needs, and managing their diet, nutrition, and exercise are all duties that are beginning to be “digitalized” [ 58 ]. Moreover, digital technologies are providing practitioners with more effective and user-centered ways to educate, inform, and treat older patients. For example, in a systematic review of “virtual visits” in home care for older patients, both service users and providers found online visits to be more flexible, easy to arrange, and personal than offline visits [ 59 ]. In another study of an internet-based videoconferencing system for frail elderly people in Nordic countries, telehealth was associated with reduced loneliness among 88% of users, while simultaneously reducing the odds of catching a cold during winter months due to leaving the house [ 60 ].

Overall, we discovered that while the eHEALS is most frequently used to measure digital literacy among older adults, the MDPQ may be more appropriate for measuring the literacy of older adults. Unlike the eHEALS, the MDPQ attempts to measure older adults’ digital content creation capacity (developing, integrating, and re-elaborating digital content; copyright; licenses; programming), which according to the European Commission, can give valuable information regarding an individual’s ability to add value to new media for self-expression and knowledge creation [ 18 ].

Also, the MDPQ contains numerous items related to data protection and privacy such as “passwords can be created to block/unblock a mobile device” or “search history and temporary files can be deleted” despite the fact that security was the least measured element of the DigComp Framework among the instruments in our study. Only the CAS, eHEALS, and MDPQ provide items related to data protection and privacy, which is concerning given that older adults comprise a significant proportion of the target population for internet scams or email attacks [ 61 ].

In our review of 27 selected articles, more than half (16/27, 59%) used the eHEALS to measure the digital literacy of older adults. Several reasons can be speculated; this instrument is short (8 items), and the questions are simple to understand (eg, “I know how to use the Internet to answer my health questions”). Scholars claim that it is easy to administer to older adults [ 48 ]. It should be noted that because of its simplicity, there has been some debate regarding the validity of the eHEALS [ 62 - 64 ]. As described by Jordan and colleagues [ 64 ], the eHEALS has a “lack of an explicit definition of the concept that many health literacy indices were developed to measure limit... ...its ability to make fully informed judgments... ...about a person’s ability to seek, understand, and use health information.”

Studies focusing on similar research aims also employ similar instruments. For example, the CAS was used in 2 studies that focused on computer anxiety and confidence. In the existing body of literature, the CAS has often been used for studies targeting individuals in highly stressful environments such as business graduate students [ 65 ], psychiatric inpatients [ 66 ], and students studying at a 2-year technical college experiencing “technostress” [ 67 ]. As explained by Kelley and Charness [ 68 ], older adults “commit more errors in post-training evaluations” than the general population, which may result in greater stress and anxiety. This may demonstrate the suitability of the CAS for older adult populations.

Regarding the overall quality of the included studies evaluated using the CCAT, some variation existed among the studies reviewed. Studies that were cross-sectional or lacked acquisition of written informed consent and used alternate approaches, such as telephone or self-reported, web-based or email surveys, scored poorly in the “design” and “ethical matters” category. Studies also lost marks if there was no flow diagram, there was no mention of design methods in the title of their manuscript, or they had biased sampling methods (convenience sampling, pertaining only to 1 or 2 ethnic groups).

Contrastingly, 2 RCTs in our review received a score of 100% on the CCAT, as they had excellent preliminaries, introductions, study design, sampling methods, data collection methods, ethical matters, results, and discussions. These studies employed performance-based measures like the Attitudes Toward Computer/Internet Questionnaire (ATC-IQ; semistructured interview) and UTAUT model (face-to-face interview), which are more reliable data collection methods than self-administered questionnaires. Performance-based measures like these may be suitable for studies targeting older adults, but it should be noted that clinical environments and personal fitness can greatly influence outcomes, especially if environments contain learners of mixed ability [ 69 ], rapid progression [ 34 ], and the possibility for embarrassment or discomfort [ 70 ]. Positive clinical settings are associated with improved performance, as observed in 1 of the RCTs in our review, where “a combination of patience, perseverance, and peer-to-peer or instructor encouragement, whether with words or a pat on the shoulder” were successful in reducing older adults’ stress and anxiety during digital learning [ 32 ].

As aforementioned, for older adults, it is important that the research and design of digital technologies encompass the heterogeneity of their capacity. While we believe that instructions should be “clear and understandable” to study participants [ 34 ], we also believe that literacy elements that are generalizable to the rest of the population (relative to communication, safety, problem solving, and competence) should be measured for this population as well. As described by Hänninen et al [ 16 ], the digital capacity of older adults lies on a continuum or spectrum and can range from actively independent to limited.

Previous studies recommend that, instead of employing the full MDPQ or technology acceptance model (TAM), the shorter, 16-question version [ 34 ], or senior version of the TAM (Senior Technology Acceptance & Adoption Model), may be more appropriate for relatively older and frailer populations [ 7 ]. User-centeredness in instrument development and measurement is crucial for this population, as the functional status of older adults varies immensely. Furthermore, scales and scoring methods are encouraged to be as inclusive as possible, so that they encompass the diversity in functionality that exists among study subjects.

Limitations

Ultimately, many limitations exist in our review. First, it is important to mention that the association between age and digital capacity is controversial among certain scholars who argue that age-based divisions are too simplistic [ 23 ] and unclear [ 71 ] to explain the digital divide. In the Netherlands, for example, “digital natives” do not appear to exist, and other factors like life stages and socialization are considered to be more relevant proxies of digital literacy than age [ 71 ]. Also, in a German study, perceptions of threat due to technologization were perceived as the main predictors of digital capacity, rather than age itself [ 23 ]. Older adults with lower perceptions of threat could be digitally fluent, just as younger adults with higher perceptions of threat could be digitally illiterate. Future questionnaires should consider measuring this factor in depth and the possible interaction that it has with age in predicting digital capacity outcomes.

Likewise, digital literacy is a process-oriented skill, and measuring it in isolation may be inaccurate for quantifying an individual’s skillset [ 72 ]. In the Lily Model, Norman and Skinner [ 72 ] posit that there are 6 core skills of literacy: traditional, media, information, computer, scientific, and health. Not only are these skills heavily interconnected with one another but also only an in-depth analysis of all 6 can fully contextualize an individual’s personal, social, and environmental contexts [ 72 ]. For example, computer literacy may be heavily influenced by an individual’s ability to understand and read passages (traditional literacy) as well as their ability to find information on a topic (information literacy) and understand certain scientific terms (science literacy). Because these literacy types are interconnected, only an in-depth analysis of all 6 may accurately measure an individual’s knowledge.

Also, as observed in our review, many of the investigated instruments, including the Attitudes Toward Technologies Questionnaire, TAM, ATC-IQ, APOI, and CAS, measured attitudes or perceptions toward technology rather than digital aptitude itself. While studies on attitude are important, the lack of measures examining older adults’ abilities to use information and communications technology was an unexpected limitation of the reviews studied.

Last, even though previous studies have argued that the DigComp Framework is one of the broadest and most generalizable frameworks for assessing digital literacy measures [ 15 , 21 ], it is undeniable that certain types of survey error are more likely to occur among older populations relative to memory loss, health problems, sensory and cognitive impairments, and personal or motivational factors that influence their ability to participate in an investigation [ 73 ]. The author and editors of this framework specifically mention in their proposal that, because they adopted a “general” rather than “individual” approach, their framework should be considered only as a starting point in interpreting digital competence among different age groups [ 18 ].

In conclusion, more studies are required so that the measurement of digital literacy among older adults can become more elaborate and specific. Digital literacy evidently has strong associations with the utility of information and communications technologies that promote physical and mental well-being among older adults. Further assessments and studies of digital literacy among older adults that overcome the limitations of existing research and measurement designs would allow for better allocation of support and resources to address the diverse health care needs of this growing but vulnerable population.

Acknowledgments

This research was supported by the Brain Korea 21 FOUR Project funded by the National Research Foundation of Korea, Yonsei University College of Nursing. This study received funding from the National Research Foundation of Korea (grant number 2020R1A6A1A0304198911 [SO, KK, and JC]), the Ministry of Education of the Republic of Korea, and Yonsei University College of Nursing Faculty Research Fund (6-2020-0188 [SHC and JC]).

Abbreviations

Multimedia appendix 1, multimedia appendix 2.

Authors' Contributions: JC and SC conceptualized and supervised the study. JC, SC, and SO developed the methodology. SO, MK, and JO screened the studies and performed the formal analysis. KK and JC performed the validation. SO, KK, SC, and JC wrote, reviewed, and edited the manuscript. All authors have read and agreed to the published version of the manuscript.

Conflicts of Interest: None declared.

Dimensions of teachers’ data literacy: A systematic review of literature from 1990 to 2021

  • Open access
  • Published: 06 May 2024

Cite this article

You have full access to this open access article

literacy rate literature review

  • Jihyun Lee   ORCID: orcid.org/0000-0001-5896-0686 1 ,
  • Dennis Alonzo 1 ,
  • Kim Beswick 1 ,
  • Jan Michael Vincent Abril 1 ,
  • Adrian W. Chew 1 &
  • Cherry Zin Oo 2  

657 Accesses

2 Altmetric

Explore all metrics

The current study presents a systematic review of teachers’ data literacy, arising from a synthesis of 83 empirical studies published between 1990 to 2021. Our review identified 95 distinct indicators across five dimensions: (a) knowledge about data, (b) skills in using data, (c) dispositions towards data use, (d) data application for various purposes, and (e) data-related behaviors. Our findings indicate that teachers' data literacy goes beyond addressing the needs of supporting student learning and includes elements such as teacher reflection, collaboration, communication, and participation in professional development. Considering these findings, future policies should acknowledge the significance of teacher dispositions and behaviors in relation to data, recognizing that they are as important as knowledge and skills acquisition. Additionally, prioritizing the provision of system-level support to foster teacher collaboration within in-school professional development programs may prove useful in enhancing teachers’ data literacy.

Similar content being viewed by others

literacy rate literature review

A Medical Science Educator’s Guide to Selecting a Research Paradigm: Building a Basis for Better Research

literacy rate literature review

The cyclical ethical effects of using artificial intelligence in education

literacy rate literature review

A systematic literature review of ICT integration in secondary education: what works, what does not, and what next?

Avoid common mistakes on your manuscript.

1 Introduction

In recent years, there has been a growing recognition of the importance of teachers’ data literacy for educational policy, research, and practice. This trend was ignited in 2009 when Arne Duncan, the former Secretary of Education of the United States, advocated evidence-driven practices in schools to enhance student performance (Mandinach & Gummer, 2016 ). Since then, there has been an increasing expectation for teachers to engage in data-informed practices to guide teaching and decision-making in schools. Following this trend, educational researchers have also increasingly directed their attention towards offering conceptual and theoretical foundations for teachers’ data literacy.

Various organizations and researchers have provided the definitions of teachers’ data literacy. For example, drawing on the opinions of diverse stakeholder groups, Data Quality Campaign ( 2014 ) defined teachers’ data literacy as teachers’ capabilities to “continuously, effectively, and ethically access, interpret, act on, and communicate multiple types of data from state, local, classroom, and other sources to improve outcomes for students in a manner appropriate to educators' professional roles and responsibilities” (p. 1). Kippers et al. ( 2018 ) defined teachers’ data literacy as “educators’ ability to set a purpose, collect, analyze, and interpret data and take instructional action” (p. 21). Similarly, teachers’ data literacy has been defined as “one’s ability, or the broad knowledge and skills, needed to engage in data use or implement a data use inquiry process (Abrams et al., 2021 , p. 100,868).

The data literacy for teachers (DLFT) framework proposed by Mandinach and Gummer defined teachers’ data literacy as “… the ability to transform information into actionable instructional knowledge and practices by collecting, analyzing, and interpreting all types of data to help determine instructional steps” (Gummer & Mandinach, 2015 , p. 2). In recent years, much of the research efforts to provide a theoretical framework on teachers’ data literacy has been led by Mandinach and Gummer (Gummer & Mandinach, 2015 ; Mandinach & Gummer, 2012 , 2013a , 2016 ; Mandinach et al., 2015 ). As far as we can ascertain, their work presents the most comprehensive framework of teachers’ data literacy in the current literature. The primary sources of Mandinach and Gummer’s DLFT framework were their previous works, Mandinach and Gummer ( 2012 ) and Mandinach et al. ( 2015 ). Their DLFT framework was developed as the results of the analysis of the teacher licensure documents across the US states (Mandinach et al., 2015 ) and the text analysis of the perspectives and definitions provided by 55 researchers and professional development providers during a braining storming at the conference held in 2012 (cf. Mandinach & Gummer, 2012 ). There are five components in the framework: (a) identifying problems and framing questions, (b) using data, (c) transforming data into information, (d) transforming information into decisions, and (e) evaluating outcomes. Their framework aimed to identify “the specific knowledge, skills, and dispositions teachers need to use data effectively and responsibly” (Mandinach & Gummer, 2016 , p. 366). However, a potential sixth dimension, “dispositions, habits of mind, or factors that influence data use” (Mandinach & Gummer, 2016 , p. 372) was mentioned but not included in the framework.

2 The present study

In the present study, we conducted a systematic review of the empirical studies on teachers’ data literacy and data use published in academic journals between 1990 and 2021. Our primary purpose was to enhance the conceptual clarity of teachers’ data literacy by providing its updated definition, indicators, and dimensions.

We argue that there are several reasons to justify the need for this systematic review. Firstly, we update, complement, and compare our review outcomes and the DLFT framework in Mandinach and Gummer ( 2016 ). A systematic review of research studies on teachers’ data use was conducted by Mandinach and Gummer ( 2013b ), but the study selection was limited to years between 2001 and 2009. Therefore, one of the aims of the present study is to compare our systematic review outcomes against the dimensions and specific indicators identified in the DLFT framework (Mandinach & Gummer, 2016 ). The present literature search spans a period from 1990 to 2021. We have set 1990 as the lower-boundary year because “during the 1990s, a new hypothesis – that the quality of teaching would provide a high-leverage policy target – began to gain currency” (Darling-Hammond et al., 2003 , p. 5).

Secondly, it appears that much work on teachers’ data literacy, including that of Mandinach and Gummer, has tended to focus on teachers’ data use in relation to teaching (e.g., Beck et al., 2020 ; Datnow et al., 2012 ) and instructional improvement (e.g., Datnow et al., 2021 ; Kerr et al., 2006 ; Wachen et al., 2018 ) or in relation to student academic performance (e.g., Poortman & Schildkamp, 2016 ; Staman et al., 2017 ). However, we argue that classroom teachers’ tasks and responsibilities go beyond teaching itself and include many other tasks such as advising/counselling, organising excursions, and administrative work (e.g., Albiladi et al., 2020 ; Kallemeyn, 2014 ). Our review, therefore, examines how teachers’ data use practices may be manifested across a range of teacher responsibilities beyond teaching and teaching-related tasks.

Thirdly, there has been a relative lack of attention to teachers’ personal dispositions in data literacy research. Dispositions refer to a person's inherent tendencies, attitudes, approaches, and inclinations towards ways of thinking, behaving, and believing (Lee & Stankov, 2018 ; Mischel & Shoda, 1995 ). According to Katz ( 1993 ), a disposition can be defined as “a tendency to exhibit frequently, consciously, and voluntarily a pattern of behavior that is directed to a broad goal” (p. 2). In the context of education, disposition refers to the attitudes, beliefs, and values that influence a teacher’s actions, decision-making, and interactions with various stakeholders including students, colleagues, and school leaders (Darling-Hammond et al., 2003 ). While teachers’ dispositions were mentioned in Mandinach and Gummer ( 2016 ), dispositions were not included in their DLFT framework. Teacher educators have long emphasized that accomplished teachers need to possess extensive knowledge, skills, and a range of dispositions to support the learning of all students in the classroom, engage in on-going professional development, and continuously strive to enhance their own learning throughout their careers (Darling-Hammond et al., 2003 ; Sykes, 1999 ). Therefore, we aim to identify a range of teachers’ dispositions in relation to data literacy and data use in the school contexts.

Fourthly, we argue that teachers’ data literacy may be more important in the current context of the rapidly evolving data and digital landscape influenced by the technical advancements in artificial intelligence. Teachers may encounter significant challenges in comprehending and addressing a wide array of issues, both anticipated and unforeseen, as well as observed and unobserved situations, stemming from various artificial intelligence tools and automated machines. In this sense, comprehending the nature, types, and functions of data is crucial for teachers. Without such understanding, the educational community and teaching workforce may soon find themselves in an increasingly worrisome situation when it comes to evaluating data and information.

Finally, we argue that there is a need to update conceptual clarity regarding teachers’ data literacy in the current literature. Several systematic review studies have focused on features in professional development interventions (PDIs) aimed at improving teachers’ data use in schools (e.g., Ansyari et al., 2020 ; 2022 ; Espin et al., 2021 ), emphasizing the need to understand data literacy as a continuum spanning from pre-service to in-service teachers and from novice to veteran educators (Beck & Nunnaley, 2021 ). Other systematic review studies have given substantial attention to data-based decision-making (DBDM) in the schools (e.g., Espin et al., 2021 ; Filderman et al., 2018 ; Gesel et al., 2021 ; Hoogland et al., 2016 ). For example, Hoogland et al. ( 2016 ) investigated the prerequisites for data-based decision-making (DBDM) in the classroom, highlighting nine themes that influence DBDM, such as collaboration, leadership, culture, time, and resources. These systematic reviews are highly relevant to the current review, as the PDIs, understanding the continuum, or data-based decision-making would require a clear and updated understanding of what teachers’ data literacy should be. We hope that the current study’s definition, indicators, and dimensions of teachers’ data literacy may be useful in conjunction with other systematic review studies on teachers’ data use and factors influencing teachers’ data use.

3.1 Data sources and selection of the studies

Our strategies for literature search were based on the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), a framework for reporting and synthesising literature review (Moher et al., 2009 ). In accordance with PRISMA suggestions, we followed the four steps in locating and reviewing the relevant studies. First, we conducted initial searches to identify relevant studies, using three databases: Scopus, ProQuest, and Web of Science. Keywords in our search were teacher, school, data, data use, data literacy, evidence-based, and decision-making (see Table  1 for the detailed search strategy syntax). This initial search, using the combination of the identified keywords, yielded 2,414 journal articles (see Fig.  1 ). After removing duplicates, 1,976 articles remained.

figure 1

Study selection flow using PRISMA guidelines

Secondly, we set and applied the inclusion criteria to screen the studies. The inclusion criteria were: (a) topics relating to the key words, (b) school context of primary or secondary school settings (i.e., excluding studies focusing on university, vocational education, and adult learning), (c) the full text written in English (excluding studies if the full text is presented in another language or if only the abstract was presented in English), (d) peer-reviewed empirical studies (across quantitative, qualitative, and mixed-methods) published in academic journals (excluding book chapters, conference papers, thesis) to ensure the inclusion of the published work that has undergone peer-review process, and finally, (e) published studies from 1990 onwards. The titles and abstracts of the studies were reviewed to assess their eligibility based on the inclusion criteria. As a result of applying these criteria, 117 articles were selected for the next step, full-text review.

Thirdly, we evaluated the eligibility of the full-text versions of the published studies. This full-text review resulted in a further exclusion of 34 studies as they were found to not meet all the inclusion criteria. We also examined whether the studies included data literacy or data-driven decision-making. Following these assessments, we identified 83 articles that met all the inclusion criteria.

Finally, we reviewed, coded, and analyzed the final set of the selected studies. The analysis approaches are described below.

3.2 Approach to analysis

We employed a thematic synthesis methodology, following the framework outlined by Thomas & Harden ( 2008 ). The coding and analysis process consisted of three main stages: (a) conducting a line-by-line reading and coding of the text, (b) identifying specific descriptive codes, and (c) generating analytical themes by grouping conceptually inter-related descriptive codes. The final analytic process was, therefore, categorizing and naming related descriptive codes to produce analytical themes. During the development of the analytic themes, we utilized an inductive approach, organizing conceptually interconnected codes into broader themes.

The first author developed the descriptive and analytical themes, which were then reviewed by another two authors. To ensure coding rigor and consistency, three authors independently coded the same two articles, and then compared the coding to address any inconsistencies and reach a consensus. This process was repeated in four iterations. Once the three authors who were involved in the initial coding reached the consensus, the remaining authors double-checked the final outputs of the thematic analysis (i.e., codes, and themes). We have labelled descriptive codes as ‘indicators’ of teachers' data literacy, while the broader groups of descriptive codes, referred to as analytic themes, represent ‘dimensions’ of teachers’ data literacy.

4.1 Characteristics of the reviewed studies

The main purpose of the present study was to examine the conceptualization of teachers’ data literacy from 83 peer-reviewed empirical studies. Table 2 presents the studies included in our systematic review, along with the summary of the study characteristics such as country, school-level, study focus (i.e., main constructs), study purposes/objectives, research method, data collection tools, and sample size. Figure  2 presents the number of the reviewed studies by publication year. We found that since 2015, there has been an increase in the number of published empirical studies on teachers' data literacy.

figure 2

Number of the reviewed studies by publication year

Out of 83 studies, 50 were conducted in the United States. Thirteen studies were from Netherlands, four from Belgium, three from Australia, two from each of Canada and the United Kingdom, and one study for each of the following ten countries: China, Denmark, Germany, Indonesia, Ireland, Kenya, Korea, Norway, South Africa, and Sweden. Therefore, more than half of the studies (i.e., 58 studies, 70%) were conducted in the English-speaking countries. In terms of school-settings, studies were mostly conducted in primary school settings or in combination with high school: 36 studies in primary school settings, 16 in secondary school settings, and 30 studies were in both primary and secondary school settings. The most common design was qualitative ( n  = 35 studies), followed by mixed methods ( n  = 30) and quantitative ( n  = 18). Multiple sources of data collection (e.g., interview and survey) were used in 22 studies. The most commonly used data collection tool was interview ( n  = 55), which was followed by surveys ( n  = 37) and observation ( n  = 25). A smaller set of studies used focus group discussion ( n  = 18) and document analysis ( n  = 19). A few studies used students’ standardised assessment data ( n  = 4), field notes ( n  = 4), and teacher performance on data literacy test ( n  = 4).

We also reviewed the study topics and found that there are seven foci among the reviewed studies: (a) factors influencing teachers’ data use ( n  = 29), (b) specific practices in teachers’ data use ( n  = 27), (c) teachers’ data use to enhance teaching practices ( n  = 25), (d) teachers’ data use for various purposes ( n  = 24), (e) approaches to improve teachers’ data literacy ( n  = 22), (f) approaches to improve teachers’ assessment literacy ( n  = 19), and (g) teachers’ data use to improve student learning outcomes ( n  = 19).

4.2 Dimensions and indicators of teaches’ data literacy

Our thematic analysis identified 95 descriptive codes (see Table  3 ). Careful review of the identified descriptive codes suggested that they can be viewed as indicators of teachers’ knowledge, attitudes, behaviors, and dispositions in data use. These indicators were further organized into inter-related concepts, which formed analytic themes; we refer to these as ‘dimensions’ (see Table  3 ). There were five broad dimensions that emerged from the indicators: knowledge about data (Dimension 1), skills in using data (Dimension 2), dispositions towards data use (Dimension 3), data application for various purposes (Dimension 4), and data-related behaviors (Dimension 5).

It is necessary to point out that Dimension 1 pertains to understanding the nature of data itself, focusing on knowledge about data. On the other hand, Dimension 2 revolves around data-related skills in the actual use of data , encompassing a spectrum of sequences including data generation, processing, and production. These two dimensions, i.e., knowledge and skills, are highly interconnected and complement each other. Proficiency in data-use skills (Dimension 2) may not be developed without a solid understanding of how data can be utilised, for instance, in teaching practices or school improvement in data use (Dimension 1). Conversely, teachers' understanding of how data can enhance teaching practices (Dimension 1) can guide them in determining specific approaches to analysing particular datasets (Dimension 2). While we acknowledge the complementary nature of knowledge and skills, it is important to note that certain aspects of knowledge and skills may not completely overlap. For instance, a teacher who understands the process of creating state-level assessment data may not necessarily possess the technical expertise required to analyze state-level data, taking into account measurement errors. Therefore, we maintain knowledge and skills as two distinct dimensions to highlight both as the core components of teachers’ data literacy.

Within each of the five broad dimensions, we also uncovered sub-themes to illuminate the constituent elements of those dimensions. Under Dimension 1, four sub-themes emerged: “knowledge about data”, knowledge about data for “teaching practices”, understanding “data culture in the school”, and understanding the use of “external assessment”. Dimension 2 featured sub-themes highlighting the sequential stages of data utilization: “data generation & collection”, “data analysis”, “data interpretation”, “data integration”, “evaluation”, and “reporting”. Within Dimension 3, we identified dispositions towards data use, encompassing sub-themes such as confidence, values/beliefs, trust/respect, and anxiety. Dimension 4 revealed various purposes of data applications, categorized into three sub-themes: “teaching,” “student learning,” and “school improvement.” Lastly, Dimension 5 delineated teachers’ behaviors related to data into two sub-themes: “communication & discussion” and “participation & engagement.”

In the following passages we provide detailed descriptions of the indicators and their associated dimensions. Figure  3 presents a visual a summary of these indicators and dimensions.

figure 3

A summary of the dimensions and indicators of teachers’ data literacy

4.2.1 Dimension 1. Knowledge about data

The first dimension of teachers’ data literacy pertains to teachers’ knowledge about data . Many studies recognized the importance of data-related knowledge to be utilized in the schools (e.g., Jacobs et al., 2009 ; Omoso et al., 2019 ; Schildkamp et al., 2017 ). Our review revealed four major ways that teachers' data-related knowledge can be manifested. Firstly, teachers’ knowledge about data involves their understanding of the necessary steps in data analysis procedures (Ebbeler et al., 2016 ; Snodgrass Rangel et al., 2016 ; Vanlommel et al., 2021 ; Wardrip & Herman, 2018 ) and understanding of different data types to be used for varying purposes (Abdusyakur & Poortman, 2019 ; Beck et al., 2020 ; Howley et al., 2013 ; Reeves et al., 2016 ).

Secondly, teachers’ knowledge about data involves their capability to relate the insights gleaned from data to inform their teaching practices (Abrams et al., 2016 ; Jimerson et al., 2016 ). Specifically, data-literate teachers leverage student assessment data to evaluate learning progress (Abrams et al., 2016 ; Jimerson, 2014 ; Jimerson & Wayman, 2015 ; Jimerson et al., 2016 ; Snodgrass Rangel et al., 2016 ), to tailor classroom instruction based on data insights (Mokhtari et al., 2009 ; Poortman & Schildkamp, 2016 ; Staman et al., 2017 ; van der Scheer & Visscher, 2018 ), and to ensure alignment between instructional approaches and appropriate assessment methods (Howley et al., 2013 ; Marsh & Farrell, 2015 ; van der Scheer & Visscher, 2018 ).

Thirdly, teachers’ data literacy extends to understanding of the school culture surrounding data utilization (e.g., Andersen, 2020 ; Schildkamp, 2019 ; Wachen et al., 2018 ). This encompasses recognizing the conditions that may facilitate or hinder teachers’ data use (Abdusyakur & Poortman, 2019 ; Anderson et al., 2010 ; Keuning et al., 2017 ) and navigating various challenges associated with using assessment data in the school (Datnow et al., 2012 ; Ford, 2018 ; Kanjee & Moloi, 2014 ; Thomas & Huffman, 2011 ).

Lastly, teachers’ knowledge about data includes understanding of externally administered assessment data and data system, such as state-level assessment policies related to data use (Copp, 2017 ; Hardy, 2019 ; Reed, 2015 ) and understanding the broader state-level contexts that impact data utilization within the school (Datnow et al., 2013 ; Dunn et al., 2013a ; Ford, 2018 ; Omoso et al., 2019 ; Powell et al., 2021 ). Teachers may need to have thorough knowledge of educational government policies to ensure alignment between state-level curriculum initiatives and school-level assessment policies (Anderson et al., 2010 ; Copp, 2017 ; Gelderblom et al., 2016 ; Hardy, 2015 ).

In summary, existing literature highlights that data-literate teachers would have a comprehensive understanding of a diverse range of data sources and purposes, regularly reviewing and evaluating student outcomes from various channels. Consequently, if teachers face excessive pressure to meet accountability measures and improve standardized testing results, it could potentially hinder their overall development and growth in a broad spectrum of data-related knowledge.

4.2.2 Dimension 2. Skills in using data

Skills in using data is the second key dimension in teachers’ data literacy. There were a wide range of specific data-skills mentioned in the literature, spanning from data generation and collection (Farley-Ripple et al., 2019 ; Jimerson & Wayman, 2015 ) to data analysis (Farley-Ripple et al., 2019 ; Jimerson & Wayman, 2015 ; Marsh et al., 2010 ), data interpretation and integration (Jimerson & Wayman, 2015 ; Marsh et al., 2010 ), evaluation (Andersen, 2020 ; Dunn et al., 2013b ; Thomas & Huffman, 2011 ), and report writing (Farley-Ripple et al., 2019 ; Jimerson & Wayman, 2015 ). These indicators (see Table  3 ) emphasize that teachers’ data literacy requires proficiency across the entire sequence, across different stages of data generation, processing, and production.

Teachers’ skills in data use also involve selecting specific data types appropriate for different purposes (Anderson et al., 2010 ; Jimerson et al., 2016 ; Kanjee & Moloi, 2014 ), analysing multiple sources of data on student learning outcomes (Datnow et al., 2012 ; Vanlommel et al., 2021 ; von der Embse et al., 2021 ), and integrating multiple data sources to arrive at a holistic assessment of student progress (Brunner et al., 2005 ; Farley-Ripple et al., 2019 ; Ford, 2018 ; Jacobs et al., 2009 ; Mausethagen et al., 2018 ). For example, teachers may need to apply different data analytic approaches when evaluating student outcomes based on school-based versus externally administered standardized assessments (Copp, 2017 ; Curry et al., 2016 ; Omoso et al., 2019 ; Wardrip & Herman, 2018 ; Zeuch et al., 2017 ). Data-literate teachers may also plan data analysis for targeted purposes, such as analyzing students’ social-emotional outcomes (Abrams et al., 2021 ; Jimerson et al., 2021 ; von der Embse et al., 2021 ; Wardrip & Herman, 2018 ), identifying individual students’ learning needs, making recommendations for curriculum revisions, or evaluating pedagogical approaches (Dunn et al., 2013a ; Snodgrass Rangel et al., 2016 ; Wolff et al., 2019 ; Young, 2006 ).

In summary, this “skills” dimension highlights the importance of teachers possessing a diverse array of competencies to leverage data effectively. The literature reviewed identified various aspects of teachers’ data use, spanning the spectrum from data collection and generation to analysis, interpretation, integration across multiple sources, evaluation, and reporting.

4.2.3 Dimension 3. Dispositions towards data use

While somewhat overlooked in data literacy literature, teachers’ disposition is a crucial component of their data literacy. Our review identified four major types of such dispositions in the context of teachers’ data literacy (see Table  3 ). Firstly, studies have underscored that teachers’ confidence in using data may be necessary when making data-driven school-level decisions, for example, to design intervention programs (Andersen, 2020 ; Keuning et al., 2017 ; Staman et al., 2017 ; Thompson, 2012 ), or to develop strategic plans for school improvement (Dunn et al., 2013b ; Poortman & Schildkamp, 2016 ). Researchers also claimed that teachers may need to feel confident in many steps of data processes, across accessing, analyzing, interpreting, evaluating, and discussing data within the school environment (Abrams et al., 2021 ; Dunn et al., 2013a ; von der Embse et al., 2021 ).

The second disposition pertains to teachers valuing and believing in the importance of data use in schools. Data-literate teachers would recognize the usefulness of data in informing school improvement and enhancing student performance (Howley et al., 2013 ; Poortman & Schildkamp, 2016 ; Prenger & Schildkamp, 2018 ). They would also place value on collaboration among colleagues and actively seek institutional support for effective data use (Kallemeyn, 2014 ; Marsh & Farrell, 2015 ; Nicholson et al., 2017 ; Poortman & Schildkamp, 2016 ). Furthermore, they would appreciate the pivotal role of school leaders in supporting and promoting teachers’ data use within the school (Albiladi et al., 2020 ; Curry et al., 2016 ; Joo, 2020 ; Young, 2006 ).

A third type of teacher disposition that our review identified is trust in and respect towards colleagues and school leaders . Teachers often work collaboratively in the school environment when they learn about and utilise school-level data. In this sense, teacher collaboration and sustaining trusting relationships are fundamental in fostering a school culture that appreciates data-driven decision-making, as well as for encouraging teachers to further develop their own data knowledge and skills (Abrams et al., 2021 ; Andersen, 2020 ; Keuning et al., 2017 ). Mutual trust and respect among teachers can allow them to have open and honest conversations about their experiences and share any concerns arising from data use in the school context (Andersen, 2020 ; Datnow et al., 2013 ; Ford, 2018 ; Wachen et al., 2018 ).

Lastly, data anxiety may play a role when teachers use or are expected to use data in the school (Abrams et al., 2021 ; Dunn et al., 2013b ; Reeves et al., 2016 ). Teachers may experience data anxiety when they are expected to effectively analyze student assessment outcomes (Dunn et al., 2013b ; Powell et al., 2021 ), when they are introduced to new data management systems in the school, when they feel pressured to quickly grasp the school’s data management system (Andersen, 2020 ; Dunn et al., 2013a ), or when they are tasked with developing specific strategies to assess and enhance student learning outcomes (Dunn et al., 2013a , b ; Jimerson et al., 2019 ). These types of teacher responsibilities demand proficient data skills and knowledge, which not all teachers may possess, and thus, anxiety may hinder their ability to further develop their data literacy.

In summary, teacher dispositions towards data use can impact their effective utilization of data or impede the capacity to further develop their own data literacy. Our review also illuminated that it is not just individual teachers’ confidence or anxiety towards data use, but also the social dynamics within the school environment, including teacher collaboration, trust and respect, and relationships with the school management team, that can influence teachers’ data literacy. Therefore, fostering a collaborative climate within the school community and creating more opportunities for data use may strengthen a data-driven culture within the school.

4.2.4 Dimension 4. Data applications for various purposes

Our review suggests that teachers' data literacy can be manifested in their use of data for multiple purposes, primarily in three areas: (a) to enhance teaching practices (e.g., Datnow et al., 2012 , 2021 ; Farrell, 2015 ; Gelderblom et al., 2016 ; Wachen et al., 2018 ), (b) to support student learning (e.g., Joo, 2020 ; Lockton et al., 2020 ; Staman et al., 2017 ; Vanlommel et al., 2021 ; van der Scheer & Visscher,  2018 ), and (c) to make plans and strategies for school improvement (e.g., Abdusyakur & Poortman, 2019 ; Jimerson et al., 2021 ; Kallemeyn, 2014 ).

With respect to teaching enhancement purposes, teachers use data to inform their lesson plans (Ford, 2018 ; Gelderblom et al., 2016 ; Snodgrass Rangel et al., 2016 ; Reeves et al., 2016 ), set lesson objectives (Kallemeyn, 2014 ; Snodgrass Rangel et al., 2016 ; Reeves et al., 2016 ), develop differentiated instructions (Beck et al., 2020 ; Datnow et al., 2012 ; Farley-Ripple et al., 2019 ), and provide feedback to students (Gelderblom et al., 2016 ; Andersen, 2020 ; Jimerson et al., 2019 ; Marsh & Farrell, 2015 ). Furthermore, teachers use data to reflect on their own teaching practices (Datnow et al., 2021 ; Ford, 2018 ; Jimerson et al., 2019 ; Snodgrass Rangel et al., 2016 ) and evaluate the impact of using data on teaching and learning outcomes (Gelderblom et al., 2016 ; Marsh & Farrell, 2015 ).

In relation to supporting student learning, teachers use data to recognize individual students’ learning needs (Curry et al., 2016 ; Gelderblom et al., 2016 ), guide students to learning new or challenging concepts (Abrams et al., 2021 ; Keuning et al., 2017 ; Marsh et al., 2010 ; Reeves et al., 2016 ), set learning goals (Abdusyakur & Poortman, 2019 ; Curry et al., 2016 ), and monitor learning progress (Curry et al., 2016 ; Gelderblom et al., 2016 ; Marsh et al., 2010 ).

In terms of guiding school improvement strategies, teachers use data to develop school-based intervention programs (Abdusyakur & Poortman, 2019 ; Jimerson et al., 2021 ; Kallemeyn, 2014 ; Thompson, 2012 ), make decisions about school directions (Huffman & Kalnin, 2003 ; Prenger & Schildkamp, 2018 ; Schildkamp, 2019 ), and evaluate school performance for meeting the accountability requirements (Hardy, 2015 ; Jacobs et al., 2009 ; Jimerson & Wayman, 2015 ; Marsh et al., 2010 ; Omoso et al., 2019 ; Snodgrass Rangel et al., 2019 ).

In summary, the literature indicates that data-literate teachers use data for multiple purposes and consider it essential in fulfilling their various roles and responsibilities within the school. Teachers’ data use for supporting student learning tends to focus primarily on helping students achieve better learning outcomes; in contrast, teachers’ data use for teaching enhancement includes a broader range of data processes and practices.

4.2.5 Dimension 5. Data-related behavior

The fifth and final dimension we identified pertains to teachers' data-related behaviors within and outside the school context. Within this dimension, there appear to be two distinctive sets of teacher behaviors: (a) teachers’ data use to enhance communication and discussion with various stakeholders such as colleagues (Datnow et al., 2013 ; Van Gasse et al., 2017 ), school leaders (Jimerson, 2014 ; Marsh & Farrell, 2015 ; Nicholson et al., 2017 ), and parents (Jimerson & Wayman, 2015 ; Jimerson et al., 2019 ); and (b) teachers’ participation in and engagement with learning about data use (Schildkamp et al., 2019 ; Wardrip & Herman, 2018 ) and data culture in schools (Datnow et al., 2021 ; Keuning et al., 2016 ). These behaviors were found to be integral aspects of teachers' data literacy. Teacher engagement with data is manifested in multiple ways, such as involvement in team-based approaches to data utilization (Michaud, 2016 ; Schildkamp et al., 2017 ; Wardrip & Herman, 2018 ; Young, 2006 ), active participation in creating a school culture of data use (Abrams et al., 2021 ; Albiladi et al., 2020 ), evaluation of the organizational culture and conditions pertaining to data use (Andersen, 2020 ; Datnow et al., 2021 ; Lockton et al., 2020 ), and participation in professional development opportunities focused on data literacy (Ebbeler et al., 2016 ; O’Brien et al., 2022 ; Schildkamp et al., 2017 ).

In summary, this dimension highlights that teachers’ data literacy includes various forms of their active engagement and behavior to enhance the effective use and understanding of data. Our findings also indicate that teacher communication and discussions regarding data primarily focus on student assessment data with various stakeholder groups including colleagues, school leaders, and parents.

5 Discussion

The present study reviews 83 empirical studies on teachers' data literacy published in peer-reviewed journals from 1990 to 2021, and we identified 95 specific indicators categorized across five dimensions: (a) knowledge about data , (b) skills in using data , (c) dispositions towards data use , (d) data applications for various purposes , and (e) data-related behaviors in the school . Our review of the identified indicators of this study has led to the following definition of teachers’ data literacy:

A set of knowledge, skills, and dispositions that empower teachers to utilize data for various purposes, including generating, collecting, analyzing, interpreting, integrating, evaluating, reporting, and communicating, aimed at enhancing teaching, supporting student learning, engaging in school improvement, and fostering self-reflection. Teachers’ data literacy also involves the appreciation for working together with colleagues and school leaders to (a) assess organizational conditions for data use, (b) foster a supportive school culture, and (c) engage in ongoing learning to optimize the effective utilization of data.

Our analysis also revealed several noteworthy findings that are presented in the following sections.

5.1 Teachers’ data literacy and assessment literacy

There have been concerns expressed by scholars about conceptual fuzziness in teachers’ data literacy and assessment literacy (cf. Brookhart, 2011 ; Ebbeler et al., 2016 ; Mandinach, 2014 ; Mandinach & Gummer, 2016 ). Indeed, student assessment data are the most salient form of data in the school (Mandinach & Schildkamp, 2021 ). The research trend of recognising the importance of teachers’ data literacy is often based on the premise that teachers’ data literacy would enhance teaching and ultimately improve student outcomes (cf. Ebbeler et al., 2016 ; Mandinach & Gummer, 2016 ; Poortman & Schildkamp, 2016 ; Thompson, 2012 ; Van Gasse et al., 2018 ; Zhao et al., 2016 ). Furthermore, the systemic pressure on schools to meet accountability requirements has also impacted their endeavors to utilize, assess, and demonstrate school performance based on student assessment data in recent years (Abdusyakur & Poortman, 2019 ; Farrell, 2015 ; Schildkamp et al., 2017 ; Weiss, 2012 ). In these contexts, it is not surprising that educational practitioners would think about student assessment data when they are expected to improve their data skills.

In this light, we have tallied the teacher data literacy indicators that directly relate to student assessment or about students’ learning outcomes . In Table  3 , the symbol “⁑” is used for the indicators related to student assessment, and “ξ” is used for the indicators related to students’ learning outcomes. We found that there were only 19 out of 95 indicators that directly related to student assessment (e.g., knowledge about different purposes of assessment, understanding the alignment between instruction and assessment, understanding state-level assessment policies on data use). Similarly, there were only 13 out of 95 indicators that directly related to students’ learning outcomes (e.g., identifying evidence of student learning outcomes, understanding student learning outcomes using multiple sources).

Our review demonstrates that teachers regularly interact with a diverse array of data and undertake various tasks closely associated with its utilization. Therefore, teachers' data literacy encompasses more than just its use in student assessment and learning outcomes; it extends to understanding students’ social-emotional learning and higher-order thinking skills, assessing school conditions for data use, reflecting on teaching practices, and communicating with colleagues. Consequently, limiting the perspective of teachers’ data literacy solely to assessment literacy may impede their full utilization and appreciation of data applications essential to their multifaceted work in supporting and enhancing student and school outcomes.

5.2 Teachers’ data literacy and data-related dispositions

We found that one of the key aspects of teachers’ data literacy is teachers’ dispositions towards data use. As noted by Mandinach and Gummer ( 2012 , 2016 ), this aspect of teacher characteristics has not received as much research attention as data knowledge or data skills. It is perhaps due to ‘literacy’ being traditionally linked to knowledge and skills (Shavelson et al., 2005 ; also see Mandinach & Gummer, 2012 ) or due to the research trend of unpacking teachers’ needs and pedagogical approaches in specific subject/learning domains (Sykes,  1999 ; see Mandinach & Gummer, 2016 ). However, our review suggests that teacher dispositions towards data use are required in virtually all aspects of data use and data analyses processes. We also found that the most important data-related teacher disposition was confidence . The data literacy literature recognized the importance of teacher confidence, with respect to accessing, collecting, analysing, integrating, evaluating, discussing, and making decisions, suggesting that for teachers to be data literate, confidence may be required in every step of data use. There has been extensive research that has demonstrated a strong link between confidence and learning motivation, indicating that individuals tend to gravitate towards domains in which they feel comfortable and confident (e.g., Lee & Durksen, 2018 ; Lee & Stankov, 2018 ; Stankov & Lee, 2008 ). Our review findings contribute to this existing body of research, emphasizing the importance of confidence in teachers’ data utilization. This underscores the necessity for policies and professional development initiatives aimed at enhancing teachers’ data use to also prioritize strategies for building teachers’ confidence in this area.

Our findings also indicate that teachers’ data literacy is associated with their trust in colleagues and school leaders, as well as their respect for the leadership team's role in leading data use and school improvement (Andersen, 2020 ; Ford, 2018 ; Wachen et al., 2018 ). This suggests that for teachers to be effective data users, they need to feel empowered to voice concerns and express frustrations with colleagues (Andersen, 2020 ; Ford, 2018 ; Wachen et al., 2018 ), seek help when necessary (Wardrip & Herman, 2018 ; Young, 2006 ), and collaboratively develop strategies for effective collaboration within the school (Datnow et al., 2013 ; Huffman & Kalnin, 2003 ; Michaud, 2016 ; Van Gasse et al., 2021 ).

Many teacher tasks are deeply intertwined with human relationships (Lee, 2021 ) and often completed through collaborative efforts (Li et al., 2022 ). Therefore, school leaders and policymakers may recognize that fostering teachers’ data literacy may necessitate cultivating open, honest, and trusting school environments conducive to collaboration. Notably, the social aspect of data literacy was not prominently evident in dimensions related to teachers' knowledge and skills, which suggests that teachers may enhance their knowledge and skills independently from others in the school environment. However, fostering teacher dispositions, such as active engagement in effective data use within the school, appears to be influenced by collaborative relationships with colleagues, as well as the supportive roles of school leaders.

5.3 Teachers’ data literacy and data-related behaviors

Our review showed that teachers’ data literacy goes beyond the knowledge, skills, and dispositions that are required to effectively use data; it also involves a range of behaviors that enhance their ways of using and learning about data. Within this dimension, we noted two sub-categories, communication/discussion and participation/engagement. Therefore, one core aspect of teacher behaviors related to data was found to be communicating with various stakeholders such as colleagues, parents, and school leaders to discuss instructional approaches (e.g., Datnow et al., 2013 ; Militello et al., 2013 ; van der Scheer & Visscher, 2018 ) and assessment results (e.g., Curry et al., 2016 ; Howley et al., 2013 ). The other aspect—participation and engagement—underscores the importance of teacher involvement in team-based learning regarding data use (e.g., Andersen, 2020 ; Young, 2006 ), active engagement in establishing conducive school conditions and fostering a culture of data use within the school community (e.g., Datnow et al., 2021 ; Keuning et al., 2016 ), and proactive participation in professional development to enhance knowledge and skills (e.g., Ebbeler et al., 2016 ; van der Scheer & Visscher, 2018 ). Existing studies on data literacy have not given substantial attention to the importance of teachers' behaviors related to data. However, we argue that teachers’ behaviors related to data deserve recognition as a distinct category within the concept of teachers’ data literacy.

Dimension 4 (about teachers’ disposition) and Dimension 5 (about teachers’ behaviors) would be correlated. For example, teachers who are confident in data use may be more inclined to lead the discussions with other colleagues about data use in the school, and they may pursue additional learning opportunities to become an effective leader in school data use. Trust and respect within the school communities mentioned above would also influence how teachers behave in order to collectively enhance data literacy within the school. Studies (e.g., Ebbeler et al., 2016 ; van der Scheer & Visscher, 2018 ) have highlighted teacher participation in professional development, but there has been a relative lack of research attention to examine the collaborative nature of teacher engagement and learning within the professional settings. With the rapid evolution of educational tools and applications driven by learning analytics and artificial intelligence, the influx of data generated in this new era poses a significant challenge for teachers and school leaders. Accordingly, teacher collaboration in learning and addressing data-related challenges in schools will increasingly become a paramount concern, more so than ever before. In this regard, future policies concerning data use may prioritize the expansion of teacher collaboration and mutual learning as essential components of in-school professional development activities.

5.4 Reflections on Mandinach and Gummer’s ( 2016 ) DLFT framework

We have compared the indicators and dimensions arising from the present study and those in Mandinach and Gummer's ( 2016 ) “data literacy for teachers” (DLFT) framework. For this purpose, the conceptually similar indicators of Mandinach and Gummer ( 2016 ) are included in Table  3 alongside the corresponding indicators identified in this study. As can be seen in Table  3 , some indicators were identified in both studies, but there were also notable differences between the two sets of indicators.

Firstly, it appears that there were more fine-grained indicators across the five dimensions arising from the present study, compared to those included in Mandinach and Gummer’s ( 2016 ) DLFT framework. For instance, our study identified the importance of teacher knowledge about externally administered assessments and associated policies to guide teacher use of data, which were not a part of Mandinach and Gummer’s ( 2016 ) DLFT framework. Overall, 95 indicators of the present study, compared to 59 indicators within Mandinach and Gummer’s ( 2016 ) DLFT framework, indicates the level of details incorporated in our framework.

Secondly, perhaps the most important discrepancy is articulated in our Dimension 3 “Dispositions towards Data Use”. We have identified 25 specific indicators under this dimension, which were clustered into confidence, values/belief, trust/respect, and anxiety. These four constructs were identified as the most prominently featured psychological dispositions when teachers deal with data in the school. In Mandinach and Gummer ( 2016 ), “Dispositions, habits of mind, or factors that include data use” is mentioned, but they “chose not to include them in the conceptual framework… [due to the nature of] these positions as general to effective teaching, rather than specific to data use. They are likely to influence data literacy but are seen as more general habits of mind of good teaching” (p. 372). As such, their framework did not include dispositions as integral part of teachers’ data literacy. We argue that teacher dispositions are an essential component of teachers’ data literacy. Perhaps this discrepancy may have arisen from the views that the teacher dispositions identified in Mandinach and Gummer ( 2016 ) are general teacher qualities – such as “belief that all students can learn”, “belief in data/think critically” and “belief that improvement in education requires a continuous inquiry cycle” (p. 372). On the other hand, teachers’ dispositions in our framework were all specific to data use – such as “confidence in integrating data from multiple sources”, “confidence in discussing data with colleagues”, “trust in principals’ leadership in data use”, “trust in open and honest discussions about data use with colleagues”, and “anxiety in using data to make decision”.

On a related point, and thirdly, our framework has two separate dimensions, one focusing on individuals’ psychological dispositions under “Dimension 3: Dispositions towards Data Use”, and the other centered on behaviors “Dimension 5: Data-Related Behavior”. Most of the indicators under the behavioral dimensions were found to be social interactions, communication, discussion, participation, and engagement, as mentioned above. In Mandinach and Gummer ( 2016 ), psychological dispositions (such as belief) and behavioral tendencies (such as ethical use of data, collaboration, and communication skills) were grouped into one dimension of “Dispositions, habits of mind, or factors that include data use”. Considering these, it appears that there was less emphasis on the dispositions and behavioral tendencies in Mandinach and Gummer ( 2016 ).

On the other hand, Mandinach and Gummer ( 2016 ) offered a fine-grained description of skill-related indicators within their DLFT framework. For example, our indicator of “selecting data appropriate for different purposes” was described with more granularity in the DLFT framework: “understand what data are appropriate”, “use qualitative and quantitative data”, “prioritize data”, and “understand specificity of data to question/problem”. Likewise, our indicator of “describing data patterns” was further divided into “understand data properties”, “access patterns and trends”, “drill down into data” and “examine data” in the DLFT framework. Additionally, two indicators within the Mandinach and Gummer’ ( 2016 ) framework—“understand student privacy” and “ethics use of data, including the protection of privacy and confidentiality of data”—did not fit into any of the indicators or dimensions of the present study. This is because we were unable to locate empirical studies that directly examined ethical data management and data use among teachers. Therefore, data ethics issues, which we believe to be an important aspect of teachers’ data literacy, were omitted from our framework.

Finally, we also note the differences between the broad dimensions proposed by Mandinach and Gummer's ( 2016 ) DLFT framework and our framework. The DLFT framework consisted of: (a) identifying problems and framing questions, (b) using data, (c) transforming data into information, (d) transforming information into decisions, and (e) evaluating outcomes. These five dimensions are primarily about data skills, which was just one dimension of our framework. Furthermore, their indicator descriptions suggest heavy emphasis on data use to inform teaching and learning. In contrast, our dimensions and indicators illustrate the broader purposes and contexts of teachers' data use, highlighting the significance of fostering teacher dispositions and data-related behaviors through effective leadership and a collaborative school environment. In particular, the detailed descriptors for each of the indicators under Dimensions 3, 4, and 5 of the present study are the strengths of our framework, as they illustrate a wide range of varied and specific purposes and data-related dispositions and behaviors related to teachers’ data literacy; these descriptions are relatively sparse in Mandinach and Gummer ( 2016 ).

5.5 Limitations of the present study and future research directions

We acknowledge several limitations of the present study. First, our review focused on empirical studies published in journal articles, and omitted government documents, books, and book chapters and publications by professional organizations. Second, we did not differentiate the studies based on in-service teachers vs. pre-service teachers. Future studies may look into potential differences between these two groups and suggest policy directions and strategies for teacher preparation. Third, teachers may possess discipline-unique capabilities and inclinations, and thus it may be worthwhile to identify teacher characteristics across different disciplines (e.g., Science vs. English) and examine the influences of discipline contexts on teachers’ data use and data literacy. Fourth, exploring teachers’ data literacy required for students at different levels of schooling (e.g., early childhood, primary, and secondary) and for students with diverse needs (e.g., learning difficulties, dyslexia) may provide further insights into the specific expectations within the daily tasks and responsibilities of teachers. Fifth, most of the reviewed studies were conducted in Western or English-speaking countries, and thus our findings may have limited relevance to teacher data literacy in different world regions. Future studies may investigate cross-country characteristics in teachers’ data literacy. Sixth, our research also reveals that current studies of teachers’ data literacy have not explored the possible connections between technological advancements, particularly in AI-based systems, and teachers’ data literacy. This suggests a need to investigate the link between teachers’ data literacy and their proficiency in understanding emerging technologies such as AI-based systems. It is anticipated that discussions on data ethics will emerge as a crucial aspect of teachers’ data literacy in the era of artificial intelligence (AI). Finally, our review did not include, and thus future reviews may examine, system-level contextual factors (e.g., digital technology infrastructure, schools’ socio-economic standing) and their influences on teacher practices in data use.

6 Conclusion

Our review of 83 empirical studies published between 1990 and 2021 produced 95 specific indicators of teachers’ data literacy. The indicators were further categorised into five dimensions: (a) knowledge about data , (b) skills in using data , (c) dispositions towards data use , (d) data applications for various purposes , and (e) data-related behaviors . Our findings suggest that teachers' data literacy encompasses more than just knowledge and skills; it also includes a wide range of dispositions and behaviors. Additionally, teacher data literacy extends beyond assessing student learning outcomes and meeting accountability requirements and includes teachers’ reflection and engagement in professional development.

Data availability

Data sharing not applicable to this article as no datasets were generated or analysed during the current study.

*indicates the studies under the systematic review of this investigation

*Abdusyakur, I., & Poortman, C. L. (2019). Study on data use in Indonesian primary schools. Journal of Professional Capital and Community, 4 (3), 198-215. https://doi.org/10.1108/JPCC-11-2018-0029

*Abrams, L., Varier, D., & Jackson, L. (2016). Unpacking instructional alignment: The influence of teachers’ use of assessment data on instruction. Perspectives in Education, 34 (4), 15–28. https://doi.org/10.18820/2519593X/pie.v34i4.2

*Abrams, L. M., Varier, D., & Mehdi, T. (2021). The intersection of school context and teachers’ data use practice: Implications for an integrated approach to capacity building. Studies in Educational Evaluation , 69 , 100868. https://doi.org/10.1016/j.stueduc.2020.100868

*Albiladi, W. S., Lasater, K., & Bengtson, E. (2020). Data use among principals and teachers: divergent paths or common ground? Implications for the leadership preparation programs . Journal of School Administration Research and Development, 5 (2), 63–76.

*Andersen, I. G. (2020). What went wrong? Examining teachers’ data use and instructional decision making through a bottom-up data intervention in Denmark. International Journal of Educational Research, 102 , 101585. https://doi.org/10.1016/j.ijer.2020.101585

*Anderson, S., Leithwood, K., & Strauss, T. (2010). Leading data use in schools: Organizational conditions and practices at the school and district levels. Leadership and Policy in Schools , 9 (3), 292–327. https://doi.org/10.1080/15700761003731492

Ansyari, M. F., Groot, W., & De Witte, K. (2020). Tracking the process of data use professional development interventions for instructional improvement: A systematic literature review. Educational Research Review, 31 , 100362.

Article   Google Scholar  

Ansyari, M. F., Groot, W., & De Witte, K. (2022). A systematic review and meta-analysis of data use professional development interventions. Journal of Professional Capital and Community, 7 (3), 256–289.

*Beck, J. S., Morgan, J. J., Brown, N., Whitesides, H., & Riddle, D. R. (2020). “Asking, Learning, Seeking Out”: An exploration of data literacy for teaching. The Educational Forum , 84 (2), 150–165. https://doi.org/10.1080/00131725.2020.1674438

Beck, J. S., & Nunnaley, D. (2021). A continuum of data literacy for teaching. Studies in Educational Evaluation, 69 , 100871.

*Bianco, S. D. (2010). Improving student outcomes: data-driven instruction and fidelity of implementation in a response to intervention (RTI) model.  Teaching Exceptional Children Plus ,  6 (5).

Brookhart, S. M. (2011). Educational assessment knowledge and skills for teachers. Educational Measurement: Issues and Practice, 30 (1), 3–12.

*Brunner, C., Fasca, C., Heinze, J., Honey, M., Light, D., Mardinach, E., & Wexler, D. (2005). Linking data and learning: The grow network study. Journal of Education for Students Placed at Risk (JESPAR) , 10 (3), 241–267. https://doi.org/10.1207/s15327671espr1003_2

*Chen, L.-L. (2019). Enhancing teaching with effective data mining protocols. Journal of Educational Technology Systems , 47 (4), 500–512. https://doi.org/10.1177/0047239519835846

*Copp, D. T. (2016). The impact of teacher attitudes and beliefs about large-scale assessment on the use of provincial data for instructional change. Education Policy Analysis Archives , 24 , 109. https://doi.org/10.14507/epaa.24.2522

*Copp, D. T. (2017). Policy incentives in Canadian large-scale assessment: How policy levers influence teacher decisions about instructional change. Education Policy Analysis Archives , 25 , 115. https://doi.org/10.14507/epaa.25.3299

*Curry, K. A., Mwavita, M., Holter, A., & Harris, E. (2016). Getting assessment right at the classroom level: using formative assessment for decision making. Educational Assessment, Evaluation and Accountability, 28 (1), 89–104. https://doi.org/10.1007/s11092-015-9226-5

Darling-Hammond, L., Hightower, A. M., Husbands, J., LaFors, J. R., Young, V. M. & Christopher, C. (2003) Building instructional quality: ‘inside-out’ and ‘outside-in’ perspectives on San Diego’s school reform (Seattle, WA, Center for the Study of Teaching and Policy, University of Washington).

Data Quality Campaign. (2014). Teacher data literacy: It's about time.

*Datnow, A., Park, V., & Kennedy-Lewis, B. (2012). High school teachers' use of data to inform instruction. Journal of Education for Students Placed at Risk (JESPAR), 17 (4), 247–265. https://doi.org/10.1080/10824669.2012.718944

*Datnow, A., Park, V., & Kennedy‐Lewis, B. (2013). Affordances and constraints in the context of teacher collaboration for the purpose of data use. Journal of Educational Administration , 51 (3), 341–362. https://doi.org/10.1108/09578231311311500

*Datnow, A., Lockton, M., & Weddle, H. (2021). Capacity building to bridge data use and instructional improvement through evidence on student thinking. Studies in Educational Evaluation , 69 , 100869. https://doi.org/10.1016/j.stueduc.2020.100869

*Dunn, K. E., Airola, D. T., Lo, W.-J., & Garrison, M. (2013a). Becoming data driven: The influence of teachers’ sense of efficacy on concerns related to data-driven decision making. The Journal of Experimental Education , 81 (2), 222–241. https://doi.org/10.1080/00220973.2012.699899

*Dunn, K. E., Airola, D. T., Lo, W.-J., & Garrison, M. (2013b). What teachers think about what they can do with data: Development and validation of the data driven decision-making efficacy and anxiety inventory. Contemporary Educational Psychology, 38 (1), 87-98. https://doi.org/10.1016/j.cedpsych.2012.11.002

*Ebbeler, J., Poortman, C. L., Schildkamp, K., & Pieters, J. M. (2016). Effects of a data use intervention on educators’ use of knowledge and skills. Studies in Educational Evaluation, 48 , 19-31. https://doi.org/10.1016/j.stueduc.2015.11.002

*Ebbeler, J., Poortman, C.L., Schildkamp, K. & Pieters, J. M. (2017). The effects of a data use intervention on educators’ satisfaction and data literacy.  Educational Assessment, Evaluation and Accountability, 29 , 83–105. https://doi.org/10.1007/s11092-016-9251z

Espin, C. A., van den Bosch, R. M., van der Liende, M., Rippe, R. C., Beutick, M., Langa, A., & Mol, S. E. (2021). A systematic review of CBM professional development materials: Are teachers receiving sufficient instruction in data-based decision-making? Journal of Learning Disabilities, 54 (4), 256–268.

*Farley-Ripple, E., & Buttram, J. (2015). The development of capacity for data use: The role of teacher networks in an elementary school. Teachers College Record, 117 (4), 1–34.

*Farley-Ripple, E. N., Jennings, A. S., & Buttram, J. (2019). Toward a framework for classifying teachers’ use of assessment data. AERA Open, 5 (4). https://doi.org/10.1177/2332858419883571 .

*Farrell, C. C. (2015). Designing school systems to encourage data use and instructional improvement: A comparison of school districts and charter management organizations. Educational Administration Quarterly, 51 (3), 438–471. https://doi.org/10.1177/0013161x14539806

Filderman, M. J., Toste, J. R., Didion, L. A., Peng, P., & Clemens, N. H. (2018). Data-based decision making in reading interventions: A synthesis and meta-analysis of the effects for struggling readers. The Journal of Special Education, 52 (3), 174–187.

*Ford, T. G. (2018). Pointing teachers in the wrong direction: Understanding Louisiana elementary teachers’ use of compass high-stakes teacher evaluation data. Educational Assessment, Evaluation and Accountability , 30 (3), 251–283. https://doi.org/10.1007/s11092-018-9280-x

*Gelderblom, G., Schildkamp, K., Pieters, J., & Ehren, M. (2016). Data-based decision making for instructional improvement in primary education. International Journal of Educational Research , 80 , 1–14. https://doi.org/10.1016/j.ijer.2016.07.004

Gesel, S. A., LeJeune, L. M., Chow, J. C., Sinclair, A. C., & Lemons, C. J. (2021). A meta-analysis of the impact of professional development on teachers’ knowledge, skill, and self-efficacy in data-based decision-making. Journal of Learning Disabilities, 54 (4), 269–283.

Gummer, E., & Mandinach, E. B. (2015). Building a conceptual framework for data literacy. Teachers College Record, 117 (4), 1–22. https://www.tcrecord.org

*Hardy, I. (2015). Data, numbers and accountability: The complexity, nature and effects of data use in schools. British Journal of Educational Studies , 63 (4), 467–486. https://doi.org/10.1080/00071005.2015.1066489

*Hardy, I. (2019). Governing teachers’ work and learning through data: Australian insights. Cambridge Journal of Education , 49 (4), 501–517. https://doi.org/10.1080/0305764X.2018.1557594

*Hodges, C. A. (1996). How valid and useful are alternative assessments for decision‐making in primary grade classrooms? Reading Research and Instruction , 36 (2), 157–173. https://doi.org/10.1080/19388079709558235

Hoogland, I., Schildkamp, K., Van der Kleij, F., Heitink, M., Kippers, W., Veldkamp, B., & Dijkstra, A. M. (2016). Prerequisites for data-based decision making in the classroom: Research evidence and practical illustrations. Teaching and Teacher Education, 60 , 377–386.

*Howley, M. D., Howley, A., Henning, J. E., Gilla, M. B., & Weade, G. (2013). Intersecting domains of assessment knowledge: School typologies based on interviews with secondary teachers. Educational Assessment , 18 (1), 26–48. https://doi.org/10.1080/10627197.2013.761527

*Huffman, D., & Kalnin, J. (2003). Collaborative inquiry to make data-based decisions in schools. Teaching and Teacher Education , 19 (6), 569–580. https://doi.org/10.1016/S0742-051X(03)00054-4

*Jacobs, J., Gregory, A., Hoppey, D., & Yendol-Hoppey, D. (2009). Data literacy: Understanding teachers’ data use in a context of accountability and response to intervention. Action in Teacher Education , 31 (3), 41–55. https://doi.org/10.1080/01626620.2009.10463527

*Jimerson, J. B., & Wayman, J. (2015). Professional learning for using data: Examining teacher needs and supports. Teachers College Record, 117 (4), 1–36.

*Jimerson, J. B., Cho, V., & Wayman, J. C. (2016). Student-involved data use: Teacher practices and considerations for professional learning. Teaching and Teacher Education , 60 , 413–424. https://doi.org/10.1016/j.tate.2016.07.008

*Jimerson, J. B., Cho, V., Scroggins, K. A., Balial, R., & Robinson, R. R. (2019). How and why teachers engage students with data. Educational Studies, 45 (6), 667–691. https://doi.org/10.1080/03055698.2018.1509781

*Jimerson, J. B., Garry, V., Poortman, C. L., & Schildkamp, K. (2021). Implementation of a collaborative data use model in a United States context. Studies in Educational Evaluation , 69 , 100866. https://doi.org/10.1016/j.stueduc.2020.100866

*Jimerson, J. B. (2014). Thinking about data: Exploring the development of mental models for “data use” among teachers and school leaders. Studies in Educational Evaluation, 42 , 5–14. https://doi.org/10.1016/j.stueduc.2013.10.010

*Joo, Y. H. (2020). Promoting sustainable data-based decision-making in the Korean educational information disclosure system. Sustainability , 12 (17), 6762. https://doi.org/10.3390/su12176762

*Kallemeyn, L. M. (2014). School-level organisational routines for learning: supporting data use. Journal of Educational Administration, 52 (4), 529–548. https://doi.org/10.1108/JEA-02-2013-0025

*Kanjee, A., & Moloi, Q. (2014). South African teachers’ use of national assessment data. South African Journal of Childhood Education , 4 (2), 24. https://doi.org/10.4102/sajce.v4i2.206

Katz, L. (1993). Dispositions as Educational Goals. ERIC Digest #211.

*Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., & Barney, H. (2006). Strategies to promote data use for instructional improvement: Actions, outcomes, and lessons from three urban districts. American Journal of Education , 112 (4), 496–520. https://doi.org/10.1086/505057

*Keuning, T., Geel, M. V., Visscher, A., Fox, J.-P., & Moolenaar, N. M. (2016). The transformation of schools’ social networks during a data-based decision making reform. Teachers College Record, 118 (9), 1–33. https://doi.org/10.1177/016146811611800908

*Keuning, T., Van Geel, M., & Visscher, A. (2017). Why a data-based decision-making intervention works in some schools and not in others. Learning Disabilities Research & Practice, 32 (1), 32-45. https://doi.org/10.1111/ldrp.12124

*Kippers, W. B., Poortman, C. L., Schildkamp, K., & Visscher, A. J. (2018). Data literacy: What do educators learn and struggle with during a data use intervention? Studies in Educational Evaluation, 56 , 21–31 . https://doi.org/10.1016/j.stueduc.2017.11.001

Lee, J. (2021). Teacher–student relationships and academic achievement in Confucian educational countries/systems from PISA 2012 perspectives. Educational Psychology, 41 (6), 764–785.

Lee, J., & Durksen, T. (2018). Dimensions of academic interest among undergraduate students: Passion, confidence, aspiration, and self-expression. Educational Psychology, 38 (2), 120–138. https://doi.org/10.1080/01443410.2017.1342770

Lee, J., & Stankov, L. (2018). Non-cognitive predictors of academic achievement: Evidence from TIMSS and PISA. Learning and Individual Differences, 65 , 50–64.

Li, L., Zhu, H., & Li, H. (2022). School leadership enhances secondary students’ achievement in rural China through teacher commitment and collaborative culture. Frontiers in Psychology, 13 , 894688.

*Lockton, M., Weddle, H., & Datnow, A. (2020). When data don’t drive: Teacher agency in data use efforts in low-performing schools. School Effectiveness and School Improvement , 31 (2), 243–265. https://doi.org/10.1080/09243453.2019.1647442

Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to inform practice. Educational Psychologist, 47 (2), 71–85. https://doi.org/10.1080/00461520.2012.667064

Mandinach, E. B., & Gummer, E. S. (2013a). Defining data literacy: A report on a convening of experts. Journal of Educational Research and Policy Studies, 13 (2), 6–28.

Google Scholar  

Mandinach, E. B., & Gummer, E. S. (2013b). A systemic view of implementing data literacy in educator preparation. Educational Researcher, 42 (1), 30–37. https://doi.org/10.3102/0013189x12459803

Mandinach, E. B., & Gummer, E. S. (2016). What does it mean for teachers to be data literate: Laying out the skills, knowledge, and dispositions. Teaching and Teacher Education, 60 , 366–376. https://doi.org/10.1016/j.tate.2016.07.011

Mandinach, E. B., & Schildkamp, K. (2021). Misconceptions about data-based decision making in education: An exploration of the literature. Studies in Educational Evaluation, 69 , 100842. https://doi.org/10.1016/j.stueduc.2020.100842

Mandinach, E. B., Parton, B. M., Gummer, E. S., & Anderson, R. B. (2015). Responsible data use as a skill in educators’ data literacy. Phi Delta Kappan, 96 (5), 25e28.

Mandinach, E. B. (2014). What's the difference between assessment literacy and data literacy?. In Presentation made at the data quality Campaign webinar. Retrieved from http://dataqualitycampaign.org/files/events/Data%20literacy%20event%20slides.pdf

*Marsh, J. A., & Farrell, C. C. (2015). How leaders can support teachers with data-driven decision making: A framework for understanding capacity building. Educational Management Administration & Leadership , 43 (2), 269–289. https://doi.org/10.1177/1741143214537229

*Marsh, J. A., Sloan McCombs, J., & Martorell, F. (2010). How instructional coaches support data-driven decision making: Policy implementation and effects in Florida middle schools. Educational Policy , 24 (6), 872–907. https://doi.org/10.1177/0895904809341467

*Mausethagen, S., Prøitz, T., & Skedsmo, G. (2018). Teachers’ use of knowledge sources in ‘result meetings’: Thin data and thick data use. Teachers and Teaching , 24 (1), 37–49. https://doi.org/10.1080/13540602.2017.1379986

*McDonald, J.P. (2019). Toward more effective data use in teaching. Phi Delta Kappan , 100 (6), 50–54. https://doi.org/10.1177/0031721719834029

*Michaud, R. (2016). The nature of teacher learning in collaborative data teams. The Qualitative Report, 21 (3), 529–545. Retrieved from http://nsuworks.nova.edu/tqr/vol21/iss3/7

*Militello, M., Bass, L., Jackson, K., & Wang, Y. (2013). How data are used and misused in schools: Perceptions from teachers and principals. Education Sciences , 3 (2), 98–120. https://doi.org/10.3390/educsci3020098

*Mills, M., Mockler, N., Stacey, M., & Taylor, B. (2021). Teachers’ orientations to educational research and data in England and Australia: Implications for teacher professionalism. Teaching Education, 32 (1), 77–98. https://doi.org/10.1080/10476210.2020.1843617

Mischel, W., & Shoda, Y. (1995). A cognitive-affective system theory of personality: Reconceptualizing situations, dispositions, dynamics, and invariance in personality structure. Psychological Review, 102 (2), 246–268.

Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ, 339 , b2535. https://doi.org/10.1136/bmj.b2535

*Mokhtari, K., Thoma, J., & Edwards, P. (2009). How one elementary school uses data to help raise students’ reading achievement. The Reading Teacher , 63 (4), 334–337. https://www.jstor.org/stable/30249385

*Nicholson, J., Capitelli, S., Richert, A. E., Wilson, C., & Bove, C. (2017). Teacher leaders building foundations for data-informed teacher learning in one urban elementary school. The New Educator , 13 (2), 170–189. https://doi.org/10.1080/1547688X.2016.1140856

*O’Brien, S., McNamara, G., O’Hara, J., & Brown, M. (2022). Learning by doing: Evaluating the key features of a professional development intervention for teachers in data-use, as part of whole school self-evaluation process. Professional Development in Education , 48 (2), 273–297. https://doi.org/10.1080/19415257.2020.1720778

*Omoso, E., Schildkamp, K., & Pieters, J. (2019). Data use in Kenyan secondary schools. Journal of Professional Capital and Community , 4 (3), 216–231. https://doi.org/10.1108/JPCC-11-2018-0027

*Park, V., & Datnow, A. (2009). Co-constructing distributed leadership: District and school connections in data-driven decision-making. School Leadership & Management , 29 (5), 477–494. https://doi.org/10.1080/13632430903162541

*Poortman, C. L., & Schildkamp, K. (2016). Solving student achievement problems with a data use intervention for teachers. Teaching and Teacher Education, 60 , 425-433. https://doi.org/10.1016/j.tate.2016.06.010

*Powell, S. R., Lembke, E. S., Ketterlin-Geller, L. R., Petscher, Y., Hwang, J., Bos, S. E., Cox, T., Hirt, S., Mason, E. N., Pruitt-Britton, T., Thomas, E., & Hopkins, S. (2021). Data-based individualisation in mathematics to support middle school teachers and their students with mathematics learning difficulty. Studies in Educational Evaluation, 69 , 100897. https://doi.org/10.1016/j.stueduc.2020.100897

*Prenger, R., & Schildkamp, K. (2018). Data-based decision making for teacher and student learning: A psychological perspective on the role of the teacher. Educational Psychology , 38 (6), 734–752. https://doi.org/10.1080/01443410.2018.1426834

*Reed, D. K. (2015). Middle level teachers’ perceptions of interim reading assessments: An exploratory study of data-based decision making. RMLE Online , 38 (6), 1–13. https://doi.org/10.1080/19404476.2015.11462119

*Reeves, T. D., Summers, K. H., & Grove, E. (2016). Examining the landscape of teacher learning for data use: The case of Illinois. Cogent Education , 3 (1), 1211476. https://doi.org/10.1080/2331186X.2016.1211476

*Reeves, T. D. (2017). School level and other differences in Illinois teachers’ use of data to inform instruction.  Mid-Western Educational Researcher ,  29 (4), 332–354.

Schildkamp, K. (2019). Data-based decision-making for school improvement: Research insights and gaps. Educational Research, 61 (3), 257–273. https://doi.org/10.1080/00131881.2019.1625716

*Schildkamp, K., Poortman, C., Luyten, H., & Ebbeler, J. (2017). Factors promoting and hindering data-based decision making in schools . School Effectiveness and School Improvement, 28 (2), 242–258. https://doi.org/10.1080/09243453.2016.1256901

*Schildkamp, K., Smit, M., & Blossing, U. (2019). Professional development in the use of data: From data to knowledge in data teams. Scandinavian Journal of Educational Research , 63 (3), 393–411. https://doi.org/10.1080/00313831.2017.1376350

Shavelson, R. J., Ruiz-Primo, M. A., & Wiley, E. W. (2005). Windows into the mind. Higher Education, 49 (4), 413–430.

*Snodgrass Rangel, V., Monroy, C., & Bell, E. (2016). Science teachers’ data use practices: A descriptive analysis. Education Policy Analysis Archives , 24 , 86. https://doi.org/10.14507/epaa.24.2348

*Snodgrass Rangel, V. S., Bell, E., & Monroy, C. (2019). Teachers’ sensemaking and data use implementation in science classrooms. Education and Urban Society, 51 (4), 526–554. https://doi.org/10.1177/0013124517727053

*Staman, L., Timmermans, A. C., & Visscher, A. J. (2017). Effects of a data-based decision making intervention on student achievement. Studies in Educational Evaluation, 55 , 58–67. https://doi.org/10.1016/j.stueduc.2017.07.002

Stankov, L., & Lee, J. (2008). Confidence and cognitive test performance. Journal of Educational Psychology, 100 (4), 961–976. https://doi.org/10.1037/a0012546

Sykes, G. (1999). Teacher and student learning: Strengthening their connection. In L. Darling-Hammond & G. Sykes (Eds.), Teaching as the learning profession: Handbook of policy and practice (pp. 151–179). Jossey-Bass.

Thomas, J., & Harden, A. (2008). Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Medical Research Methodology, 8 , 45. https://doi.org/10.1186/1471-2288-8-45

*Thomas, K., & Huffman, D. (2011). Navigating the challenges of helping teachers use data to inform educational decisions. Administrative Issues Journal , 1 (2). https://doi.org/10.5929/2011.1.2.7

*Thompson, C. J. (2012). Improving the use of data in early reading intervention programs in northwest Florida. Journal of Educational Research and Practice, 2 (1), 1-14.

*van Der Scheer, E. A., & Visscher, A. J. (2018). Effects of a data-based decision-making intervention for teachers on students’ mathematical achievement. Journal of Teacher Education , 69 (3), 307–320. https://doi.org/10.1177/0022487117704170

*Van Gasse, R., Vanlommel, K., Vanhoof, J., & Van Petegem, P. (2017). The impact of collaboration on teachers’ individual data use. School Effectiveness and School Improvement , 28 (3), 489–504. https://doi.org/10.1080/09243453.2017.1321555

*Van Gasse, R., Vanhoof, J., & Van Petegem, P. (2018). Instrumental, conceptual and symbolic effects of data use: The impact of collaboration and expectations. Educational Studies, 44 (5), 521–534. https://doi.org/10.1080/03055698.2017.1382328

*Van Gasse, R., Goffin, E., Vanhoof, J., & Van Petegem, P. (2021). For squad-members only! Why some teachers are more popular to interact with than others in data use. Studies in Educational Evaluation , 69 , 100881. https://doi.org/10.1016/j.stueduc.2020.100881

*Vanlommel, K., & Schildkamp, K. (2019). How do teachers make sense of data in the context of high-stakes decision making? American Educational Research Journal, 56 (3), 792–821. https://doi.org/10.3102/0002831218803891

*Vanlommel, K., Vanhoof, J., & Van Petegem, P. (2016). Data use by teachers: The impact of motivation, decision-making style, supportive relationships and reflective capacity. Educational Studies , 42 (1), 36–53. https://doi.org/10.1080/03055698.2016.1148582

*Vanlommel, K., Van Gasse, R., Vanhoof, J., & Van Petegem, P. (2021). Sorting pupils into their next educational track: How strongly do teachers rely on data-based or intuitive processes when they make the transition decision? Studies in Educational Evaluation , 69 , 100865. https://doi.org/10.1016/j.stueduc.2020.100865

*von der Embse, N. P., Kilgus, S. P., Eklund, K., Zahn, M., Peet, C., & Durango, S. (2021). Promoting effective decision making: Training educators to collect and use social–emotional skill assessment data to inform tier 2 interventions. School Psychology Review, 51 (5), 574-588. https://doi.org/10.1080/2372966X.2020.1827680

*Wachen, J., Harrison, C., & Cohen-Vogel, L. (2018). Data use as instructional reform: exploring educators’ reports of classroom practice. Leadership and Policy in Schools , 17 (2), 296–325. https://doi.org/10.1080/15700763.2016.1278244

*Wardrip, P. S., & Herman, P. (2018). ‘We’re keeping on top of the students’: Making sense of test data with more informal data in a grade-level instructional team. Teacher Development , 22 (1), 31–50. https://doi.org/10.1080/13664530.2017.1308428

Weiss, J. (2012). Data for improvement, data for accountability. Teachers College Record, 114 (11), 110307.

*Wolff, A., Wermelinger, M., & Petre, M. (2019). Exploring design principles for data literacy activities to support children’s inquiries from complex data. International Journal of Human-Computer Studies , 129 , 41–54. https://doi.org/10.1016/j.ijhcs.2019.03.006

*Young, V. M. (2006). Teachers’ use of data: Loose coupling, agenda setting, and team norms. American Journal of Education , 112 (4), 521–548. https://doi.org/10.1086/505058

*Zeuch, N., Förster, N., & Souvignier, E. (2017). Assessing teachers’ competencies to read and interpret graphs from learning progress assessment: Results from tests and interviews. Learning Disabilities Research & Practice , 32 (1), 61–70. https://doi.org/10.1111/ldrp.12126

*Zhao, X., Van den Heuvel-Panhuizen, M., & Veldhuis, M. (2016). Teachers’ use of classroom assessment techniques in primary mathematics education—An explorative study with six Chinese teachers. International Journal of STEM Education, 3 (1), 19. https://doi.org/10.1186/s40594-016-0051-2

Download references

Acknowledgements

This work was supported by the Faculty of Arts, Design & Architecture and the School of Education at UNSW, Sydney.

Open Access funding enabled and organized by CAUL and its Member Institutions.

Author information

Authors and affiliations.

School of Education, University of New South Wales, Sydney, NSW, Australia

Jihyun Lee, Dennis Alonzo, Kim Beswick, Jan Michael Vincent Abril & Adrian W. Chew

Yangon University of Education, Yangon, Myanmar

Cherry Zin Oo

You can also search for this author in PubMed   Google Scholar

Contributions

Jihyun Lee: Conceptualization; Data curation; Formal analysis; Funding acquisition; Investigation; Methodology; Supervision; Validation; Visualization; Roles/Writing—original draft; and Writing—review & editing

Dennis Alonzo: Conceptualization; Formal analysis; Funding acquisition; Investigation; Methodology; Project administration; Resources

Kim Beswick: Conceptualization; Funding acquisition; Investigation; Writing—review & editing

Jan Abril: Data curation; Validation

Adrian W. Chew: Data curation; Visualization

Cherry Zin O: Data curation; Formal analysis

Corresponding author

Correspondence to Jihyun Lee .

Ethics declarations

Conflict of interest statement.

There is no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Data Management Statement

The manuscript has no data attached, as it is a systematic review.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Lee, J., Alonzo, D., Beswick, K. et al. Dimensions of teachers’ data literacy: A systematic review of literature from 1990 to 2021. Educ Asse Eval Acc (2024). https://doi.org/10.1007/s11092-024-09435-8

Download citation

Received : 06 November 2023

Accepted : 27 March 2024

Published : 06 May 2024

DOI : https://doi.org/10.1007/s11092-024-09435-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Data literacy
  • Find a journal
  • Publish with us
  • Track your research

Map Options

map placeholder

U.S. Literacy Rates by State 2024

The literacy rate is the total number of literate persons (people able to both read and write) in a given age group, expressed as the percentage of that age group. The adult literacy rate measures literacy among persons age 15 and older.

According to the National Center for Education Statistics , about four out of five U.S. adults (79%) have medium to high English literacy skills. These literacy levels are sufficient to compare and contrast information, paraphrase, and make low-level inferences. This means that about one in five U.S. adults (21%) have low literacy skills, translating to about 43.0 million adults.

Of those who have low English literacy skills, 35% are White, 2% of whom are born outside of the U.S.; 23% are Black, 3% of whom are born outside of the U.S.; 34% are Hispanic, 24% of whom are born outside of the U.S.; 8% are of other races/ethnicities. Non-U.S.-born adults comprise 34% of the U.S. population with low literacy skills.

Literacy rates vary greatly by state and are affected by several factors.

States with the High Literacy Rates

1. new hampshire.

New Hampshire has the lowest percentage (5.8%) of adults lacking basic prose literacy skills, making New Hampshire's literacy rate of 94.2%. About 36.6% of adults in New Hampshire have a Bachelor's degree or higher, the seventh-highest rate in the country, and 60.8% of the population are registered library users, also the seventh-highest in the country.

2. Minnesota

Minnesota has the second-highest literacy rate of 94.0%, with only 6% of adults lacking basic prose literacy skills. About 69.9% of Minnesota residents are registered library users, the second-highest in the country.

3. North Dakota

North Dakota has the third-highest literacy rate of 93.7%, with 6.3% of adults lacking basic prose literacy skills. North Dakota, however, has the lowest percentage of registered library users of 35.9%.

Vermont has the fourth-highest literacy rate of 93.4%, with 6.6% of adults lacking basic prose literacy skills. Vermont is also the fourth-most educated state in the U.S. . Vermont has the highest number of libraries per 100,000 people of 29.8 and has the eighth-highest Bachelor's degree or higher attainment of 36.4%.

5. South Dakota

South Dakota has the fifth-highest literacy rate of 93.0%, with 7.0% of adults lacking basic prose literacy skills. South Dakota has 16.4 public libraries per 100,000 residents, the fifth-highest in the country.

States with the Lowest Literacy Rates

1. california.

California's 23.1% of adults lacking basic prose literacy skills make California have the lowest literacy rate of 76.9%. The state of California and the state Department of Education are being blamed and sued for the failing literacy rate, as families and students believe that they are not receiving a quality education in reading and writing.

2. New York

New York has the second-highest percentage of adults lacking basic prose literacy skills of 22.1%, equaling a literacy rate of 77.9%. Despite this, New York has the ninth-highest percentage of adults with Bachelor's degrees or higher of 35.7%.

Florida has the third-highest percentage of adults lacking basic prose literacy skills of 19.7%, equaling a literacy rate of 80.3%. Florida has the lowest number of public libraries per 100,000 residents of 2.6.

Texas has the fourth-lowest literacy rate of 81.0%, with 19.0% of adults lacking basic prose literacy skills. Texas has the fourth-lowest number of libraries of 3.2 per 100,000 residents.

5. New Jersey

The fifth-lowest literacy rate in the United States is in New Jersey, where 16.9% of adults lack basic prose literacy skills, and 83.1% are literate. Despite this, 38.6% of New Jersey adults have a Bachelor's degree or higher, the country's fifth-highest. Additionally, New Jersey also has the third-best public schools .

Download Table Data

Enter your email below, and you'll receive this table's data in your inbox momentarily.

  • Adult Literacy State Cards 2019 - Barbara Bush
  • National Reading Proficiency Map 2022 - Reading is Fundamental
  • Teaching & Learning Home
  • Becoming an Educator
  • Become a Teacher
  • California Literacy
  • Career Technical Education
  • Business & Marketing
  • Health Careers Education
  • Industrial & Technology Education
  • Standards & Framework
  • Work Experience Education (WEE)
  • Curriculum and Instruction Resources
  • Common Core State Standards
  • Curriculum Frameworks & Instructional Materials
  • Distance Learning
  • Driver Education
  • Multi-Tiered System of Supports
  • Recommended Literature
  • School Libraries
  • Service-Learning
  • Specialized Media
  • Grade Spans
  • Early Education
  • P-3 Alignment
  • Middle Grades
  • High School
  • Postsecondary
  • Adult Education
  • Professional Learning
  • Administrators
  • Curriculum Areas
  • Professional Standards
  • Quality Schooling Framework
  • Social and Emotional Learning
  • Subject Areas
  • Computer Science
  • English Language Arts
  • History-Social Science
  • Mathematics
  • Physical Education
  • Visual & Performing Arts
  • World Languages
  • Testing & Accountability Home
  • Accountability
  • California School Dashboard and System of Support
  • Dashboard Alternative School Status (DASS)
  • Local Educational Agency Accountability Report Card
  • School Accountability Report Card (SARC)
  • State Accountability Report Card
  • Compliance Monitoring
  • District & School Interventions
  • Awards and Recognition
  • Academic Achievement Awards
  • California Distinguished Schools Program
  • California Teachers of the Year
  • Classified School Employees of the Year
  • California Gold Ribbon Schools
  • Assessment Information
  • CA Assessment of Student Performance and Progress (CAASPP)
  • CA Proficiency Program (CPP)
  • English Language Proficiency Assessments for CA (ELPAC)
  • Grade Two Diagnostic Assessment
  • High School Equivalency Tests (HSET)
  • National Assessment of Educational Progress (NAEP)
  • Physical Fitness Testing (PFT)
  • Smarter Balanced Assessment System
  • Finance & Grants Home
  • Definitions, Instructions, & Procedures
  • Indirect Cost Rates (ICR)
  • Standardized Account Code Structure (SACS)
  • Allocations & Apportionments
  • Categorical Programs
  • Consolidated Application
  • Federal Cash Management
  • Local Control Funding Formula
  • Principal Apportionment
  • Available Funding
  • Funding Results
  • Projected Funding
  • Search CDE Funding
  • Outside Funding
  • Funding Tools & Materials
  • Finance & Grants Other Topics
  • Fiscal Oversight
  • Software & Forms
  • Data & Statistics Home
  • Accessing Educational Data
  • About CDE's Education Data
  • About DataQuest
  • Data Reports by Topic
  • Downloadable Data Files
  • Data Collections
  • California Basic Educational Data System (CBEDS)
  • California Longitudinal Pupil Achievement Data System (CALPADS)
  • Consolidated Application and Reporting System (CARS)
  • Cradle-to-Career Data System
  • Annual Financial Data
  • Certificated Salaries & Benefits
  • Current Expense of Education & Per-pupil Spending
  • Data Strategy
  • Data Privacy
  • Student Health & Support
  • Free and Reduced Price Meal Eligibility Data
  • Food Programs
  • Data Requests
  • School & District Information
  • California School Directory
  • Charter School Locator
  • County-District-School Administration
  • Private School Data
  • Public Schools and District Data Files
  • Regional Occupational Centers & Programs
  • School Performance
  • Postsecondary Preparation
  • Specialized Programs Home
  • Directory of Schools
  • Federal Grants Administration
  • Charter Schools
  • Contractor Information
  • Laws, Regulations, & Requirements
  • Program Overview
  • Educational Options
  • Independent Study
  • Open Enrollment
  • English Learners
  • Special Education
  • Administration & Support
  • Announcements & Current Issues
  • Data Collection & Reporting
  • Family Involvement & Partnerships
  • Quality Assurance Process
  • Services & Resources
  • CA Equity Performance and Improvement Program
  • Improving Academic Achievement
  • Schoolwide Programs
  • Statewide System of School Support (S4)
  • Specialized Programs Other Topics
  • American Indian
  • Gifted & Talented Education
  • Homeless Education
  • Migrant/International
  • Private Schools and Schooling at Home
  • State Special Schools
  • Learning Support Home
  • Attendance Improvement
  • School Attendance Review Boards
  • Expanded Learning
  • 21st Century Community Learning Centers
  • After School Education & Safety Program
  • Expanded Learning Opportunities Program
  • Child Nutrition Information & Payment System (CNIPS)
  • Rates, Eligibility Scales, & Funding
  • School Nutrition
  • Parents/Family & Community
  • Clearinghouse for Multilingual Documents
  • School Disaster and Emergency Management
  • Learning Support Other Topics
  • Class Size Reduction
  • Education Technology
  • Educational Counseling
  • Mental Health
  • Safe Schools
  • School Facilities
  • Transportation
  • Youth Development
  • Professional Learning Home
  • Title II, Part A Resources and Guidance
  • Learning Support

2024 Summer Food Service Program Guide

Contact information.

If you have questions regarding this subject, please reach out to the Summer Nutrition Programs and Grants Unit by email at [email protected] .

  • Child Nutrition Programs Course Catalog A catalog of online training courses for program operators participating in the Child Nutrition Programs.
  • Disaster Resources Assistance for Victims of a Disaster within California.
  • Disaster Response - Child Nutrition Programs This Management Bulletin provides information on responding to state or federally declared disasters in regard to the Child Nutrition Programs.
  • Nutrition Fiscal Services Analyst Directory Fiscal information to aid nutrition sponsors and to provide information about claiming meal reimbursements.
  • Summer EBT 2024 Announcement
  • Special Milk Program
  • California Universal Meals
  • Summer EBT 2024 Benefit Issuance Timeline Update
  • Local Agency Procurement Review List of Operators
  • Grapes, Variety
  • Income Eligibility Scales for School Year 2023–24
  • Summer Meal Programs
  • School Nutrition Programs Management Bulletins
  • Summer Food Service Program Information
  • 2024 Summer Food Service Program Guide (added 23-May-2024) removed by RO --> Summer Food Service Program Administration Guide. removed by RO -->
  • Summer 2024 Rural Non-congregate Feeding FAQ's (added 23-May-2024) removed by RO --> The purpose of this announcement is to highlight existing resources for program operators and to address frequently asked questions (FAQ) the CDE has received related to non-congregate feeding options. removed by RO -->
  • School Nutrition Program Administration Training (added 23-May-2024) removed by RO --> This announcement provides registration information for the in-person Introduction to School Nutrition Program Administration courses. removed by RO -->
  • Extended: CNAC Student Recruitment (added 14-May-2024) removed by RO --> The Child Nutrition Advisory Council (CNAC) is accepting student member applications through May 21, 2024. removed by RO -->
  • Local Agency Procurement Review List of Operators (added 10-May-2024) removed by RO --> Program operators scheduled for the School Year 2024-25 off-site local agency procurement review. removed by RO -->
  • SFSP Application Deadline is May 15 (added 09-May-2024) removed by RO --> The California Department of Education (CDE) Nutrition Services Division would like to remind you that your summer meal application packets for the Summer Food Service Program (SFSP) are due May 15. removed by RO -->
  • CNAC Student Member Recruitment Reminder (added 07-May-2024) removed by RO --> The California Department of Education is currently recruiting applicants for the Child Nutrition Advisory Council (CNAC) student member position. removed by RO -->
  • USDA Final Rule: School Meal Standards (added 07-May-2024) removed by RO --> This information announces the release of the U.S. Department of Agriculture (USDA) Final Rule for Child Nutrition Programs: Meal Patterns Consistent with the 2020-2025 Dietary Guidelines for Americans. removed by RO -->
  • Cargill Meat Solutions Recalls Ground Beef Product (added 03-May-2024) removed by RO --> Cargill Meat Solutions Recalls Ground Beef Products due to possible E. coli O157:H7 contamination. removed by RO -->
  • Summer Meals: Rural Non-congregate Meal Service (added 03-May-2024) removed by RO --> This management bulletin provides guidance on rural non-congregate meal service options in the Summer Food Service Program (SFSP) and Seamless Summer Option (SSO), collectively referred to as the Summer Meal Programs (SMP). removed by RO -->

IMAGES

  1. 50+ Awesone Literacy Rates Facts Everyone Should Know

    literacy rate literature review

  2. LITERACY RATE 2021

    literacy rate literature review

  3. A Comparison Of Literacy Rates

    literacy rate literature review

  4. Helping You In Writing A Literature Review Immaculately

    literacy rate literature review

  5. A Comparison Of Literacy Rates.Jpg

    literacy rate literature review

  6. Literacy Rate Definition

    literacy rate literature review

VIDEO

  1. Literacy rate in G20 Nations (1980-2022) #shortvideos #literacy #g20

  2. Literacy rate in south asia

  3. Countries Literacy Rate

  4. Literacy rate -worldwide #worldaffairs #shorts #ranking #currentaffairs

  5. Cardiotocography in hindi

  6. Literacy Rate by Country 1970 to 2021

COMMENTS

  1. Literacy

    This article was first published in 2013. The last revisions were done in March 2024. Contents Literacy is a key skill and a key measure of a population's education. In this topic page, we discuss historical trends, as well as recent developments in literacy.

  2. Full article: Discovering the literacy gap: A systematic review of

    The present study is a systematic literature review of articles focusing on reading and writing interventions published in the Journal of Adolescent and Adult Literacy. Using the matrix method (Garrard, Citation 2011 ) that includes detailed coding procedures, multiple coders, and inter-rater reliability checks, we reliably synthesize the ...

  3. How to Write a Literature Review

    Examples of literature reviews. Step 1 - Search for relevant literature. Step 2 - Evaluate and select sources. Step 3 - Identify themes, debates, and gaps. Step 4 - Outline your literature review's structure. Step 5 - Write your literature review.

  4. A historical review of global efforts on adult literacy education, with

    Literacy is a fundamental aspect of development and a step towards freedom and liberation from socio-economic constraints, because it enables all citizens to realise their basic right to learning. However, the global efforts which - according to a 2017 estimate of the United Nations Educational, Scientific and Cultural Organization (UNESCO) - resulted in an increase in literacy from 55.7% ...

  5. Journal of Literacy Research: Sage Journals

    The Journal of Literacy Research (JLR) is a peer-reviewed journal that has contributed to the advancement literacy and literacy education research for over 50 years.JLR is a forum for sharing innovative research and pedagogy that considers a broad range of topics encompassing instruction and assessment, policy development, understandings of literacies, and relationships of ideology and knowledge.

  6. A review of academic literacy research development: from 2002 to 2019

    Literature review Academic literacy as a set of literacy skills specialized for content learning is closely associated with individual higher order thinking and advanced language skill development (Shanahan & Shanahan, 2008 ).

  7. PDF International Literacy Statistics: A Review of Concepts ...

    International Literacy Statistics: A Review of Concepts, Methodology and Current Data . INTERNATIONAL LITERACY STATISTICS: A REVIEW OF CONCEPTS, METHODOLOGY AND CURRENT DATA ... literacy rate increased from 56% in 1950 to 70% in 1980, 75% in 1990 and 82% in 2000-2004. Worldwide, the adult literacy rate increased at a faster pace in the 1970s ...

  8. Literacy and Development: A Global Perspective

    Abstract: This study employed a research design centered on the. analysis and review of data concerning literacy rates sourced from. reputable databases and organizations. The primary sources of ...

  9. Literacy Research: Theory, Method, and Practice: Sage Journals

    An annual review, Literacy Research: Theory, Method, and Practice (LR:TMP) reports contemporary research and theory in literacy and literacy education reflecting the content of the Literacy Research Association (LRA) Annual Meeting. The articles in LR:TMP promote discussion and constructive critique about key areas of research, in addition to informing scholarship n the field.

  10. Literacy Rates Analysis: An International Comparison

    This paper attempts to provide a more complete reckoning of the economics of literacy rates around the world. It consists on a literature review about the economic benefits and costs of literacy rates and then it shows two econometrics models that attempt to predict literacy rates based on GDP, government expenditure, and several other independent variables.

  11. Writing a literature review

    A formal literature review is an evidence-based, in-depth analysis of a subject. There are many reasons for writing one and these will influence the length and style of your review, but in essence a literature review is a critical appraisal of the current collective knowledge on a subject. Rather than just being an exhaustive list of all that ...

  12. (PDF) Literacy Rate and Some Variables of Primary Education -A

    Literature Review. Dridi, M. (2014) showed a strong . link between corruption and . secondary school enrollment rates, but the relationship between . ... [Show full abstract] literacy rate, number ...

  13. Literacy Rate by Country 2024

    60%. 70%. 80%. 90%. 100%. Click on a country for details. Literacy Rate by Country 2024. Literacy—the ability to read and write—is arguably the single most important factor in determining a person's career arc. For those who can read and write, the range of possible vocations is vast—even highly skilled, high-paying careers are within reach.

  14. Critical digital literacies at school level: A systematic review

    The main aim of this systematic literature review (SLR) was to map out which elements or issues of critical digital literacies (CDL) relevant for school education currently feature in the academic literature and international policy documents.

  15. PDF THE SCIENCE OF READING: A LITERATURE REVIEW

    State-driven initiatives focused on early literacy precede the COVID-19 pandemic, but new legislation passed in 2021 demonstrates the growing momentum around literacy instruction and how it is taught. In the last ... Hanover's approach to this literature review began with a review of the secondary and peer-reviewed

  16. Ten Simple Rules for Writing a Literature Review

    Literature reviews are in great demand in most scientific fields. Their need stems from the ever-increasing output of scientific publications .For example, compared to 1991, in 2008 three, eight, and forty times more papers were indexed in Web of Science on malaria, obesity, and biodiversity, respectively .Given such mountains of papers, scientists cannot be expected to examine in detail every ...

  17. The province-wise literacy rate in Pakistan and its impact on the

    Factors affecting literacy rates in the literature review. Boissiere (2004) conducted research on primary school determinants and outcomes in developing countries. According to author, in today's world merely getting children to school is not enough; the government should also ensure that children complete their primary cycle to gain the basic ...

  18. Writing a Literature Review

    Writing a Literature Review. A literature review is a document or section of a document that collects key sources on a topic and discusses those sources in conversation with each other (also called synthesis ). The lit review is an important genre in many disciplines, not just literature (i.e., the study of works of literature such as novels ...

  19. Measurement of Digital Literacy Among Older Adults: Systematic Review

    Overall, the eHealth Literacy Scale (eHEALS) was the most frequently used instrument measuring digital literacy among older adults (16/27, 59%). Scores on the CCAT ranged from 34 (34/40, 85%) to 40 (40/40, 100%). Most instruments measured 1 or 2 of the DigComp Framework's elements, but the Mobile Device Proficiency Questionnaire (MDPQ ...

  20. Dimensions of teachers' data literacy: A systematic review of

    The current study presents a systematic review of teachers' data literacy, arising from a synthesis of 83 empirical studies published between 1990 to 2021. Our review identified 95 distinct indicators across five dimensions: (a) knowledge about data, (b) skills in using data, (c) dispositions towards data use, (d) data application for various purposes, and (e) data-related behaviors. Our ...

  21. (PDF) Low Literacy Rate at Primary Level: Identification ...

    overpopulation, increased poverty, slow economic growth, increases child marriage, Identification of the Causes of Low Literacy Rate and its Impact on the Community at Primary Level. 504 ...

  22. Culturally Responsive Literacy Interventions in Speech-Language

    The scoping review encompassed 22 studies selected from a pool of 908 sources, uncovering 34 literacy strategies presented in an evidence map. These strategies, intended for both school-age and preschool children, were implemented across diverse populations, including Indigenous communities, dual language learners, and children with disabilities.

  23. (PDF) Low Reading Literacy Skills of Elementary Pupils in the

    This meta-synthesis aims to synthesize factors that affect the low reading literacy skills among pupils enrolled in elementary schools in the Philippines. Fifteen (15) studies from 2013 to 2022 ...

  24. U.S. Literacy Rates by State 2024

    28.2%. California. 28.4%. New Mexico. 29.1%. U.S. Literacy Rates by State 2024. U.S. Literacy Rates by State 2024. The literacy rate is the total number of literate persons (people able to both read and write) in a given age group, expressed as the percentage of that age group. The adult literacy rate measures literacy among persons age 15 and ...

  25. 2024 Summer Food Service Program Guide

    The U.S. Department of Agriculture (USDA) released the 2024 Summer Food Service Program (SFSP) Administration Guide. This guide reflects policy updates to the SFSP since its last release in 2016, including policy updates made through the interim final rulemaking, Establishing the Summer EBT Program and Rural Non-congregate Option in the Summer ...