Measuring Media Literacy Around the World

The full papers presented in this Online Conference Paper panel are no longer accessible. You can see the original abstracts of the papers submitted for presentation in the abstract books.

Rationale for the Panel Discussion
The rise of mis- and disinformation has increased interest in media literacy (ML) among policymakers, who clamor for data and evidence to characterize the competencies required for work, life, and citizenship in a digital age. As general consensus among policymakers and academics grows that ML is defined as the ability to access, analyse, evaluate and create media in multiple forms, there has been some progress in measuring ML competencies in local and national contexts.

Some approaches to measuring media literacy have repurposed existing data or data that can be easily collected, rather than choosing stronger measures identified through empirical research (Bulger, 2012). For example, in Europe, the Media Literacy Index was created in 2017 as a response to the ‘post-truth’ phenomenon to measure the resilience of EU citizens to ‘fake-news’ and its consequences. The survey ranks 35 countries in Europe using proxy indicators for media freedom, quality of education, interpersonal trust, and e-participation as a way to gauge levels of public trust. But critics point out that such measures do not adequately characterize media literacy competencies in ways that enable policymakers to take action (citation).

New survey research has emerged that begins to paint a portrait of media literacy competencies using survey research methodologies that ask people to self-report their competencies (OFCOM, 2020). Performance-based measures that ask people to complete tasks have also been used to measure media literacy, but these studies involve small samples that are not generalizable to larger populations. Recent survey research with representative samples of Australian adults show that most people have a low level of confidence in their own use of media for information activities like safely navigating online environments, changing privacy settings on social media, and identifying misinformation (Notley et al, 2021). Other studies have examined high school teachers’ assessments of students’ skills (Hamilton et al, 2020) and perceptions by educators, parents and community members about the implementation of ML instructional practices in local school districts (Media Education Lab, 2021).

The need for more granular measures of ML competencies that can be assessed cross-nationally remains high. At a time when the vast majority of people in advanced democracies are increasingly reliant on digital media, there is a sense of urgency to develop reliable and valid measures of ML competency that can be conceptually distinct from measures of media usage. This panel discussion brings together researchers who are interested in the methodological dimensions of measuring media literacy competencies. Questions to be considered include:

  • What ideas, characteristics or practices distinguish media literacy competencies from digital literacy competencies?
  • Should media literacy competencies be based on values that promote democracy, civic participation or other perceived social benefits?
  • What role should emotional engagement and identification with media play in measuring media literacy competencies?
  • Can exposure to ML instructional practices serve as a proxy measure of media literacy competencies? Could proxy measures of ML be useful for certain research purposes?
Measuring Media Literacy Around the World
Session Code:
MER-6
Chair:
TESSA
JOLLS
Center for Media Literacy
399
Panel overview - Challenges and Opportunities of Measuring Media Literacy Around the World
Session Code:
MER-6
TESSA
JOLLS
Center for Media Literacy
Pierre
Fastrez
UCLouvain
Michael
Dezuanni
Queensland University of Technology
Renee
Hobbs
University of Rhode Island
Tanya
Notley
Western Sydney University
401
Media Literacy Measurement for Scalable Programs
Session Code:
MER-6
TESSA
JOLLS
Center for Media Literacy
402
Survey or Test: How Should We Assess Media Literacy on a Global Scale?
Session Code:
MER-6
Pierre
Fastrez
Université catholique de Louvain
Camille
Tilleul
Université catholique de Louvain
403
Measuring Adult Media Literacy in Australia
Session Code:
MER-6
Tanya
Notley
Western Sydney University
Michael
Dezuanni
Queensland University of Technology
Sora
Park
University of Canberra
404
Media Literacy Instructional Practices as Proxy Measures of ML Competencies
Session Code:
MER-6
Renee
Hobbs
University of Rhode Island

#IAMCR2022    facebook     
Tsinghua University           XJTLU              Chinese Association for History of Journalism and Mass Communication         IAMCR