Jill Castek. College of Education, University of Arizona, Tucson, AZ, USA
Gloria E. Jacobs. College of Education, University of Arizona, Tucson, AZ, USA
* Address all correspondence to: Jill Castek, College of Education, University of Arizona, 1430 E. Second St., Tucson, AZ 85721, E-mail: jcastek@email.arizona.edu
Abstract
The purpose of this article is to introduce a practical framework and tools that can be used or adapted to assess the digital problem-solving abilities of library users. The framework and tools were developed from a three-year research study conducted as a university and public library partnership. The library field is in need of a valid and reliable digital problem-solving assessment, which reflects the idea that the digital world is constantly changing as the networks, interfaces, and tools evolve. Such an assessment would promote digital problem solvers who are fluid and flexible to navigate an ever-changing digital landscape. A structured digital problem-solving assessment can be used by librarians and other library staff to customize supports for underserved adult learners specifically, and to enhance learning experiences with digital tools in libraries for the wider public more generally.
KEY WORDS: assessment, digital problem solving, assessment framework, underserved populations, digital literacy, digital resources, digital tools, digital equity, library services
Digital technologies have fundamentally transformed the ways we read texts, access information, and interact with one another. The implications of the vital role that digital technologies plays in our lives have affected tasks we perform in the workplace, the texts we examine and how we access them, and the ways we learn. Because of the ever-changing nature of digital technologies, it has become necessary for libraries, as community anchor institutions, to find ways to understand patrons' needs and provide the support needed for patrons to be successful in meeting whatever goals they have identified. This article introduces a practical framework and approaches that can be used or adapted to assess the digital problem-solving abilities of library users. Insights gained from an assessment of this type can be used by librarians and other library staff to customize supports for underserved adult learners specifically, and to enhance learning experiences with digital tools in libraries for the wider public more generally.
The assessment framework we introduce stems from the Advancing Digital Equity in Public Libraries: Assessing Library Patrons' Problem Solving in Technology Rich Environments research project [referred to as the Digital Equity in Libraries (DEL) project]. This work acknowledges public libraries as community anchors for adults' digital learning, places the learner at the center of learning supports, and addresses the wider goal of designing engaging and relevant experiences in libraries that prepare people to be full participants in their communities and in a global society.
In the sections that follow, we first define the terms digital literacy and digital problem solving. Next, we discuss results of a global survey of problem solving in technology-rich environments (PSTRE). Then, we examine the digital literacies teaching and assessment tools currently available to libraries before sharing the processes and protocols for the assessment framework we have developed. We end with a discussion of what a digital problem-solving assessment tool could look like and future plans for designing a valid and reliable assessment for adult learners.
Digital technologies encourage wider access to texts and information. However, not all individuals have access to digital devices and connectivity, or possess the digital literacies needed to navigate the digital world. Digital literacy is the ability to use information and communication technologies to find, evaluate, create, and communicate information (American Library Association, 2013). Digital literacies (plural) refer to a composite set of competencies including: basic computer skills, navigating online interfaces, efficiently using digital and online tools, and digital networking.
Digital problem solving differs from digital literacy, in that it reflects an individual's ability to fluidly and flexibly navigate and use multiple digital resources in order to accomplish goals in multiple domains. These domains include professional work, personal interests and hobbies, educational pursuits, social and professional networking, civic engagement, and for uses not yet conceptualized. Public libraries are at the forefront of the important work being done in digital problem solving. They have played an essential role in making knowledge and the tools of lifelong learning accessible to the most vulnerable and traditionally underserved populations. Furthermore, libraries are uniquely positioned to take a leadership role in developing evidence-based practices that seek to improve library users' digital literacies. In the process, libraries can tackle inequities needed to help underserved adult populations bridge the digital divide.
Results from the large-scale, international Program for the International Assessment of Adult Competencies (PIAAC) survey revealed that adults in the United States are less able to problem solve in digital environments than those in many other countries (Organisation for Economic Co-Operation and Development, 2016). The results indicated that people, regardless of age, struggle to solve problems in an online environment (Organisation for Economic Co-Operation and Development, 2015). The PIAAC results indicated that the overall performance of the U.S. adult population was below the international average in all three PIAAC subject areas: literacy, numeracy, and PSTRE.
These results underscore important equity issues facing adult subgroup populations, including immigrants, English learners, disconnected youth, adults with learning disabilities, and dislocated workers. The U.S. skills deficits highlighted by the PIAAC results have important implications for the nation's global economic competitiveness and the quality of civic life in local communities. For example, adults in the United States with low skills are four times more likely than adults in other countries surveyed to report only “fair” or “poor” health status. Moreover, workforce participation rates and wages are lower among adults who report having no experience using digital devices compared to those who have basic problem-solving skills using digital devices (Organisation for Economic Co-Operation and Development, 2015).
Trends surfaced in the PIAAC results demonstrate that participation in the digital world is no longer optional. In order to access the information needed to be an informed citizen, apply for a job, access public and social services, and participate in health care, digital problem-solving skills are a necessity. However, individuals from underserved communities often lack access to the hardware and Internet services needed to attain, develop, and use these skills. Libraries have long been the “people's university,” and thus are the access point for individuals who require support in developing digital problem solving. In an age of wide discrepancies between those who have a firm grasp of digital literacies and who need support in acquiring them, libraries can level the playing field; the services they provide are essential.
Determining the level of support an individual requires to successfully navigate online tends to be done on an individual level. Librarians, library staff, and volunteers may meet with learners and determine learning needs in the moment. However, not having access to a structured assessment tool puts library staff and volunteers at a disadvantage, in that they do not have a keen sense of the individual's strengths or abilities to navigate online. As a result, they may have to recommend resources and programming based on “gut feel” alone.
A structured assessment, however, must take into account the highly variable abilities of individuals since digital problem solving is applied across different contexts and for different purposes. Customized support could be provided more efficiently if library users' skill levels are known to both the learner and library staff. A consistent framework for assessing an individual's digital problem-solving skills can help librarians confidently provide appropriate resources and digital tools that match a library patron's abilities and needs. Moreover, a consistent assessment framework for digital problem solving also encourages data-informed programming designed to help adult learners acquire stronger digital problem-solving abilities.
However, pointing an individual to the right digital tools might not be enough. A structured assessment can also provide librarians, library staff, and volunteers with knowledge about how much in-person support an individual may need in order to be successful with a given digital tool. In-person support, whether through one-on-one mentoring, groups of three, small drop-in groups in a computer lab, or planned classroom cohorts, has been shown to be instrumental in the online learning success of individuals from underserved populations (Castek et al., 2015). The remainder of this article details specific processes developed from research for assessing digital problem solving (Castek et al., 2018a,b,c). These processes can be used or adapted in libraries and/or in library-sponsored outreach efforts and classes.
To date, there are limited free tools available to libraries to help them gain an understanding of the digital skills and experiences of a library user. In this section, we discuss three of the available options and a discussion of where they fall short of meeting the needs of libraries for assessing the digital problem-solving skills of adults, particularly the underserved.
The PSTRE, administered as part of PIAAC, is available for a fee. This valid and reliable scenario-based assessment is made up of nine multi-stem constructed response items. The items evaluate digital communication and use of networks to acquire and evaluate information and perform practical tasks in personal, work-related, and community contexts. Completing these tasks requires basic digital literacies but also requires digital problem-solving abilities needed to sort information, interpret search results, navigate databases, and complete functions within and across novel digital interfaces; however, none of these tasks included represent digital problem solving in a library setting.
The PSTRE yields scores ranging from 200 to 400 reported in four levels. Following the completion of the PSTRE assessment, adults are provided individual results that are intended to be informative. The score reports, in practice, were difficult to interpret and unclear as to how they could be translated into instructional actions. To make the reports more meaningful, we adapted them [see Castek et al. (2018d)] in an attempt to provide concrete follow-up activities that could be used to support digital literacies development in the library. While the aggregate PSTRE scores collected in our study reveal larger trends that are useful for comparing populations, neither the individual scores nor the PSTRE levels were meaningful for librarians seeking to support adults' development of digital problem-solving skills. The scores provided explanations of the types of competencies an individual should acquire, but did not suggest how an individual might go about meeting his/her problem-solving goals.
Assessment tools such as NorthStar Digital Literacy (https://www.digitalliteracyassessment.org/) provide digital badges and/or a computer skills certificate to users after completing assessment and learning modules. These self-access materials are geared toward workplace skills and cover the use of specific software packages commonly used in education and the workplace such as Microsoft Word and Microsoft Excel. For those striving to advance in the workforce, the NorthStar assessment and learning materials provide guidance in sequencing digital skills instruction. These skills have been identified as necessary for many jobs and include basic knowledge of computers, Internet basics, email, and Windows and Mac operating systems (Minnesota Literacy Council; digitalliteracyassessment.org/assessment-info). The materials can provide a foundation for training on Microsoft software programs, using social media, or reinforcing information literacy skills; however, the materials are not geared toward building the capacity to “learn how to learn” with technology—a necessity when it comes to applying what is learned fluidly and flexibly across many digital contexts, domains, and interfaces.
While NorthStar focuses on the development of specific skills needed for employment, GCF Learn Free (https://edu.gcfglobal.org) provides over 125 online tutorials and materials for learning in the areas of technology, work, core skills, reading, and math. Part of the set of lessons includes basic computer skills that help new-to-computer users learn to navigate a keyboard, mouse, windows, and browsers. However, no pre-assessment is built into the program, so learners are left without guidance to determine what to study. Quizzes are available at the end of each learning module and could be used as pre-tests, but doing so requires knowledgeable guidance from a tutor or teacher. Furthermore, the quizzes are not specific to the needs of library users, nor can they be adjusted to meet the needs of individuals' application across different learning contexts.
Each of the assessment resources we described can be powerful tools for the purposes for which they were intended. For example, the PSTRE is useful for comparing demographic groups or examining correlations between scores and the use characteristics identified through PIAAC's background questionnaire. The NorthStar assessment is useful for personal advancement and is upskilled in a workforce development context, and the GCF Learn Free tutorials are helpful for individuals who have some sort of support system already in place such as an adult basic education class. For each assessment, the information obtained differs, but unfortunately none of the information provides the insights needed to provide responsive instructional support in the moment by a librarian to facilitate life-wide learning (Reder, 2013). The main challenge of any assessment to be used within a library setting—and the main shortcoming of PSTRE, NorthStar Digital Literacy Assessment, and GSF Learn Free tutorials—is that the aforementioned programs do not meet the goals of adult digital problem solvers and they do not necessarily apply to learning within a library context. Work in the field of adult learning (Reder, 2013) has indicated that adults need to be able to set their own goals and have materials available that immediately address those goals, whether those goals be related to hobbies or personal interests, commercial (shopping), navigational (maps), creative arts oriented, networking, or other purposes. Because library staff play an essential role in supporting learners' in finding resources, an assessment that flexibly acknowledges the autonomy of adults in goal setting in a library setting is needed.
Over the course of three years, researchers from the University of Arizona and Portland State University collaborated with librarians from the Multnomah County Library in Portland, Oregon. The purpose of the DEL project was to examine and understand the digital problem-solving processes of vulnerable adults. Vulnerable adults were defined as those individuals who are economically insecure, may not have stable housing, may have low educational attainment, and may not have regular access to computers and the Internet beyond that provided by the library. In response to needs in the library and the adult education field, the research questions asked: (a) what are the digital problem-solving skills of individual library users, (b) how do they use those skills, and (c) how can libraries determine how best to support these individuals as they engage in digital problem solving? To address those questions, we developed an assessment framework, performance-based assessment tasks, observation protocols, protocol documentation, and adaptable tools that librarians can use to support digital problem solving.
Data were collected from approximately 450 library users who completed a library-use background survey. Of those participants, 211 completed PIAAC's PSTRE assessment. To better understand the range of strategies involved in digital problem solving, researchers observed and screen recorded 18 learners while they were engaged in digital problem solving. These 18 individuals were recruited from a participant pool made up of individuals who were living outside or were subsidized housing residents or otherwise displaced, job seekers, low-income, and others. Interviews and observations focused on their digital navigation in order to document, analyze, and better understand a range of digital problem-solving abilities. We asked the 18 participants to complete a range of digital problem-solving tasks while we recorded and observed them.
With the guidance of an advisory panel made up of public and academic librarians, along with scholars in the field of digital literacy and adult learning, librarians on the research team developed a set of five library tasks commonly encountered at the Multnomah County Library. The five tasks required navigation of the library's website and linked databases and addressed the four areas outlined in the PSTRE framework (Organisation for Economic Co-Operation and Development, 2012) including: (1) setting goals and monitoring progress, (2) planning and self-organizing, (3) acquiring and evaluating information, and (4) using information for specific purposes. These specific tasks were developed with the Multnomah County Library's website resources in mind; however, the tasks can be adapted by librarians for use in their specific settings. Ultimately, these tasks became a framework for observing digital problem solving in the library [see Castek et al. (2018a)]. The framework offers libraries a useful tool for observing, tracking, and documenting digital problem-solving strategies and introduces metacognitive scaffolding prompts that suggest what a librarian can say to support digital problem-solving development. For example, participants were asked to find the answer to a medical question using a resource made available through the library. Scaffolding prompts include: Why should you use a database to search for health information? Can you only search databases through the library's website? How do you know the symptoms you found are reliable/accurate? Should you search another source to be sure? Suggested prompts such as these plant the seed for learners to think broadly about how online information is organized and ways it can be navigated within and across different interfaces. Critical evaluation is also promoted.
Observational data collected from problem-solving tasks led us to develop a checklist for observing library users' digital problem solving [see Castek et al. (2018c)]. The checklist can be used as a basis for documenting an individual's experience with digital tools and problem solving, and can be used informally to prompt a targeted dialogue aimed at learning about library patrons' digital experiences. For example, the checklist can be used to shape interactions with a library user when he or she comes in for assistance. The tool can also be used in conjunction with a reference interview, requests for assistance, one-on-one appointments with a librarian, or in the context of a class to identify who needs more support, and what type of support to offer. The flexibility of the tool can support individual interactions or it can be used with a small group or within a structured class. The checklist takes into account the following four main concepts that arose from our data analysis:
This article suggests that librarians, and others who work with adult learners in informal settings, can benefit from examining and using the assessment framework and checklist described that identify the experiences and strengths of those learners. Once a librarian or other library staff member has a sense of what a learner knows how to do, he/she can then direct the individual to the most appropriate tool or learning environment. For example, an individual who has very little experience using computers might benefit from a tutor-supported, curated online learning platform such as the one documented in our work on digital literacy acquisition (Castek et al., 2015) before moving on to more demanding tools such as Lynda.com, databases, job, or ancestry searches. Faculty in schools of Library and Information Science may find the assessment framework useful as a tool for mapping the digital territory and documenting the range of library users' digital skills and experience. It can be used in role-playing exercises to help librarians in training to anticipate and document the needs of learners who have limited digital skills and experience. It can also be used to encourage organizing a plan for dynamic responsive scaffolding of digital skills when working with patrons whose needs are not yet known.
The framework and checklist are offered as first steps toward a more structured way of thinking about assessment, yet are flexible enough to personalize for use different contexts. We offer them as a way for library staff to better understand the digital problem-solving experiences of their library users. These tools are grounded in insights from research, but have not been tested across a variety of library settings with diverse library users. We suggest that this work can be expanded to create a stable, valid, and reliable assessment instrument, which can be designed to be informative in guiding library practice. Such a tool would consistently provide meaningful information for helping patrons in a way that goes beyond instinct and experience.
Our work has provided a backdrop for better understanding of available digital problem-solving assessment. We argue that a valid, reliable tool will address an unmet need within the field. We envision such a tool to be personal, civic, and library/information search oriented as well as addressing workplace needs. It should be scenario based, face to face, easy to score, and include an option for assessing individuals' abilities as well as opportunities to examine collaborative problem solving among pairs or small groups.
This work was supported in part by a National Leadership Grant from the Institute of Museum and Library Services, Washington DC (Grant LG-06-14-0076-14A).
American Library Association (2013), Digital Literacy, Libraries, and Public Policy: Report of the Office for Information Technology Policy's Digital Task Force. Retrieved November 8, 2018, from https://www.districtdispatch.org/2013/01/on-the-front-lines-of-digital-inclusion/.
Castek, J., Gibbon, C., Jacobs, G., Frank, T., Honisett, A., and Anderson, J. (2018a), Blueprint for Designing Digital Problem Solving Tasks. Advancing Digital Equity in Public Libraries: Assessing Library Patrons' Problem Solving in Technology Rich Environments. Retrieved November 8, 2018, from http://archives.pdx.edu/ds/psu/24579.
Castek, J., Jacobs, G., Gibbon, C., Frank, T., Honisett, A., and Anderson, J. (2018b), Executive Summary, Advancing Digital Equity in Public Libraries: Assessing Library Patrons' Problem Solving in Technology Rich Environments. Retrieved November 8, 2018, from http://archives.pdx.edu/ds/psu/24522.
Castek, J., Jacobs, G., Gibbon, C., Frank, T., Honisett, A., and Anderson, J. (2018c), Observing Digital Problem Solving. Advancing Digital Equity in Public Libraries: Assessing Library Patrons' Problem Solving in Technology Rich Environments. Retrieved November 8, 2018, from http://archives.pdx.edu/ds/psu/24581.
Castek, J., Jacobs, G., Gibbon, C., Frank, T., Honisett, A., and Anderson, J. (2018d). Digital Problem Solving: Score Report. Advancing Digital Equity in Public Libraries: Assessing Library Patrons' Problem Solving in Technology Rich Environments. Retrieved November 8, 2018, from https://pdxscholar.library.pdx.edu/digital_equity_toolkit/4/
Castek, J., Jacobs, G., Pendell, K., Pizzolato, D., Reder S., and Withers, E. (2015), Language Learners: The Learner/Tutor Relationship (Digital Literacy Acquisition in Brief). Retrieved November 8, 2018, from http://archives.pdx.edu/ds/psu/16195.
Minnesota Literacy Council, Northstar Assessment. Retrieved November 8, 2018, from https://www.digitalliteracyassessment.org/assessment-info.
Organisation for Economic Co-Operation and Development (2012), Problem Solving in Technology-Rich Environments, in Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills, Paris: OECD Publishing. DOI: 10.1787/9789264128859-7-en
Organisation for Economic Co-Operation and Development (2015), Adults, Computers and Problem-Solving: What's the Problem? Paris: OECD Publishing.
Organisation for Economic Co-Operation and Development (2016), Skills Matter: Further Results from the Survey of Adult Skills, Paris: OECD Publishing.
Reder, S. (2013), Lifelong and Life-Wide Adult Literacy Development, Perspectives Language Literacy, vol. 39, no. 2, pp. 18–21.
Change Subscription
Information
Server Error