jon

Preparing Academic Librarians to Prioritize Privacy in Learning Analytics Projects: An Evaluation of a Professional Development Course

As institutions of higher education further develop their learning analytics efforts, academic library practitioners are called upon to participate in these efforts and have opportunities to shape their campus strategies. Nonetheless, library practitioners may not be prepared with the knowledge, skills, and strategies to engage with campus stakeholders. This article documents the effectiveness of an online training course that developed librarian skill and confidence. Details discuss opportunities to replicate and extend the course.

Introduction

Since around 2010, researchers, vendors, and higher education institutions have been developing tools, practices, and policies to support learning analytics. Learning analytics “uses analytic techniques to help target instructional, curricular, and support services” to affect an array of educational outcomes, such as personalizing educational programs to student needs and matching resources to improve learning outcomes (van Barneveld et al., 2012, p. 8). As institutions of higher education further develop their learning analytics efforts, academic library practitioners are called upon to participate in these efforts and have opportunities to shape their campus strategies. Nonetheless, library practitioners may not be prepared with the knowledge, skills, and strategies to engage with campus stakeholders. This lack of preparation can mean that library values, such as privacy, are not raised when institutions are planning and designing learning analytics initiatives. While many institutions are still in the early stages of developing learning analytics activities, the need to prepare library practitioners with training and tools is pressing. This article describes the effectiveness of an online training course that develops librarian skill and confidence, prepares them to engage with campus stakeholders, and equips them to contribute to institutional learning analytics efforts. After a brief review of library learning analytics literature, the authors describe the evaluation methods they used and how their quantitative and qualitative data analysis led to findings associated with positive assessments of learning and the course’s impact, as well as specific areas for improvement. The authors conclude with a discussion reflecting on the course’s impact, how the course could be replicated by others, and opportunities to extend the course for other educational needs.

Literature Review

Campus Technologies, Learning Analytics, and Student Privacy

Information and educational technologies, which are integral to the higher education ecosystem in which students are enmeshed, create digital traces of student learning and life. They are key tools used in service to the primary mission of higher education: learning. Learning management systems, communication tools, library databases, and other technological artifacts support a student’s ability to access and use information, and to interact in student-to-student and student-to-instructor learning experiences (see Figure 1).

Figure 1

Student-Technology Touchpoints, Sites of Data Creation, and Tracking
Image courtesy of Gabriel Hongsdusit for The Markup.

Figure 1. Student-Technology Touchpoints, Sites of Data Creation, and Tracking
Image courtesy of Gabriel Hongsdusit for The Markup.

Other technologies serve notable purposes in higher education, too. Unfortunately, in the context of American higher education, safety and security have increasingly become motivators in adopting arguably more invasive tools to address campus crime and to support protective interventions during active shooter incidents. CCTV paired with facial recognition applications have become more common (Burke, 2020) in addition to the use of RFID chips in—and printed barcodes on—identification cards, which enable and restrict access to physical spaces. Some campuses are also using license plate readers to flag suspicious cars and aid criminal investigations (“How License Plate Readers Are Helping University Police Solve Crimes,” 2023; see “Innovative License Plate Reader Technology Now in Use on CU Boulder Campus,” 2023; Nichols, 2023).

These once distinct technological domains—education and safety technologies—have begun to blur into one. Computer scientists see some value in actively surveilling, identifying, and judging students with facial recognition tools paired with AI to mold student learning behaviors (D’Agostino, 2024). Aspects of this technological approach became evident during the peak of the COVID-19 pandemic when students were forced into online learning environments (Flaherty, 2020). Many campuses licensed invasive proctoring software, arguing that academic integrity was at stake. Tools like Proctorio required students to allow monitoring via “webcam, microphone, browser, desktop, or any other means necessary” (Flaherty, n.d., para. 10).

The data gleaned from campus technologies has enabled myriad analytics of student learning and behaviors. Much of this analytical work stands under the umbrella of learning analytics and educational data mining research, though in unique ways, respectively. Both encompass “data design, aggregation, mining, and analytics (e.g., data visualization, predictive modeling, personalize systems) for myriad purposes, including personalized education, predictive advising, and automated interventions in learning behaviors” (Jones, 2022a, p. 4). But, as alluded to above, the evolving technological environment and the learning analytics that are enabled by it create serious privacy issues. Jones (2022a), citing Nissenbaum (2009), writes:

Privacy is an embedded contextual value built into the overall mission of higher education, and it has normatively moderated the flow and ends of student data and information use for some time. In other words, informational norms mapped to student privacy have served to “regulate the flow of information of certain types about [students] from one actor (acting in a particular capacity or role) to another or others (acting in a particular capacity or role) according to particular transmission principles” (p. 10).

The singular problem is that information flows are changing and are being created for the purposes of “dataveillance” (Clarke, 1988), which put at risk a student’s ability to pursue an education according to their own interests without undue influence from higher education actors informed by analytics or by the analytics themselves being built into systems. As librarians began to use learning analytics in their own practices and student-focused services, they, like others, struggled with privacy concerns in higher education as related technologies evolve.

Library Learning Analytics Ethics

Practitioners and scholars alike have widely documented library learning analytics practices and the related ethical issues. In our previous research on this topic, we noted that “the ethics of learning analytics are nothing but complicated, connecting various nodes such as privacy, autonomy and free will, intellectual property, justice and fairness, and democratic participation” (Jones & Hinchliffe, 2022, p. 2). In that article, we—like others have—concentrated on the idea that privacy is a key instrumental value that enables varied pursuits and expressions of intellectual freedom, which are foundational elements of the educational experience and crucial pillars in professional library ethics (see Currier, 2021; Doty, 2020; Hartman-Caverly, 2019; Oliphant & Brundin, 2019).

There is a growing disconnect between values and praxis. Citing work by Zimmer and Tijerina (2018), who in their own work identified gaps in professionals’ privacy literacy, we argued that “practitioners are unable to meet the practical needs that prioritizing privacy requires as a core professional value” (Jones & Hinchliffe, 2022, p. 2). Briney’s (2019) study of published library learning analytics research led them to the conclusion that “academic libraries’ actual data practices are not living up to data best practices” (p. 27) demonstrating “evidence of a conflict between libraries’ commitment to patron privacy and their current data handling practices in learning analytics projects” (p. 28).

Why might this be the case? We previously wrote, citing Jones (2019), “that part of the ethics problem is that most LIS [library and information science] students receive little research methods training and are likely to be ‘under-skilled and unprepared to lead quantitatively rigorous learning analytics projects’” (Jones & Hinchliffe, 2022, p. 3). Huang et al. (2021, p. 363) argue that “the combination of information ethics, information science, and educational technologies built into LIS programs” (p. 362) should motivate LIS educators to fill the values and praxis gap and to take “a proactive, holistic, and direct interest in artificial intelligence (and machine learning or data science) alongside offerings and contributions in information ethics” (p. 363). So, while current practitioners who are conducting library learning analytics projects are ethically attuned to privacy and its importance generally, they are likely under-skilled and underprepared to meet the demands associated with learning analytics that raise significant, myriad, and specific privacy problems.

Identified Professional Development Needs

Academic librarians need professional development opportunities to learn ethics strategies and sensitivities that acknowledge the varied issues that learning analytics practices create. It is a pressing problem: “learning analytics can be seen as extending on traditional styles of assessment practiced in the library and that ‘to not participate in learning analytics may limit a library’s ability to serve students’ educational interests’” (Flierl et al., 2023, p. 35 citing Jones & Hinchliffe, 2022, p. 177). Flierl et al. (2023, p. 39), whose work represents an environmental scan conducted on behalf of the Association of College and Research Libraries (ACRL), argued that “academic librarians are increasingly required to gain skills beyond the traditional qualifications” acquired in their master’s degree programs and during on-the-job practice. In a previous study (Jones & Hinchliffe, 2022) we investigated: 1) what, exactly, academic library practitioners perceived to be the most pressing ethical issues associated with learning analytics and 2) whether they were prepared to address the issues they identified; to date, this is the only research that addresses these questions. Specifically, in 2020, we conducted a survey of academic library practitioners. The results from the 2020 survey were as follows. While most respondents rated their knowledge of learning analytics ethics, research ethics, and data ethics as “moderately knowledgeable” with a “higher degree of knowledge” for research ethics, “49% of respondents had not received any training for learning analytics ethics; only 6% reported receiving training in a course while pursuing a degree.” Respondents still wanted more education: “88% responded they somewhat or strongly agree they need learning opportunities to better understand ethical issues associated with learning analytics.” A strong majority—90%—of respondents stated they somewhat or strongly agreed learning analytics raises ethical issues. We presented respondents with “29 ethical and practical learning analytics issues identified in our literature search grouped by four themes: privacy, data ethics, data management, and trust. The top five ethical issues respondents identified as being very challenging for higher education were: power imbalances (68%), algorithmic biases (64%), self-fulfilling prophecies (59%), establishing new privacy norms (56%), and maintaining trusting relationships (54%).” In our discussion, we argued that the ethics training respondents had received “was useful,” but “their direct learning need was for something separate and unique from research ethics,” which is to say that specific ethics training associated with learning analytics would be useful and fill current knowledge gaps.

The remainder of this article concerns our development and evaluation of a professional development course to fill the previously identified gaps and reflects an approach to improve academic library practitioners’ privacy literacy related to learning analytics. Kumar (2023, p. 6) defines privacy literacy as “(1) knowledge about information flows and how to limit them, (2) a process of critical thinking about information flows, and (3) a practice of enacting appropriate information flows.” The course was influenced by this definition, insofar that it aimed to fill participant knowledge gaps; engage them in critical thinking about personal, professional, and institutional values and practices; and provide opportunities for “reflexive engagement” (Kumar, 2023, p. 9) that would enable participants to change the privacy conditions at their workplace via information practices and policies.

Methods

Course Design

We developed an asynchronous, online course using an outcomes-based, or backwards design strategy, following techniques outlined in Biggs (2014). We began by drafting and finalizing course learning outcomes according to thematic areas relevant to the course content. Verbs from Bloom’s Revised Taxonomy informed the construction of the outcomes and ensured that they addressed the cognitive process dimension and the knowledge dimension (see Anderson et al., 2000; Iowa State University Center for Excellence in Learning and Teaching, 2022; see Table 1).

Table 1

Course Learning Outcomes and their Bloom’s Revised Taxonomy Alignment

Course Learning Outcome

Thematic Area

Bloom’s Revised Taxonomy: Cognitive Process Dimension

Bloom’s Revised Taxonomy: Knowledge Dimension

Describe the social, political, and technological elements of learning analytics in higher education, generally, and academic libraries, specifically.

Learning Analytics

Understand

Conceptual

Distinguish between theoretical aspects of information privacy and their connection to learning analytics.

Privacy

Analyze

Conceptual

Critique existing learning analytics principles, policies, practices, and recommendations and the ways in which they may create privacy harms.

Privacy

Analyze

Conceptual

Adjust a learning analytics practice to strategically minimize privacy harms and maximize specific benefits.

Ethics

Evaluate

Procedural

Plan for ethical and evidence-based library learning analytics projects that are based in privacy by design.

Professional Development

Create

Procedural

Develop a learning plan for continuing professional development regarding learning analytics, information privacy, and ethical practice.

Professional Development

Create

Metacognitive

Next, we developed assessments driven by and mapped to the learning outcomes. The major assessment consisted of the Privacy Sourcebook. About the Privacy Sourcebook, we wrote to learners in the introduction:

Opportunities to engage in conversations about learning analytics will likely present themselves in both expected and unexpected settings. Being prepared for these opportunities will enable you to respond confidently and with clarity of thought. By developing your own Privacy Sourcebook you will have an opportunity to bring together your philosophy on learning analytics and privacy with an environmental scan and an analysis of key stakeholders and allies. When these pieces are put together, you will be able to develop key messages that reflect ethical approaches you would like your library and campus to pursue in designing and implementing learning analytics programs and to identify strategies for action (Hinchliffe & Jones, 2022p, p. 3).

The Privacy Sourcebook contains five unique activities, each described with an introductory overview, a purpose statement for the activity, and a description of the required tasks for successful completion (see Table 2).

We also tasked learners with completing a structured multimedia introduction, participating in four guided discussions, and completing a multimedia presentation focused on the learner’s course reflections, growth moments, and construction of their Privacy Sourcebook. To the extent possible and useful, we employed the Transparency in Learning and Teaching (TILT) model to describe an assessment, its purpose, and the relevant tasks (see Winkelmes, 2014). We developed rubrics for each assessment to help learners gauge their engagement and progress in the course, and to help us—the instructors—monitor learner participation and intervene when necessary.

Table 2

Privacy Sourcebook Activity Overviews and Purpose Statements

Activity

Overview

Purpose Statement

Environmental Scan

Environmental scanning is a process for observing, reflecting upon, and interpreting data that is relevant to your work. It makes you aware of potential pitfalls while also helping to identify affordances and support structures. As such, it sets the foundation for planning for action.

By conducting an environmental scan you will take stock of the work on your campus with learning analytics, identify leaders and other stakeholders, and strategize where the library is and/or could be involved in these activities.

Philosophy

A philosophy statement is an articulation of one’s beliefs and intentions. As a self-reflective statement, it communicates your values and goals as a lens through which to examine the alignment between what is and what you believe should be. Paired with an environmental scan, it enables one to identify actions to further that alignment.

Developing a personal statement of your beliefs about learning analytics will help ground your actions and ensure that your decisions are reflective of your intentions and goals for ethical practice. This activity will provide you with an opportunity to explore various considerations related to learning analytics and articulate your perspectives on these issues.

Talking Points

Talking points are prepared messages that you can convey in a clear and concise manner. At the core of a talking point is the message, which may be stating a position, raising an issue or question, presenting an objection, etc. How that message is expressed takes into account the audience for that message. The audience for a talking point may be a particular person or a group.

Developing your messages in advance ensures that when the opening presents itself you will be ready to communicate what you want to say and it guards against missed opportunities. The particular talking points you will craft will be specific to your context and your intended audience(s).

Resources

Curating a set of resources allows one to find those items that are particularly useful or relevant for one’s work. The course has provided you opportunities to engage with a variety of literature (e.g., popular press, scholarly articles), as well as interactions with peers that will have brought to the surface other potential resources, such as communities-of-practice. Each resource serves as a potential intellectual tool to use when engaging in learning analytics.

By identifying the resources that you rely on in your work with learning analytics, you will be well-positioned to share useful information with others, call upon the scholarly literature and other documents to support your talking points, and advocate for prioritizing privacy in learning analytics work.

Professional Development Plan

Being intentional about ongoing learning enables one to solidify one’s knowledge and skills while also continuing to grow and develop. A professional development plan is a mechanism for identifying one’s goals for continued learning and systematically addressing one’s learning needs.

Learning analytics is a rapidly growing area of practice and research. Through this course you have hopefully developed a foundation of knowledge and skills that will be useful for your ongoing work and serve as a strong basis for ongoing learning. By taking some time to reflect on what you have learned and how you anticipate engaging with privacy and learning analytics going forward, you can then articulate some personal learning goals and identify specific ways to pursue ongoing professional development in this arena.

Finally, we created learning objects to contextualize and deliver our instruction. We developed six modules respectively entitled: Getting Situated; Learning Analytics in Higher Education and Opportunities for Libraries; Learning Analytics and the Privacy Problem; Critical Lenses on Learning Analytics; Ethics in Action; and Planning for Prioritizing Privacy. Each module contained at a minimum an overview page, a multimedia lecture embedded within a discussion forum to facilitate questions and answers, and a readings and media page to access required and supplementary materials; modules also included a module quality survey. Four modules also included a multimedia interview, “Four Questions With …,” that we conducted with a leading practitioner or scholar on issues related to learning analytics.

Each module overview page contained a description of the module’s content, specific learning outcomes, and activities for learners to complete. Readings and media pages provided guidance on how learners should approach readings; required readings were open access, but supplementary readings were a mix of open and closed access. Modules also had associated relevant assessments. To provide a guide for the course, set expectations, and outline policies, we created a comprehensive syllabus.

With the learning objects drafted, we implemented them in the learning management system (LMS) Canvas. The authors were both familiar with teaching online or hybrid courses in Canvas for our respective institutions, which made our LMS choice straightforward. Further, Canvas provides a “Free-for-Teacher,” non-institutionally affiliated version, as our learners would be from outside our respective institutions. An additional benefit of selecting Canvas for our LMS was that it enabled us to create an IMS Common Cartridge (IMSCC) export file each time we ran the course for archival purposes. Using this file, others can duplicate and iterate on the course by importing it into an LMS that accepts IMSCC files. After building the course site in Canvas, co-author Jones did an informal evaluation of its design according to the Quality Matters general standards and associated specific review standards for higher education courses (Murillo & Jones, 2020; Quality Matters, 2022). As a certified Quality Matters peer reviewer, Jones was familiar with the standards and evidence that support whether a standard is met. We made course design adjustments as needed. All learning objects associated with the course, along with IMSCC exports of the course site, are available for access, use, and modification by others in accordance with the associated license by accessing our research repository (Hinchliffe & Jones, 2022n).

Learner Recruitment and Enrollment

With the course design set, we constructed the process for learner recruitment and enrollment. As we intended to collect data for assessment, evaluation, and research purposes, we began this process by discussing our methods and research designs with our respective institutional review board (IRB) offices. Both Indiana University (Jones, 2022b) and the University of Illinois Urbana-Champaign (Hinchliffe, 2022) IRBs classified our activities as exempt from a full review.

Our targeted participants were individuals who identified as academic library professionals working in American higher education institutions. We recruited participants by posting a recruitment message (Hinchliffe & Jones, 2022j) to 20 academic library listservs and online community groups (Hinchliffe & Jones, 2022i) in addition to publishing blog posts on our project website (Hinchliffe & Jones, 2022r) and updates on our Twitter account (Hinchliffe & Jones, 2022o); we posted no follow-up messages. Interested individuals completed a survey of their experiences, interests, and demographics (Hinchliffe & Jones, 2022b).

Because interest in the course was high but enrollment for each cohort was limited, we reviewed the interest surveys to build diverse cohorts based on personal demographics (e.g., age, race/ethnicity, gender), professional demographics (e.g., job classification, years of experience, experience with learning analytics), and institution type (e.g., community college, research-intensive, private). After selecting participants, we sent them an invitation to enroll by email (Hinchliffe & Jones, 2022h), which required them to complete a pre-course knowledge, skills, and abilities assessment survey (Hinchliffe & Jones, 2022m). Those who accepted our invitation by completing the survey were sent a “join code” to enroll in our Canvas course site (Instructure, 2020). A short waitlist was developed for each cohort and additional invites issued if original invitees declined or did not respond. See Table 3 for final enrollment numbers across cohorts.

Table 3

Enrollment and Course Completion

Cohorts

Classesa

Invited to Enroll N

Enrollment n

Incompletes nb

Completes n

Completion Rate

2021, Fall

2

67

46

13

33

71.7%

2022, Spring

4

116

108

42

66

61.1%

2022, Fall

2

56

47

17

30

63.8%

a Learners were evenly divided among classes.

b Incompletes include learners who: 1) accepted the invitation but never enrolled in a course; 2) enrolled in a course but were never active and were withdrawn by the instructors; 3) became inactive in a course over more than two modules and were withdrawn by the instructors; and 4) proactively withdrew their enrollment due to personal reasons.

Quantitative and Qualitative Data Collection

We collected quantitative data from learners via multiple instruments, starting with the pre-course knowledge, skills, and abilities assessment survey (Hinchliffe & Jones, 2022m). This survey primarily asked interval questions regarding research, data, and learning analytics ethics, along with categorical questions about a learner’s ability to address ethics issues. Other interval questions asked learners to assess their abilities relative to the course’s learning outcomes. Upon completion of the course, learners took a similar post-course knowledge, skills, and abilities assessment survey (Hinchliffe & Jones, 2022l); the design enabled calculating pre- and post-course self-reported learning gains. In total, 194 learners completed the pre-course knowledge, skills, and abilities assessment survey; 100 learners completed both the pre- and post-course surveys for a completion rate of 52% (see Table 4).

Table 4

Pre- and Post-Course Knowledge, Skills, and Abilities Survey Completion

Cohorts

Pre- n

Post- n

Completion of Both Pre- and Post-

2021, Fall

47

23

49%

2022, Spring

101

52

51%

2022, Fall

46

25

54%

In addition to the two knowledge, skills, and abilities survey instruments, learners completed a course evaluation survey (Hinchliffe & Jones, 2022a). This survey addressed the course’s design (i.e., structure, instructional materials) and the instructors’ success using interval and short essay questions. Ninety-three learners completed the evaluation survey for a completion rate of 46.3% (see Table 5).

Table 5

End-of-Course Evaluation Completion

Cohorts

Enrollment n

Completion na

Completion Rate

2021, Fall

46

20

43.5%

2022, Spring

108

45

41.7%

2022, Fall

47

28

59.6%

a Evaluations were marked as complete when a learner completed 70% or more of the survey.

Learners also had the option to complete a module quality survey at the end of each of the six modules (Hinchliffe & Jones, 2022k). This survey asked about the learner’s agreement with four questions regarding the module’s activities, design, lecture, and materials regarding how these things impacted their learning; it also contained one short essay question to provide general feedback. Learners completed 229 module quality surveys. We used these surveys to make iterative improvements to discrete parts of the course as necessary. See Table 6 for completion rates across cohorts for each module survey.

Finally, we gathered data from the course activities by exporting the cohort gradebooks into CSVs and exporting rubric scores using a Tampermonkey script for the Firefox web browser (University of Colorado Boulder, 2022), which created analyzable CSVs.

We also collected qualitative data by interviewing 25 learners who completed a course in either the fall 2021 or spring 2022 cohorts.

Table 6

Module Quality Survey Completion

Cohorts

Module 1 n

Module 2 n

Module 3 n

Module 4 n

Module 5 n

Module 6 n

Total N

2021, Fall

23

16

14

8

9

10

80

2022, Spring

22

28

16

11

9

14

100

2022, Fall

6

13

8

8

4

10

49

The interviews were scheduled six to twelve months after course completion to allow learners to act and reflect on course material in their professional lives after their cohort’s course completed. We solicited interview participants via a Qualtrics contact list and email message (Hinchliffe & Jones, 2022e); learners indicated their willingness to participate and preferences for interview scheduling by completing a brief survey (Hinchliffe & Jones, 2022g). We sent one reminder message to targeted learners to indicate their participation preference (Hinchliffe & Jones, 2022c). Learners who participated in interviews received an electronic $20 gift code via email for use at Amazon.com (Hinchliffe & Jones, 2022d). Co-author Jones conducted and recorded all interviews via the web-conferencing application Zoom using a semi-structured interview protocol (Hinchliffe & Jones, 2022f). Interviews averaged 30 minutes in duration. We sent recorded audio to AutomaticSync to professionally and confidentially create transcriptions for analysis.

Data Analysis Procedures

Given the multiplicity of research instruments and inclusion of quantitative and qualitative data, we used a variety of data analysis procedures. We initially analyzed survey data using descriptive statistics to sum findings and examine notable changes in means. Next, we worked with the Indiana Statistical Consulting Center at Indiana University-Bloomington (2023) to explore more advanced parametric and non-parametric statistics. We analyzed the pre- and post-course knowledge, skills, and abilities assessment surveys for each cohort to test for significant differences, using tests appropriate to the data. For the 2021, fall cohort, the data passed a Shapiro-Wilk normality test; for this cohort we ran a paired t-test, which is also known as the paired samples t-test or dependent samples t-test. Neither the 2022, spring nor the 2022, fall cohorts passed the Shapiro-Wilk normality test, so we instead ran a paired Wilcoxon signed-rank test using the Bonferroni method to adjust the alpha (p < .05). This test is a non-parametric test used to compare two related or paired samples, and it is often used as an alternative to the paired t-test when the assumption of normality is violated or when the sample size is small.

For qualitative data, we imported the transcripts into MAXQDA qualitative data analysis software to support our coding procedures. We first coded the transcripts in relation to the interview protocol question numbers (e.g., 1.1., 1.2., 2.1., 3.1.), which enabled us to focus on specific questions and their answers across transcripts. Next, we thematically coded participant answers within questions. Where variables between questions emerged, we noted them. We used memoing strategies and MAXQDA code analysis tools to ideate and confirm themes (Charmaz, 2014).

Table 7

End-of-Course Evaluation Questions on Self-Reported Knowledge Gains and Knowledge Use

Questions

Averagea

2021, Fall

2022, Spring

2022, Fall

Combined

I know significantly more about this subject than I did before I took this course

4.7

4.7

4.6

4.7

I will use the knowledge/skills gained in this course in my profession

4.5

4.6

4.6

4.6

aAnswer options for each question used the following ranked Likert scale: Strongly disagree (1); Disagree (2); Neither agree nor disagree (3); Agree (4); Strongly agree (5).

Findings

Assessment of Learning

Analysis of course evaluation data and pre- and post-course knowledge, skills, and abilities assessment surveys within and between cohorts indicate strong self-reported learning gains. 99% of respondents across cohorts (N = 93) agree or strongly agree that they know significantly more about this subject than they did before they took the course, and 100% of respondents across cohorts (N = 93) agree or strongly agree that they will use the knowledge/skills gained in the course in their profession (see Table 7).

The statistical analysis of the pre- and post-course knowledge, skills, and abilities assessment surveys across cohorts revealed 11 questions where mean differences were statistically significant and indicated positive learning gains. Table 8 contains results for questions that specifically address knowledge, skills, and abilities. From this table we derive that knowledge concerning data ethics, learning analytics, and the ethics of learning analytics consistently increased across cohorts, with the latter increasing the most by a full point on the scale. Learners also expressed an ability to put their new knowledge into action. Prior to the course, learners were neutral regarding whether they were knowledgeable enough to address ethical issues associated with learning analytics. After the course, they indicated greater agreement with their usable knowledge. Further, they indicated that their course training had now prepared them to address privacy and related ethical issues associated with learning analytics, and that they felt confident representing library perspectives on a campus learning analytics committee.

Table 8

Statistically Significant Results from Pre- and Post-Course Knowledge, Skills, and Abilities Surveys: General Questions

ID

Question

Average Measures

Cohort Averagesb

2021, Fall

2022, Spring

2022, Fall

Combined

5

How would you rate your knowledge of data ethics?a

Pre-KSA Average

2.61

2.81

2.68

2.73

Post-KSA Average

3.22

3.40

3.40

3.36

Average Change

0.61

0.60

0.72

0.63

11

How would you rate your knowledge of learning analytics?a

Pre-KSA Average

2.48

2.56

2.68

2.57

Post-KSA Average

3.48

3.37

3.20

3.35

Average Change

1.00

0.81

0.52

0.78

14

How would you rate your knowledge of learning analytics ethics?a

Pre-KSA Average

2.30

2.33

2.36

2.33

Post-KSA Average

3.48

3.54

3.52

3.52

Average Change

1.17

1.21

1.16

1.19

15

To what extent do you agree with this statement: I feel knowledgeable enough to address ethical issues associated with learning analytics.b

Pre-KSA Average

2.91

2.85

2.32

2.73

Post-KSA Average

4.13

4.21

4.16

4.18

Average Change

1.22

1.37

1.84

1.45

17

To what extent do you agree with this statement: My library ethics training has prepared me to address privacy and related ethical issues associated with learning analytics.b

Pre-KSA Average

2.74

2.73

2.48

2.67

Post-KSA Average

4.26

4.08

4.16

4.14

Average Change

1.52

1.35

1.68

1.47

18

How confident would you be in your ability to represent library perspectives on a campus learning analytics committee?c

Pre-KSA Average

2.61

2.50

2.08

2.42

Post-KSA Average

3.70

3.71

3.64

3.69

Average Change

1.09

1.21

1.56

1.27

Total Average Change

1.10

1.09

1.25

1.13

Minimum Average Change

0.61

0.60

0.52

0.63

Maximum Average Change

1.52

1.37

1.84

1.47

aAnswer options for this question used the following ranked Likert scale: Not knowledgeable at all (1); Slightly knowledgeable (2); Moderately knowledgeable (3); Very knowledgeable (4); Extremely knowledgeable (5).

bAnswer options for this question used the following ranked Likert scale: Strongly disagree (1); Somewhat disagree (2); Neither agree nor disagree (3); Somewhat agree (4); Strongly agree (5).

cAnswer options for this question used the following ranked Likert scale: Not confident at all (1); Slightly confident (2); Moderately confident (3); Very confident (4); Extremely confident (5).

Table 9 contains results for questions that concern the course’s learning outcomes. Across the outcomes we see nearly a 1.5-point change on the scale asking about their ability to do the stated outcome, moving from “slightly capable” with a pre-KSA average of 2.3 across outcomes closer to “very capable” with post-KSA average of 3.8 across outcomes. We see the largest average increase (1.7) in the learning outcome asking learners about their ability to develop a learning plan for continuing professional development regarding learning analytics, information privacy, and ethical practice.

Table 9

Statistically Significant Results from Pre- and Post-Course Knowledge, Skills, and Abilities Surveys: Learning Outcomes

ID

Question/Learning Outcomea,b

Average Measures

Cohort Averages

2021, Fall

2022, Spring

2022, Fall

Combined

23

Describe the social, political, and technological elements of learning analytics in higher education, generally, and academic libraries, specifically.

Pre-KSA Average

2.4

2.4

2.4

2.4

Post-KSA Average

3.9

3.6

3.8

3.8

Average Change

1.5

1.2

1.4

1.4

24

Distinguish between theoretical aspects of information privacy and their connection to learning analytics.

Pre-KSA Average

2.4

2.3

2.3

2.3

Post-KSA Average

3.8

3.7

3.6

3.7

Average Change

1.4

1.4

1.3

1.4

25

Critique existing learning analytics principles, policies, practices, and recommendations and the ways in which they may create privacy harms.

Pre-KSA Average

2.3

2.4

2.5

2.4

Post-KSA Average

4.0

4.0

3.8

3.9

Average Change

1.7

1.6

1.4

1.5

26

Adjust a learning analytics practice to strategically minimize privacy harms and maximize specific benefits.

Pre-KSA Average

2.1

2.2

2.2

2.1

Post-KSA Average

3.4

3.6

3.6

3.5

Average Change

1.3

1.4

1.4

1.4

27

Plan for ethical and evidence-based library learning analytics projects that are based in privacy by design.

Pre-KSA Average

2.1

2.2

2.3

2.2

Post-KSA Average

3.7

3.8

3.9

3.8

Average Change

1.6

1.6

1.6

1.6

28

Develop a learning plan for continuing professional development regarding learning analytics, information privacy, and ethical practice.

Pre-KSA Average

2.4

2.4

2.6

2.4

Post-KSA Average

4.1

4.4

4.0

4.2

Average Change

1.8

2.0

1.4

1.7

Total Average Change

1.5

1.5

1.4

1.5

Minimum Average Change

1.3

1.2

1.3

1.4

Maximum Average Change

1.8

2.0

1.6

1.7

aThe stem for learning outcomes questions was: “How would you rate your ability to…”

bAnswer options for each question used the following ranked Likert scale: Not capable at all (1); Slightly capable (2); Moderately capable (3); Very capable (4); Extremely capable (5).

Evaluation of Course Design

End-of-course evaluation surveys contained two sections, one focused on course design and one focused on instructor effectiveness. The former contained nine Likert scale questions and two open-ended questions, while the latter contained three Likert scale questions and two more open-ended questions. Table 10 contains average scores across within and between cohorts regarding course design. With one exception for cohort 2021, fall, scores for all questions were at or above a four (“Agree”). The consistent, positive rating indicates a stable, successful course design as perceived by the learners.

Table 10

End-of-Course Evaluation Questions on Course Design

ID

Questions

Averagea

2021, Fall

2022, Spring

2022, Fall

Combined

1

The course description accurately reflected the content of the course

4.4

4.6

4.4

4.5

2

Course goals and objectives are clearly specified

4.2

4.7

4.5

4.6

3

The structure for this course is easy to understand and follow

3.6

4.5

4.5

4.3

4

Course materials (required readings, supplemental readings) are accessible, appropriate, and helpful

4.5

4.6

4.6

4.6

5

Course lectures are accessible, appropriate, and helpful

4.8

4.7

4.5

4.7

6

Course activities (discussions, Privacy Sourcebook, virtual symposium) are accessible, appropriate, and helpful

4.3

4.5

4.3

4.4

7

I knew what was expected of me in this course

4.0

4.3

4.6

4.3

aAnswer options for each question used the following ranked Likert scale: Strongly disagree (1); Disagree (2); Neither agree nor disagree (3); Agree (4); Strongly agree (5).

Table 11

End-of-Course Evaluation Questions on Instructor Effectiveness

ID

Questions

Averagea

2021, Fall

2022, Spring

2022, Fall

Combined

8

The instructors explained concepts effectively

4.5

4.6

4.4

4.5

9

The instructors foster an encouraging atmosphere for learning

4.4

4.4

4.6

4.5

10

The instructors let me feel free to ask questions

4.4

4.5

4.5

4.5

aAnswer options for each question used the following ranked Likert scale: Strongly disagree (1); Disagree (2); Neither agree nor disagree (3); Agree (4); Strongly agree (5).

Table 11 contains average scores across within and between cohorts regarding instructor effectiveness. Calculating slightly higher than the course design questions, responses for this section were at above 4.4 (between “Agree” and “Strongly agree”) across all cohorts, averaging a 4.5. Again, the consistent, positive rating indicates learners believe the instruction to be successful and it was consistently delivered across cohorts.

Post-Course Impact

Interview participants reflected on how the course’s learning experiences helped them examine their own professional ethics vis-à-vis library learning analytics and other student-focused analytic practices at their institution. Individuals reconsidered what one participant called “librarian sensibilities” and another labeled “knee-jerk reactions”: professional dispositions to be maximally privacy protecting, even at the expense of potentially useful data collection and analysis. The course helped one participant “rethink and reframe” their philosophy of student privacy and, for another participant, enabled them to “build my thoughts around privacy and […] think a little more critically.” Notably, rethinking and reframing their professional ethics cut two ways. First, they used course learning experiences to flesh out a more nuanced view of student privacy. Second, they melded considerations of expanding data collection and analysis activities with concepts such as beneficence, transparency, consent, and autonomy in reconsidering those activities to better align them with newly formed understanding of student privacy boundaries and expectations.

When asked to describe their most impactful learning experience in the course, interviewees largely pointed to the Privacy Sourcebook. We weaved the Sourcebook’s five activities throughout most of the course and it was a keystone learning assignment, so it is not surprising that learners would point to it. What was unexpected, however, was how learners continued to use it in aid to their professional practice after the course concluded. The Sourcebook was an effective learning experience because its structured approach scaffolded information seeking about institutional practices and stakeholders, while enabling learners to build a personal understanding of student privacy that could help them engage in campus activities. For course participants, such as this interviewee, it “elucidates a lot of blind spots that I had in terms of what’s happening [or not] on my campus.” The Sourcebook also guides participants in asking key questions, like: “What do we do here? Do we have anything like this? Or who’s responsible for this? […] Who would I go to for this kind of thing or that kind of thing?” In that sense, it was highly practical. Interviewees stated that it helped them develop useful knowledge and build confidence to be an active participant on committees and hold dialogues with their peers. The Sourcebook was also a resource. Interviewees cited that they would return to it from time to time to review their notes and reflect on what they learned in the course, even using it to jump start conversations with their peers. Others see opportunities to use the Sourcebook in part or in whole with library peers or on committees to guide campus conversations about learning analytics and student privacy—though none report that they had done so.

Analysis of conversations with interviewees suggested that two major gaps still existed in their knowledge concerning learning analytics and student privacy: the practice of learning analytics and mapping information flows. After participating in the course, interviewees reported that learning about the ethics of learning analytics filled a significant gap in their knowledge and prepared them for working with learning analytics, but that they did not know how to take that next step: “So I think the gap,” stated an interviewee, “is now trying to think about the application and what data points should be gathered or could be gathered easily that isn’t going to be as perhaps intrusive for the students in order to move things forward. So, a little bit more of that nuts and bolts [about] how can I start applying something.” To move forward with learning analytics, interviewees indicated that they still needed to do more institutional research about 1) what data access points exist, 2) who managed specific sets of data or data flows, and 3) what institutional policies were in place—if any—to gain access to that data and under what conditions it could be used. But as one participant stated, even though these gaps exist, they now know “the language” to speak to have more informed conversations with others on campus to engage in learning analytics in an ethical, informed way.

Discussion

Reflecting on the Course’s Impact

Across the three cohorts, the findings suggest that the course had a positive impact on the professional development of the learners who completed it. Self-reported learning gains were significant. Learners also consistently reported that the course design and instruction was effective. When considering these findings in isolation it leads us to conclude that the course was a success. We argue, however, that what is more important is the reported action that resulted from what was learned. Participants self-reported that they are better prepared to address privacy and related ethical issues associated with learning analytics and they have the confidence to represent library perspectives regarding learning analytics. The interviews support these beliefs with statements that they are engaging in conversations, bringing campus actors together, and taking action to develop learning analytics practices while prioritizing privacy. Those who act on the knowledge gained in the course are working to improve the learning environment for students while also considering their privacy needs and committing to ethical practice.

Could the course be improved? Even though we see markers of success in the data, our reflections have highlighted some problematic areas. Online courses often suffer from problems related to learner engagement, persistence, and completion. This is an issue for both online higher education programs (see Hart, 2012; Rovai, 2003; Yang et al., 2017) and adult learners in professional development programs (see Wuebker, 2013). Completion rates across our three cohorts ranged from ~61% to ~72%, leading us to ask what we could have done in our course design or instruction to improve learner persistence to completion. One reason for this lower-than-ideal rate of completion could be because no professional development credits or credentials were offered; learners engaged in the material simply because they were interested. Should the course be replicated in the future, aligning it with an institution or organization that can grant such credits or credentials could be beneficial.

Replicating the Course

The course is completely replicable. As mentioned previously, our research repository (Hinchliffe & Jones, 2022n) contains all learning objects, including the course as an IMSCC file, which can be imported into our LMS of choice—Canvas—or many other major LMSs. Those who wish to replicate the course can pick and choose learning objects to meet their pedagogical needs. Alternatively, individuals can run the course in whole with minor modifications to course logistics (e.g., due dates), grading needs (e.g., changes to rubrics), and instructional responsibility (i.e., making it clear who is in charge of running the course, contact information, etc.). We have documented many of these considerations and other helpful information in The Prioritizing Privacy Course Instructor Handbook (Hinchliffe & Jones, 2023), a 28-page document created to provide future instructors insights and support into our instructional design strategies. Reproducing the course is only limited by the license, which states that it cannot be used for commercial purposes.

Extending the Course

There are several opportunities for extending the Prioritizing Privacy course for different types of learners. Even though the course was not created with LIS graduate students in mind, it can fill some gaps in LIS curricula that Jones and Hinchliffe (2022) and Huang et al. (2021) identified. Course materials are well matched to fit into specific courses, for instance:

  • learning objects focused on understanding learning analytics practices in the context of higher education could inform courses on academic librarianship services and management or assessment and evaluation;
  • learning objects focused on theoretical and practical aspects of information privacy could support learning experiences in an information policy course;
  • and learning objects focused on critical approaches to learning analytics could aid instruction in information ethics or critical data/algorithm studies courses.

As a six-module course developed to span six weeks, we do not foresee the course being used as-is in a traditional spring/fall semester-long format. However, with modification, the course could be a strong addition to January terms (J-term) or summer semesters as a one or two credit course.

The course is most easily replicable as a professional development experience for the original target audience: practitioners. There are two ways it could be successfully run. First, academic library consortia or professional organizations at the state or national level could replicate the course with facilitators. Only minor changes to the course would be needed, and opportunities within the course exist to contextualize it to an organization’s needs or interests. Second, individual academic libraries could run the course as a professional development exercise. In this case, it could also be useful to facilitate the course in such a way that it includes faculty and staff from the library’s wider university, but deemphasizing library issues in the course may be needed. Librarian-led facilitation of the course in this way may have the added political benefit of demonstrating to university actors that librarians are leaders in the learning analytics space.

Finally, the course can be—and has been—distilled into a workshop-style experience. From May 2022 to September 2023, we conducted 24 workshops for academic libraries and consortia across the United States by selecting materials from the course for instruction and using pieces from the Privacy Sourcebook to guide individual and group-based activities. These half-day workshops introduced practitioners to the major foci of the course without requiring participants to commit to a multi-week learning experience. Like the course, all workshop materials are available for reuse (Hinchliffe & Jones, 2022q).

Conclusion

With learning analytics continuing to expand in use across higher education institutions, it is imperative that library practitioners engage with campus efforts in planning for and implementing learning activities. If the design of such efforts is to be informed by the ethical concerns librarians have about learning analytics, librarians must find ways to participate in the shared governance and policymaking processes that create the parameters for these programs. Avoidance and disengagement will fail to bring library values of privacy, user control, etc. to the forefront and may leave the library mandated to collect and report data in ways that are professionally problematic or ethically suspect. Professional development opportunities are critical to ensuring that library practitioners have the necessary knowledge, skills, and strategies. The openly available, field-tested curriculum made available through the Prioritizing Privacy course can serve the profession as the basis for ongoing education in this realm.

Funding Statement

This project was made possible in part by the Institute of Museum and Library Services (RE-18-19-0014-19). The views, findings, conclusions or recommendations expressed in this paper do not necessarily represent those of the Institute of Museum and Library Services. More information is available at https://osf.io/mfczs/.

Acknowledgments

We thank Bowen Jiang from the Indiana Statistical Consulting Center at Indiana University-Bloomington. We appreciate the time provided by learners in the Prioritizing Privacy course for their participation in our research practices.

References

Anderson, L., Krathwohl, D., Airasian, P., Cruikshank, K., Mayer, R., Pintrich, P., Raths, J., & Wittrock, M. (2000). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives (1st edition). Pearson.

Biggs, J. (2014). Constructive alignment in university teaching. HERDSA Review of Higher Education, 1, 5–22. https://www.herdsa.org.au/herdsa-review-higher-education-vol-1/5-22

Briney, K. A. (2019). Data management practices in academic library learning analytics: A critical review. Journal of Librarianship and Scholarly Communication, 7(1), 1–39. https://doi.org/10.7710/2162-3309.2268

Burke, L. (2020, February 21). Facial recognition surveillance on campus. Inside Higher Ed. https://www.insidehighered.com/news/2020/02/21/ucla-drops-plan-use-facial-recognition-security-surveillance-other-colleges-may-be

Charmaz, K. (2014). Constructing grounded theory (2nd ed.). SAGE Publications.

Clarke, R. (1988). Information technology and dataveillance. Communications of the ACM, 31(5), 498–512. https://doi.org/10.1145/42411.42413

Currier, C. (2021). Unresolved privacy and ethics issues related to learning analytics in higher education and academic librarianship. Emerging Library & Information Perspectives, 4(1), 117–142. https://doi.org/10.5206/elip.v4i1.13463

D’Agostino, S. (2024, February 27). Facial recognition heads to class. Will students benefit? Inside Higher Ed. https://www.insidehighered.com/news/tech-innovation/teaching-learning/2024/02/27/facial-recognition-heads-class-will-students

Doty, P. (2020). Library analytics as moral dilemmas for academic librarians. The Journal of Academic Librarianship, 46(4), 1–5. https://doi.org/10.1016/j.acalib.2020.102141

Flaherty, C. (n.d.). No More Proctorio. Inside Higher Ed. Retrieved March 11, 2024, from https://www.insidehighered.com/news/2021/02/01/u-illinois-says-goodbye-proctorio

Flaherty, C. (2020, May 11). Big proctor. Inside Higher Ed. https://www.insidehighered.com/news/2020/05/11/online-proctoring-surging-during-covid-19

Flierl, M., Quigley, B., Caswell, T., Costello, L., Li, C., Maher, M., Ness, C., Piorun, M., Prud’homme, P.-A. (Max), Van Diest, K., Walker, G., Wang, M., & Yang, A. (2023). 2023 ACRL environmental scan. http://deepblue.lib.umich.edu/handle/2027.42/175964

Hart, C. (2012). Factors associated with student persistence in an online program of study: A review of the literature. Journal of Interactive Online Learning, 11(1), 19–42. https://eric.ed.gov/?id=EJ976760

Hartman-Caverly, S. (2019). Human nature is not a machine: On liberty, attention engineering, and learning analytics. Library Trends, 68(1), 24–53. https://muse.jhu.edu/pub/1/article/736893

Hinchliffe, L. J. (2022). University of Illinois Urbana-Champaign IRB application. https://osf.io/4r2ex

Hinchliffe, L. J., & Jones, K. M. L. (2022a). Course evaluation survey. https://osf.io/mx5bt

Hinchliffe, L. J., & Jones, K. M. L. (2022b). Interest survey. https://osf.io/jznhr

Hinchliffe, L. J., & Jones, K. M. L. (2022c). Interview follow-up recruitment request. https://osf.io/mq5wu

Hinchliffe, L. J., & Jones, K. M. L. (2022d). Interview incentive message. https://osf.io/zafhq

Hinchliffe, L. J., & Jones, K. M. L. (2022e). Interview initial recruitment request. https://osf.io/6vcxb

Hinchliffe, L. J., & Jones, K. M. L. (2022f). Interview protocol. https://osf.io/c6dqx

Hinchliffe, L. J., & Jones, K. M. L. (2022g). Interview recruitment survey. https://osf.io/j65y2

Hinchliffe, L. J., & Jones, K. M. L. (2022h). Invitation to enroll. https://osf.io/ur4yf

Hinchliffe, L. J., & Jones, K. M. L. (2022i). Listserv distribution. https://osf.io/9y2vt

Hinchliffe, L. J., & Jones, K. M. L. (2022j). Listserv message. https://osf.io/qbr5p

Hinchliffe, L. J., & Jones, K. M. L. (2022k). Module quality survey. https://osf.io/359td

Hinchliffe, L. J., & Jones, K. M. L. (2022l). Post-course knowledge, skills, and abilities (KSA) survey. https://osf.io/qtkxw

Hinchliffe, L. J., & Jones, K. M. L. (2022m). Pre-course knowledge, skills, and abilities (KSA) survey. https://osf.io/35mvq

Hinchliffe, L. J., & Jones, K. M. L. (2022n). Prioritizing privacy Canvas course site. https://osf.io/auywk/

Hinchliffe, L. J., & Jones, K. M. L. (2022o). @priorityprivacy. Twitter. https://twitter.com/priorityprivacy

Hinchliffe, L. J., & Jones, K. M. L. (2022p). Privacy sourcebook (v2—Spring 2022). https://osf.io/wzxks

Hinchliffe, L. J., & Jones, K. M. L. (2022q). Workshop materials. https://osf.io/mytbj

Hinchliffe, L. J., & Jones, K. M. L. (2023). The prioritizing privacy course instructor handbook. https://osf.io/hwdn7

How license plate readers are helping University Police solve crimes. (2023, February 1). University of Illinois Urbana-Champaign Division of Public Safety. https://police.illinois.edu/how-license-plate-readers-are-helping-university-police-solve-crimes/

Huang, C., Samek, T., & Shiri, A. (2021). AI and ethics: Ethical and educational perspectives for LIS. Journal of Education for Library and Information Science. https://doi.org/10.3138/jelis-62-4-2020-0106

Indiana Statistical Consulting Center. (2023). Indiana University-Bloomington. https://iscc.indiana.edu/index.html

Innovative license plate reader technology now in use on CU Boulder campus. (2023, July 17). CU Boulder Today. https://www.colorado.edu/today/2023/07/17/innovative-license-plate-reader-technology-now-use-cu-boulder-campus

Instructure. (2020, July 20). How do I sign up for a Canvas account with a join code or secret URL as a student? https://community.canvaslms.com/t5/Student-Guide/How-do-I-sign-up-for-a-Canvas-account-with-a-join-code-or-secret/ta-p/437

Iowa State University Center for Excellence in Learning and Teaching. (2022). Revised Bloom’s taxonomy. https://www.celt.iastate.edu/instructional-strategies/effective-teaching-practices/revised-blooms-taxonomy/

Jones, K. M. L. (2019). “Just because you can doesn’t mean you should”: Practitioner perceptions of learning analytics ethics. Portal: Libraries and the Academy, 19(3), 407–428. https://muse.jhu.edu/article/729196

Jones, K. M. L. (2022a). The datafied student: Why students’ data privacy matters and the responsibility to protect it. Future of Privacy Forum. https://studentprivacycompass.org/resource/the-datafied-student-why-students-data-privacy-matters-and-the-responsibility-to-protect-it/

Jones, K. M. L. (2022b). Indiana University IRB application. https://osf.io/bn92c

Jones, K. M. L., & Hinchliffe, L. J. (2022). Ethical issues and learning analytics: Are academic library practitioners prepared? The Journal of Academic Librarianship, 102621. https://doi.org/10.1016/j.acalib.2022.102621

Kumar, P. C. (2023). Orienting privacy literacy toward social change. Information and Learning Sciences, ahead-of-print (ahead-of-print). https://doi.org/10.1108/ILS-06-2023-0061

Murillo, A. P., & Jones, K. M. L. (2020). A “just-in-time” pragmatic approach to creating Quality Matters-informed online courses. Information and Learning Sciences, 121(5/6), 365–380. https://doi.org/10.1108/ILS-04-2020-0087

Nichols, S. (2023, October 10). University to install license plate readers to improve campus safety. The Daily Tar Heel. https://www.dailytarheel.com/article/2023/10/university-new-license-plate-readers

Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.

Oliphant, T., & Brundin, M. R. (2019). Conflicting values: An exploration of the tensions between learning analytics and academic librarianship. Library Trends, 68(1), 5–23. https://muse.jhu.edu/pub/1/article/736892

Quality Matters. (2022). Higher ed course design rubric standards. https://www.qualitymatters.org/qa-resources/rubric-standards/higher-ed-rubric

Rovai, A. P. (2003). In search of higher persistence rates in distance education online programs. The Internet and Higher Education, 6(1), 1–16. https://doi.org/10.1016/S1096-7516(02)00158-6

University of Colorado Boulder. (2022, May 27). Export rubric scores. Github. https://github.com/UCBoulder/canvas-userscripts/blob/main/export_rubric_scores.user.js

van Barneveld, A., Arnold, K., & Campbell, J. (2012). Analytics in higher education: Establishing a common language. EDUCAUSE Learning Initiative. https://library.educause.edu/resources/2012/1/analytics-in-higher-education-establishing-a-common-language

Winkelmes, M.-A. (2014). Transparency in learning and teaching project. https://tilthighered.com/transparency

Wuebker, M. P. (2013). Adult learners: Improving persistence and performance in online learning environments. Journal of College Literacy & Learning, 39, 38–46.

Yang, D., Baldwin, S., & Snelson, C. (2017). Persistence factors revealed: Students’ reflections on completing a fully online program. Distance Education, 38(1), 23–36. https://doi.org/10.1080/01587919.2017.1299561

Zimmer, M., & Tijerina, B. (2018). Library values & privacy in our national digital strategies: Field guides, convenings, and conversations. https://cipr.uwm.edu/2018/08/02/project-report-library-values-privacy/

  1. * Kyle M. L. Jones is Associate Professor at Indiana University-Indianapolis, email: kmlj@iu.edu; Lisa Janicke Hinchliffe is Professor & Coordinator for Research Professional Development at the University of Illinois, email: ljanicke@illinois.edu. ©2025 Kyle M. L. Jones and Lisa Janicke Hinchliffe, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC.

Copyright Kyle M.L. Jones, Lisa Janicke Hinchliffe


Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Article Views (By Year/Month)

2025
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 0
September: 0
October: 29
November: 779
December: 242