Student Data and Student Safety

The last twenty years or so have seen a steep increase in all sorts of platforms collecting more detailed data that has little to do with learning activities. So what should we be cautious of or avoid?
by LSA Learning & Teaching Technology Consultants

It can be difficult for instructors to navigate the increasingly broad sea of data about their students. What is useful and what is fluff? What is an invasion of privacy and what is not? What should instructors be aware of, and what can they help students be more aware of?

Data that’s useful to instructors says something about the student’s learning activities. This is often also the data most useful to students themselves! If a student’s progress on knowledge-check quizzes has held steady except for one week, then it’s worth both the instructor’s and student’s time to find out what happened that week. Was the course material not as clear as usual? Were there too many activities to complete at the usual standard? From the student’s point of view, was there something else happening in their life that week that needs some attention? From the instructor’s point of view, did other students show the same pattern, in which case that material needs to be reviewed with the whole class? Many of our instructional tools have a couple ways to access or visualize this kind of information, and we’ll mention those in a bit.

Unfortunately, the last twenty years or so have seen a steep increase in all sorts of platforms collecting more, and more detailed, data that has little to do with learning activities. So what should we be cautious of or avoid?

Particularly Dubious Data

A good basic principle is to avoid platforms or features that collect personal information. It has regrettably been demonstrated that we cannot always trust the companies behind even the most everyday and taken-for-granted of our services to keep such data secure. LSA and UofM have firm Data Protection Agreements with the vendors of ed-tech that we license, but that doesn’t always help when it’s the network provider who’s collecting and selling data. Some examples of tools and features we do not recommend using are:

  • Any tool that uses or collects geolocation, such as auto-attendance in iClicker Cloud, which requires students to turn on location services on their phones. Location data is known to be vulnerable to unscrupulous sale, not by iClicker but by phone service providers. 
  • Remote proctoring software of any kind. All of these services rely on extremely invasive surveillance of students in what are often home environments. The automated versions are known to have persistent racial bias in their flagging algorithm. And some, such as Verificient’s Proctortrack, require local installation of what is effectively spyware on each student’s computer. Personal data collected by these services may include things like knuckle prints and pictures of legal ID, in addition to video of home environments. Data breaches exposing biometric and test data have happened in the recent past. That being so, these tools are best avoided.
  • Any “free” tool that relies on advertisements for its funding. While no ed-tech tools and platforms have reached the depths of user data sale and exposure that Facebook/Meta has, all ad-supported tools expose their users to companies who have no interest in our students’ well-being and are eager to set third-party cookies or browser fingerprint through the ed-tech tool and collect every bit of personal information possible. If such a tool is important in your teaching, contact LSA Technology Services for help. We will do our best to help you find an alternative, or to mitigate the risk as much as possible.
  • If you are looking at a new tool to use in your classes, a good way to check whether it may be doing unfortunate things with course and student data (e.g. selling data to third parties) is to look at the tool’s website. In the site footer, look for links to “Privacy” or “Terms of Service.” Those are the policy documents that should say what information the vendor collects, and who they share it with. Collaborative annotation of one of these documents, in class, can also make an excellent activity in critical reading, and help students learn where to look to evaluate other platforms they may use themselves. 

Useful Data

This is not to say, however, that all student data is dangerous or unethical to have. On the contrary, many ed-tech tools have begun to focus on providing visualizations of class performance and engagement, and tools for instructors to usefully act on the information. This is information that instructors already have access to, but a graph representation based on hard figures often helps instructors feel more confident of their own sense of how students are doing. 

One example of this is the Engagement Insights dashboard in Harmonize, one of LSA’s discussion and collaboration tools. The LSA instructors who participated in the beta testing and development all felt that the dashboard didn’t often provide new information, but did confirm their gut feelings were accurate, and encouraged them to take action when they felt a student was starting to withdraw.

Another example is Canvas course analytics. These provide a reasonably non-invasive snapshot of traffic and activity in the course site, including assignment submissions and the grade distribution. Like the Harmonize dashboard, the Analytics tool simply provides a graph of course activity information that instructors inherently have access to but may not have had time to analyze the patterns of.

Data tools that help instructors pinpoint places where more than one student is having trouble are also useful. The monitor page in Playposit, the tool LSA licenses for instructors to make interactive videos, is an example of this. The monitor page shows how many times each student attempted each video and its quizzes/discussions/etc., and how long it took to go through. Paying attention to such data helps instructors identify areas that need extra review or attention.

Consider pointing out to your classes the version of this that’s designed for students, here at UofM: My Learning Analytics. In addition to helping students track their own patterns, this makes a very good place for students to learn about data analysis, and start understanding how such data may be used elsewhere.

Course activity data still has some potential for misuse, of course; engagement data could reveal at least the presence of sensitive personal issues. So the second major principle to keep in mind is to be aware who is sure to have access to this data. University licensed tools have to pass data security reviews. Be cautious when dealing with tools that have not been vetted and licensed that way, and check the Privacy policy page.

Course Activities

Some additional activities that LSA instructors can use to encourage awareness of and dialogue about data and privacy among students include:

Compare Experience: Any course that touches on culture (language and literature, cultural studies, anthropology, history, etc.) might ask students to compare their own experience of privacy and surveillance with the experience of the cultural group and time-frame the course deals with.

Cost/Benefit Analysis: Any course that deals with data analysis (statistics, psychology, QMSS, etc.) might use this issue as a sample data set. Provide students with some research on smartphones/smartwatches and privacy issues, and then ask them to analyze the advantages and drawbacks of carrying such a device. 

Debate: Any course that deals with debate or persuasion (communication, literature, philosophy, political science) might include some readings on the history of intellectual property and ask students to debate how far their IP rights over their course-work can or should extend. You might include some educational surveillance tools such as Turnitin and ProctorU as examples.

If you’d like to discuss privacy issues related to your own class, or have questions or concerns, please feel free to request an appointment with the LSA Learning and Teaching Consultants! We will be glad to help or brainstorm ideas with you.



Asher-Schapiro, Avi. “Online exams raise concerns of racial bias in facial recognition.” Christian Science Monitor. November 17, 2020. 

Mackey, Aaron. “Forced Arbitration Thwarts Legal Challenge to AT&T’s Disclosure of Customer Location Data.” Electronic Frontier Foundation. April 14, 2021. 

Slusser, Haley. “Rutgers responds to Proctortrack security breach.” The Daily Targum. October 19, 2020.

TechRepublic Security Staff “Facebook Data Privacy Scandal: A Cheat Sheet.” TechRepublic. July 30, 2020.

Release Date: 09/21/2023
Category: Learning & Teaching Consulting; Teaching Tips
Tags: Technology Services
Technology Services Contact Center Chat