Investigating Microsoft Insights and Learning Analytics

K.Park/ May 15, 2021/ 1 comments

ISTE Standard for Educators 7

Educators understand and use data to drive their instruction and support students in achieving their learning goals. 

  • 7B – Use technology to design and implement a variety of formative and summative assessments that accommodate learner needs, provide timely feedback to students and inform instruction.
  • 7C – Use assessment data to guide progress and communicate with students, parents and education stakeholders to build student self-direction.

ISTE Standard for Coaches 6

  • 6B – Support educators to interpret qualitative and quantitative data to inform their decisions and support individual student learning.

Question 

How does the use of learning analytics, like Microsoft Insights, impact student learning and care?

Introduction

The introduction of learning analytics in the classroom is complicated.  There is an aspect of care rooted in the belief that if I know what my students are doing and how they’re using the system, that I can know where to help them and step in to provide preemptive support, rather than reactive support. On the other hand, the collection and use of this data strays into surveillance and this can be problematic, particularly when the data is used in a punitive manner, like at Dartmouth.

So the question that I want to explore this post is the use of learning analytics in the classroom and specifically, the impact they can have on student learning and the line between care and surveillance.

Using Insights

Microsoft Insight is an app that can be used in Microsoft Teams that helps educators analyze the data collected, so they can make meaningful decisions in their classrooms.  It’s not available for business teams, only education tenants and is an optional feature.

Overall, it’s easy to see data, you can view it in a tab in a specific course or you can open Insights across all of your courses. Some of the data you can see is activeness per day as well as the length of time a student spent in your course.  Instructors can even sort their students from most active to least, see what was accessed in a class as well as what meetings or activities they may have missed.

Chart showing student activity in a course.
Chart showing student activity in a course.

One thing about this data though is that data on channels visited is only collected from desktop devices and not mobile devices at this time. 

A concern I have with the way this data is presented is that by having a filter that prioritizes most to least active, it reinforces the view that high activity participation = better grades and improved learning.  This data also won’t show how much or often a student participated inside of an online meeting. So, one must ask themselves if this is an accurate measurement of participation or if more information is needed or if maybe the way an instructor thinks about class participation needs to change?

You can also see how often communications take place in Teams, and the data is divided and presented in easy to read columns that show posts, replies and reactions.  Instructors can even apply filters to focus in on  students, channels or across a period of time.  

Chart in Microsoft Insights, showing what types of communication students use.
Chart in Microsoft Insights, showing what types of communication students use.

It is nice that the charts and the data presented are easy to read and understand. Microsoft also provides use cases for how the data might be used too. However, with the data here, you can see quantitative, but not qualitative information.  As in, you may be really pleased that someone is posting a lot, but what is the content of their posts?  It’s possible that someone who posts little posts significantly vs someone who posts often but smaller interactions as well as vice versa; the data can’t show you that.  

The concern here is that it becomes easier to make a snap judgement about an individual due to the way this information is presented, without talking to the student or learning more about how the student works and their learning environment.

Some Additional Concerns…

In general, I have concerns about learning analytics, not in any particular application, but how the data is interpreted, used, and accessed. In addition to what I highlighted above, I also worry about these issues:

  • The data tends to focus on negative behaviors
  • Students not being able to see their own data or not knowing that their data is being collected
  • Instructors not being able to see their own data regarding how they interact with the content
  • Do educators know the policies and processes surrounding the use and protection of this data?

This is a simplified list and there is a lot more to cover with learning analytics, each of these concerns could be their own lengthy post or more.  I make this list because there are the most urgent areas that I think need to be addressed when using learning analytics to make decisions about students.

For example, in the latest Educause 2020 Student Technology Report: Supporting the Whole Student (2020), one of the key findings was that “Course-related alerts and nudges overwhelmingly focus on the negative, and few students receive kudos or congratulations for positive achievements.”

Bar graph showing percentage of student respondents who received each type of alert. Low scores on assessments, quizzes, or exams 	33%. Missing work 	32%. Missed classes, labs, workshops, or tutorials 	26%. Low scores on course assignments 	19% Other	19% Lecture content or course resources not viewed 	10%. Discussion posts not read 	8%. Lack of participation 	8%. Not reading class announcements 	7%. Not logging into the learning management system (LMS)	6%. Deficient number of discussion board postings 	5%. Deficient quality of discussion board postings 	3%. In-class behavioral problems or issues 	3%.
Chart showing the types of nudges students received.

This isn’t to say that all alerts are bad, because nearly all (92%) of the students surveyed thought they were at least moderately useful, but students also need systems in place that celebrate good choices. When I was reading through the Insights documentation, I felt the framing of the documentation encouraged readers to use this tool to look for issues (negative issues) and take action and while that may be helpful for some, I think educators also need to look at the data for positive behaviors and take action on those too.

Lastly, the biggest concern I have is what students know and if they can see their own data. Insights is only viewable by an instructor and when adding it to a tab, the message encourages instructors not to post to the channel that the tool has been added because students won’t have access.  On one hand you could see that as “care” because you don’t want to distract students or make things confusing in the class. However, in my opinion this particular decision in the name of care, disempowers students because it takes that data out of their control. In the same way that companies that collect data have a responsibility to inform and protect the data that is collected, I believe educators have a similar responsibility. 

Using Data to Support Students

Regarding the learning analytics and how effective they are at helping students, I have mixed feelings on if they are helpful because student’s don’t see them in insights. In the ECAR study, most nudges were helpful, but overly negative. If positive messages were added in, then I think you could say the nudges are supportive and caring.

For other aspects, such as identifying students who may need additional support, there are a lot of factors that are “beyond the control of the student, caregiver or educational provider (Toldson).”

Here are some suggestions of things to look for in terms of “academic health” where the data could help (pulled across multiple literatures):

  • Frequent class absences in the first 2-4 weeks
  • Chronic class tardiness or leaving class sessions early in the first 2-4 weeks
  • Missing, late or poor performance on early exams or other assignments
  • Poor grades in multiple courses
  • Lack of interaction or action to implement feedback or learning strategies that were recommended

From this list, you can see that this data won’t speak to an individual’s mental or emotional health and you may not be able to see all of the data for some of the points. So while you may need to look at some data, that’s only going to be one part of the picture.

Conclusion

The collection of data is here to stay, but how we use the data we collect matters. Yes, some of the data can be helpful in completing a picture or helping an instructor gain more understanding into how students may interact with their online content, but it can easily be misunderstood, and abused. So I find myself asking, what data is actually needed and when is it appropriate to use it? 

If you find that it’s best to use the learning data, then I think you also have a duty to inform students that you’re using the tool and how you use it.  You may also need to discuss if there are any relevant institution, state or federal policies that also guide how the data can be used and who can access it. 

The values that I set forth in my Digital Mission Statement, remind me that I want to be an EdTech leader who focuses on people; someone who recognizes and pays attention to each individuals’ full personhood. I’m still working out what this means in each situation – especially as more of our lives turn into data points, but I don’t want the default mode of care to be a form of surveillance, where it means watching and tracking every mouse click or eye movement. It’s my hope that the person would always come first, even as new use cases and situations arise.

Resources

Gierdowski, D. C., Brooks, D.C., Galanek, J. (2020). EDUCAUSE 2020 Student technology report: Supporting the whole student. https://www.educause.edu/ecar/research-publications/student-technology-report-supporting-the-whole-student/2020/student-success 

Educator’s guide to insights in Microsoft Teams. (2021). https://support.microsoft.com/en-us/topic/educator-s-guide-to-insights-in-microsoft-teams-27b56255-90c0-47aa-bac3-1c9f50157181?redirectsourcepath=%252fen-us%252farticle%252factionable-analytics-with-class-insights-preview-in-teams-163add4f-997d-4a01-91de-2846fe4e99bc&ui=en-us&rs=en-us&ad=us

Toldson, I.A., 2019. Why it’s wrong to label students ‘at-risk.’ https://theconversation.com/why-its-wrong-to-label-students-at-risk-109621.

Share this Post

1 Comment

  1. I share your concern about students not being able to see what data about them is being shared and monitored. It’s only fair that there be transparency in letting students know what they are being judged on.

Leave a Comment

Your email address will not be published. Required fields are marked *

*
*

This site uses Akismet to reduce spam. Learn how your comment data is processed.