Monday, November 12, 2012

Reply to "Putting Learners in the Driver's Seat With Learning Analytics

Over at Online Learning Insights, Debbie Morrison gives an overview of learning analytics and posts about a worrisome announcement made by e-textbook vendor Course Smart. The company is now offering to instructors data on student engagement with textbooks.  Ms. Morrison's concern is that "eyeballs on textbook" do not equate to learning:

"Yet Course Smart’s [in my opinion] program is an example of learning analytics gone awry. The ‘packaging up’ as mentioned by Ms. Clarke refers to the program Course Smart developed with data on students’ reading patterns. The program looks at how students interact with the e-textbooks, the number of times a student views a page and for how long, highlights made, etc. Course Smart compiles this ‘data’ and sends a Student Engagement Report to professors.  Are these metrics a true measure of a student’s level of engagement? "



I also find this announcement troubling but for another reason besides for the assumptions it reveals about student engagement.  I find the monitoring of online textbooks troubling for the same reason that I sometimes feel uncomfortable when someone looks over my shoulder and asks, "What are you reading?" My question is, will this type of surveillance have a chilling effect on intellectual freedom of the learners? Learning should be a time to be able to engage in low-stakes "play" that is not constantly monitored and evaluated.  If a learner knows that her every move in the textbook is being monitored, it raises the level of stakes for that learner.  In my opinion, it is not necessarily the monitoring of material that has actually been assigned that is troublesome -- it's monitoring of the material that was not assigned, but that the student found through serendipity or just looks interesting.  What about material that is controversial, or resonates with a learner for personal reasons?  What about the learner who is interacting with text related to a personal health issue, or a relationship problem? Do you really want others to see what we have highlighted in our own books, especially if we know that others might learn something about us that we don't necessarily want them to know? Knowing that our reading is being monitored -- not just by an impersonal vendor like Amazon or Course Smart, but by our own instructors --  will make us more careful about what we read.  It impinges on our intellectual freedom.

Thursday, July 26, 2012

Sharing data about library usage, not about subject content

I watched a fantastic presentation on the University of Minnesota's Library Data and Student Success Project.  This project analyzed library usage across 5 types of transactions: book loans, use of library e-resources, use of library computer workstations, online reference transactions, and instructional workshops. Although they had to tie the usage to individual users, the data was only reported in the aggregate. I now have an itch to reproduce their methodology at our library. 

In terms of privacy, they have an interesting graphic that indicates that they had to shift their practice a little outside of the customary library paradigm in which no user identified data is shared.  They are sharing data about library usage, but not about content. So don't worry, Gophers, the library is only sharing the fact that you checked out books, not that you were reading Fifty Shades of Grey.
We kept this:But not this:
Checked out X booksActual book titles
Attended X workshopsActual workshops
Reference interactionSubstance of interaction
Logged into library workstationDate, location, duration
Used an ejournalActual ejournal title
                  (from University of Minnesota Libraries, http://blog.lib.umn.edu/ldss/2012/05/a-word-about-privacy.html)

Do librarians obsess about user privacy to an unnecessary extent? Perhaps. I'll share a true story from my workplace: two of my librarian colleagues approached me with concern, saying that a teaching faculty member had come to the library to inquire whether a couple of his students had been there that morning.  It seems that they had ducked out of a required lecture stating that they had to go to the library to take care of an IT-related task, and the instructor was doublechecking their story.  The request made my colleagues very uncomfortable. In discussing it with them, it became clear that part of their reticence in sharing this information was because they felt that sharing whether a person was in the library would be infringe on users' privacy.

Would most librarians, with our deeply held values regarding user privacy, confidentiality, and intellectual freedom, agree that this type of user tracking is in keeping with those values? Or do our values need to shift in order to realize the potential value for students and our own institutions that can be uncovered by this type of analysis?

Tuesday, July 24, 2012

Learning analytics at Penn State

What is Penn State planning to do with learning analytics?  As a faculty member at Penn State, I'm interested in this question, so I've tried to explore this question over the past few months.

In April, several individuals from across Penn State's campuses virtually attended the EDUCAUSE Learning Initiative event on learning analytics.

Also in April, a presentation by Chris Millet, Simon Hooper, and Bart Pursel, individuals from various educational technology groups at the University, was held at University Park.  This presentation offered to the campus community a background on learning analytics and outlined current work being done in learning analytics, which includes:
          --Early Progress Report: this is a low-level use of LA which is nonetheless helpful to students and faculty as it automates the process of alerting faculty and academic advisers of students who are receiving a C or less in any course as of the third week of the semester.  Not only is the faculty alert automated, but students also receive an email message alerting them to their own progres. 
           --examination of the relationship between blog and wiki posting in the learning management system with course GPA. Do students who are more connected and communicative with course colleagues fare better in terms of grades?
            --Simon Hooper is helping faculty design better multiple choice tests by analyzing student performance on discrete test questions and comparing it to overall GPA and performance on other assignments involving specific learning objectives.  In doing so, "bad" test questions -- those that don't discriminate well between those who have mastered a specific learning objective and those who haven't -- can be eliminated or redesigned.

Then in May I met with Chris Millet from Teaching and Learning with Technology, Penn State's educational technologies group.  He had just returned from LAK12, the Second Annual Conference on Learning Analytics and Knowledge and he graciously shared with me some of what he learned at that conference.  Millet also described some of the work of a recently-formed Learning Analytics group that has been charged by Penn State Provost Robert Pangborn to explore and implement the use of learning analytics at the University.

There still is a lot of work to be done in developing capacity in this area at the University.  The choice of a new learning management system to replace ANGEL, our current LMS, will also impact the adoption of learning analytics.  Many LMSs now have learning analytics components built in.  Unfortunately any University-wide implementation of learning analytics will be hampered by the College of Medicine's choice to adopt Moodle, which is one of the LA systems that apparently is not being considered by groups elsewhere at the University.  Furthermore, according to Millet, only about 75% of faculty across PSU have even adopted ANGEL, the current LMS.  Will the numbers improve with a new LMS? Perhaps the 25% of faculty who have not adopted ANGEL have good pedagogical reasons for not doing so -- maybe they're using technology in other ways. 

I noted with interest this quote from Chris Millet concerning data sources for LA: "The analysis of this data, coming from a variety of sources like the LMS, the library, and the student information system, helps us observe and understand learning behaviors in order to enable appropriate interventions.”  Again, mention of the library as a source of data.  Yet I wonder how many librarians know about learning analytics and are currently considering how libraries might be involved?

Friday, July 20, 2012

A reaction to the ELI Brief, "Learning Analytics: Moving from Concept to Practice"

EDUCAUSE Learning Initiative has just issued a new brief, Learning Analytics: Moving from Concept to Practice. It is a synthesis of discussions at the Learning Analytics and Knowledge Conference (LAK12) and the ELI 2012 Spring Focus Session.  Here are some reactions from an academic librarian:

Learning analytics systems are built around assumptions about the variables that predict and indicate academic success of students.

At academic institutions using learning analytics, one of the most important decisions is what pieces of information about a student will predict his/her success or indicate that he/she is succeeding? If a student's high school GPA will predict their performance in their first year of college, then we need to feed that information into the system and use it in our predictive models.  If the number of times a student eats in the cafeteria in week two of the semester is unrelated to academic success, then we don't need to get data from the cafeteria.  But what if we don't yet know--because we had no way of mining that data until now-- what variables are truly indicative or predictive of academic success?  It would seem that getting as much data as possible into your system, and then mining it, would be the way to go.  If library usage is correlated to academic success, then we need to put it into the system, but what if we don't really know yet that it is correlated? Then, it would seem that mining library data as part of learning analytics is the way to prove this. 

Visualization tools in learning analytics make the data understandable to users of the system, including students and faculty.

However, Santos and Duval of Katholieke Universiteit Leuven report that some students said they didn't like other students being able to see their activity on the analytics dashboard -- in the cases where each individual student's effort is compared with others in the course to benchmark individual effort. A potential intellectual chilling effect? A violation of privacy? This point is related to library values regarding user privacy and confidentiality.

Learning analytics, in the end, are only as good as the followup. I concur. 

If institutions do not act on the information they gather from learning analytics, then it is simply surveillance and not truly related to teaching and learning. Perhaps this is where libraries can best be involved in learning analytics: helping at-risk students with learning interventions. More on this later.






Friday, March 2, 2012

#LAK12 upcoming EDUCAUSE learning analytics focus session

The announcement about the ELI 2012 Online Spring Focus Session on learning analytics arrived in my inbox yesterday. Yet again, the reference to libraries:

"The analysis of this data, coming from a variety of sources like the LMS, the library, and the student information system, helps us observe and understand learning behaviors in order to enable appropriate interventions. "

Because EDUCAUSE's focus is the management and use of and leadership in information resources, librarians are a significant portion of the group's constituency. So I'm glad, in fact, that librarians are explicitly listed in the focus session announcement as a target audience.

However, since the sharing of user data (at least personally identifiable) would seem to be against the librarians' professional code of ethics, I'm still stumped as to how libraries have been placed as possible participants in the learning analytics sphere. Don't get me wrong...I'm glad to have been asked to the party, but I'm not sure I'm going to want to dance. Some libraries are involved in analytics (for example, the Library Impact Data Project in the UK, which I learned about from commenters on this blog) and I'm curious as to how their projects work and still honor values such as intellectual freedom and user privacy.

Wednesday, February 29, 2012

#LAK12 Gamifying SNAPP

In a recent blog post over at David Jennings' Quickened with the Firewater blog David asks: "What would happen if we put learners in charge of analysing their own data about their performance?"

Here's how it could happen with SNAPP, which is a software tool that helps online instructors analyze the communication patterns among students who use the LMS' communication tool:




  • the instructor would allow communication to happen using the LMS, perhaps providing some task guidance or parameters to inspire the online discussion.


  • The network visualization (like the one at the right, from the SNAPP group) could be shared with the students, who would use the visualization to help them think critically about how communication unfolded and what they individually added to the discussion.


  • Learners could then be challenged to change the visualization by changing their own communication patterns as a group, thereby gamifying the system.




Learners would gain insight into their group communication processes and play with different communication roles. This might be especially useful in a management or communication course, but such insights would be valuable to anyone no matter their course of study. But what learners might also gain, in addition to learning about communication principles, is insight into how such systems can be manipulated, and how the systems might also be used to manipulate learners -- both worthy learning goals.


P.S. thank you Shane Dawson, for a fantastic presentation yesterday about SNAPP for the learning analytics massively open online course!

Wednesday, February 22, 2012

#LAK12 Word Cloud of Data Privacy & Ethics Chat


We had a lively online chat during Erik Duval's presentation on privacy and ethics in learning analytics.
Here are some details that you can see on the word cloud:
--"brother" as in Big...some of us brought up the "Big Brother" theme when the question was asked, "What do you worry about with sharing your data?"
--Lies vs. truth: is it considered lying when you present yourself as other-than-you? Is it your responsibility to present the truth about yourself online?
--transparency is a concern: who can see the data? who can see the models?
--power: who has it in learning analytics? who doesn't? Do teachers, learners, or administrators have power in the system?
--We discussed Google and Target as two corporations who are mining our data for marketing insights.
--medical: How are learning analytics issues similar to issues encountered with personally identifiable medical information?