Using Data to Improve Your Online Course

When enhancing an online course you’ve previously taught or developed, your experience becomes a valuable asset. You know the student population, have learned from their feedback, and likely picked up on trends from previous iterations. However, one essential resource often goes underused: the data stored within your learning management system (LMS). Hung and Zhang (2008) note that course data can reveal critical patterns, preferences, and progress in student achievement. This data offers an unbiased picture of how students engage with your course elements, making it a powerful tool for refining your course design. In this article, we’ll explore how to mine LMS data and use it to enhance the learning experience for your students.

 

Before You Begin

Before diving into course data, keep a few things in mind:

  1. Not all LMS features may be available to you. Your LMS might not track everything you’d like, but with a bit of creativity, you can still gather helpful insights.
  2. Correlation is not causation. For example, a drop in student engagement during a certain week may not reflect poorly on your course content. External factors—such as exams in other courses or life circumstance—could play a role. Use context to guide your analysis and avoid jumping to conclusions.

Course Evaluations

Course evaluations and student surveys provide valuable insight into students’ perceptions of the course content and teaching methods. While they may not tell the whole story, evaluations offer helpful feedback about what students feel worked and what could be improved.

How to Use Student Evaluations

  • Look for patterns: If several students report the same issue, it’s worth investigating. For example, if multiple students mention a confusing assignment or rated an aspect of the course low, that feedback could signal an area for revision.
  • Consider outliers: If you have a small course, outliers will more heavily impact average scores on evaluations. Consider how these results fit into the overall context of your course and identify adjustments as needed.
  • Review context: If the course is particularly challenging, students may express frustration. This feedback doesn’t necessarily mean you should lower expectations but may suggest opportunities to add resources or clarify instructions.
  • Consider both sets of data: It’s possible that your institution’s survey contains both quantitative and qualitative data. Both quantitative data (scores) and qualitative feedback (written comments) provide value.  When reviewing students’ comments, note averages and outliers, just as you did for quantitative data. Compare the qualitative and quantitative results together for a complete picture into the course. Consider areas of discrepancy as you rework the next iteration of your course.

Activity Logs

Activity logs offer detailed insights into how students interact with course elements. This data includes which resources students access, how often they visit them, and how long they stay on each page. Black, Dawson, and Priem (2008) note that the activity log files in LMSs provide one of the most promising sources of automatically gathered e-learning data. Many LMSs will provide instructors with data on which elements students clicked on, how often they visited each element, and even how long they remained on each element. This information can be particularly helpful when compared with how much time you think students should spend on a particular resource.

Compare student behavior with your expectations:

  • Short engagement with key materials: If students are spending only a minute on an important resource, consider breaking it into smaller chunks or adding interactive elements to keep them engaged.
  • Excessive time spent on simple tasks: This could indicate that instructions are unclear or the layout is confusing.

Use the table below for strategies to improve specific behaviors:

Behavior Improvement Strategies
Students not clicking on a resource

Revise your module introduction.

Mention the resource in your assessment directions.

Remind yourself to direct students to the resource in discussion forums, LMS chats, etc.

Students spending too much time on a resource

Ensure that the resource is cognitively appropriate for your students.

Rewrite navigation instructions to ensure clarity.

Students spending too little time on a resource

Ensure the module introduction and relevant assessments mention the importance of the resource.

Intentionally foster value and expectancy in the resource itself.

Students spending too much time on an assessment

Ensure that the assessment is cognitively appropriate for your students.

Ensure that the assessment aligns with your learning objectives.

Ensure that the assessment has clear instructions.

Students spending too little time on an assessment

Ensure that the assessment is not too simple for your students.

Ensure students see the alignment of the assessment to learning objectives.

Ensure that the assessment (and module) fosters value and expectancy.

Students not posting to forums

Include an icebreaker or other tool at the beginning of your course to help build an online community.

Revise your module introductions to ensure students know that forums play an integral part in the learning process.

Highlight discussion forum requirements.

Students posting too often to forums

Ensure that students’ comments are related to the discussion forum topic.

Remind students of the discussion forum and netiquette rules in your course (either in the syllabus, rubric, or other discussions).

Quiz Reports

If your course includes auto-graded quizzes, the LMS may store data on quiz performance—such as how long students spent on each question, how many attempts they made, and which incorrect answers they selected. This data helps identify whether assessments are appropriately challenging.

Using Quiz Data to Improve Assessments

  • Quick quiz completion: If students breeze through a quiz, the questions might be too easy. Consider revising them to increase cognitive challenge while staying aligned with learning objectives.
  • Struggles with specific questions: Analyze incorrect answers to identify patterns and adjust instruction. You may need to revise lesson materials or clarify confusing topics.

Striking the right balance between challenge and accessibility is key. As Dick, Carey, and Carey (2015) one of the most crucial pieces of information in instructional design is an understanding of learners’ skills, preferences, and attitudes. Overly easy assessments can reduce students’ engagement, while overly difficult tasks may discourage participation. Use quiz reports to hit that “sweet spot.”

Your auto-graded quizzes can give you a snapshot of students’ perception of difficulty. If you see that students progressed through a quiz too quickly, consider revising questions to increase the difficulty. Alternatively, if students struggled with one quiz, take a look at the questions they got wrong to better determine how you can ease the process for them.

Rubric Analysis

Rubrics provide clear expectations and consistent grading, which help students succeed. Reddy and Andrade (2010) found that students who have rubrics to guide their work experience higher achievement. Rubrics ensure consistency and objectivity, and they help communicate assignment expectations clearly and concisely. Analyzing rubric data can show where students excel and where they struggle.

How to Use Rubric Data for Course Improvements

  • Align criteria with objectives: Make sure your rubrics reflect the learning goals of the course. If a rubric criterion doesn’t match the objective, students may be unclear on expectations.
  • Provide targeted resources: If students consistently score low on a section of the rubric, revise your learning materials, add supplemental materials or practice activities to help them prepare.
  • Clarify expectations: Ensure that rubric language is specific and easy to understand. For example, instead of saying, “Essay contains few grammatical errors,” specify: “Essay contains no more than 3–5 grammatical errors.”

Conclusion

When revising an online course, data-driven decisions help you design with student success in mind. Whether you’re reviewing student evaluations, activity logs, quiz reports, or rubrics, each data point offers valuable insights into your students’ experiences. However, remember that changes should align with your learning objectives—adjusting for the sake of change can hinder rather than help.

As you evaluate your course data, ask yourself, “How does this information help me better support my students?” The answer will guide you toward meaningful improvements that create a more engaging and effective learning experience.

References

Black, E. W., Dawson, K., & Priem, J. (2008). Data for free: Using LMS activity logs to measure community in online courses. The Internet and Higher Education, 11(2), 65–70.

Dick, W., Carey, L., & Carey, J. O. (2015). The systematic design of instruction. Boston, MA: Pearson.

Hung, J., & Zhang, K. (2008). Revealing online learning behaviors and activity patterns and making predictions with data mining techniques in online teaching. MERLOT Journal of Online Learning and Teaching, 4(4), 426–437.

Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 35(4), 435–448.