Surveys are powerful tools for gathering meaningful feedback to improve online course design and implementation. While student feedback surveys often take center stage, course facilitator surveys are equally important for identifying opportunities to enhance the online learning experience.
Course facilitators, who engage directly with students in the course shell, often identify flaws that might be missed during the design phase. They can also highlight areas where students struggle with instructions or assignments. Insights from these surveys provide valuable data for course writers, instructional designers, program directors, and department chairs to improve course quality and better support student learning outcomes.
This blog post offers practical tips for surveying course facilitators effectively and interpreting the data to make informed course enhancements.
Open-ended prompts are useful for obtaining in-depth information on topics with which you may not be as familiar or aware. When you create an open-ended prompt, there should not be specifics surrounding the question asked (Wasik & Hindmand, 2013). These prompts give a new perspective on course topics, allowing for different interpretations and a variety of responses. This written feedback provides program directors and department chairs with more direction and insight into steps for course improvements.
Once you have identified common phrases, determine if you can group any of them together or break them down further. Creating separate categories for these phrases allows for a deeper examination of a specific topic for further interpretation. Program administrators can then determine how best to address common themes, or dismiss anomalous feedback.
When facilitators feel empowered to share their insights and administrators act on the feedback, everyone benefits—students, instructors, and the institution as a whole. Thoughtful survey implementation ensures a continuous feedback loop that fosters innovation, quality, and a better learning experience for all.By optimizing the question types and encouraging course facilitators to participate appropriately, institutions can leverage these opportunities for feedback to benefit everyone involved in the online course. With use of proper evaluation techniques, course writers, instructional designers, department chairs, and program directors will be aware of not only what to improve but also suggestions on how to improve it.
Encourage your course facilitators to participate, and make feedback an integral part of your program’s commitment to excellence!
Stillwagon, A. (2017). How to analyze and interpret survey results. Retrieved from https://smallbiztrends.com/2014/11/how-to-interpret-survey-results.html Wasik, B. A., & Hindman, A. H. (2013). Realizing the promise of open-ended questions. Reading Teacher, 67(4), 302–311. doi:10.1002/trtr.1218
Best Practices for Course Facilitators Surveys
Survey Timing and Delivery
The ideal time to distribute course facilitator surveys is immediately after the term ends. Surveys should be sent via email, and rather than attaching the survey as a document, include a hyperlink to the survey platform. This eliminates the need for manual data entry and ensures faster, more accurate analysis. Encouraging participation is crucial. A message from the department chair or program director explaining the importance of the survey data and how it informs future course improvements can significantly boost response rates. Sending reminders about the survey’s release date and deadline is another effective way to ensure timely responses.Selecting the Right Survey Tool
Several platforms and software programs are available for creating and distributing surveys, including SurveyMonkey, Google Forms, and learning management system (LMS) survey features. Using the LMS is often the most efficient option, as it centralizes data collection within the platform already familiar to facilitators. Regardless of the software used, remember that the content of the survey is far more important than the platform itself.Including Course Identification Questions
To ensure actionable feedback, include required fields for course identification details. This might include the course facilitator’s name, course code and title, session number, and the survey completion date. These questions help tie feedback to specific courses and terms.Survey Structure
A well-designed survey combines both quantitative and qualitative questions, allowing for diverse insights. Questions should address areas such as course objectives, course assignments and grading, course content (e.g., textbook and resources), and student workload.Using Likert Scales for Quantitative Feedback
The first step in forming survey prompts is determining what information you are seeking (Stillwagon, 2017). To measure latent constructs (i.e., attitudes and opinions) and sentiment about a specific topic, a Likert scale is recommended. A Likert scale is a universal tool used in data collection in online education surveys because it provides quantitative, granular feedback. The scale helps you see areas that need improvement in addition to highlighting areas of success. A Likert scale is a five (or seven) point scale that allows individuals to express how much they agree or disagree with a statement or question. They often offer a range of responses from “strongly disagree” to “strongly agree” with a neutral midpoint (e.g., 1 = “strongly disagree,” 2 = “disagree,” 3 = “neutral,” 4 = “agree,” and 5 = “strongly agree”).- Focus each statement on a single idea to avoid confusion (e.g., “The assignments supported course learning objectives” instead of combining topics).
- Avoid negative phrasing, as it can create misunderstanding (e.g., rephrase “The materials were not appropriate” to “The materials were appropriate”).
- Be consistent in your scale design to ensure easy interpretation.
Examples of Likert Scale Questions
- The course objectives aligned with program outcomes.
- The assignments helped students achieve course learning outcomes.
- The workload was appropriate for students.
- Students had meaningful opportunities to interact and learn from each other.
Adding Open-Ended Prompts for Deeper Insights
Per Wasik and Hindman (2013), open-ended prompts are often questions or statements that allow multiple responses and multi-word responses. Course facilitator surveys should conclude with an open-ended section that allows course facilitators to provide qualitative feedback. These surveys should also include open-ended prompts following a Likert scale prompt. For example, an open-ended prompt may ask a course facilitator to further explain the reasoning behind his or her responses.Open-ended prompts are useful for obtaining in-depth information on topics with which you may not be as familiar or aware. When you create an open-ended prompt, there should not be specifics surrounding the question asked (Wasik & Hindmand, 2013). These prompts give a new perspective on course topics, allowing for different interpretations and a variety of responses. This written feedback provides program directors and department chairs with more direction and insight into steps for course improvements.
Examples of Open-Ended Questions
- Which assignment or activity did you feel was the most valuable for students, and why?
- Please provide any additional feedback below.
Analyzing and Interpreting Course Facilitator Survey Data
Making Sense of Quantitative Data
Likert scale responses are easy to analyze using averages or medians. For example, if a prompt scores below 3 on a five-point scale, it signals an area requiring further attention. However, the numbers alone don’t tell the full story—interpretation of trends is key to identifying actionable insights. One of the disadvantages of Likert scale data is that it does not measure emotional distance between the responses. It also doesn’t allow for more in-depth feedback. However, by using open-ended prompts and Likert scale prompts cohesively, feedback is optimized for greater course development and instruction.Extracting Insights from Open-Ended Responses
Unlike Likert scale prompts, open-ended prompts are difficult to interpret. Analyzing open-ended responses requires identifying common themes. Start by scanning for recurring phrases or ideas, which can be grouped into categories for deeper exploration. Tools like Voyant Tools can help visualize word patterns, making it easier to spot trends.Once you have identified common phrases, determine if you can group any of them together or break them down further. Creating separate categories for these phrases allows for a deeper examination of a specific topic for further interpretation. Program administrators can then determine how best to address common themes, or dismiss anomalous feedback.
Conclusion
Course facilitator surveys are an invaluable resource for improving online courses. By thoughtfully designing surveys and effectively analyzing the results, institutions can strengthen course design, content, and delivery while improving student learning outcomes.When facilitators feel empowered to share their insights and administrators act on the feedback, everyone benefits—students, instructors, and the institution as a whole. Thoughtful survey implementation ensures a continuous feedback loop that fosters innovation, quality, and a better learning experience for all.By optimizing the question types and encouraging course facilitators to participate appropriately, institutions can leverage these opportunities for feedback to benefit everyone involved in the online course. With use of proper evaluation techniques, course writers, instructional designers, department chairs, and program directors will be aware of not only what to improve but also suggestions on how to improve it.
Encourage your course facilitators to participate, and make feedback an integral part of your program’s commitment to excellence!
References
Gaide, S. (2005). Evaluating distance education programs with online surveys. Distance Education Report, 9(20), 4–5.Stillwagon, A. (2017). How to analyze and interpret survey results. Retrieved from https://smallbiztrends.com/2014/11/how-to-interpret-survey-results.html Wasik, B. A., & Hindman, A. H. (2013). Realizing the promise of open-ended questions. Reading Teacher, 67(4), 302–311. doi:10.1002/trtr.1218