Review of OERu course evaluation survey - Consultation closed

In preparation for the launch of the OERu 1st year of study and the OERu process evaluation we have drafted a course evaluation survey. We invite feedback and suggestions from the OERu community to be posted by 23 February 2018.


The development of the survey instrument was informed by:

The survey instrument includes:

  • A general section with generic items to be reused for all OERu courses to facilitate comparisons.
  • A course specific section for items relating to the respective course being evaluated.
  • The OERu partner(s) who contributed to the development of the course will be able to include custom items if needed.

Design notes:

  • All OERu learners will be invited to complete the optional new participant survey. We have avoided duplicating items from the new participant survey in the course evaluation survey. Please review the new participant survey before providing feedback.
  • We plan to incorporate the evaluation of specific course features, for example the assessment of unique learning activities, using the poll feature of the site. Spot poll links can be embedded in situ on the course site to gain feedback on specific pedagogical approaches.
  • The draft survey instruments are licensed to the OER Foundation (OERF) using a CC-BY-SA 4.0 license.
  • The OERF would like to acknowledge the work of Ken Udas and Alma Cervantes who assisted us with the development of the draft survey instrument.

Draft evaluation survey instrument

There are two ways you can access the draft survey:

  • Online version hosted on the OERu instance of LimeSurvey. Removed link to online version - survey is now live after implementing changes.
  • PDF version available for download from the OERu instance of NextCloud. (Retained pre-review link for historical reasons).

Ways to provide feedback

  1. By replying to this topic thread on Please provide the item number when providing feedback to specific questions.
  2. By annotating the online version of the survey using (You will need to sign up for a account to annotate.) (Survey is now live - period for comment is over).

Documenting feedback

We have established a wiki post under this topic to document suggestions for refining the evaluation survey.

Thank you in advance for sharing the gift of knowledge in helping the OERu.

Hi Wayne,

Thank you for the chance to contribute - there is a lot of potential for very powerful data sets to be pulled from this and provide a strong evidence base for other work (I can see application to the Course Quality Working Group too).

A few observations, prefaced by the fact that I’m no specialist in methodology, and my most recent survey went through umpteen revisions before being released on an unsuspecting sector. :slight_smile:

Q4. Gender options. Can we double-check that the terminology is consistent with contemporary understandings? Not sure if ‘trans-gender’ should be used instead of just ‘trans’? Also, my limited reading when dealing with questions of gender identity in my survey indicated that ‘Other’ was not considered an appropriate term, mostly because ‘othering’ has a negative past in a whole range of disciplines. Do we also want to add an option ‘I would prefer not to disclose’?

Q7-8. Type of employment. Could we combine these questions into a ‘Tick all that apply’ response? The categories could be ‘not employed’, ‘paid casual’, ‘unpaid volunteer casual’, ‘paid part-time’, ‘unpaid volunteer part-time’, ‘paid full time’, and ‘unpaid volunteer full time’. We cover the hours of work in a later question, so this would provide nuance (especially as we ask for an aggregate of hours worked). This could reduce the survey by at least one question.

Q14 Active user accounts. So that we qualify ‘active user account’ could we change the responses next to each platform to indicate frequency of engagement? The question could be reframed as ‘How frequently do you use the following platforms?’, and the responses could be Never/Every Day/A few times every week/Montly/ A few times a year - or something more precise. In this way we not only capture the platforms used, but also frequency of use that isn’t defined by subjective opinions of what constitutes ‘active user account’. We can then cross-reference activity data on the OERu course to see how it compares to average frequency of interaction of the other identified platforms.

Lastly, I know this is an entry survey, but when/if we do an exit survey, can we include a question about how likely learners would be to recommend an OERu course (based on their most recent experience) to a friend? And why? I think that intent to refer/recommend - whilst hard to quantify - is a powerful currency in online interactions and does speak to value.

As I said, I’m no specialist, so take my suggestions with that in mind. I’m looking forward to seeing other responses as a way of improving my knowledge of surveys and learning from others.

Thanks for the additional feedback on the new participant survey. Learners are invited to participate in this survey when they join the course. I will look into your feedback comments soon.

For your info, we have drafted a course evaluation survey (the subject of this consultation) which has a question about recommending the course to a friend. You can comment directly on the course evaluation survey on from this link. (Survey is now live and comments are closed.)

@AdrianStagg - thanks.

Gender identity. Yes - a “Prefer not to answer” option is missing and I believe other should be replaced by “Not listed” and corresponding open text field. I will consult further. Do you have a local colleague at USQ you could also consult for advice?

Q7-8 - Excellent suggestion for replacing these questions with the “Tick all that apply” response. This solves the challenge of full-time employee who volunteers.

Q14 - In earlier versions of the survey, we had a corresponding scale to gauge frequency of use of various social media and then decided to remove it from the survey because we were concerned about the additional time required to provide that level of detail. I don’t think the detail regarding frequency of use is that important for us - but a rough idea of the spread of technologies would be useful (eg for marketing / OERu promotion). Also, we have other data points - for example in the LiDA104 course we have a spot poll on time spent online. Would this alternative data collection point be adequate?

Thanks again for your valuable feedback!

Hi Wayne. It looks pretty good to me. Only a few minor points:

  1. Phrase the middle option consistently for the Likert scales. It changes from “Neutral” to “Neither Agree nor Disagree” at the end.
  2. Q14 focuses on one possible motivation for taking the course but I wonder if the question can be phrased differently or include more response options that allow learners to share other potential reasons why they took the course?
  3. I found Q17 to be phrased in a slightly confusing fashion. Perhaps switch the order of the two sentences so that Learning by Doing is explained after the statement they are being asked to evaluate?

Hi @rjhangiani

Thanks for the feedback - appreciate the oversight from an experienced researcher.

  • Good catch on Q17 regarding the inconsistency of the middle option of the Likert scale. I’ve fixed this directly on the Limesurvey site.
  • Regarding Q14 of the course evaluation survey, we have another data collection point which covers more detailed options. Q20 of the new participant survey, which learners are invited to complete when they join the course, includes a more nuanced question on motivation for joining. Options included in that question: Formal study requirement, Professional development, Personal development, Research, Curiosity, Networking, Academic volunteer to support OERu learners and Other - please specify. Does this address your concern / query?
  • Agreed - Q17 needs refinement. Reading this again, its conflating the concepts of “Learning-by-doing” and the “Pedagogy of Discovery”. Need to focus on one or the other with clearer question statement.

Thanks again for taking the time to respond.


  1. Q4 Gender identity - New participant survey: Remove “Other” and replace with “Not listed plus text field”. Suggest using “Transgender Female” and “Transgender Male” as replacement for “Trans*”. See feedback from @AdrianStagg
  2. Q4 Gender identity - New participant survey: Add “Prefer not to answer” option. See feedback from @AdrianStagg
  3. Q7-8 Employment / Volunteering - New participant survey: Replace with single question using ‘Tick all that apply’ response. Suggested categories could be: ‘not employed’, ‘paid casual’, ‘unpaid volunteer casual’, ‘paid part-time’, ‘unpaid volunteer part-time’, ‘paid full time’, and ‘unpaid volunteer full time’. See feedback. from @AdrianStagg
  4. Q17 Pedagogy of Discovery - Course evaluation survey: Revise, simply and improve. See feedback from @rjhangiani and feedback from @Marcsinger1 on improving the item.
  5. Q1, Q2 and Q7 - Course evaluation survey: Revise these items to find replacement for “complete” and past tense in Q2 which is not compatible with the OERu delivery model of lifelong learning where learners can proceed iteratively and go back over the material ad infinitum. See feedback from @Marcsinger1.
  6. Survey introduction. "As an OER collaboration, your responses are dedicated to the public domain” - needs to be refined to clarify meaning. See feedback from @lward
  7. Q9 - correct spelling of additionally. See feedback from @lward
  8. Q6 - replace “important” with “relevant”. See feedback from @rachael
    9 Q9 - Add option: To participate in the activities. See feedback from @rachael
  9. Implement solution for addition of partner specific / course specific items. See feedback from @rachael

Hello all–I may be looking at the already-updated version of the survey, but several of the questions mentioned in previous comments don’t seem to be on the current LimeSurvey version. There is nothing about gender identification, for example.

Reviewing what I have reviewed, though, I agree with Rajiv’s comment about Q17–if we are asking learners about the extent to which they agree with the statement about the pedagogy of discovery, it might be best to preface the statement with something that alerts learners to the concept (“Read the following statement. To what extent do you agree with it?” or something like that.)

My only other comment has to do with the concept of completion. Completion in a university context usually means that the course is over, the assignments are graded, and your grade has been recorded. What does it mean here when you can proceed at your own pace, go back over the material ad infinitum, and no credit has been earned (if that was even your goal)? The past tense construction of Q2, for instance, suggests that the learner is done. Q1 and Q7 also raise issues for me, though perhaps I am too immersed in the lifelong learner, iterative approach that open courses would permit.

Hi Marc - @AdrianStagg was providing feedback on the new participant survey - another survey. (I provided the link in the instructions merely as a reference to know what data OERu has collected from alternative data collection points and to avoid duplicating items in the course evaluation survey. )

That would explain why you didn’t see the questions referenced in the feedback.

Hi @Marcsinger1- thanks for your valuable feedback and I agree. I’ve added your recommendations to the wiki list of refinements arising from the consultation.

Thanks for providing an opportunity to review the draft survey :slight_smile:
Stephen Linquist and I are hoping that the following comments are somewhat useful.

building on the question A4 ‘I found the flow of the course logical’, what do you think about including a question on the flexibility of the delivery schedule? Perhaps something like…'the delivery schedule was flexible enough to meet my needs e.g self paced, more than one opportunity to participate in real time interactions when available.

A6 consider replacing ‘important’ with ‘relevant’

Building on A6 ‘when I was engaged in learning activity, I felt that what I was learning was important’, consider including a question that considers the feedback. perhaps something like…‘when I was engaged in a learning activity, I felt that I received an appropriate level of feedback’.

A9 consider including an additional check box, ‘to participate in the activities’

Consider adding a new question that looks at the application of knowledge and skills post course participation. perhaps something like 'can you please describe how the knowledge and skills you have acquired in this course will assist you in your personal and/or professional context

Hi @rachael

Appreciate the constructive feedback from Stephen and yourself.

Q6 - I agree, “relevant” is a better concept than “important” in this context and will recommend the change.

The OERu model does not provide tutorial support (we don’t have the money to pay for tutors). Feedback is limited to the inputs from academic volunteers and peers. That said, we would benefit from getting evaluation data on activities which are purposefully designed for peer feedback. Our plan is to use the poll feature on (see for example here) and I think this approach would be a good alternative to getting specific data on feedback where relevant to the activity design.

Good suggestion regarding Q9 to add an option “to participate / engage in the activities”. Will add that to the list of refinements.

I like the suggestion of describing how knowledge and skills acquired in courses could assist learners in their personal and/or professional context. Our intention is to have a section in the survey instrument where individual partners can add one or two items relating to their specific interests or the nature of the course. Some courses may not be vocationally or professionally oriented so I think this item would be well suited to the custom section of the survey.

Thanks again for your feedback - I’ll update list of suggested refinements with your contributions.

The OERu model is designed for independent self-study - so self-paced flexibility is a given. Courses remain open and learners can engage at any time, sip and dip as they feel appropriate and retake courses if they like. It will be hard to generate a generic item to distinguish it from cohort based offerings with regards to flexibility of the delivery schedule - the nature of our open courses is such that schedules are self-determined.

Hello Wayne, thanks for considering our feedback. In response to the ‘feedback’ element, this would include automated (typically from a quiz), individual, peer (often discussions) and whole class. I am guessing that most of the activities would be designed for peer or automated feedback, given the challenges that you have raised. The level and quality of feedback may impact on a students enthusiasm to continue because without a level of feedback, it is hard to know if we are able to grasp and apply concepts. This may vary considerably from course to course e.g correct/incorrect answer to a quiz question through to providing a link to the related concept. Your idea of polls is a good one, triggered or suggested when forums and quizzes are used.

You raise a good point Wayne, I have fallen into the trap of looking through my ‘traditional’ course lens.

1 Like

Hi @rachael

I agree, the quality of feedback through simulated dialogue, peer interactions etc is an important determinant for motivation, engagement and learning. I think the poll option will work well here because we can tailor the response options to the activity and course context, plus avoid increasing the number of items for the evaluation survey.

Thanks again for your constructive input :-).

1 Like