Background To Online Teaching Surveys 4/10/20
Wieman Faculty and Student Surveys for characterizing online teaching April 10, 2020
The primary purpose of these two surveys is to provide important information a Provost or maybe Dean, would need to have to help decide between options of:
- Does one have online summer classes for revenue and to fill in gaps of classes and scheduled that got missed in spring chaos?
- Assuming cannot have dorms full of students on campus in fall, do you cancel fall term or go with online fall term.
- What about winter term, if comes to that?
To answer these questions you don’t need to know much about the “why”, which as a good researcher you want to know, you just need to know the “what”. What teaching practices are being used in courses, how students feel about the relative educational experience between online and in person, and how do different on-line teaching practices impact that student experience (or not)?
So the faculty survey is just to capture what they are doing in classes, to use for interpreting student responses and not for any sort of faculty evaluation.
With student survey, I am focusing primarily on the question, “how bad did the students see this compared to in-person teaching”, because that is the question that will matter in the large decisions listed above. Is clear that online courses can be perceived as very bad. The real question, and where this data will help a lot are: do most students think courses are ok relative to in-person, or are ok IF THERE WERE CERTAIN PRACTICES FOLLOWED but fairly awful otherwise? So this is not exploring what instructors thought worked well, only what students thought worked reasonably well for what conditions, and what could be replicated widely. Recognizing the critical need to minimize the survey time demands on students and faculty, the surveys are as short as absolutely possible.
A critical question in any institutional decisions will be, how do students feel about the educational experience that online instruction provided them compared to the in-person instruction they had previously experienced? A student’s answer to that question will almost certainly depend on how the online course is taught, as widely different online approaches are being used. So, the surveys were intended to measure how the student experience varies with instructional approaches across the full spectrum of university courses, as well as environmental elements that might be important to their experience, such as their level of IT, and the settings in which they were working. Particular attention in the survey was given to probing student-student and student-instructor interaction, as this seems likely to be the most salient difference between on-line and in-person educational experience.
Finally, great effort was made to capture all the essential information with surveys that would take as little time to complete as possible, setting less than 5 minutes as an upper limit. My idea was to have these surveys completed for all courses with enrollments larger than a specific size (10-30? students). The student responses would be aggregated for each course, and one could then analyze how those responses depended on instructional practices.
Faculty survey guiding questions.
What instructional practices and course activities were used in the online course? To what extent did faculty change these from when they taught in-person?
Student survey guiding questions.
What environmental factors, including distractions of various types, impacted their online learning?
How were the students spending their time in completing the online course work?
How much did they interact with fellow students and the instructors and TAs, and what factors influenced those interactions?
What was their overall sense of the educational experience relative to in-person?
Surveys (Note: these are semi-final drafts, with some wording updates pending):
*Copies of the Qualtrics files for these surveys are available by emailing Linda Kim at firstname.lastname@example.org.