After tweaking and adopting a survey, it’s time to develop a plan to successfully distribute the survey and collect responses from students. There are considerations to look at in advance to ensure distribution is successful and the right classes, faculty, and students are targeted. This section of the toolkit describes those considerations and provides some strategies for reaching students and faculty in OER or ZTC classes.
Considerations prior to survey distribution
Prior to distributing the survey on campus, the creator/distributor should ensure that the appropriate stakeholders have been contacted, engaged, and have had input on the survey content and distribution plan. Campus stakeholders might include:
- Public Information Officer
- Deans and/or Vice President of Instruction
- Campus IRB (Institutional Review Board) if applicable
- Institutional Researchers
- OER/Zero Textbook Cost Committees
- Faculty teaching OER/ZTC classes
- Student government group
- Academic Senate or other faculty-centered governance groups
When contacting these groups emphasize these four simple points:
- Why the survey is being distributed
- What the survey aims to capture
- How the survey data could/would be used
- What is being asked of the specific group (approval, input, permission to survey students, or simply informational)
In the case of institutional researchers or faculty teaching ZTC/OER classes, engaging their participation and permission to survey students is critical to the survey’s success, so it’s beneficial to reach out and contact these stakeholders well before the survey is distributed.
Strategies for distribution
Once campus stakeholders are informed and have provided input, permission, or acknowledgement of the survey, it’s time to start developing a distribution plan.
The ten pilot colleges involved in the initial creation and distribution of the survey chose only to provide the survey to courses that were fully ZTC or OER where no textbook costs were passed along to students. This decision was made to ensure that responses were limited to courses where a commitment to completely zero textbook costs was evident. Other adopters of this survey can tweak the survey to include classes that may use both commercial and ZTC/OER to get a broader response base.
There are three major factors to contend with when distributing the survey: (1) identifying students in ZTC/OER classes, (2) developing a timeline for collecting responses, (3) and setting a numerical goal for responses. Discussed below are strategies for each of these areas.
One: Identifying students in ZTC/OER classes
Institutions have different ways to identify students enrolled in ZTC or OER classes. The easiest way to identify these are through an icon, symbol, or search attribute in the institution’s online schedule. If this is an option, it can be useful to search the online schedule for that semester and contact the instructors of record to administer the survey in their classes.
In the pilot project, one successful college identified ZTC classes through the schedule and then worked with campus IT to push out the survey directly to students enrolled in these classes through the LMS (Canvas) with the permission of the instructors. This takes away the burden of the individual faculty to distribute the survey but has the downside of potentially being ignored by students because it’s not coming directly from the teaching faculty, but instead is automated through the LMS. It should be noted that this strategy resulted in a higher response number than the other colleges involved in the pilot.
Another strategy for identifying students is to present the survey at Academic Senate and other shared governance groups and solicit for faculty volunteers to administer the survey to their classes (if they are ZTC/OER). The benefit of this strategy is that there is more possibility that these accountable faculty will get their students to respond to the survey, but the downside is that it limits access to the survey to a smaller group of faculty and students. However, building these relationships with individual faculty have a downstream benefit: these faculty might be willing to have their students participate in a video, and providing them the data from the survey could help with course content improvement.
Two: Timeline for collecting responses
The pilot project colleges had distinct start and end dates to their survey. The survey was opened briefly, for a test run, for three weeks. When the survey was launched more broadly in Spring 2022, it was open for a total of sixty days. This provided each campus enough time to contact faculty or students and collect responses. However, this timeline might be too long for an individual campus depending on the campus culture and the way faculty and students are contacted.
In general, an official opening and closing time should be determined before the survey is distributed so that survey takers and faculty know when they can access the survey and plan for completion.
It may go without saying, but the survey addresses areas related to the content of the OER or ZTC materials and so it should be distributed AFTER students have had ample time to engage with the material and the instructor teaching the course. For example, if a semester is seventeen weeks long, it wouldn’t make sense to ask students questions about course materials before week five or six. In addition, as students get more overwhelmed and their time for midterms and finals get taxed, it is less likely that they would respond to a voluntary survey, so avoiding the final weeks of the semester is highly encouraged for a successful response rate.
Three: Setting numerical goals for responses
Responses to surveys will vary based on several factors, number of students given the survey, when and how the survey is administered, and how important the survey is to both students and faculty. Survey response rate can be low, so be sure to cast a wide net with the distribution strategy that works best for a particular campus and give ample time for response to come in. That said, it’s typical to set a goal that is either a specific number of responses, or a rate of response based on the number of surveys distributed. Either way, the validity of the survey will be linked to how many students respond. Getting a large sample of responses, with comments, also helps create a better picture of OER/ZTC on a campus.
Sample Email communications
The email below was used as a template among the ten pilot colleges that participated in the original survey distribution. It contains both an email to faculty/instructors and communication to students that either came to them from the faculty teaching the OER/ZTC class or directly through their LMS.
Hello, [Colleague/s] –
Our college is working to investigate and publicize the student impact of courses with zero course material costs. I hope that you can help us deploy a survey about this to students in your ZTC class(es). The survey was developed by the ASCCC OERI and is now being used at colleges throughout the California Community Colleges.
The survey asks students to identify which college they are at but does not identify particular course sections or instructors. If you are willing to share this survey with students in your ZTC class(es), please share this link through a Canvas message or announcement in any of your ZTC classes: [SURVEY LINK]
Here is some text you are welcome to use, customize, or ignore if you share the survey with your students:
Dear students,
Faculty across the state of California are trying to learn more about the impact of courses like this one, which do not require you to pay for course materials. Please respond to a survey to share your thoughts. The survey should take no more than five minutes. If you happen to have already responded to the survey for a different class, you can skip this. Thank you!
Thanks for considering this at this busy time! If you do decide to send this out to your students, please let me know how many classes you are sending it to, just so I have an idea of how widely this is going out. Thank you!