3. EDSCLS Administrative Guide

This section of the guide is intended for EDSCLS Survey Administrators. It provides education agencies with an overview of the best practices they should know about prior to launching an EDSCLS administration. Guidance for specific EDSCLS survey populations—students, instructional staff, noninstructional staff (including principals), and parents/guardians—is indicated where appropriate.

Specifically, the following topics are addressed:

  • planning how to survey respondent groups: students, instructional staff, noninstructional staff (including principals), and parents/guardians;
  • setting the dates of the administration window;
  • dissemination of survey usernames;
  • conducting a test run of the EDSCLS before taking it live;
  • monitoring participation rates and encouraging the participation of potential respondents who have not yet taken the survey; and
  • overseeing the data when the administration window closes.

It is the responsibility of the education agency that is administering the EDSCLS to check whether additional approval for a data collection is required by the state or locality (e.g., from an Institutional Review Board).

This subsection contains recommended practices for preparing for EDSCLS administrations, beginning with specific recommendations for each survey:

Student survey

  1. The school’s own technology capacity plays a large part in considering whether computer labs, media centers, or classrooms (with computer or tablet access) are the best options for administering the survey. If multiple options are available, consider which venue in your school affords the most privacy to each student.
  2. Regarding parental consent, please use the standard practice of the state/district/school to acquire the proper parental consent for any surveys or testing. The student survey questions are available in paper form in both English and Spanish for parents/guardians to review. It is recommended that parents/guardians be given the opportunity to review the student survey prior to the start of data collection. The student survey questions and parental consent forms can be e-mailed or printed and mailed along with report cards or other school documents. As a reference, two sample parental consent forms (passive and active)[1] are included in Appendix E of this guide.
    • If your state law or school system policies do NOT require parents/guardians to “opt in” for surveys of their child, you can provide them the passive Parental Consent Form I found in E.1.
    • If your state law or school system policies do require parents/guardians to “opt in” for surveys of their child, you will need to provide them the active Parental Consent Form II found in E.2.[2]

Instructional and noninstructional staff surveys

  1. If your school does not provide instructional and noninstructional staff access to computers in their classrooms or offices, consider creating a sign-up sheet for the available computers through which staff can rotate.
  2. Some noninstructional staff, such as custodians, cafeteria workers, and bus drivers, may need to use school computers at designated times to complete their surveys. Schools should provide personnel to assist with this as necessary.
  3. Staff meetings provide an opportune time to both explain the surveys to staff and to have them complete the survey. Consider setting aside time and/or computers during the meeting for staff to use to complete the survey.
  4. If staff have designated times in the school day for planning, professional activities, or administrative tasks, consider allowing them to use that time to complete the survey.

Parent/guardian surveys

  1. In-person explanations may be more effective than letters and e-mails at increasing parent response rates. If a significant number of parents/guardians do not have e-mail addresses or computer access, then letters and in-person explanations become even more important. In such an instance, consider making the EDSCLS a cornerstone of orientation and parent-teacher conferences. Allocate more computers for EDSCLS participation during these meetings, provide personnel to assist, and have plenty of paper versions of the student survey questions on hand for those who request to review them.Orientation is ideal for describing the surveys to parents/guardians.
  2. Consider setting aside a room with computers for parents/guardians to use while they are waiting to start their conference or after they are finished with their conference.

The importance of standardized procedures:

  1. Whether your aim is to compare your school climate results to other schools in your district or state, or to establish your own trend data, applying standardized procedures is critical to producing reliable data. Accurate measurement of the differences between two populations, or the change over time in a single population, cannot be achieved if the measurement process itself is changed in any significant way.

In the context of the EDSCLS, this means maintaining uniform procedures for administering the survey to respondents. For the student survey, this includes the selection and training of Survey Proctors, using the proctor scripts (see Appendix D), and applying strict protocols to ensure privacy. Training sessions  should familiarize Survey Proctors with the scripts, procedures, and use of the FAQs to answer students’ questions (see Frequently Asked Questions (FAQs)).

Documentation

  1. It is critical to document decisions made throughout the data collection (e.g., data collection windows, eligibility of respondents, methods used to engage respondents). Whatever procedural decisions are made, the same procedures may need to be followed in subsequent administrations in order to establish valid trend data.

The logistics of administering the EDSCLS:

The EDSCLS platform has been developed to be usable at the school, district, and state levels. Depending on the size and complexity of the population and the education institution, the logistics of administration may require different divisions of labor.

  1. For state-level administrations, consider the following configuration of key staff:
  • State Survey Administrator: The person leading the EDSCLS administration at the state level. This person controls the generation and dissemination of usernames for all respondents, monitors the real-time submission rates of each respondent group, and orchestrates the activities of the District and School Survey Coordinators.
  • District Survey Coordinator(s): The people managing the EDSCLS administration at the district level. They act as liaisons between the State Survey Administrator and the School Survey Coordinators.
  • School Survey Coordinators: The people managing the EDSCLS administration at the school level. They answer respondents’ questions about the EDSCLS, remind all respondents to answer their surveys, and reserve space during the administration window for students to take the surveys.Depending on the size and complexity of the district, either the District Survey Coordinators or the School Survey Coordinators are tasked with recruiting Survey Proctors and with organizing and conducting their training.
  • Survey Proctors: The people supervising the in-school student surveys. They prepare the rooms and computer access for students, read the Survey Proctor Script to the students, take note of absentees, and provide support to students having trouble accessing the survey.
  1. For district-level administrations, consider the following configuration of key staff:
  • District Survey Administrator: The person leading the EDSCLS administration at the district level. This person controls the generation and dissemination of usernames for all respondents, monitors the real-time response rates of each respondent group, and orchestrates the activities of the School Survey Coordinators.
  • School Survey Coordinator(s): The people managing the EDSCLS administration at the school level. They answer respondents’ questions about the EDSCLS, remind respondents to answer their surveys, and reserve space for students to take surveys during the administration window.Depending on the size and complexity of the district, District Survey Coordinators may be necessary. Either the District Survey Coordinators or the School Survey Coordinators are tasked with recruiting proctors and with organizing and conducting their training.
  • Survey Proctors: The people supervising the in-school student surveys. They prepare the rooms and computer access for students, read the Survey Proctor Script to the students, take note of absentees, and provide support to students having trouble accessing the survey.
  1. For school-level administrations, consider the following configuration of key staff:
  • School Survey Administrator: The person leading the EDSCLS administration at the school. This person controls the generation and dissemination of usernames for all respondents and monitors the real-time response rates of each respondent group. This person also answers respondents’ questions about the EDSCLS, reminds all respondents to answer their surveys, and reserves space for students to take surveys. This person also recruits Survey Proctors and organizes and conducts the Survey Proctor Training.
    • Depending on the size of the school and the workload of the School Survey Administrator, a School Survey Coordinator may be necessary.
  • Survey Proctors: The people supervising the in-school student surveys. They prepare the rooms and computer access for students, read the Survey Proctor Script to the students, take note of absentees, and provide support to students having trouble accessing the survey.

Selecting Survey Proctors for the Student Survey:

  1. EDSCLS student survey administrations must be supervised, necessitating Survey Proctors. The Survey Administrator or Survey Coordinator should select the Survey Proctors and furnish them with student usernames (which the Survey Administrator will randomly generate through the EDSCLS platform) and the proctor script. Depending on the size and complexity of your administration, this task can either be accomplished by a school- or district-level Survey Coordinator.
  2. Eligible Survey Proctors may include teachers, student teachers, noninstructional staff, school counselors, school nurses, computer lab technicians, or outside consultants. If instructional staff are used, please consider having them proctor for classes of students that they do not teach. Despite overt privacy procedures, students may not be as open to providing honest responses in the vicinity of their regular class teacher.

Training Survey Proctors:

  1. Training the Survey Proctors is critical to ensuring that the students finish the survey within a single class period. Provide the Survey Proctors with the Survey Proctor Script (see Appendix D) and the Frequently Asked Questions (FAQs), and hold an in-person or virtual meeting prior to the start of the administration window to review the materials and field any questions the proctors may have.
  2. All individuals involved in administering the EDSCLS, including the Survey Proctors, should sign a Confidentiality Pledge (see the sample in Appendix C). This reinforces the commitment to confidentiality, and the signed form can be shown to parents/guardians to address privacy concerns.

Determining Respondent Eligibility/Ineligibility:

  1. It is recommended that data be collected from all eligible respondents at a school to obtain a full picture of the school climate. This is called a universe or census data collection.
  2. Even with a census or universe data collection, decisions should be made by the education agency regarding respondent eligibility. For example, consider:
    • Students who are new to the school. Students may need time to experience the school building before accurately answering questions about building-level conditions. Consider whether students must be enrolled in the school for a certain number of days prior to being eligible for the survey.
    • Students who are eligible for alternative assessment. The EDSCLS survey is not specifically designed to accommodate students with severe cognitive disabilities who typically require alternative assessments. Consider whether these students should be ineligible.
    • For the instructional staff and noninstructional staff surveys, consider which staff will be invited. Some important questions to consider are as follows: Should only full-time and part-time staff be included, or would you also include occasional staff and substitute teachers? Should noninstructional staff who interact with students in nonacademic ways, such as janitors, bus drivers, and cafeteria staff, be included?

Make sure to document these decisions. Future administrations of the EDSCLS need to replicate these decisions to establish valid trend data.

Information on Response Rates:

  1. A response rate is the number of those eligible for the survey who respond divided by the number of those eligible for the survey. Achieving high response rates is very important for obtaining valid and unbiased data. Education agencies should decide the acceptable minimal response rate for a respondent group’s data to be included when reporting results.

Overcoming the Challenges Around Communicating with Parents/Guardians:

  1. Parents’/guardians’ opinions about a school’s climate are very important. However, obtaining their interest and support can be challenging. Parents/guardians may not have the technology to access the survey or have enough technical skills to answer the online survey. They may also feel they are too busy to respond to the survey.
  2. Given these challenges, we recommend reaching out to parents/guardians early, informing them about the goals of the survey and providing opportunities to ask questions. Schools may need to employ creative strategies to interest them in the survey. It is important to note that the parent survey is short, and it can be answered on desktops and laptops as well as on any mobile device, such as tablets or smart phones. Those parents/guardians who are not familiar with computers can be invited to use school computers, with assistance provided by school personnel.

Setting the dates of the administration involves early planning. Spring administrations are recommended because they give respondents a chance to reflect and report on their perceptions of the school over the course of the school year. The dates you select for the survey administration window can affect participation rates, the perceptions of certain school climate factors, and future administrations. Consider the following guidelines:

  1. Establishing trend data
  • If your state or district is interested in comparing school scores across the state or district, your state or district should administer the surveys to all participating schools during the same time frame.
  • If you intend to use the EDSCLS to establish trend data across time, repeated administrations should be conducted cyclically, during the same 2-week to 1-month window, annually or biannually. This prevents conflation of cyclical factors with structural factors.
  1. School year schedule
  • The EDSCLS is best administered in the spring, but no later than April, if possible. Later administrations face the challenge of competing for time with standardized tests, increasingly busy school schedules, and higher absentee rates (an especially acute problem when surveying 12th-grade students).
  • We recommend that sites avoid conducting the survey at the same time as state testing.
  • Both efforts aim to measure school characteristics that have matured over a school year, but past experience suggests that a significant number of schools do not have the administrative and/or technological capacity to conduct concurrent universal data collections. As such, we recommend that sites carefully examine the calendar of activities for all participating schools and select the optimal time for administration.
  • If it is not possible to schedule concurrent data collections—at multiple schools, to different respondent groups, or to students at multiple grades—consider using slightly different data collection windows for different sites or populations.
  1. Other considerations
  1. Holidays. It is best to avoid conducting the EDSCLS after long school breaks, especially after the winter holiday and spring break. In general, surveys should not be conducted on the day immediately before or after a holiday because absentee rates may spike.
  2. Days of the Week. If possible, avoid administering the student survey on Mondays and Fridays, as they often have unusually low attendance rates. This is particularly prevalent on Fridays before a Monday holiday.
  3. Submission Rates. If submission rates[3] are low, the Survey Administrator may want to consider extending the data collection window in the EDSCLS platform (see 2.2.4 Data Collection). This is a particularly attractive feature when a large number of respondents have logged in to the survey but not have submitted it or a large number of potential respondents have not been used (see 2.2.8 Respondent Usernames Generation). Those in both groups may be convinced to finish the survey, if reminded and given a little more time. For students, this may mean scheduling a make-up time to respond to or finish up surveys (this is especially helpful for slow readers).

Closed data collections cannot be reopened. If during a data collection, you decide that the length of the window should be increased, try to implement the change before the original window expires. If the original end date is reached, you can employ a workaround by creating a new data collection and then importing the results of the second data collection into the first one (See 2.2.11 Exporting and Importing Respondent-Level Survey Results).

Section 2.2.8 above provides instructions for disseminating usernames through the e-mail function of the EDSCLS platform. This is the most efficient method and is particularly useful for large data collections. However, if your host server is not configured to send out e-mails, your site will need to consider the option of manually distributing the usernames on paper or using regular e-mail services outside of the platform.

Manual distribution should also be considered if your site experiences pushback from respondents concerned about the confidentiality of their responses. This process involves generating the usernames in the standard way, exporting the usernames to a PDF or EXCEL file, and then printing them out instead of e-mailing them. Printed usernames can be distributed anonymously by, for example, having respondents pick one username from a stack of paper strips on which the usernames have been printed. This will ensure that responses cannot be linked back to specific respondents and that the same usernames will not be distributed to multiple persons.

Please note that the instructional, noninstructional (including principal), and parent survey responses cannot be linked back to respondents. Even if usernames are disseminated via the platform, the original usernames are replaced with randomly generated usernames as soon as the data collection closes and the results are made available. Figure 3 shows how usernames are retained or deleted for different respondents.

A figure indicating the steps needed to retain or delete a username in the EDSCLS

Survey Administrators should conduct a test run of the platform and logistics chain to make sure that the platform has been installed properly and the system works. Conducting a test run can also help you become familiar with the survey administration process. The test run should include the following steps:

  1. Set the Data Collection start and end dates.
  2. Generate at least one random username for each respondent group.
  3. Use the usernames to log in and answer the first few questions.
  4. Use the “REPORTS” boxed section of the dashboard to make sure it is showing your username as “Partial.”
  5. Complete the remainder of the survey begun in step 3.
  6. Check the Survey Status Reports/Case Disposition section to make sure it is showing your username as “Completed.”
  7. After the data collection end date, check the Survey Status Reports section to make sure it reflects question-level data (i.e., item frequencies, scale scores). Please note that results will only be shown if there are 10 or more responses due to concerns of disclosure risk.

Survey Administrators, Survey Coordinators, and Survey Proctors should do a test run to access and log in to the survey prior to the start of the data collection. For efficiency, consider folding this test run into the training of Survey Coordinators and/or Survey Proctors.

At least three school days before the survey window starts, the Survey Administrator should distribute the following materials related to the student survey to each Survey Proctor (usually through the school-level Survey Coordinators):

  • classroom number(s) and period(s) of their administrations;
  • class roster (to keep track of absentees);
  • student usernames randomly generated by the EDSCLS platform;
  • proctor instructions; and Survey Proctor Script (see Appendix D)

Several different kinds of activities and considerations are important when a data collection is open; these include communication with respondents and survey personnel, monitoring submission rates and incentivizing participation and evaluating potential nonresponse bias.

Guidelines for communication during administration:

  • The EDSCLS platform reports the number of the usernames generated for each data collection, and the number used to log in as well as the submission rate (see 2.2.10.1 Survey Status Reports). Survey Administrators can use submission rates to motivate nonrespondents to participate. For example, they can be included in the reminders sent to participants. We recommend displaying the numbers in a visual format (e.g., pie graphs) for added effect. If you are conducting the survey in multiple schools, you can imbue the reporting with a competitive aspect by publicizing the submission rates of each school. The same concept can be applied to a whole district or state, depending on the size of your administration.

Monitoring submission rates:

  • Achieving a high response rate is important in order to avoid nonresponse bias. Nonresponse bias occurs when the views expressed by those who respond do not reflect the views of the entire population. For example, the first responders to a school’s parent survey may be the parents who have most frequent contacts with the school and thus have the most positive view of the schools. If no effort is made to get the rest of the parents respond to the survey (i.e., increase response rate), the final parent survey results may be skewed toward more positive views of the school’s climate.
  • Please note that response rates and submission rates are often different from one another, for a variety of reasons. For example, a Survey Administrator may generate extra usernames, resulting in a higher denominator for the submission rate calculation. Since the EDSCLS platform cannot produce response rates, submission rates should be used as a proxy for response rates.

Increasing submission rates:

  • Reminder e-mails and/or letters can be sent to respondent groups to increase participation.
  • Who sends the reminders to which respondent groups is best determined by relationship immediacy? As such, instructional staff are the best contact points for reaching out to parents/guardians, principals are best suited to influencing instructional and noninstructional staff, and district leaders are best suited to achieving full participation from principals.

Lackluster submission rates are most acute in parent surveys of school climate. Consider the following strategies to optimize parents’/guardians’ submission rates:

  • Emphasize the value/actionability of the data gathered and ensure confidentiality. The randomly generated usernames are not connected to any individual. The parents’/guardians’ input is valuable because they are the only adult stakeholders in the school system who are not directly part of that school system.
  • Teacher-parent conferences provide an opportune venue for parents/guardians to complete the survey. The instructional staff give feedback on the child’s progress to the parents/guardians, and the parents/guardians can then provide feedback to the school about its climate. We recommend designating a room with computers or tablets where parents/guardians can fill out the survey while they wait for their turn with the teacher or when they are finished with their conference. Provide personnel to assist those parents/guardians who are not familiar with computers or tablets.
    • Be careful about soliciting parent input from only certain groups of parents/guardians (e.g., the parents at a PTA meeting). Such programs draw a narrower band of parents/guardians who are likely to be far more involved in their child’s school than the average parent, leading to an overrepresentation of a subset of the population.

Nonresponse bias:

  • When the response rate, by proxy of the submission rate,[4] is below 80 percent, a nonresponse bias analysis is recommended to determine whether or not the respondents to your study are representative of the population in your school, district, or state and to assess the potential magnitude of nonresponse bias. The analysis will help evaluate whether the data, or the reports based on the data, are biased by the missing respondents.
  • EDSCLS administrators can use the frequency distributions of the demographic variables (grade, student only; race/ethnicity, and sex) that are included in the reports and compare them to a data source that includes the frequencies of these demographic variables for the total population. The nonresponse bias worksheet with embedded formulas is included in the EDSCLS package and can be used to carry out basic analysis. A worksheet for the student survey will look like this:
A figure showing the nonresponse bias worksheet for the student survey
  • The “Percent of respondents” column indicates the characteristics of the respondents to the survey. The “Percent of students” column is extant information from the administrative data of the school or school system. The EDSCLS administrator should enter the demographic characteristics included at the end of the item frequency report into the “Percent of respondents” column, and comparable data from the administrator’s records should be entered into the “Percent of students” column. The “Estimated bias” column indicates, in percentage point terms, differences between the respondent and the overall student population. The “Relative bias column” indicates how large the bias is relative to the estimates from the “Percent of respondents” column. This analysis should also be done for other respondents – instructional staff, noninstructional staff and parents.
  • In those cases, in which the administrator has elected to preserve the link between the usernames and the identity of the students, student survey responses can be linked to other data sources the school/district/state may have to conduct more detailed bias analyses (using the additional student data to measure bias within the responding population as compared to the full population). Additionally, if the data are being collected at the district or state level, the administrator can add additional school and district data for additional analyses. If these additional data are used, the administrator would add the additional variables to the “Student Characteristic” column and drag the formulas in the “Estimated bias” and “Relative bias” columns to the row corresponding to the end of the list of characteristics.
  • The bias is computed by subtracting each value in the “Percent of students” column (e.g., 12.0 for grade 5 students in the table above) from the comparable value in the “Percent of respondents” column (e.g., 12.5 for grade 5 students). The relative bias is the bias estimate for each row divided by the “Percent of respondents.” For any group of respondents, if the estimated bias is larger than 1 percentage point (greater than 1.0 or less than -1.0), the survey data should be used with caution (e.g., 1.8 for grade 8 students). Administrators should also be cautious if the relative bias is larger than 0.3 or less than -0.3 (e.g., 0.4 for Asian students).

The EDSCLS platform automatically produces a report of the results when the data collection window closes. However, if you wish to further analyze the data, you may export the raw data into a CSV file, accessible via Excel and many programming applications, to further analyze the data as needed.

Storage of the Data

The education agency that conducts the EDSCLS is responsible for storing the data in a secure manner. Any materials that directly or indirectly identify respondents should be kept in a locked compartment in a locked room when not in use.

Deletion or Preservation of the Data

The EDSCLS platform can be used for multiple cycles without deleting prior data collections, and the platform’s tools allow the data to be sorted by administration. Keeping the data makes multi-administration comparisons easier by establishing trend lines. However, the data are ultimately the responsibility of the education agency that collected them and deleting or preserving them is at the discretion of that agency.


[1]   Passive consent means parents must notify the school if they want their child to not take (opt out of) the survey.
Active consent means parents must notify the school if they want their child to take (opt into) the survey.

[2]   Note that acquiring active parental consent requires more advance notice than passive consent, as schools need to note which parents have sent in forms and send reminders, as necessary, to maximize the number of students who will take the survey.

[3]   The submission rate is the number of surveys completed (i.e., submitted to the EDSCLS system) divided by the number of usernames randomly generated by the system. Submission rates can be different from response rates. For example, a Survey Administrator may generate extra usernames, resulting in a higher denominator for the submission rate calculation. Since the EDSCLS platform cannot produce response rates, submission rates should be used as a proxy for response rates.

[4]   Note that submission rates can differ from response rates (see footnote 3 above). The EDSCLS platform is only capable of tracking submission rates. Users who want to track response rates will need to determine the requirements for defining respondent status and calculate the response rates using the raw data.

American Institutes for Research

U.S. Department of Education

The contents of the National Center on Safe Supportive Learning Environments Web site were assembled under contracts from the U.S. Department of Education, Office of Safe and Supportive Schools to the American Institutes for Research (AIR), Contract Number  91990021A0020.

This Web site is operated and maintained by AIR. The contents of this Web site do not necessarily represent the policy or views of the U.S. Department of Education nor do they imply endorsement by the U.S. Department of Education.

©2024 American Institutes for Research — Disclaimer   |   Privacy Policy   |   Accessibility Statement