FAQs

Below are a list of our most frequently asked questions about the product. 

Why use SmarterSurveys when a course survey instrument is available within my learning management system?
  • ADVANCED FUNCTIONALITY - SmarterSurveys provides advanced survey building and reporting tools that are not available in the basic functionality of the free survey tools within learning management systems.
  • COMPARATIVE METRICS – If you use the standardized survey items within SETE we can produce reports that benchmark performance across similar institutions.
  • BIAS CONTROL – Using the proprietary SETE System we can statistically control for student bias based on 24 factors such as the student’s grade in the course, gender, time of class, delivery system, etc.
  • DATA BANK – We provide a data bank of over 300 possible end of course survey items which you can use in addition to your own items.
  • VALIDATED INSTRUMENT  - We make available a bank of survey items which have undergone rigorous item analysis resulting in a bank of items with high reliability and validity.
  • CUSTOMIZATION – Using our custom programming services you can add desired functionality in a timely and affordable manner.
  • INTEGRATION – We achieve many of the same integration features such as single sign on, deployment of surveys as assignments, deployment of surveys within the learning space of the LMS, reporting of participation metrics to faculty.
  • APPROPRIATE ACCESS – Using our tiered, role-based access to data only the appropriate persons obtain access to survey results.
  • CONSULTING – We can provide consulting services to help you improve your survey instruments and response rates.
  • EXPERTISE – Providing course surveys is what we do, not just one feature among many.  Measuring student satisfaction is critical to the improvement of your institution.  For that mission critical role trust the experts, not just a general software company.
Can I customize the look of the SmarterSurveys tool?

Yes, your annual license gives you the right to completely customize the look of the tool to match your schools website.  Text and labels throughout the system can also be modified in the configuration section.

SmarterSurveys seems like a good tool for conducting evaluations.  Is there a way to try the product out for a trial period?

Institutions considering licensing SmarterSurveys may qualify for a trial period.  Click Here to request more information.

Do I need to install any software or buy expensive hardware?

No.  SmarterSurveys operates under a Software as a Service (SaaS) model. Under this model, there is no software your institution or users must install or manage. When your institution signs up, we will create a custom user interface that has the look and feel of your institution's website. We will give you the sub domain, and you can link that in any way you choose.   SmarterSurveys is housed on dedicated servers in a secure, high powered, redundant datacenter faculty located in Birmingham, AL. The datacenter is monitored and supported by a group of onsite network administrators 7 days-a-week, 24 hours-a-day, 365 days-a-year. SmarterSurveys utilizes both automated network monitoring as well as human intervention and oversight to keep data secured. Complete information about the technical infrastructure of SmarterSurveys is available upon request.

Does SmarterSurveys have support staff to help with implementing the system and possible technical difficulties?

Yes.  As part of your initial setup, a member of our team will assist your Admin group with setup and developing an implementation plan around the product for your institution.  Support to end users of SmarterSurveys, is provided via an online support request ticketing system and a knowledge base. Links to a form through which students can request support as well as gain access to the knowledge base are provided on each screen of the product. During business hours, support requests are answered within two hours. Depending on the volume of requests, many are answered within minutes. Requests received prior to 9:00 PM CST on weekdays are typically answered the same night with requests submitted later in the evening being answered at the start of business the following day. On weekends and holidays, a response can be expected within 24 hours.

How soon are survey results available?

Persons with appropriate levels of administrative access can log into SmarterSurveys and view incoming data and reports in real time.  Schools are able to monitor results as they are being submitted.  If schools are doing more detailed analysis of returns, they typically wait until after the survey has closed to complete more advanced computations such as comparisons to previous terms.

How long are my reports available?

All data and reports are stored in the SmarterSurveys' system for 10 years. Data older than 10 years is archived and available for a small fee.

Can data collected in SmarterSurveys be exported for use with other analytical programs?

Yes.  All data is available for export in multiple formats including CSV, flat text file, and Microsoft Excel (xls) formats.

Who can view the survey results?

Each school identifies one person as the SmarterSurveys primary administrator.  This person then assigns other persons at the school with the appropriate level of access to data and reports.  Generally faculty are only able to see the results from their own courses and administrators such as department chairs can only view reports from within the scope of their authority.

What results do students see?

Immediately after submission of data, students receive a confirmation message.  If schools desire for students to see some form of generalized summary report of other students' entries, this could be done.  However, this practice is not typical. 

What types of questions can be asked on the survey?

Schools determine the types of questions that are asked.  Any objective question type can be asked.  Subjective questions such as open ended answers can also be asked, but no calculation is done from the input, just reporting of the input from the students.  Common objective question types include multiple choice, true/false, Likert-type scales, checkboxes, radio buttons, and drop down boxes.

Can SmarterSurveys control for bias and/or provide a standardize score?

Yes.  You have the option of using SmarterSurveys as a basic survey collection and reporting tool or use one of our 2 methods of scaling scores and controlling for bias.  Our fixed approach is using the SETE (Student Evaluation of Teaching Effectiveness) instrument.  This 12-item instrument developed by the University of North Texas is a validated response scale which has evolved from over three years of research and development.  With this instrument, you can control for bias and generate scaled scores for each evaluated user.  Our flexible approach is using the ASA (Advanced Survey Analysis) tool.  This integrated tool allows you to use your existing end-of-course items and also control for bias and generate a scale score for faculty.

How are students' responses anonymous if reminder emails can be sent to non-participating students?

SmarterSurveys uses an advanced process through which the student first authenticates into the system.  Once our system determines that the student is one of the students included in the data file we received from the school, then we provide the appropriate survey to the student.  The process for authentication and data submission are totally separated in our database.  We are technically not able to match up student entries with student contact information. 

How does SmarterSurveys prevent students from submitting more than one response per course?

Using the data file provided from the school to SmarterSurveys, we create a single  “digital ticket” to each survey per course and per student.  Once that “ticket” has been used by a student for that course, the student cannot access the survey for that course again.  They can access surveys for other courses for which they have not yet completed the survey.

Can students submit surveys for more than one course at a time?

SmarterSurveys is designed for students to submit their perceptions about a single course.  However, immediately after the student submits their input for one course, they are asked if they would like to submit feedback for another course.  The student is shown a list of the courses for which they have not yet submitted a survey.

How often can schools use SmarterSurveys?

Schools can use SmarterSurveys as often as they would like.  While the name of this solution is SmarterSurveys, it could also be used for mid-course evaluations or for any other type of survey that the school would like to administer. 

How do students access SmarterSurveys?

SmarterSurveys works with each school to provide a process which is best for their students.  SmarterSurveys provides multiple options for access including:

  1. SmarterSurveys sends emails to the students providing instructions and a unique link to the evaluation.
  2. A link to the SmarterSurveys site for the school is embedded into each online course.  Students click the link to access the surveys.
  3. In either of the above models the students can be automatically logged into the appropriate surveys or they can use a user name/password to authenticate into the system.
Can SmarterSurveys be integrated with existing user authentication systems?

Yes, SmarterSurveys can be integrated with existing user authentication systems such as LDAP, NTLM, MS AD, and others.

How long can the SmarterSurveys survey be active?

Schools determine the dates of availability for all surveys.  Typical times range from one week to three weeks.  A common model is that the survey opens with one week left in the course and is left open for two weeks afterward.  It is conceivable that a school might have some sort of survey, such as a student services survey, which remains open at all times. 

Are students required to complete the entire survey?

Schools set the policy on whether or not students are required to complete the survey.  Most schools strongly encourage students to participate but do not require it.  Schools can determine which, if any, of the questions on the survey are required.

How long is the data available?

All data is automatically archived for up to ten years if the school takes no action to delete it.  The primary SmarterSurveys administrator at the school can choose to delete data from the system,   At this point the data is truly deleted and cannot be recalled.  Deletion of data should carefully be performed. 

Is SmarterSurveys Feedback 508 Compliant?

Yes, but when creating surveys with SmarterSurveys’s EFM Feedback application, you must ensure that any graphic contained in the survey uses an Alt Tag description of the graphic.  

In addition, some users of screen reader applications have difficulty with table-oriented structures such as those used in a matrix question. Technically, while the matrix questions are still 508 compliant, SmarterSurveys recommends that you do not use this type of question if you need to adhere to a strict usage of 508 compliance.

Can results be printed or linked from other websites?

Yes, all score reports are available in PDF format so that users can print them and/or link to them from another website. 

Is technical support available to faculty and students using SmarterSurveys?

Yes, during normal business hours (central time) technical support is available via a support ticket system, email and/or toll free phone. After hours support is available via a support ticket system, email or voice mail. After hours support requests are responded to either before or shortly after the start of the next business day.

How is SmarterSurveys priced?

Pricing for SmarterSurveys is based on the number of users.  Users are individual students, faculty, and administrators.

What can schools do to increase response rates on end-of-course evaluations?

One of the reasons why schools select SmarterSurveys is because it can increase the response rate on end-of-course evaluations.  We realize that a higher response rate helps schools spot meaningful trends in their data.  Lower response rates often are overly influenced by the outliers of very pleased or very disappointed students.   SmarterSurveys recommends the following actions to increase the response rate:  

   1. Promote the end-of-course surveys prior to the end of the term.  Schools can place posters around campus, post announcements on bulletin boards, stuff reminders in student mail boxes.  Faculty can make announcements to students and add instructions to take the survey to their course instructions.
   2. Give students a long window of opportunity to submit the surveys.  Students are very busy, especially around the end of the term as they are composing papers and studying for exams.  A common model is that schools open the survey with one week left in the course and then leave it open for two weeks after the course has ended.
   3. Regularly remind the students about the survey.  One of the features of SmarterSurveys is that we can send a daily email reminder to students who have not yet submitted their end-of-course surveys.  The email provides clear instructions on how to access the surveys and reminds them of the courses for which they have not submitted the surveys.
   4. Provide an incentive for participation.  Schools can provide rewards ranging from free tuition for one course to a $25.00 book store gift certificate.  SmarterSurveys can randomly select a student(s) as the winner of the random drawing.  Each term the schools should promote the fact that these students were the winners.
   5. Assure students that their input is anonymous.   One of the benefits of partnering with SmarterSurveys to collect the data is that when students know that the school is collecting the data, they fear that their input is not anonymous.  But since SmarterSurveys is a reliable third party, we assure students that their contact information is never reported with their evaluation input.
   6. Inform students of improvements that have been made based on their input.   When a change in policy or practice is made based on a trend indicated in the SmarterSurveys data, this is encouraging to students because they know that their thoughts are making a difference.
   7. Require participation.   Schools can require students to participate before their grades can be released for the term.  Caution should be used in considering this method because students who submit the survey just to meet the requirement may not accurately reflect their true perceptions.

Thomas M. Archer of Ohio State University has provided an article titled "Characteristics Associated with Increasing the Response Rates of Web-Based Surveys." Archer made the following recommendations for increasing the response rate on web-based surveys: [1] Increasing the total days a questionnaire is left open, with two reminders, may significantly increase response rates. It may be wise to launch in one week, remind in the next week, and then send the final reminder in the third week; [2] Potential respondents must be convinced of the potential benefit of accessing the questionnaire; and [3] Do not be overly concerned about the length or detail of the questionnaire - getting people to the web site of the questionnaire is more important to increasing response rates.

Arher, Thomas M. (2007) Characteristics Associated with Increasing the Response Rates of Web-Based Surveys. Practical Assessment & Evaluation, 12(12). Available online: http://pareonline.net/getvn.asp?v=12&n=12

According to the research literature, what factors impact a student’s end-of-course evaluations?

Studies have shown that the students’ ratings are affected by characteristics of the course, the instructor and the student (Dukes & Victoria, 1989). Conventional wisdom among university professors seems to be that instructor evaluations are heavily influenced by the grades the instructor gives (Winer, 1999). Sonner and DeLoach (2003) noted in their study that 28% of non-traditional university students believe that evaluations are influenced by grades. In order to ensure valid and accurate evaluations from students, a proposed evaluation process should be sensitive to the "difficulty of the first test" (Hewett, 1988) and final grades for the term. Other characteristics of interest that affect ratings of the instructor include:

  • Students' Interest: "...students' self-reported sleepiness in the class was negatively correlated with the ratings of instructor's ability to explain material clearly and understandably". (Tang, 1987)
  • Student Participation: "Participation in classroom discussions was significantly related to the students’ ratings of the professor’s ability to stimulate students’ interest in the subject matter, willingness to talk with students outside of class, and knowledge in this class". (Tang, 1987)
  • Purpose of Student Evaluation: "...the mean rating was higher when the student was told the purpose [of the evaluation] … The results of this study agree with the findings of Driscoll and Goodwin, Aleamoni and Hexner, Sharon and Bartlett, and Taylor and Wherry which show that certain types of information tend to elevate or increase average student ratings". (Douglas & Carroll, 1987)
  • Socialization with Instructor Outside of Class: "Socializing with students outside of class improved a female Instructor’s [ratings], but social contact did not affect the ratings given to male Instructors". (Kierstead, D’Agostino & Dill, 1988)
  • Perceived Friendliness of Instructor: "Smiling slightly depressed ratings given to male instructors, but it elevated those given to female Instructors". (Kierstead, D’Agostino & Dill, 1988)
  • Grades Received: "The biasing effect of grades on evaluations has been clearly demonstrated in a variety of experimental studies”. (Cahn, 1987) “Generally class evaluations are higher in classes with higher student grades". (Shapiro, 1990) "In other words, students satisfied with their grades took credit for it; students dissatisfied with their grades blamed the Instructor". (Benz & Blatt, 1996)
  • Story telling: "If it is the case that story-telling is what it means to be ‘interesting,’ as we concluded in this study, then faculty may want to apply this understanding to their teaching, i.e., tell stories." (Benz & Blatt, 1996)
  • Use of Textbook: "The present study showed that reading the textbook before coming to the class was positively associated with their ratings of the instructor’s knowledge of the subject matter and students’ knowledge in this class..." (Tang, 1987)
  • Teacher Behaviors: "...male and female instructors will earn equal [ratings] for equal professional work only if the women also display stereotypically feminine behavior". (Kierstead, D’Agostino & Dill, 1988)
  • Teacher Delivery: "For two semesters at Cornell University, Stephen J. Ceci taught his developmental psychology course in exactly the same way – with one exception. The second time, he spoke more enthusiastically, varied his pitch, and used more gestures. The result was a major improvement in student ratings of his course.” Researchers “were stunned to find that Dr. Ceci had earned much better scores in the second semester for his level of knowledge, organization, fairness, and even quality of the textbook. Yet student performance on the tests was about the same in both courses." (Chronicle, 1997)