2017 Faculty Survey
This summary is a presentation of survey data which informed the Request for Proposals (RFP) of a new Learning Management System (LMS) at Texas State University. The survey solicited information from users regarding current system utilization, perceptions of usability, satisfaction or dissatisfaction with current tools, and experience with other systems. The survey responses informed the development of an RFP, sent to vendors in November of 2017.
The LMS Project seeks to replace the Sakai environment with a system upon which the university can grow a set of tools that can support its current and future teaching and learning needs. This system is strategically defined as an ecosystem of integrated tools. The LMS is the foundational piece of the strategy, and to be able to fulfill that function it has to be robust, easy to use, flexible, and adhere to standards that enable growth and evolution. This survey was developed to ensure that the needs of the instructional community were part of the evaluation process for a new system.
An email notification that the survey was available was sent through the Texas State University email system to 2,507 active faculty members on April 13, 2017. This email linked faculty to a Qualtrics survey, which was made available to faculty for 30 days, ending May 13, 2017. Two follow-up communications were sent during the open period to solicit maximum potential responses.
Questions focused on the user experience with the current system, satisfaction with its tools, and desires for a future LMS. The battery was adapted from a survey conducted by the University of Boulder’s Office of Information Technology, based on an industry standard developed by the U.S. Department of Health and Human Services, and modified for this instance by the Learning Experience Design (LxD) and Learning Applications Solutions (LAS); two internal departments in the Instructional Technologies Support unit.
The resulting sample size of 458 respondents gives an overall response rate of just over 18% and, of that group, 92% percent reported using an LMS within the past two academic years. Additionally, of those who replied affirmatively, 93% reported that TRACS (Sakai) was their primary LMS tool.
Of those responding in the negative, having not used an LMS in the past two academic years, the most cited reason for not doing so was a lack of adequate training, followed by not having adequate time to learn the system.
Utilization of Tools and User Satisfaction
The current LMS offers a variety of tools to facilitate the needs of the instructional community. Part of this assessment was to determine which tools were most utilized and if instructors were satisfied with those tools. A series of questions were asked regarding reasons for usage and satisfaction, or dissatisfaction.
Overall, results showed a significant usage of the majority of the tools provided in system. Seven out of eleven tools were utilized by more than 50% of the users surveyed, with Gradebook and Email being reported as the two most frequently used. The four tools that fell below 50% were Discussion/Forums, Calendar/Schedule, Groups, and Chat/Meetings. For tools that were under-used, the most reported reason was that it wasn’t needed by the instructor in their courses. Faculty were also either not familiar with certain tools or felt that alternate tools outside the LMS were better suited to their instructional needs.
High satisfaction with systems tools was reported by regular users. The main reasons were that the tools meet the needs of users, they facilitate communication with students, and they are reported as consistently reliable. Satisfaction was reported proportionately to frequency of usage, with the Course Materials and Email tool reported as the most satisfying. Calendar, Gradebook, and Announcements also reported high levels of satisfaction.
Dissatisfaction did not have a proportionate relationship to frequency of usage, as Gradebook was reportedly the tool people most disliked, followed by the Assessments/Quizzes feature. The Gradebook feature is a tool of particular interest here because it is both reported as highly satisfactory and, also, as not meeting the needs of faculty. Further examination is likely necessary.
Overall reports of dissatisfaction trended around tool features being inadequate and difficult to use, particularly with Gradebook and Assessments/Quizzes. The two tools consistently reported not providing adequate features and being too difficult to use were the Gradebook and Assessments/Quizzes.
User Requests and Possible LMS Solutions
In addition to usage and satisfaction, the survey collected data regarding tools and functionality that would be desirable in a future LMS solution. To isolate responses to the question of dissatisfaction with tools, data was collected in a free response field.
Free response feedback indicated that the most desirable improvements sought from a future LMS system were ease of use, integration with external tools, mobile responsiveness, and reliability. This is reflected in the quantitative data, which showed that ease of use and reliability were the most valued aspects of the current system. Greater control over the customization of tools, flexibility in the system, and improved functionality for group work in online courses was also frequently reported.
Utilization and Perceptions of Users
TRACS is a well-utilized system and users engage with most of the tools regularly, seven out of eleven, as reported in the results section. Free response data indicate that people are satisfied with the tool, though the quantitative data suggest that the users are not very familiar with other systems and their capabilities.
Users appreciate the reliability of the current system, the assistance it provides in coordinating classes, and how it facilitates communication with students. Users would prefer that Gradebook and Assignments/Quizzes were more robust in their features and easier to use. Additionally, there is a perception among the respondents that a transition to a new system would imperil the content they have developed for the university within the Sakai system.
Suggestions in the free response analysis informed the development of the feature requirements submitted to vendors in the RFP. Comments were used to determine which tools or features were considered key requirements of the RFP and which would be considered value-add features. The information provided by faculty was deeply valuable in establishing requirements for the project.
From the data gathered, we learned what tools were important to faculty in an LMS and subsequently made those tools a requirement that any LMS that is chosen must have. We also learned how satisfied, or dissatisfied you are with the tools available in TRACS. This helped further refine the requirements that specific tools must have. For instance, we learned that a majority of respondents use the Gradebook tool in their site but were only 71% satisfied with the tool’s performance. We investigated further and found that the reason why people were not completely satisfied was because the tool is perceived as difficult to use.
This data, and the data correlations, helped us shape not only the required tools and functionality within an LMS, but it also educated us on the various pain-points faculty experience. These issues, as identified in the responses were used in the functionality requirements portion of the RFP.