2010 Student Surveys: Complete results

In two major surveys, students get the chance to grade their own universities.

There are many ways by which a university can measure its performance, including asking those on the receiving end of an education—the students—what they think. In recent years, a growing number of universities have been doing exactly that. The following pages contain results from two major student surveys: the National Survey of Student Engagement and the Canadian University Survey Consortium—NSSE and CUSC for short. Between them, these surveys examine how involved students are in various academic and extracurricular activities, how satisfied they are with their university and its faculty, and how connected they feel to their school.

Want to know what universities are doing to improve the student experience? Click here.

The findings show that while students are generally happy with their university education, there are key areas of discontent. In particular, a significant number of students feel they don’t fit in at their university, more often in the larger schools than the smaller ones.

Commissioned by the universities, the surveys ask more than 150 questions about the undergraduate experience—inside the classroom and beyond. The answers help each university assess the quality of its programs and services, which in turn can aid in the design and implementation of strategies to improve areas as indicated.

Recognizing that this data can also be useful for prospective students trying to decide which university is right for them, Maclean’s has been publishing CUSC and NSSE results each year since 2006. They provide direct feedback from students on the quality of their education and their general level of satisfaction.

The U.S.-based NSSE began in 1999 and is distributed to first- and senior-year students. Administered by the Indiana University Center for Postsecondary Research, NSSE is not primarily a student satisfaction survey. Rather, it is a study of best educational practices and an assessment of the degree to which each university follows those practices. The survey pinpoints what students are doing while they are in school and on campus.

Research has shown that various forms of engagement are likely to lead to more learning and greater student success. And this link exists not only in the more obvious areas of academic endeavour, such as the number of books read and papers written, but also in curricular extras such as conducting research with a faculty member, community service, internships and studying abroad, as well as in extracurricular involvement with other students.

In 2004, 11 Canadian universities participated in NSSE for the first time, with 14,267 students completing the survey. Participation has grown considerably over the years with 64 institutions having conducted the NSSE survey at least once and 12 more schools set to take part for the first time this year. Fourteen institutions representing 8,965 students participated in the 2009 survey.

The NSSE results are headlined by the Benchmarks of Effective Educational Practice, created by NSSE to compare performance across all universities—American and Canadian. These benchmarks focus on five key areas: level of academic challenge, student-faculty interaction, active and collaborative learning, enriching educational experience, and supportive campus environment. The higher a school’s scores from student responses on the five benchmark topics, the better the chance, according to NSSE, that its undergrads are learning and getting the most out of their university experience. NSSE also asked two important student satisfaction questions; school-by-school results appear on the following pages.

CUSC, a group of Canadian universities working together to examine student issues and experiences, was created in 1994 and administers a Canada-only survey. Unlike NSSE, CUSC’s focus is largely on student satisfaction. Each year, its survey targets one of three student populations: first-year students, graduating students and all undergrads. In 2009, 34 institutions took part, administering an online questionnaire to a random sample of approximately 1,000 graduating students at each university. Institutions with fewer than 1,000 graduating students surveyed them all. In total, more than 12,000 students took part for an overall response rate of 45 per cent. The following pages feature results of two CUSC student satisfaction questions.

In its summary of the 2009 results, CUSC said most graduating students had a positive assessment of their university and its faculty. Nearly nine out of 10 were satisfied with the overall quality of their education and their decision to attend their university. The vast majority agreed that their professors were knowledgeable in their fields, encouraged class discussion and were accessible outside of class. But the report identified one key area of weakness, summed up as “inclusion.” Fully a quarter of students stated that they didn’t feel like they were part of their university. Perhaps not surprisingly, the number was higher for students attending larger institutions (31 per cent) than for those at smaller schools (21 per cent). An even greater number of students—57 per cent—reported feeling they sometimes got the runaround at university. Again, smaller institutions tended to fare better on this measure with 54 per cent of students voicing this complaint compared to 62 per cent at larger schools.

Learning from the feedback these surveys provide, universities are now devoting more time and energy to enhancing the student experience—including students’ need to feel that their identity on campus amounts to more than just their student number. Future survey results will show how well those improvements are progressing. (For details on how some universities are using their NSSE results, see “Applying knowledge.”)

READING THE RESULTS

The charts on the following pages list universities, including affiliates and second campuses, that have taken part in NSSE surveys or the 2009 CUSC. Each chart lists the universities in descending order of achievement. In NSSE benchmark results, universities appear according to their senior year scores; responses to student satisfaction questions are ordered according to the percentage of survey participants who chose the highest level of satisfaction (e.g., “excellent”).

The NSSE and CUSC surveys include more than 150 questions. The results we publish—the five NSSE benchmarks, plus two satisfaction questions each from NSSE and CUSC—are the broadest and most representative of the student experience.

Some universities have completed the NSSE survey four or five times since 2004—others as little as once. As we are displaying results from the most recent year that an institution participated in the survey, readers should bear in mind that the data are from various years. This year’s NSSE charts include 14 universities that took part in the survey in 2009, 37 from 2008, four from 2007 (Lethbridge, Mount Allison, UNB’s Fredericton campus and UNBC), one from 2006 (St. Thomas) and one from 2005 (Regina). The average NSSE score shown in each chart, however, is derived from 2009 results only. (There were fewer participating schools in 2009 than in 2008 as most institutions do not conduct the survey two years in a row.)

First-year student data are not displayed for Royal Roads University as it does not offer first-year courses. Similarly, there are no results for senior-year students at Mount Royal University as it received university status only last year. Data for the three affiliate colleges at the University of Western Ontario are displayed separately from the main institution’s.

Read past year’s results here.

Additional web-exclusive charts available here.

(Click to enlarge)

(Click to enlarge)

(Click to enlarge)

(Click to enlarge)

(Click to enlarge)

(Click to enlarge)




Browse

2010 Student Surveys: Complete results

  1. Pingback: Are Canadian students satisfied with university? « Tony Bates

  2. Universities are no long a place of higher learning but a glorified trade school with students graduating without the knowledge. Universities should be a place for mature learning not extended highschools

  3. I was just asked to fill out the NSSE survey, and I only answered about a third of the questions because I was concerned with what I suspected were methodological problems with others.

    For example, the survey asks things like “how often do you write more than one draft of a paper?” “how many books were you assigned this year” and “how often do you introduce alternative opinions (can’t remember the exact language) into your essays and classroom discussions?”

    From what I’ve seen of the results of this survey, this data appears to inform conclusions about how high academic standards are. However, in reality, there are no questions on the survey that determine to what extent these behaviors are motivated by university standards and to what extent they are behaviors that students exhibit DESPITE university standards.

    In the case of the book count, there is also no assessment of the level of difficulty of the books. My course at SFU last term assigned 4 books, but two were novels and all were easy readers. My course at UBC during the summer assigned only one book, but it was extremely dense. If one were judging academic standards by the book count, one would conclude that the SFU course was tougher. Such a conclusion would, in reality, be laughable. The UBC course was 10 times more challenging.

    There are also questions such as “how many times have you worked harder than you thought you could to meet a professor’s expectations”? This kind of question potentially leads to a result in which schools with dumber students (if I may be so unPC as to call them that) score higher in academic standards. In order to get a sense of how this works, one only needs to compare the RateMyProfessors.com ratings of professors who teach at more than one institution. For example, I had one professor at university who also taught at a community college. He was a tough but brilliant prof who got a 4.9 on his university RMP rating. He got a lower rating from the community college students because some found him too hard. If a good school is judged by how hard students *perceive* that they have to work, then the community college would score higher than the university. In reality, however, I would say that the university students were simply used to performing to higher standards, making the university, in fact, the superior institution.

    I think the only credible way to measure academic standards is through standardized testing of students–which of course is virtually impossible at the University level. However, only then can you separate out the true quality of a student’s learning from their subjective interpretation of what they got out of school.

  4. Athabasca U should be included, its a shame with the number of students enrolled (including myself, a loyal reader) it wouldn’t get mentioned on your list.

  5. Pingback: University Of Ottawa Gets A Big Failing Grade | Unambiguously Ambidextrous

  6. Clearly the Jesus freaks love it, but Trinity Western should not be on this list. It is not a real educational institution.

  7. Trinity Western is really good

  8. I just completed a BComm at Royal Roads. Good program – not sure I’d rate it the hardest thing to get through, but I took it seriously and learned a lot – because I wanted to. I had people in my class (not many, but a few) that seemed to just coast and they got through fine as well. The feedback and marking from the teachers is kind of all over the place – alot of emphasis on just a few assignments and one exam. Keep in mind school is big business – and pushing people through means they keep their pocketbooks open.

    I have heard that the Masters programs there are quite tough, and the delivery method (distance) has taught me how to be a better online/knowledge worker.

    I am glad I took the program – suprised to see Royal Roads at the top of this list, though.

  9. Nice to see Royal Roads with good rankings. As a “newer” school that differs from the traditional coplete lecture style education it still struggles in the eyes of some for credibility. After completing the MBA program I have to say it was an amazing experience and as the traditional and archaeic models of delivery evolve, Royal Roads will be on the leading edge of quality, flexibility and service…

  10. So I guess the student surveys repeatedly show that smaller universities bring you a better educational experience but “academic” calculations always put UOT, McGill or UBC way up there and the smaller universities are virtually invisible. Massive disconnect with actual experience and “reputation”.

  11. How is that a disconnect? As someone commented on one of the other threads “the best schools, get the best students with the best grades, period”

    In my experience, these surveys rate academic challenge based on a number of factors that are really tests of how competent students are. If a student who was rejected by UofT because of low grades reports that s/he is working his/her butt off at Trent just to keep his head above water, does that make Trent a better school than UofT?

    I would say probably not.

  12. I understand that some schools don’t require high academic standards from applicants as much as others, but that does not mean they are inferior. I personally don’t like the mentality of schools who judge submissions based only on academics, since it creates a focus on academics only instead of a well-rounded person. Not trying to insult larger universities, far from it; however, smaller universities have a tendency to look at more so the person as an individual over their grades alone. Combined with the higher amount of student-professor interaction, I prefer small schools in the fact that they are capable of providing a structured form of learning that isn’t just bound to textbook knowlege. I was accepted to larger universities, and I still chose a smaller school despite having good grades. A smaller school isn’t always a fallback for those who were denied elsewhere due to their grades. You have sub-par students everywhere, but because of the lower student population they sometimes become of greater focus in a small scale school. That’s my view though, not discrediting anyone else’s opinions.

  13. If u notice, larger universities tend to get lower ratings by students on satisfaction, interaction, and personal attention scores cause larger universities are more like knowledge manufacturing plants that produces masses. While the smaller or medium size ones does the contrary by providing attention to students and better services to students. Larger and better branded names tend to be over-hyped and over-rated in terms of names cause the larger ones tend to focus more on research activities and less on teaching. Therefore, different levels of students should focus on different unis, ie. masters or PHD candidates should go for the bigger names while the undergrads or the masters one that require teaching should target the smaller or medium ones that provide better student-faculty ratio and better attentions to the needs of the students. Having attended Bigger US unis (branded ones) and also smaller(less branded) ones, i tend to suggest undergrads to go for smaller ones with better student faculty ratios for a higher quality education.

  14. Standardized tests, at least in and of themselves, are certainly not the way to establish quality of education, and this is coming from someone who makes their living in the academic arena of language testing and assessment.

    I’ve lived and worked in countries (like Korea and Malaysia) where standardized tests rule education, and it’s an awful scenario, for teachers, students, and employers alike. Students and instructors waste hours upon hours delivering and memorizing stagnant bits of information (bits of information which are increasingly likely to be obsolete a few short months after the exams are finished) to be regurgitated on some test, learning about skills (like language, numeracy, research, analytical and creative skills) rather than actually practicing and developing them. The end result is graduates who are very good at memorization and little else, and employers get stuck hiring students with little practical abilities in their fields of study.

    As a quick example, I’ve met many students who tested extremely highly on English language tests and yet cannot hold a basic conversation with me about what they did yesterday. They’ve been trained how to take the test, to memorize static grammar rules and the like, and can actually _do_ very little with the language. What’s the value in that?

    The answer is in assessment (or Quality Assurance as it’s referred to in the UK) – students, instructors, employers, and other stakeholders deciding together what outcomes of higher education they want and need their graduates to be proficient in or with. These quite often include the skills I mentioned already – literacy, numeracy, analytical, skils, etc. – as well as ethics, civic responsibilities, and so on. These can be developed and assessed through a variety of traditional and non-traditional means – quizzes and exams, essays, and, increasingly, portfolios of work from a variety of courses, monitored and guided internships, and a capstone project requiring the synthesis of many of the desired outcomes in some sort of academic project prior to graduation.

  15. That sounds like a better plan than standardized testing along. However, I would argue that there is a place for rote memorization in university. My comment on standardized testing was inspired, in part, by my experience in a history class where an obviously under-qualified professor told the class that he wasn’t really big on making us memorize names and dates and events like other profs. He wanted us to decide what was important. Well, I would argue that in history, you have absolutely zero ability to make judgements about “what’s important,” if you don’t have a pretty detailed chronology and list of key figures etched into your memory.

    The prof in question would never be identified as incompetent through student course evaluations. He has sky-high RateMyProfessors ratings because he’s funny and marks easily and brings donuts to class.

    This is why I think standardized tests would go some way toward catching profs who rely on kindness to make up for lack of preparedness. …at the expense of students who were genuinely hoping to learn something from the class.

    That said, having also had a prof who was big on rote memorization without providing a lot of context, I can sympathize with your point about the down side of standardized tests.

  16. Just a note to Mature Student:

    During the course of my academic career, I have had occasion to study at both Trent and U of T. Trent may have had softer entrance requirements, but that hardly translated, in my experience, to a lower calibre student body. I don’t necessarily disagree with most of what you’ve been saying, but at the same time I caution against putting too much stock into current practices used to evaluate undergraduate applicants. As Mr. Jerema observes, high-school grades are a fickle measure of a student’s academic future.

    http://oncampus.macleans.ca/education/2010/07/08/your-grades-will-drop/

  17. Point taken, UVic, and my comment was not intended as a slight against Trent specifically. I’ve attended neither Trent nor UofT and cannot comment on either from personal experience. My observations are really based on my experience at the various Vancouver area schools.

  18. I am not sure how well the univeristy rankings are being done or the way they ask questions about over all student satisfaction and success. I know from experience when I was studying in Ontario at St Lawrence College I noticed a big drop in student success. This is due to alot of the first year students who were totally unprepared for college let alone univesity. I peer mentored two classmates who were fresh out of high school and both could barely could read let alone write a 2 page paper, and their reading comprehension was very poor to not at all. So when these surveys come out I can see these first year students ranking things on the lower end of the scale. In my final year at St Lawrence College I had a heard about nearly half of the first year students were failing out of school in the first year. I think this is because of their total unpreparedness for college and university life as well as how pop culture portrays college and univerisy in movies and tv as big parties, lots of girls and the students are never hung over from the night before to get to class. But also alot of students getting out of high school the last few years have been used to reciving extension after extension on assignments that are late, and are passed onto the next grade with the hopes of picking it up in the next grade. A friend that is a child youth worker explained this to me of “holding a student back a grade can be detrimental to their social development and can cause them stress not being in the same class as their friends but can give them the feeling that they are a failure. And this can cause them to believe that is all they are” I now study at the UofA and this was an issue of debate a short while ago about why there are students being accepted to the university that barely passed high school and are barely passing university why they were being accepted to attend university with low grades and unable to meet the enterance requirements let alone assignment due dates. I over heard a first year student one day complaining how the professor would not give her an extension on her assignment and how each day late she has 5% taken of the assignment mark. She was quite mad about becuause she always got extension in high school. One of the main points about the debate on university rankings that was made was their lower grades and GPA’s as well their over all satisfaction at their university/college would affect their opinion of their university/college experience and in turn affect the survey rankings.

  19. Then again a big proportion of people failing first year is not unexpected. I do mean schools do end up filling slots up in first year to earn huge dollars, especially in the big research schools. Money for research has to come from somewhere!

    As for why many top students fail, I’ll take the known example of Wateroo (if you think I spelled the name wrong, you are the example).

  20. Looks like all the rabid Neo-con Anne Coulter hate speech supporters are out in force in a full frontal attack on U of O.

  21. I am disappointed that UBC-V and UBC-O are not separated out as in Western’s three campuses. UBC-O has been up and running fully for sometime so would like to see how they rate against each other and others of the same size

  22. to Observer Guy and others who thinks that “TWU is not a real school” – why don’t you come and actually observe or participate in a class before you cast judgment on a place you’ve never attended. I suppose the nurses and teachers and other professionals graduating from our school aren’t “real nurses” or “real teachers” either by your false and inferior judgment.

    I suppose you also believe the YMCA (Young Men’s Christian Association) is not a real place to work out either. Or the US dollar bill that says “in God we trust” is not real currency. Or our Canadian anthem is fake because we say “God keep our land.”

    Keep your anti-Christianity from judging whether a school is a school.

  23. Laurier is an awesome school and as a first year i am glad I choose it

  24. One problem with most international rankings is that they tend to measure historical quality and are not much use for predicting what will happen in the near future. The Shanghai rankings’ alumni and awards criteria allow Oxbridge and some German universities to live off intellectual capital generated decades ago. The surveys of the QS rankings inevitably favour big, old, wealthy universities with years of alumni and endowments behind them. It will take a long time for any rapidly developing school to score well on the eleven year criteria in the HEEACT rankings.

    No effort is made to verify it the abilities of the students are in line with the reality of the working world. Being able to use SAP, by knowing SAP transaction codes remains one of them!

    Interest in rankings in Asian higher education is undoubtedly high and the introduction of the QS Asian University Rankings in 2009 served to reinforce this. The publication of ranking lists is now greeted with a mixture of trepidation and relief by many university presidents and is often followed by intense questioning from media that are interested to know what lies behind a particular rise or fall on the global or regional stage.

  25. there will always be that timeless argument of where the money should be spent, on academics or sports, and unfortionatly it is a halting topic in many large universities, which in turn leads to these unwelcoming societies that don’t appeal to everyone, but thats life and thats how it will always be.

Sign in to comment.