Barriers to permanent residency are formidable, but can be overcome
Paré, D. E., Collimore, L.-M., Joordens, S., Rolheiser, C., Brym, R., & Gini-Newman, G.
(2015). Put Students’ Minds Together and their Hearts Will Follow: Building a Sense of Community in Large-Sized Classes via Peer- and Self-Assessment – Appendix. Toronto: Higher Education Quality Council of Ontario.
Stemming from a series of discussions at recent women's academic conferences in the U.S. and abroad, Women Interrupting, Disrupting, and Revolutionizing Education Policy and Practice is born of the frustration many scholars have expressed over the stagnation of the study of women in educational leadership. Whitney Sherman Newcomb and Katherine Cumings Mansfield
have brought together the works of a broad range of feminist scholarsseasoned and newer academics and studentsto address the questions: in what ways is feminism in the field of educational leadership stalled? What can we do to move ahead?
Some scholars have questioned academe’s reliance on letters of recommendation, saying they’re onerous for the professors writing them or speak more about connections to “big-name” scholars than substance, or both.
A recent study explores another concern about letters of recommendation: whether they’re biased against the women they’re supposed to help. The short answer is yes.
Scholarly reading is a craft — one that academics are expected to figure out on our own. After all, it’s just reading. We all know how to do that, right?
Yes and no. Scholarly reading remains an obscure, self-taught process of assembling, absorbing, and strategically deploying the writing of others.
Digital technology has transformed the research process, making it faster and easier to find sources and to record and retrieve information. Like it or not, we’ve moved beyond card catalogs, stacks of annotated books and articles, and piles of 3x5 cards. What hasn’t changed, however, is the basic way we go about reading scholarly work.
OTTAWA, July 4, 2018 – The Canadian Alliance of Student Associations (CASA) released a poll today, revealing that while paid work placements related to a student’s field of study are seen as the best form of experience to help new graduates get a good job, nearly half of students still are not able to participate in them.
Study finds gender of instructors influences evaluations they receive, even if they have fooled students (in an online course) about whether they are men or women.
One of the reasons I love teaching is that each semester provides a fresh start: empty grade books, eager students. I also cherished this time when I was a student myself: poring over course syllabi, purchasing new textbooks, meeting my professors. Although I reside on eastern South Dakota’s frigid plains, the first day of class consistently brings me a warm feeling.
But once the newness of the semester fades, it’s not long before I casually share with a colleague something a student did or (more commonly) failed to do. This habit started in graduate school. Years ago, student shaming provided a humorous means of connecting with my fellow TAs: in my early 20s, commiserating over student issues felt normal, even cool. Perhaps, too, a case can be made that swapping stories of students’ shortcomings had little effect on our students themselves. They didn’t hear us laugh at their misspelled words or poorly constructed sentences. Yet, 10 years later, I’m haunted by the thought that I might
have spent more time complaining about my students than championing their success.
Dear parent of a university student,
You might want to sit down because I’ve got news you’ve dreaded for some time: your child has enrolled in a creative writing course.
I know it’s scary. As the course’s instructor, I’ve heard the same stories you have. On the street, they call creative writing the most potent of the humanities’ gateway drugs. Students get their first hit, and before you even have time to threaten to cut them out of the will, they’re writing every text message as a haiku and studying Soviet film.
Your child might have already hinted to you that creative writing was a possibility. They might have mentioned something called a “workshop.” You probably laughed, because the poets and novelists whose photographs you’ve seen in newspapers seldom look like they know how to work much of anything, never mind a drill or power saw.
You might be angry with the university for allowing your child to take a creative writing course. You might be angry with me for teaching it. Let me assure you: in class, I do everything possible to pull back the curtain on creative writing. We talk about how hard it can be put anything on the page without lapsing into clichés. I explain just how much there is to learn about things like form, style and genre. I tell them what a misery it can be to sit alone at a keyboard for hours, moving words around.
I say these things, but every year, students keep signing up for the course. They just seem to love writing. They seem to love it even though it involves struggle. Maybe because it involves struggle. They seem to relish the challenge of describing the world closely; of imagining how it could be different; of treating language as a puzzle and a game; of discovering new things about themselves. Sometimes, getting the right words in the right order feels impossible, but they seem to think that it can be important work.
This article is concerned with the differences in REB policy and application processes across Canada as they impact multi-jurisdictional, higher education research projects that collect data at universities themselves. Despite the guiding principles
of the Tri-Council Policy Statement 2 (TCPS2) there is significant variation among the practices of Research Ethics Boards
(REBs) at Canada’s universities, particularly when they respond to requests from researchers outside their own institution.
The data for this paper were gathered through a review of research ethics applications at 69 universities across Canada. The
findings suggest REBs use a range of different application systems and require different revisions and types of oversight for
researchers who are not employed at their institution. This paper recommends further harmonization between REBs across
the country and national-level dialogue on TCPS2 interpretations.
Keywords: research ethics, university ethics, higher education, social science research, harmonization
Abstract
Most Canadian universities participate in the US-based National Survey of Student Engagement (NSSE) that measures various aspects of “student engagement.” The higher the level of engagement, the greater the probability of positive outcomes and the better the quality of the school. Maclean’s magazine publishes some of the results of these surveys. Institutions are ranked in terms of their scores on 10 engagement categories and four outcomes. The outcomes considered are how students in the first and senior years evaluate their overall experiences (satisfaction) and whether or not students would return to their campuses. Universities frequently use their scores on measures reported by Maclean’s in a self-congratulatory way. In this article, I deal with levels of satisfaction provided by Maclean’s. Based on multiple regression, I show that of the 10 engagement variables regarded as important by NSSE, at the institutional level, only one explains most of the variance in first-year student satisfaction. The others are of limited consequence. I also demonstrate, via a cluster analysis, that, rather than there
being a hierarchy of Canadian institutions as suggested by the way in which Maclean’s presents NSSE findings, Canadian universities can most adequately be divided into a limited number of different satisfaction clusters. Findings such as these might serve as a caution to parents and students who consider Maclean’s satisfaction rankings when assessing the merits of different universities. Overall, in terms of first-year satisfaction, the findings suggest more similarities than differences between and among Canadian universities.
Keywords: NSSE, Maclean’s, Canadian university rankings, student engagement, student satisfaction
Résumé
La plupart des universités canadiennes participent à l’Enquête nationale sur la participation étudiante/National Survey of Student Engagement (NSSE), qui est basée aux États-Unis. Plus le niveau de « participation étudiante » est élevé, plus la probabilité de résultats positifs est élevée, et plus l’école est considérée comme étant de bonne qualité. Le magazine Maclean’s publie certains des résultats de cette enquête. Les établissements y sont classés selon leur score dans dix catégories de « participation » et quatre résultats. Les résultats considérés sont la manière dont les étudiants de première et de dernière année évaluent leur expérience globale (satisfaction), et leur désir de retourner étudier au même endroit si c’était à refaire. Les universités utilisent fréquemment les résultats rapportés par Maclean’s à des fins d’autopromotion. Dans cet article, je me penche sur les niveaux de satisfaction présentés par Maclean’s. Sur la base d’une régression multiple, je montre que sur les dix variables de participation considérées comme importantes par la NSSE, au niveau des établissements, une seule explique la majeure partie de la variance en ce qui concerne la satisfaction des étudiants de première année. Les autres ont peu d’effet. Je démontre également, par le biais d’une analyse par grappe, qu’au lieu d’être hiérarchisées comme le suggère la façon de faire de Maclean’s avec les résultats de la NSSE, les universités canadiennes peuvent être divisées de façon plus adéquate en un nombre limité de grappes de satisfaction. Ces découvertes peuvent servir de mise en garde aux parents et aux étudiants qui considèrent les classements de Maclean’s pour comparer les universités. Globalement, en ce qui a trait à la satisfaction des étudiants de première année, elles suggèrent qu’il y a plus de ressemblances que de différences entre les universités canadiennes.
Mots-clés : enquête nationale sur la participation étudiante, Maclean’s, classement des universités canadiennes, participation
étudiante, satisfaction des étudiants
Each new semester as I walk down the hallway to my classroom, I am a little nervous, even after 27 years of teaching experience…and I’m okay with this. I think when I get to the point where I don’t feel this anxiety, I won’t be as effective a teacher. After all, I will be walking into that classroom for the next four months and it’s important to make a good first impression.
Below are 10 tips to help you get off to a great start.
One of my New Year’s resolutions was to reread some of my favorite teaching and learning resources, especially those I haven’t looked at in a while. I’m enjoying these revisits and decided to share some random quotes with timeless insights.
In 2012, Mohawk College solicited the support of the Education Policy Research Initiative (EPRI) to collect and use administrative and other data on students held by Mohawk as part of a broad initiative to improve student success based on the principle of evidence‐based decision making.
The first project involved analyses to better understand student retention at Mohawk using both descriptive and statistical modelling approaches. This work led to the development of a predictive model to identify students at risk of leaving college early.
In 2015, Mohawk and EPRI applied to and became part of the Higher Education Quality Council of Ontario’s (HEQCO) Access and Retention Consortium (ARC) to undertake a project that would build on this earlier work. The purpose was to update, refine and extensively test the predictive model, which would then be used to inform and assess a set of alternative advising interventions put in place for students entering Mohawk College in Fall 2015.
The coronavirus has colleges and universities swinging into action to move courses online. In the coming weeks, we’ll find out just how prepared (or not) academe is to do this on a large scale. Those of us in online teaching and educational technology have moved quickly to help, too, and it’s astonishing how many helpful resources have already been pulled together.
Even just a few weeks into the crisis, and really only a few days since class cancellations started to become a reality, there are top-quality guides free for the taking, created by people who really know their stuff. I will make no claim to have read all or even a fraction of them, but there are several that are clearly share-worthy:
Scenario: A doctoral student comes by your office to ask if you will serve as a reader on her dissertation committee. While a senior professor is chairing her committee, she wants you to help with the "heaving lifting." You start shifting in your seat, wishing there was a pause button you could hit as you figure out the best thing to say. You want to support this student — but as an assistant professor, a few years from tenure, you need to protect your time and avoid stepping on her adviser’s toes. Do you say no and clarify the roles of dissertation chair versus reader? Do you say yes and support the student in the way she is asking? Or do you ask her to first clarify your role with her adviser?
Maybe we should be making a stronger pitch for student-led study groups. There’s all sorts of research documenting how students can learn from each other. But, as regularly noted here and elsewhere, that learning doesn’t happen automatically, and some of us worry that it’s not likely to occur in a study group where there’s no supervision and distractions abound. Recent findings should encourage us to give study groups a second look.
For non-traditional students who are working adults or are returning to school years later, the transition to college can be intimidating. Several of my students have expressed how hard it is to learn new concepts. Many feel their minds aren’t as “sharp” as they were the first time they attended college. Others talk about the stress that comes with having to balance family and work responsibilities with their course requirements. On more than one occasion, I have had to talk a student out of quitting a program because of one or all of these factors.
Whether it’s talking to colleagues, reading the latest research or visiting a teaching and learning center, professors have places to turn to learn about best pedagogical practices. Yet faculty members in general still aren’t known for their instructional acumen. Subject matter expertise? Yes. Teaching? Not so much.
This month, we’ll focus on how to prepare for existing state and national tests. I’ll focus on three things that can help your students improve their chances to score up to their potential. By the way, kids never score above their potential; they’re just not going to randomly make enough lucky right answers time after time after time (in statistics, it’s called regression to the mean).
But, they often underperform for a host of reasons, even when they should perform much better. While we could focus on dozens of variables that influence standardized testing, we’ll focus on these three: 1) brain chemistry, 2) priming, and 3) episodic memory triggers. Some of these suggestions got so many rave reviews that they are reproduced from an earlier bulletin!