This essay is based on an episode of the University of Technology Sydney podcast series “The New Social Contract”. This audio series examines how the relationship between universities, the state and the public might be reshaped as we live through this global pandemic.
Leesa Wheelahan
University Works uses empirical data to report on the outcomes of university graduates in terms of employment levels
and earnings, as well as average debt upon graduation.
University graduates experienced the highest employment growth of any educational attainment group over the last decade.
Over the last decade the global economy has become more competitive, and the jobs needed in that new economy have grown more technologically complex. As a result, educators, education researchers, and national and state policymakers have emphasized that students must graduate from high school “ready for college and career.” While college and career readiness has become the goal for all individuals, opinions have recently begun to differ about what college and—especially—career readiness actually means and how best to assess it.
Perhaps the best career advice I ever received came from my Reiki teacher, Marty Tribble, who cautioned, “The
absence of a strong yes is actually a no.”
This advice ran counter to decision-making practices I’d developed over the years, especially during my own
academic job search. I’d talk with colleagues and confidants, consider my goals and priorities, create spreadsheets
comparing choices and weigh the relevant information. I’d work to make a well-informed decision, taking in others’
advice and ultimately pursuing the pathway that I “should” follow. Though these were useful practices, what I found
is that I’d get into trouble whenever acting from the place of “should.” I was inadvertently shutting out my own
intuitive compass and relying on external guidance systems.
When it comes to keeping tenured professors content in their jobs, you can catch more flies with honey than you can with big faculty-focused strategic initiatives, a new study suggests.
The study, based on survey data from more than 3,600 recently tenured associate professors at doctoral universities, found that their organizational commitment hinged far more on whether they believed they worked in a caring, supportive environment than on their sense that administrators had undertaken broad efforts to support the faculty.
Here’s a reality many business leaders confront at some point: corporate cultures can eat innovation strategies for breakfast.
The inertia and siloing that can settle into any workplace can be antithetical to the boldness and flexibility required to drive innovation. So, what realistically can be accomplished?
Large organizations typically try to be more innovative by setting up initiatives outside the “mothership,” with mixed results. (Many large teaching hospitals, for example, have adopted this approach). By spurring innovation outside the organization, companies might be able to create incremental change and innovation, but they could have difficulty leveraging these wins in the larger company culture. General Mills, Nestle and Pepsi recently went through experiments with outside incubators, with mixed results. Despite the uncertain evidence, we’re at a tipping point
where if you’re not linked to an incubator, your business is seen as falling behind.
In November 2013, the Ontario Undergraduate Student Alliance (OUSA) asked students to comment on their experience with summer and in-study employment. Of particular interest were: the number of jobs students were working during these
terms; whether or not these opportunities were within a student’s field of study; and whether they positively impacted their academic performance.
Results of OUSA’s 2013 Ontario Post-Secondary Student Survey (OPSSS) were further broken down based on institution and field of study for questions of particular interest. This was done to easily compare the responses from these distinct groups to see how consistent the undergraduate employment experience was across academic disciplines and universities.
My first boss, the chairman of my department when I was a young lecturer, was Wilfrid Harrison. Even though there was approximately 40 years’ difference between our ages, I would have described Wilfrid as a friend.
He was a distinguished and influential figure in many ways: the first person to be appointed a teaching fellow in politics alone at an Oxford college, a former editor of Political Studies and a founding member of the Political Studies Association – which still awards a major prize honouring his name. He was also the founding professor of the department, at the University of Warwick, in which I spent 35 years.
When Wilfrid retired, properly and traditionally at the age of 65, he sold all his books, severed all substantive contact with universities and devoted himself to his wife, his daughters, his dogs and his cooking (my memories of the latter are centred on the observation that whisky and cream seemed to feature in all his dishes).
Where once a college degree was considered the ticket to a good job, the pathway from campus to career is no longer as straightforward or as certain as it was for previous generations. The world and the job market are changing dramatically, and parents, students, institutions, and employers are all deeply concerned with the question of whether college is preparing graduates for careers—a question that is itself intertwined with the larger question of the ultimate purpose of a college degree. Tuition is an investment—of time as well as money, often a lot of money—and informed consumers want to know that they’re going to see a return on that investment, usually in the form of a good-paying job that leads to a satisfying and lucrative career. Hiring and training new employees is also an investment, and companies want assurances that they are bringing on competent, capable staff with the smarts to succeed and become an asset.
After putting in the time, money and energy to complete a degree, it can be extremely discouraging to realize you no longer want to work in that industry. If you spent the better part of four years in a classroom only to learn you don’t want to pursue the field you’re now qualified for, what do you do? Most people don’t have the time or money to go back to school and start over again — but don’t fret. There are steps to take when trying to change career paths to something not directly associated with your degree. While making the switch may be difficult, it’s not impossible. The following steps will help push you in the direction you want to go.
There is a long-standing debate over the value of certain postsecondary pro-grams in facilitating employment after graduation. The National Graduate Survey (2005) was used to examine how graduates of various programs differ in their pursuits of higher education, employment status, job-program relat-edness and job qualifications. Results suggest that graduates from humani-ties are more likely to pursue higher education, are less likely to be employed full time, are more likely to have jobs unrelated to their program, and are more likely to be overqualified for their jobs. These findings highlight that humanities programs may not provide the knowledge and skills that are in current economic demand.
Just starting out? Worried about your lectures, your students, your time-management skills and more? Eight academics offer up their advice.
There are plenty of mistakes to go around early in one’s academic career. Whether they happen in front of a class or
behind the scenes, hindsight shows us how to do better. Here you’ll find a mix of experience and advice from eight
professors who’ve been there, done that and lived to share some lessons.
Every March, as faculty interview season gets underway at two-year colleges, I find myself thinking back on some of the memorable train wrecks I’ve witnessed.
There was the extremely promising — not to mention sharply dressed — candidate who, when asked why he was interested in this particular job, replied, "If you’re implying that I don’t really want to teach at a community college, I assure you, you’re mistaken. I’m not wearing this Brooks Brothers suit for nothing."
I had an experience recently that confirmed what I’d already suspected: I am no longer an early career scholar. Perhaps because of my age, or simply because I am pre-tenure, I had still considered myself to be "early" in my career until that moment.
It happened a week before my discipline’s biggest conference. As I was checking the online schedule for pre-meeting workshops, I found an intriguing one for "early career scholars of color." But after reading the agenda, I realized I wouldn’t benefit from the content. The lineup included sessions on developing career goals, publishing a dissertation, preparing for the job market, crafting a strong CV, negotiating a job offer, publishing your first book, finding a mentor. As an assistant professor, I’d already done those things. I read the list multiple times, searching, to no avail, for at least one applicable session. Then I posted on Facebook, asking the world: "When do you stop being an early career scholar?"
The other day, a person I like and trust sent me a text: “(So-and-So) is throwing you under the bus
right now.”
“No!” I texted back. “What now?”
Thanks to some fast finger work, I provided the real facts about the current meeting topic and my text partner was able to relay them and defend my honor. The crisis was averted and the benefits of cultivating a guardian-angel network were once again revealed.
But cultivating such a network is hard work. And ensuring that every gathering is populated by at least one person who will have your back is an impossible task. So what are the best ways to manage those people who seem intent
on tearing you down?
In our second annual student survey, Maclean’s reached more than 17,000 students at almost every university campus across the country. They told us how often they’ve cheated as well as how much time they spend studying, partying, working and on extracurricular activities. It is one of the largest surveys of its kind and provides a wideranging snapshot of student life on university campuses across the country in real time.
Respondents also told us whether they feel their school has prepared them for the workplace, offering insight into which universities—and which programs—are doing the best job preparing students for the real world. St. Francis Xavier came out on top for this one measure, with 53% of students strongly agreeing they had the skills and knowledge needed for employment. For some programs, the results were even better, with 71% of St. FX nursing students saying they’d been well prepared. We also asked whether the schools helped with writing ability, with St. Thomas ranking first on that front. In addition, we surveyed professors to see whether incoming university students had the academic skills needed for success.
My father once told me that the genius of Social Security is that it’s inclusive: Every working American — regardless of socioeconomic class or skin color — pays into the system and is entitled to financial benefits. That’s why it’s popular. I didn’t know it then, but my father was describing “interest convergence,” a theory put forward by Derrick Bell, who said that white people support minority rights only when it’s in their self-interest. Bell’s theory might also explain why affirmative-action and campus-diversity programs that seem to focus narrowly on minority groups might stigmatize those groups further and breed resentment among whites who believe others are getting special treatment. But a provocative NPR piece by David Shih, an associate professor of English at the University of Wisconsin at Eau Claire, suggests that we’ve been looking at those diversity initiatives all wrong. He asks, What if those programs actually help white people?
Why we need to promote socioeconomic diversity.
Sarah Green Carmichael, an editor at the Harvard Business Review, recently talked with Joan C. Williams, director of the Center for WorkLife Law at the University of California's Hastings College of the Law, about her new book, White Working Class: Overcoming Class Cluelessness in America, which examines how class divisions affected the recent election. Ms. Williams contends that liberals have long been hung up on identity and cultural issues at the expense of socioeconomic ones. While she believes that eliminating racial and gender inequality is a good thing — she’s a progressive and a feminist, after all — she suggests that there’s been a blindness to class inequality, which is alive and growing in America. As one example, she points to a recent study of fictitious résumés, which found that male law students from privileged backgrounds were far more likely to get callbacks for coveted internships at top law firms than their working-class counterparts were. Socioeconomic bias is all too real, she argues, yet many corporate and college diversity efforts tend to overlook it. It’s time to expand diversity programs to include class, she tells Ms. Carmichael.
It happened in early January, when all my historian friends were at the annual meeting of the AHA, the
leading organization in our field.
I was sitting at home, revising my manuscript introduction and feeling jealous of my friends, when I got an email telling me my last (and best) hope for a tenure-track job this year had evaporated. I’d promised myself that this would be my last year on the market. Of course, I’d promised myself that last year, too, and then decided to try again. But this time, I knew it was over.
I’m a tenured professor at a large public research-oriented university. This was my first job straight out of doctoral studies. It was, and still is, the job of my dreams. As opposed to the many tenure-track horror stories we hear (particularly from women), I felt valued and appreciated from Day 1 in my department. I respect my colleagues and feel respected by them.
So what’s the problem? Last fall I physically collapsed.