A young woman I know did everything right in high school, got into a good private college, and landed a position in corporate marketing for a major retail chain after she graduated. While it was a good, stable job—the kind that makes parents happy—she found it stultifying and unsatisfying.
With a solid academic pedigree and good experience, she hit the job market to look for a more fulfilling career. Several months into her search, she was floundering despite a solid job market in Boston. She wasn’t sure why.
This situation is typical of those faced by millennials I talk to. This woman’s job quest mirrors a unique phenomenon of this generation: an obsession with passion and a misunderstanding of its currency in the job market.
David Verrill, Executive Director, MIT Center for Digital Business
David Verrill, executive director of the MIT Center for Digital Business sits down with Dave Vellante and Stu Miniman from theCube for the pre-show to the MIT Conference on the Digital Economy: The Second Machine Age to discuss the MIT Initiative on the Digital Economy.
On April 10, 2015, the MIT Digital Economy Conference: The Second Machine Age, led by Erik Brynjolfsson, director of the Initiative on the Digital Economy, and Andrew McAfee, co-director of the Initiative on the Digital Economy, featured a series of discussions that highlight MIT’s role in both understanding and shaping our increasingly digital world.
David Verrill is the Executive Director of the MIT Center for Digital Business.
[Question] How did you get interested in researching long-term unemployment? What motivated you to write your book and start the Institute for Career Transitions (ICT)?
[Professor Sharone] I got interested in this issue as a graduate student. I was doing a PhD in Sociology at UC Berkeley, and my initial research was actually about high-tech workers and long work hours. But at the time I was doing this research, the dot.com bubble burst around the year 2000. What was very surprising to me and to the people who got caught up in it more directly (that is, the workers), was the number of people who had done everything that society told them you need to do to be successful–they went to college, they sometimes had masters degrees or PhD degrees, and years of working experience. And yet these individuals saw themselves unemployed and sometimes unable to get to any job for months and sometimes for years.
This was all around me as a graduate student, and even though it was not yet as big or brutal a national event as came later with the Great Recession, being in the Bay Area during this time was an early experience of what was to later come in 2008. So this is how I got into the issue, and I began doing interviews of unemployed individuals. I’m a qualitative sociologist, so I do in-depth interviews with people. I began asking people about the experience of job searching, how they understood the obstacles they faced, and I came to realize that looking for work is a kind of work in itself, and it’s probably among the hardest kinds of work that exist. It’s extremely emotionally difficult–it’s essentially straight up rejection. And I was very interested in how people felt with that, and in documenting some of the pain and hardship that people described to me.
I also became interested in comparing the experience of unemployed job seekers cross-nationally. My research became driven by the question, “Is what I’m seeing among American white-collar professionals universal for similar types of workers?” That question lead to my book, Flawed System, Flawed Self, which is a cross-national comparison of the experience of job searching and unemployment for this group of highly educated, skilled workers. I learned in the process how actually very different that experience can be–the sense of self-blame, and the emotional toll can be very different depending on how one needs to look for work.
Competition via innovation has been recognized as essential for mid- to long-term success and even survival of science-based corporations such as pharmaceutical and biotechnology companies. As such, understanding how best to innovate, and how to do it more often and efficiently, is top of mind for executives at many large corporations.
Traditionally, the approach for pharma companies has been to dedicate a large portion of spending on R&D, in the hopes of generating a steady internal stream of innovation to fill pipelines with new, differentiated, and competitive products. However, as is now widely known, these costly efforts have been disappointing at best, resulting in major reductions in R&D expenditures at many of the leading pharma companies. AstraZeneca, Pfizer, Sanofi, to name a few, have made headlines recently for their reductions in R&D. Though many acknowledge that the majority of R&D cuts have been completed, this trend still exemplifies the major shakeup that has caused the industry to reevaluate its focus on innovation and examine the productivity of R&D. A recent study conducted by consulting firm Oliver Wyman concluded that “the value generated by $1 invested in pharma R&D has fallen by more than 70%.” From 1996-2004 drug companies produced $275 million in five-year sales for every $1 billion spent on R&D, and from 2005-2010 it was $75 million.