Busting the College-Industrial Complex
Obstacles to employment are a problem. They impede social mobility, disproportionately harm society's most vulnerable citizens, and hinder the larger economy. That is why efforts to remove such barriers have become a bipartisan cause. It's why more than two dozen states now ban public employers (and sometimes even private ones) from inquiring about applicants' criminal history, due to concerns that capable job candidates will be turned away or otherwise deterred. A number of states and locales are going further: New York City, for example, prohibits public employers from asking about applicants' prior-earnings history; in 2016, Massachusetts became the first state to prohibit the practice for all employers.
Occupational-licensing reform has similarly seen growing, bipartisan support. Reformers on the left and right have surveyed the staggering costs and barriers to entry for quotidian positions such as masseuse, nail technician, exterminator, and florist, and concluded that these need to be reduced or eliminated. In doing so, they are embracing the understanding Milton Friedman propounded most fluently in his 1962 book, Capitalism and Freedom: "The most obvious social cost," Friedman wrote of occupational registration, certification, and licensure, "is that any one of these measures...almost inevitably becomes a tool in the hands of a special producer group to obtain a monopoly position at the expense of the rest of the public."
Yet even as reformers have pushed to remove a variety of barriers to employment, the biggest and most significant barrier to employment in American life — the use of the college degree as a default hiring device — has gone blithely unremarked. Indeed, even as reformers target employment obstacles for felons and florists, the pervasive use of college-degree requirements, despite its dubious legality and profound costs, has bizarrely escaped serious consideration.
At its best, higher education can be a powerful engine of opportunity and socioeconomic advancement. And that's the way it's almost universally described — at least in college brochures, think-tank reports, campaign stump speeches, and legacy media. Nevertheless, for too many Americans, the truth is that post-secondary education is principally a toll: an ever-more-expensive, increasingly mandatory, two-, four-, or, more accurately, six-year pit stop on the way to remuneration.
Constitutional doctrine holds that employment practices that disproportionately affect members of a protected group are prohibited, unless the practice can be shown to be directly related to job performance and consistent with business need. Nonetheless, thousands of employers now casually flout this standard by screening applicants based on post-secondary credentials and by factoring degrees into hiring decisions, even where degree requirements have a disproportionate effect and bear no obvious relation to job duties or performance.
In a comprehensive October 2017 report, researchers from Harvard Business School documented extensive evidence of increasing "degree inflation," with employers demanding baccalaureate degrees for middle-skill jobs that previously did not require one and for which the work duties have not changed. In fact, 61% of employers surveyed admitted to rejecting applicants with the requisite skills and experience simply because they lacked a college degree. Researchers calculated that this affected an estimated 6.2 million jobs across dozens of industries.
This all raises an obvious yet oft-overlooked question: Why are college-degree requirements treated differently from other employment tests?
The burdens of degree inflation, of course, fall most heavily on those of modest means: low-income and working-class individuals who are less likely to attend college or to complete a degree. Degree requirements summarily disqualify non-credentialed workers with relevant skills and experience from desirable jobs. They impede young workers who could otherwise take entry-level jobs and build the skills and expertise needed to pursue new opportunities. And they hold students and families hostage, forcing them to spend substantial time and money on collecting degrees, regardless of whether students wish to attend college and whether the degree in question actually conveys relevant skills or knowledge. The privileged status of the degree, meanwhile, has insulated colleges from non-degree competition. As the de facto gatekeepers to "good" jobs, colleges have increasingly operated as an employer-sanctioned cartel.
As we consider the kinds of changes to work implied by artificial intelligence, automation, new technologies, and demographic shifts, it's long past time to ask whether the privileged legal treatment of college degrees is defensible — especially when it may be serving to impede opportunity, burden struggling families, aggravate systemic inequities, and stifle economic dynamism.
Employer convenience plays a major role in facilitating and escalating degree inflation. For employers, a college degree is a cheap, easy-to-read signal that an applicant has a particular bundle of behaviors, social capital, and competencies, such as the ability to complete work on deadline and take direction. But employer convenience is only a part of the equation — and, in fact, something that has been exacerbated by the unintended consequences of federal anti-discrimination law.
On the matter of employment tests, the contemporary college-industrial complex has been shaped by two key developments: the Civil Rights Act of 1964 and the U.S. Supreme Court's 1971 decision Griggs v. Duke Power Company. When enacted in 1964, Title VII of the Civil Rights Act prohibited employers from discriminating against workers or applicants on the basis of race, color, religion, sex, or national origin, but specifically allowed employers to use "professionally developed" ability or hiring tests, so long as they were not "designed, intended or used" to discriminate against protected groups. Seven years later, the Supreme Court unanimously ruled in Griggs v. Duke Power that, when minority groups are disproportionately affected — or suffer a "disparate impact" — in personnel decisions, employers must show that any requirements are directly job-related and an accurate predictor of job performance, sharply curtailing the use of ability testing or hiring tests.
Prior to passage of the Civil Rights Act of 1964, it was commonplace for employers to use ability tests to measure job applicants. According to one 1963 survey, 84% of U.S. employers were using some form of personnel test in their hiring process. While businesses maintained the necessity of such tests, many in the civil-rights community feared such measures were intended to purposefully disadvantage minority applicants (doubtless, some of them were). These arguments, as Bryan O'Keefe and Richard Vedder pointed out in a 2008 paper, made their way into the debate over the Civil Rights Act, particularly with regard to Title VII's treatment of employment issues.
On the Senate floor, Senator John Tower of Texas raised concerns that the proposed legislation would unfairly limit employers from assessing the qualifications of prospective employees, citing the then-recent example of an Illinois public official ruling that an ability test that Motorola had used for years was illegal because it was unfair to "disadvantaged groups." Senators Hubert Humphrey of Minnesota and Clifford Case of New Jersey dismissed Tower's concern as a "red herring," insisting that the proposed Title VII language was "completely different" from the Motorola scenario.
The debate eventually culminated in the addition of a new section, 703(h), which specifically permitted the use of a "professionally developed ability test," provided that "its administration or action upon the results is not designed, intended or used to discriminate" on the basis of protected characteristics. O'Keefe and Vedder observed, "While it seemed that a reasonable compromise had been struck, some hard-line opponents of the bill were skeptical." These skeptics included Tower, who lamented, "[T]he regulations, lawsuits, and federal pressures placed upon private business by this title are utterly unacceptable in a free economy, particularly since these pressures can be placed upon any firm at any time in presuming the firm guilty until it proves itself innocent."
Less than a decade later in Griggs v. Duke Power, the first major Supreme Court ruling to address Title VII's employment-testing restrictions, the Court opted for a very broad reading of the anti-discrimination conditions of section 703(h). The plaintiffs averred that the high-school diploma and aptitude-testing requirements used by Duke Power Company, a public utility corporation, discriminated against African-American workers, thus violating Title VII. They also argued that section 703(h) required that employment tests be used only to measure the ability to do a specific job, thereby prohibiting more general tests and requirements. For their part, Duke Power argued the law said no such thing, as it specifically allowed a "professionally developed ability test" as long as it was not intended or used to discriminate.
In a unanimous ruling, the Supreme Court rejected Duke Power's argument and the lower court's straight-letter interpretation of section 703(h). The Court instead deferred to the expansive interpretation promulgated by Equal Employment Opportunity Commission (EEOC) regulations. The EEOC, the Court noted, "has issued guidelines interpreting 703(h) to permit only the use of job-related tests. The administrative interpretation of the Act by the enforcing agency is entitled to great deference." Using this bureaucrat-invented interpretation, the Court held that the Civil Rights Act "proscribes not only overt discrimination, but also practices that are fair in form, but discriminatory in operation."
Thus, Griggs set forth a new threshold for employment tests based on disproportionate outcome and "business necessity": If an employment test or practice is found to disproportionately affect a protected group, regardless of discriminatory intent, then the burden falls on the employer to prove that the test is directly "job-related" and a "reasonable measure of job performance." On that basis, the Court ruled against Duke Power, holding that "neither the high school completion requirement nor the general intelligence test is shown to bear a demonstrable relationship to successful performance of the jobs for which it was used."
Though briefly revised and limited by the 1989 Supreme Court decision Wards Cove Packing Co. v. Atonio, the Griggs standard and disparate-impact theory of discrimination were codified into law by Congress in the 1991 Civil Rights Act. If an employer uses a test or selection procedure to make employment decisions and "disparate impact is established," that practice is deemed unlawful unless the employer can "demonstrate that the challenged practice is job related for the position in question and consistent with business necessity." Furthermore, an employment practice can also be found unlawful if an "alternative employment practice" with less disparate impact could be used instead.
While education requirements are subject to these conditions (for example, the high-school-diploma requirement that was struck down in Griggs), the reality is that the Griggs standard has only been scrupulously applied to other, non-degree-related employment tests. Employers using I.Q. tests to screen and hire applicants, for instance, must use approved, professionally developed tests and justify I.Q. thresholds. If a company wishes to require job applicants to possess an I.Q. of at least 110, it must be able to demonstrate why an applicant with an I.Q. of 109 is incapable of performing a job someone with a 110 I.Q. can. One need only read that sentence to understand why human-resource lawyers quiver in horror when executives contemplate using that kind of screening test.
Even directly applicable employment tests can run afoul of federal regulators. For example, in 2017 the EEOC sued CSX Transportation, a railroad company, for sex discrimination because male applicants passed the company's physical-fitness tests at a disproportionately higher rate than did female applicants. "Companies must refrain from using a test causing adverse impact unless it is job-related and consistent with business necessity," explained EEOC District Director Spencer Lewis. "Even if a test passes that standard, an employer must adopt any comparably effective alternative practices that have less adverse impact."
THE VALUE OF A DEGREE
If a transportation company, whose employees are unavoidably required to move heavy things on the job, can violate federal employment regulations for using physical-fitness tests, it's easy to see why employers have moved away from such tests and sought safer alternatives whenever possible. The question is why, given the legal standard, employers are seemingly permitted carte blanche when it comes to college-degree requirements?
After all, many degrees bear no significant relationship, obvious or otherwise, to the particular job that a candidate is seeking. Proving business necessity would seem harder still. And the data on college completion leave no doubt that the use of degrees poses enormous disparate-impact questions.
In large part, it seems degrees hold such vaunted status simply because college-degree requirements have escaped serious legal scrutiny. As such, risk-averse employers have relied on degrees as an expedient way to screen applicants while avoiding the legal pitfalls accompanying other employment tests. Employers have been content to presume that a college degree signals desirable behavior and confirms the baseline verbal and written skills required for many jobs. Meanwhile, colleges and universities maintain their own admissions criteria that are subject to nondiscrimination rules and put heavy emphasis on diversity, inclusion, and racial representation. Consequently, college degrees have a patina of intellectual capacity and nondiscrimination, making college a "natural" pathway to good employment and shielding businesses from supply-side consequences.
The quid pro quo dynamic underlying the college-industrial complex is fortified by the fact that higher education holds tremendous sway in our national consciousness. In the public discourse, employment and college degrees are almost inextricably linked. Calls for "college for all" and "college and career readiness" have helped sell the notion that college is essential for professional success. This zeitgeist almost invites degree inflation: Since everyone knows you go to college to get a good job, it only makes sense that employers would require college degrees. Even to question this collective college-or-bust mentality is to invite accusations of heresy or suspicion of hostile motives. Taken together, the cozy convenience of the college-industrial complex and higher education's cultural hegemony probably explain why degree requirements have never been subjected to a significant legal challenge.
It is, of course, a fact that the average lifetime income for a college graduate is vastly higher than for those with only a high-school diploma; Georgetown's Center on Education and the Workforce calculates about $1 million more in lifetime earnings for degree holders. Oft ignored, however, is that wages vary dramatically depending on major and field of study. And this emphasis on degree completers ignores the reality that fewer than 40% of high-school graduates earn their bachelor's degree within six years, with the rest weighted down with debt rather than buoyed up by a credential.
More to the point: That college-degree holders earn more than non-degree holders sidesteps the key question, which is whether college degrees should be required for the remunerative jobs that lead to this earnings differential — and, in fact, whether such a practice is even legally permissible. Slamming the employment window on those without degrees, regardless of experience, expertise, or skill, is a sure-fire way to keep the lifetime earnings statistics of college completers artificially high, but it doesn't say much about the innate value of a degree.
THE DISPARATE IMPACT OF DEGREE REQUIREMENTS
For employers, whatever the real costs, college-degree requirements represent an easy, risk-free way to screen applicants without having to worry about the vagaries of federal employment law. And colleges, of course, reap the outsize benefits of acting as the gatekeepers to employment. It's an arrangement that keeps upward pressure on both enrollment and tuition, allowing campus bureaucrats to pull in six-figure salaries while costs soar ever higher and colleges feast on billions in federal student loans and other streams of taxpayer largess.
Yet there's a compelling case that indiscriminate degree requirements, particularly in light of well-documented, widespread credential inflation, violate the same legal standards that helped catalyze their use. Ironically, given all the worry around employment testing, college-degree requirements carry prima facie disparate-impact implications, making their casual acceptance all the more remarkable. Indeed, the Harvard Business School report mentioned above has noted that degree inflation disproportionately harms groups with college-graduation rates below the national average, particularly African Americans and Hispanics.
Just for argument's sake, let's consider how college-degree requirements would fare under Title VII disparate-impact theory and the Griggs standard of business necessity.
When determining disparate impact, enforcement agencies often rely on the "four-fifths" rule of thumb: If a hiring practice yields a selection rate for any race, sex, or ethnic group that is less than four-fifths (or 80%) of the group with the highest selection rate, adverse impact is considered likely, at which point the onus is on employers to justify their practices. According to Bureau of Labor Statistics data from 2017, 61% of Asians in the labor force (age 25 and older) have a bachelor's degree or higher, compared to 40% of whites, 30% of African Americans, and 20% of Hispanics. Therefore, under the four-fifths standard, there are a disproportionate number of Asian college-degree holders compared to white, African-American, and Hispanic workers, and a disproportionate number of white degree holders compared to African-American and Hispanic workers.
Determinations of disparate impact are not made so sweepingly, of course, but with regard to specific disparities at a particular business or company. But, as proof of concept, when simply considering the educational attainment of the current labor force, these figures would raise potentially grave disparate-impact concerns if cascaded down to individual employers. The size of these disparities means that huge numbers of employers would potentially run afoul of the disparate-impact standard if their use of degrees were subjected to rigorous scrutiny.
Furthermore, employers fail the other prong of the Griggs test when they indiscriminately demand college degrees without evidence they are necessary for — much less related to — the job in question. A 2014 report by Burning Glass Technologies found that employers increasingly require bachelor's degrees for positions where current workers don't have one and, crucially, where the skills necessary for the job haven't changed. For example, 65% of job postings for executive assistant and secretary positions call for a B.A., even though only 19% of those in such roles hold a degree. Employer preference for degrees applies even to entry-level jobs such as IT help-desk technician, where "the skill sets indicated in job postings don't include skills typically taught at the bachelor's level."
Recall that the threshold for establishing an employment test is "job-related" and an "adequate predictor of job performance" is quite high. And the fact is that employers who require a college degree — as well as those employer practices that may systematically advantage degrees from certain institutions — almost uniformly fail to meet this bar. This means that keyword-based sorting for "B.A." or "bachelor's degree" as part of online job postings (the manner in which most hiring processes now begin), for instance, should clearly be an impermissible employment practice.
It's obvious how those without college degrees are affected by the current college-industrial complex, but there are broader consequences as well. For one thing, while employers do reap benefits from the ability to easily sift applicants based on degrees, there are real-dollar costs from this convenience premium. As researchers from Harvard Business School emphasized in the October 2017 report, college graduates filling middle-skill positions actually cost more to employ, have higher turnover rates, tend to be less engaged, and are no more productive than high-school graduates doing the same job. By requiring a degree for jobs that don't actually require them, businesses effectively bid against themselves, paying a salary premium for little to no return. And since only about a third of American adults have bachelor's degrees, employers also shrink the potential applicant pool, elongating the time needed to fill open positions — all while foreclosing opportunities to those without degrees. Yet these costs often go ignored or else come to seem acceptable to business leaders who want to avoid legal headaches or make their companies appear more prestigious, and who are themselves almost uniformly college graduates.
This state of affairs is not restricted to baccalaureate degrees. Instead, over-credentialing is a compounding phenomenon observable at every post-secondary level, as little by little workers and would-be workers credential up in order to distinguish themselves against their peers. In a 2016 nationwide survey of over 2,300 private-sector hiring and human-resource managers, for instance, CareerBuilder found that 27% of employers are recruiting master's degree holders for positions traditionally staffed by those with four-year degrees (37% are hiring college graduates for positions that had been held primarily by those with high-school diplomas). Indeed, for the past two decades, the master's has been the fastest-growing degree, a remarkable development given the paucity of information on its economic value.
Perhaps less obvious, but quite relevant, is how college's status as a quasi-mandatory pathway to desirable work has also helped fuel campus culture wars. After all, if remunerative employment requires a degree, many students will reluctantly feel obliged to pursue one. This dynamic magnifies the stakes of normal campus tensions for ideologues, parents, and policymakers. Academics and pundits who fret about the politicization of higher education will, in the same breath, extol universal college-going. Well, one of the surest ways to depoliticize something is to shrink its size and import. If more professionally ambitious students felt free to bypass higher education, if higher education were no longer the unavoidable gateway to professional success, and if institutions had to work harder at convincing students of the value of their degrees, the talismanic role of college would be tempered and campus culture clashes might feel less existential.
The privileged status of the college degree, and the earnings that attach to it, have also tended to stymie interest as to whether college degrees actually add much in the way of skills and capabilities for graduates. For instance, between 2013 and 2016, at least 68 public colleges and universities administered the Collegiate Learning Assessment Plus, a test that measures students' critical-thinking, problem-solving, and reasoning abilities. As the Wall Street Journal reported, the results showed that, "[a]t more than half of schools, at least a third of seniors were unable to make a cohesive argument, assess the quality of evidence in a document or interpret data in a table." Even at some prestigious flagship universities, "the average graduate shows little or no improvement in critical thinking over four years." In their hotly debated 2011 book, Academically Adrift, scholars Richard Arum and Josipa Roksa tracked CLA scores for student cohorts at 24 selective colleges, and found that 45% of students in their sample "did not demonstrate any statistically significant improvement" in test scores during college, and that, on average, students showed minuscule gains in learning and critical thinking.
Thus, we wind up with lots of dutiful, loan-taking college enrollees and burdened students and families, despite clear evidence that at least half of students will end up as indebted non-completers — and amidst the likelihood that taxpayers will be left paying at least some of the bill. In other words, beyond issues of inefficiency and unfairness, the college-industrial complex aggravates an array of existing concerns about higher education.
RIGHT-SIZING THE ROLE OF COLLEGE DEGREES
The problems with out-of-control college credentialing should give businesses, academia, and policymakers pause. Given the legal scrutiny around virtually all other employment tests, college-degree requirements have been turned into a default tool for gauging applicant aptitude, regardless of the reliability and validity of that measure or of the social costs it imposes. Removing the college degree from its privileged pedestal would be an invitation to the accelerated development of alternatives that are faster, cheaper, less time-consuming, and more user-friendly, but which right now operate at a substantial disadvantage.
The goal should be to level the playing field between college degrees and other employment practices. When college degrees are an appropriate prerequisite for a particular position, it's fine to treat them accordingly. If employers determine that degrees in nursing, electrical engineering, sociology, or anything else meet the established standards for hiring when it comes to particular jobs — and can make that case — then using these degrees as hiring tests is wholly appropriate. When degrees do not meet that standard, though, such requirements are problematic. After all, while a college education can offer great value — and while post-secondary credentials may make obvious sense when it comes to hiring for certain occupations — qualified applicants with requisite skills should not be turned away from broad swaths of jobs simply for want of a piece of paper.
There's an enormous, bipartisan opportunity here. Reducing the employer risk-avoidance and bureaucratic strictures that have enabled degree inflation, exacerbated student debt, and added costs to employment can offer a win-win for employers, those hoping to advance, and those seeking to address out-of-control college costs. The question is how to get there.
While it's a problem that federal policy has encouraged employers to rely so heavily on college degrees, it's also disconcerting to contemplate judges or federal officials getting into the business of dictating which employers or professions can require which kinds of credentials. After all, well-meaning federal regulation is a large part of how we got here. Given this, the optimal course for reformers, federal officials, and state government is to move to open barriers and promote innovation, competition, and experimentation.
The boldest and most straightforward response to all of this would entail Congress amending Title VII of the Civil Rights Act and the corresponding EEOC regulations around employment tests and disparate impact. The goal should be to put college degrees and other hiring qualifications on an equal footing by allowing easier use of employment tests and moving away from the doctrine of discrimination by numerical disproportionality. Opening room for hiring decisions to be made in other ways without constant threat of lawsuit and federal investigation would both likely spur experimentation around employee selection processes as well as send a strong signal that Washington is willing to come together around workforce development.
Broadly speaking, the playing field can be leveled in one of two ways. The first and far more desirable tack would be to deregulate: broaden the ability of employers to use professionally devised employment tests in the same manner they use degrees, without fear of legal liability based almost entirely on disproportionate outcomes. The second, far less desirable approach would be to regulate further: subject college degrees to the same stringent tests that are applied to other kinds of employment tests. Of course, the prospects for such legislative activity seem dim, to say the least.
Short of legislative action, the executive branch could employ civil-rights enforcement strategies that might help to level the playing field between degrees and other employment tests. Lawyers at the Department of Labor and the Department of Justice could begin to ask the same kinds of questions about degree requirements that they do regarding other hiring practices, making clear that they are willing to examine whether the cavalier use of the college degree as a hiring requirement constitutes a suspect hiring practice. Indeed, the question arises whether the cozy practice of on-campus recruiting, in which college students receive privileged access to employers, is itself problematic. Bureaucratic enforcement of this sort does, of course, engender potentially significant concerns, particularly for those concerned about capricious executive action.
Thankfully, there is also substantial room for state and federal policymakers to tackle the challenge in other ways. The college degree's privileged position, and the accompanying financial-aid policies designed to help make college accessible, have stunted the growth of alternatives. Thus, apprenticeships, non-degree programs, and training partnerships have been treated as poor cousins, with restrictions on federal aid limiting the ability of students to afford such programs. Meanwhile, efforts to develop a meaningful system of "micro-badges" to signal specific skills and competencies are still very much in their infancy. This has fostered an unfortunate cycle, as these other tools have been marginalized and made more difficult to obtain, reinforcing employer bias toward college degrees. Reforming the Higher Education Act so that federal aid can be used at a variety of non-college providers, and taking steps to promote the accreditation and validation of non-degree credentials, are useful steps in ameliorating the bottleneck.
Of course, policymakers have been reluctant to challenge the college-industrial complex, and most employers have long since made their peace with the predictability of the status quo. This creates a propitious opportunity for outside advocacy and legal action. If the status quo is buttressed more by convenience, habit, and resignation than by common sense or a defensible legal standard, a legal challenge to degree requirements could be the most promising strategy. For example, the Institute for Justice (IJ), a public-interest law firm, has amassed an impressive track record in litigating on the right to work and economic liberty. They have won lawsuits against states and localities that unduly restrict workers' rights with excessive regulations on a bevy of issues, including on behalf of taxi drivers in Arkansas, home bakers in Minnesota, and makeup artists in North Carolina. IJ, or a similar organization, would likely find no shortage of potential plaintiffs if they were to undertake a Title VII suit against degree requirements.
Other forms of private action, pursued on a sufficient scale, might help capture the public imagination and more fully illuminate the dysfunctional status quo. Back in 2011, for instance, Silicon Valley venture capitalist and noted libertarian Peter Thiel launched a scholarship that awards recipients $100,000 if they are willing to forego or leave college in order to start a business. What if there were a rash of such efforts, public pressure on employers to not force potential workers to pay colleges a ransom, and advocacy and public-information campaigns that called out the problems with the preferential treatment of college degrees and slippery slope of disparate-impact theory?
Some employers appear to recognize and be inclined to seek ways to address the costs and perversities of the college-industrial complex. In 2018, the job-search site Glassdoor compiled a list of 15 major companies that no longer require applicants for certain posts to have a college degree. The list included an array of entry- and mid-level jobs — everything from barista to Apple "Genius" to "senior manager of finance" — at such corporate giants as Apple, Google, Bank of America, Penguin Random House, Home Depot, Costco, Whole Foods, and Starbucks. Glassdoor lauded these firms for opening new pathways to success and recognizing "that book smarts don't necessarily equal strong work ethic, grit and talent." Praiseworthy developments on the part of employers such as this should be celebrated and carefully studied.
The business benefits of such changes should be heralded as well. As author Ryan Craig noted in 2017, after the U.K. branch of global accounting firm Ernst & Young removed degrees from job descriptions in favor of micro-assessments determining whether the applicants were a good match for the job, EY was recognized as one of the U.K.'s top-20 employers for social mobility. Craig reports that, in shifting away from degree-based hiring, EY saw a 75% increase in applications, as well as a 10% increase in hires from state schools, and a 7% increase among those who were first-generation university graduates. By highlighting practices like these, which can lead to better business outcomes, policymakers and advocates may encourage employers to reassess their routines and see the value of rethinking their reliance on the college degree. Policymakers intent on pushing open the doors of opportunity should do what they can to make such a course less risky and thus more appealing for employers.
Finally, in higher education, those more intent on furthering opportunity than on providing aid and comfort to corporate human-resources departments would do well to consider the part that colleges play in all of this. There is an overdue opportunity within post-secondary education to develop alternative credentialing models and devise new ways to credibly certify aptitudes and skills. Most important, there's a need to ask when and why colleges are complicit in practices that compel students to unnecessarily enter college — not because those students actually want or need the things a college degree represents, but because they fear being denied good jobs based on their failure to buy a piece of paper.
PATHS TO OPPORTUNITY
College can be a very good thing. A great many Americans are better off for having earned their degree. That's not the issue. Rather, the point is that an inconsistent judicial standard, excessive regard for employer convenience, and a well-fed college cartel ought not oblige Americans to pay the ransom of a college diploma in order to seek remunerative employment and professional success. Compelling Americans to buy a degree, especially when doing so involves onerous debt and yields dubious value, is behavior more typically ascribed to protection rackets than to engines of opportunity.
"Diplomas and tests are useful servants," Chief Justice Warren Burger wrote in his Griggs decision, but "they are not to become masters of reality." If we're serious about expanding the paths of opportunity and enabling social mobility, it's high time to bust the college-industrial complex by tackling employment restrictions masquerading as education requirements.