Going to college used to be something that few people did. In 1940, only 4.6% of Americans had completed a bachelor’s degree or higher. In 2017, that number was 33.4%. Another ten percent of Americans had completed a two-year associate degree, meaning that about nine out every twenty Americans had at least a two-year college degree. Considering that only about 15 percent of Americans finished high school in 1940 (when the average number of school years completed was about eight and a half), that’s an impressive shift in Americans’ attitude toward education in general and post-high-school education in particular.
Junior colleges started in the early 20th century, offering expanded local access to education beyond high school. These schools initially offered courses that paralleled the type of classes that a student would expect to take if they had attended a senior college for their freshman and sophomore years. Completing the two-year program resulted in a certificate until the University of Chicago began awarding associate degrees in 1899. By the 1920s, junior colleges were offering distinct programs in trades, including business management, engineering, mechanical work, agriculture, and other areas. After World War II, the occupational training component of junior colleges expanded considerably; today, about half of associate degrees are related to a specific trade or occupation and the other half are in more traditional liberal arts and sciences.
There is an ongoing debate between those who feel that both two- and four-year institutions have placed too much emphasis on occupational training and have allowed liberal arts to be diminished. The discussion can quickly veer into arguments over how rigorous college work is today or whether general education requirements in English composition, math, social sciences, and other areas are needed at all in the 21st century.
It’s important to remember that when you hear about “higher education,” “colleges,” or “universities” in the media, one size doesn’t fit all. To an individual student there is no “higher education” in that sense. There is only the institution they’ve chosen to attend. Most are non-profit, but the profit education sector has grown and ebbed in recent years. Many of the best known universities are operated by states; nearly all community colleges are public institutions; but there are many private colleges and universities in the U.S. Some are enormous, some are tiny. Some are sponsored or influenced by religious denominations. Some are expensive, some much less so.
Each student’s experience is unique, though we often try to “standardize” things. Sometimes those standardization efforts benefit the student; other times it seems that the choice to standardize a process benefits the institution, the faculty, or the support staff. Trying to support a student’s goals and dreams is hard work, but there are so many of them (20 million or so in 2020)! Shortcuts are taken by the school… and by the student.
This post is by no means an exhaustive survey of the history of higher education in the United States. There are plenty of good books on that subject if you’re interested in more depth (John Thelin‘s classic “A History of American Higher Education” is one choice). I wanted to set the stage for the discussion I’m hoping to have over the next few days, weeks, or months about where the (admittedly) broad concept of higher education is now and how we can begin to respond to the challenges and opportunities that already existed, but which 2020 has made even more critical.