Christian universities are becoming harder and harder to find in modern academia, but it wasn’t always that way in America.

__________________________________________________

The history of America’s institutions of higher learning goes all the way back to before the nation was independent.

Believe it or not, religion was integral to colleges and universities in America when higher education first arrived here.

According to an article by Clare Kaufman, a freelance writer specializing in education and career advice, the first colleges established in America, during colonial times, were divinity schools.

They were intended to prepare students for careers as Puritan clergymen and to spread the faith of the group that founded them, according to Kaufman.

These schools also taught the traditional liberal arts curriculum, rather than more practical scientific or vocational subjects, according to Kaufman.

The religious foundations of these schools, such as Harvard, gradually eroded however, according to Kaufman.

In the 1600s, 70 percent of graduates were clergymen, and by the second half of the 1800s, only 10 percent were clergymen.

The increasing focus on secular education had an early proponent in Thomas Jefferson, who believed in educating the citizens for the sake of democracy and building a skilled workforce.

The first great landmark in achieving this type of education came in 1862, when President Abraham Lincoln signed the Morrill Land-Grant Act, according to Kaufman.

The act donated public land to the states and territories, in order for them to establish colleges focused on agriculture and mechanics.

Many of the state colleges and universities that we know today had their origins in the Morrill Act.

More Land-Grant Acts, in 1862 and in 1890, helped the number of higher education institutions explode from only 23 in 1800 to 821 in 1897.

The new schools embraced science, engineering and professional training, and by the end of the 1800s, only 13 percent of the nation’s institutions of higher education were religious, according to Kaufman.

Also during this period, American universities began to develop closer relationships with industry and private funding became more prevalent.

As higher education moved into the 20th century, economics played an important role in shaping the focus of institutions.

Fields directly linked to industry, such as chemistry and physics, grew to produce scientists and technology, and the importance of a versatile education began to emerge.

The next major change in the nature of higher education came with two events: the G.I. Bill and the Civil Rights Movement.

The G.I. Bill made it possible for many war veterans lower on the economic ladder to attend a college or university.

Millions of students who had been too poor to receive higher education had their tuition, books and even housing paid for, enabling them to get the education they wanted.

It was the G.I. Bill that put together the ideas of getting a college degree and of achieving the “American Dream,” and the government resolved to be more involved in higher education, according to Kaufman.

The Truman Commission Report, a piece of legislation aimed at doing this, brought about the system of community colleges we know today.

The Civil Rights Movement overcame the racial barriers that the G.I. Bill had not, according to Kaufman.

Affirmative action in college admissions helped students from other cultures and ethnicities overcome hurdles of being accepted into colleges and universities, according to Kaufman.

Although the 1990s saw rejection of these affirmative action measures in several states, student bodies and curricula had already become quite multicultural, according to Kaufman.

In 1991, the first online university received accreditation, demonstrating the newest trend in the changing world of higher education — the use of technology.

Distance learning has experienced amazing growth, and opened up new opportunities for people such as working adults and stay-at-home mothers, according to Kaufman.

Interactive technology is changing learning to a more involved and project-based activity, and making it easier for professors to tailor the material to different styles of learning, according to Kaufman.

Higher education in the U.S. has distinguished itself from that of other countries by being mainly state policy driven, rather than federal policy driven.

The private funding that began to grow in the late-1800s to early 1900s has resulted in a considerable portion of American higher education institutions being either nonprofit or for-profit institutions.

In 2010, the count was that out of 4,350 colleges and universities, over 1,600 were private nonprofit and 1,000 were private for profit, according to one government education report.

These private schools generally have higher tuition rates, meaning that although the federal government does not exert as much control over them, federal student aid is still an important factor.

Despite growth in both federal and student aid from the late-1990s into the first decade of the 21st century, studies show the cost of attending private schools has grown faster.

The 2006 Report of the Commission on the Future of Higher Education pointed out other areas where America’s higher education was losing its shine.

Many high school students dropped out before reaching postsecondary education, and those who made it past high school often did not go to college because of a lack of information, costs or the confusing federal student aid system, according to the report.

For those who made it into higher education, many spent time learning skills they should have acquired in high school, and many were not at the level they should have been when they earned their degree, according to the government report.

In fact, literacy rates among college graduates had dropped.

None can say for certain what the future of higher education will hold, but one thing is sure — colleges and universities will continue to change and adapt in exciting ways as technology revolutionizes both teaching and learning.