Blog

Can IT Be the Core of Higher Ed?

by Justin Marquis Ph.D.

In his opening remarks at EDUCAUSE 2012, NYU Journalism professor Clay Shirky made the following statement:

“IT [information technology] changes the world by making impossible problems trivial” (Clark 7 Nov., 2012).

Despite this belief, that I also share, IT remains largely at the periphery of higher education. Technology more often than not functions as a support, or an add-on, rather than as the core of education. This is extremely ironic at a time when most of our society relies on technology, social media, and telecommunications to mediate most of our interactions. Given its importance, why hasn’t information technology become a core subject – a focus if you will- of higher education? What would students gain from a shift in priorities to emphasize IT? What would they lose? And what would it take to make the change happen?

Why Isn’t IT More Important in Higher Ed?
The crux of this issue is in understanding why technology is not more central to the mission of higher education. In my post Preparing for the Onrushing Educational Tsunami I examined this issue specifically and found that there are several causes for the staunch resistance to a full integration of educational technology at the undergraduate level.

  • Technology undermines residency: Based on an underlying suspicion and misconception that technology serves to isolate people rather than facilitate connections that bring them together and increase opportunities for collaboration, much of the old guard of higher ed administration views technology as a barrier to establishing the kinds of communities that make learning successful.
  • Technology breaks tradition: Playing on hundreds of years of tradition in the U.S., higher education is set in its ways and those ways naturally do not include technology. Personal contact, interaction, lecture halls, and a well-defined way of doing things are resistant to change of a system that has been successful for a very long time.
  • Technology is viewed as less engaged: Another myth about technology is that it fosters bad habits and anti-intellectual practices that are antithetical to the college experience. In fact, technology can be employed to increase academic rigor by connecting student researchers to a far wider pool of information and even actual experts than would ever have been possible on a residential campus with finite resources and an isolationist bent. Technology may make things faster and easier, but that does not automatically equate with laziness. Rather, it makes students more efficient and productive.
  • Technology is expensive: This is not a myth and is the greatest single deterrent to a full and productive centering of technology in higher education. There can be no doubt that the costs of acquiring, equipping, maintaining, supporting, upgrading, and replacing electronic devices is prohibitive – it is. But the benefits of facilitating rich and meaningful uses of the kinds of technologies that will not only make education more effective and efficient, but that will also provide students with valuable real world skills should not be under estimated.

While all of these reasons may not apply to every college or university, some combination of them certainly contributes to the overall marginalization of IT at most institutions. But what exactly would making technology the focus of higher education accomplish?

What Would an IT Focus for Higher Ed Do?
Some of the gains possible from a centering of IT in higher education are obvious, but others are more subtle, though no less important. Here are the three major benefits that students would reap from a more centralized role for IT in higher education.

  • Real world technological literacy: The most important thing that students stand to gain from a centering of IT in the higher education curriculum are valuable skills in communication, collaboration, and speaking the language of the 21st Century. The first two of these are essential skills for working in a hyper-connected global economy, the latter ensures that college graduates will be able to adapt to the constantly changing nature of that landscape. 
  • A broader and deeper college experience: As mentioned previously, the connections made possible through a rich integration of technology in the classroom can facilitate connections more broadly to resources beyond the ivory tower while simultaneously allowing students the tools to dig more deeply into their areas of interest.
  • The habit of lifelong learning: Finally, the constantly changing nature of technology, which makes an integration of IT challenging, also demands that students develop an ability to adapt to a perpetually shifting world of new connections, ways of communicating. Technology is not the key here, but rather cultivating a mindset that the use of technology is critical to success and that use depends on an ability to keep up with and incorporate new tools as they come online.

These big picture benefits of a core integration of IT would not be easy to achieve and the most obvious argument against doing so would be to argue what might be lost in the move.

What Would We Lose by Moving IT to the Center of Higher Ed?
The obvious thing to be argued about with a centering of IT in higher education is a dilution of the curriculum.  This concern is mainly based on the idea that IT would either have to become its own major or minor that all students would be required to take, or that it would become so interwoven into current majors that faculty members’ focus on their discipline would be disrupted. While I personally advocate for both of these options to occur simultaneously, I do not believe that moving technology to the center of higher education will water down learning. Primarily because IT is naturally moving to a more prominent role in all education as it becomes more central to society.

How Can We Make the Change?
There are two things about technology that make its assuming a more important and robust role in higher education inevitable.  First, technology makes learning better in all the ways that have been described above. Second, technology is rapidly becoming the most important aspect of our lives. We work, play, communicate, and grow through and because of technology. Barring major world-wide catastrophe, the reliance on technology for much of our day-to-day living is not going to lessen. Because of the key role that IT is playing in our lives, it is only a matter of time before it becomes a key component of higher education. All we have to do is sit back and wait.

IT is the single most important thing you learn and need to know to be a participant in our hyper-connected world. Some forward thinking university is going to catch on to that and develop a curriculum that takes advantage of technology in ways that make it central to every other discipline. When we see that model emerge, real change in higher ed will be right behind.

Please join the discussion about the future of higher education on Google+ and Twitter.

Image courtesy of nirots / FreeDigitalPhotos.net