On Oct. 3, 2011 I wrote that the U.S. News rankings of online education could hinder the future innovation of online programs. Well, those rankings finally appeared this week with great fanfare, including a nice write-up in the Chronicle of Higher Education stating that the report does not specify which program is No. 1 (DeSantis, Jan. 10, 2012), instead favoring an “honor roll” of the best programs. Was my fear mongering about the possible detrimental effects of a ranking system for online programs justified? Did the revised methodology applied to online programs address concerns about the validity of a ranking system?
A Ranking by Any Other Name. . .
In an attempt to address concerns that presenting a No. 1 overall online degree program or school would be artificial and contrived, and would fail to do justice to the wide scope of online learning, U.S. News did not rank an overall winner in online education. Instead, they created a numeric "honor roll" which leaves it up to potential students to determine which program is best for them based on some very cosmetic criteria.
The report does contain numeric rankings for what U.S News has determined are the most important sub-categories necessary to determine the quality of an online degree program. These are: Faculty Credentials and Training, Student Services and Technology, Student Engagement and Assessment, and Admissions Selectivity. So, effectively there is no overall best program, but rather a bunch of best program elements. But what do the rankings in these areas actually tell us about the quality of a program?
I once wrote in to Mike & Mike on ESPN and proposed an NFL Playoff Confidence Ranking system (that they have used for a few years now) that worked much like the new U.S. News online program rankings. In my idea you rank each team in order of your confidence in how good they are in several areas (coaching, quarterback, defense, etc.) tally up the numbers and decide which is the best team based on your subjective feelings of "confidence" in those areas. The U.S. News ranking system is a lot like this confidence index for NFL playoff teams. The idea of confidence is as arbitrary and subjective as much of the information used by U.S. News.
For the online program rankings keep in mind that all of this data is self-reported based on “typical” data derived from “calculations to the best of their abilities” (U.S. News, Jan. 9, 2012) of the schools who opted-in to the survey. This data is then tabulated, with programs placing in the top third in three of the four categories making the honor roll. The results are, by the way, then ranked from 1 to whatever for each program element.
Is Online Learning a Beauty Contest?
The areas that the U.S. News online program honor roll considers are: Faculty Credentials and Training, Student Services and Technology, Student Engagement and Assessment, and Admissions Selectivity. What do these categories tell us about the actual quality, usefulness, rigor, future employability, or value of any of these programs? Unfortunately, not much.
- Faculty Credentials and Training – This category considers the percentage of faculty with a Ph.D. and at least two years of online teaching experience. The very idea that faculty credentials are important is under attack with recent movement towards the consideration of informal education as a valid source of learning. I agree that having faculty members with experience is beneficial, so why doesn’t more experience count for more? Some online only universities may have fewer Ph.D.s teaching for them, but those individuals could have 15 years of experience as online educators.
- Student Services and Technology – The criteria for this are, career placement assistance, live tutoring, smartphone app, and live streaming video. There is no assessment of whether the career placement assistance is effective or even used by a majority of students, Live tutoring is helpful, but not essential to having a quality online program. Whether or not a program has a smartphone app is completely irrelevant – it is far better for a student to be working on a computer, where they can be focused on their tasks, than trying to multitask on a 3.5" app while they are driving to work. Further clarification of "live streaming video" is needed, as I doubt that all of the programs listing it actually have live video feeds for their courses. If they do, it would be good to know what percentage of their classes have it.
- Student Engagement and Assessment – This category considers the expected instructor response time (in hours), the number of weekly office hours, and whether or not the program is accredited by NCATE or TEAC. “Expected” response time is in no way indicative of the reality of response time. This evaluates what the institution’s policy is, not what the reality is. The category of office hours per week is arbitrary and does not indicate the quality of a program or the instructors. For example, while I only have three official “office hours” each week, I am available nearly 24 hours a day, 7 days a week for my students via email or the phone. Finally, NCATE and TEAC are not the only accreditation agencies in this country.
- Admissions Selectivity – Here the evaluation is based on whether or not there is a standardized test requirement, average undergraduate GPA, average GRE score, and acceptance rate. The issues with standardized tests are well documented and requiring them may exclude capable people from higher education for any number of reasons (cost, poor test taking skills, etc.). The acceptance rate of any program is a manipulated statistic. Schools boost their application rates in any number of ways in order to make themselves appear more selective. Name recognition, advertising budget, direct mailings, U.S. News ranking, etc. all influence the acceptance rate of schools. Acceptance rate is in no way indicative of the quality of an institution or program. This category is essentially the swimsuit portion of the U.S. News beauty contest.
Where are the Big Players?
One thing that strikes an informed reader of the U.S. News online program rankings is the absence from the honor roll of some major players in online education such as the University of Phoenix, Kaplan, DeVry, and American Public University. Some of this absence is accounted for by the fact that not all for-profits chose to participate in the voluntary survey (University of Phoenix, for example). Some of it is attributable to the lower percentage of Ph.D.s teaching at these institutions and some of it is attributable to factors such as a lack of NCATE or TEAC accreditation. The absence of these most popular online programs does, however, indicate a serious flaw or bias in the ranking system.
There would seem to be a serious bias in the survey in favor of traditional universities which also happen to have online programs. The accreditation category, for example, which considers only NCATE or TEAC, systematically leaves out institutions which use other agencies such as The Higher Learning Commission – the agency used by the University of Phoenix, DeVry University, Kaplan, and American Public University. This is the most serious flaw in the U.S. News honor roll of online programs, but others are sure to emerge in the coming weeks as people delve deeper into the methodology used in the survey.
“Rank” Online Program Rankings
According to the Urban Dictionary, something rank is "horrible" or "disgusting." That is what I felt about the U.S. News rankings before they came out this week, and actually examining them has not alleviated that feeling. Their methodology seems to pander to programs based in brick-and-mortar universities. It attempts to deflect criticism by changing the semantics regarding what it actually is from a "ranking system" to an "honor roll" which nonetheless still ranks aspects of degree programs. And it still reduces the choices students are faced with to a popularity or beauty contest without any consideration of the real quality or value of programs or how they best match students’ individual needs. As one comment posted on the Chronicle article puts it: "the annual rankings not only sell magazines, they feed into the undergraduate enrollment frenzy that has distorted admissions beyond recognition." (Chronicle.com, Jan. 10, 2012). Students have come to equate the “best” program with the one which is the highest ranked and scurry to apply to those programs.
The college admissions process is not about choosing the best school based on arbitrarily created rankings, but rather about choosing the school that best fits an individual’s needs, expectations, and budget. The U.S. News rankings of online degree programs do not help students determine which programs meet those criteria in any way, only which one looks best in print. Let’s dump the "rankings" and "honor roll" labels and call this effort what it is – a beauty contest.