Blog

The WCET Q&A about Online Program Rankings

by Justin Marquis Ph.D.

In response to the January release of the first U.S. News rankings of online education programs, WICHE Cooperative for Educational Technologies (WCET) released a fairly critical evaluation of the process and results of the effort on their blog. Among the things criticized by WCET were the actual amount of usable data that was received by U.S. News to complete the “Honor Roll,” the viability of the methodology, the focus on inputs rather than outcomes, and the overall lack of knowledge about online education on the part of U.S. News in regard to the questions asked.

U.S. News’ Bob Morse quickly published a rebuttal of the analysis and offered to respond directly to WCET members’ questions about the Honor Roll. Here is a summary of some of the more enlightening questions from WCET members and the answers from the developers of the rankings.

Question: What is the perceived value for the consumer or other audiences of a first round of rankings which so clearly does not reflect the quality of available programs?

Answer: According to the respondents, the value of the rankings is that they now exist. No such useful information was available for prospective students and this fills a gap. In essence, this is a beginning to an ongoing process which starts to fill gaps in comparative and evaluative data.

Analysis: I suppose that this would have been deserving of a follow-up question. That question would be, "Why not wait to publish the rankings if there is so little data available?" I’m trying to imagine what would happen if a researcher in any field submitted a scholarly paper for publication that drew conclusions (that people are supposed to use to make decisions) from the rough beginnings of an imperfect data collection effort.  First, no researcher in their right mind would even dream of submitting that paper, and second, they probably wouldn’t even receive a rejection letter. Their effort would go straight into the circular file. The release of the U.S. News Honor Roll seems to be nothing more than a quick money grab. They knew people would buy the ranking edition of their magazine regardless of the value of the information contained in it, so they maximized their profits by pushing out an incomplete product, which they can then follow-up with another “updated” ranking in a few months.

Question: In the questionnaire, there seemed to be an assumption that it was desirable for distance education students to comingle with on-campus students online. Can you point to the research findings that support this concept?

Answer: Not true, according to the U.S. News officials. There is only a "single minor indicator about career centers" used in one section of the reporting. No favorable advantage was given to blended programs in the rankings.

Analysis: Strangely enough, both the WCET member asking this question and I got the same impression from the rankings – that blended programs were better. In my initial post on the online rankings, I critiqued the definitions that were being employed to determine what an online program is. This fundamental confusion on the part of U.S. News seems to have carried over into the actual ranking of online programs, which comes across as a bias towards blended programs. Whether the bias is intended or not is irrelevant because either way, people are reading something in the Honor Roll that seems to favor blended options.

Question: How much are organizations listed paying for leads generated by your site, and what are the Web analytics on the site, e.g., visits, unique visits, page visits, etc?

Answer: "We are a private company and do not release that type of data, financial or Web stats." This was the U.S. News response to both questions.

Analysis: Two of the most enlightening answers provided by U.S. News in their responses to WCET questions are here. In light of the fact that the online program rankings on the U.S. News website included a lead aggregation tool displayed prominently (WCET blog, Jan. 2012), the evasion of answering these questions is answer enough. The answer must be an affirmative because there is no harm in saying “nothing,” or "we do not profit from the lead generation." So now, it’s obvious that they do and that the profits must be significant, given the amount of traffic the U.S. News site generates. According to the tools at Open Site Explorer the U.S. News website has a rating of between 90 and 94 percent regarding the authority of its pages, with an overall rating of 95 percent for the site in general, and has more than 205,000 links to the home page from other sites (OpenSiteExplorer.org).

Question: In what sense do the rankings they’ve come up with represent "the best advice and information available for your … online education"?

Answer: The rankings are just one tool that students should use in their college search process.

Analysis: True, but the quote from U.S. News’ marketing materials indicates that the rankings are the "best advice and information." This claim is not addressed in the answer, but is rebuffed by the U.S. News representative’s response to an earlier question, that this ranking fills a gap in needed information. Apparently being the only one that ranks online programs makes it the best by default.

Question: What specific steps does USNWR plan to take to increase numbers of participants and ensure accuracy and completeness of survey responses?

Answer: The U.S. News officials are pleased with the response rate (@10%) and expect it to be better next time. They will use this test run to shape future survey questions and plan to include data on outcomes once students graduate and enter the work force.

Analysis: Publishing data from what amounts to a test run of their ranking system as the "best advice and information available for your … online education" is beyond irresponsible. People are relying on these rankings as a serious tool to help guide their decision making process. If this effort is a work in progress, there should be a flashing red disclaimer emblazoned across each page of the Honor Roll explaining that limitation.

Further, if a 10% response rate is pleasing, then there is a serious problem with the survey tool. The surveys should have been targeted only to schools that offer online programs rather than being (apparently) randomly sent to 2,000 schools that may or may not have had online programs. This is yet another indication of an ongoing criticism of the U.S. News ranking system. They don’t want to actually do any of the research themselves. All data that they collect via these surveys is publically available, but they insist on placing the burden on the schools to compile it for them. This wastes the institutions’ time and resources, while providing an easy cash cow for U.S. News. At the very least, they could have determined what schools they should have targeted with their surveys.

As for students needing to "complete entire online degree programs, graduate, and enter the workforce" before data is available, this is simply one more excuse for not wanting to do research for themselves. Online education has been around since 1994. It seems like a few of those students might have completed their degrees by now.

Question: In the survey, USN asked colleges to "guestimate" or "composite answer" broadly about all degree programs inside one college and not look at individual degree programs. There is often very little congruency across online degree programs within one academic division, let alone an entire college. Do you plan to correct that?

Answer: "Unfortunately, allowing schools to report data on more specialized levels would require specialized survey instruments with specialized questions. This would make computing rankings using standardized criteria more challenging." The U.S. News respondents did say that they would be open to considering the possibility of allowing schools to report information at the program-level in the future.

Analysis: This is another long standing issue with the U.S. News rankings – a desire to paint with very broad strokes. Entire institutions are ranked rather than specific programs within institutions. Certainly, a middle-of-the-road school can have an outstanding science program or a great distance education program. In contrast, an elite school may have very poor online efforts, or lump undergraduates into giant lecture courses. Relying on an overall numerical ranking of institutions to make a decision of this magnitude is ridiculous. Overall ranking isn’t the right way to go and U.S. News is not interested in gathering information with enough detail and from enough sources to make their system useful.

Conclusions
There are many more questions and enlightening answers from the WCET Q&A and I would encourage anyone interested in the issue to check out the entire transcript on the WCET website. Overall, the session comes off as a half-hearted attempt to placate critics with the possibility of considering some future changes. However, the possibility for change is limited, given the news magazine’s format and the difficulty in producing a tool that is informative enough to be useful, but concise enough to be easily digestible in a fast-paced, consumer society. Ranking things is part of our culture, and the U.S. News effort is just one more bubblegum, candy-coated effort to feed the constant need for meaningless information that replaces any kind of serious inquiry. The most logical recommendation to be made based on this Q&A with the creators of the U.S. News online education rankings is that students looking for information about online programs should do their own research and avoid the U.S. News rankings altogether.