For Immediate Release
Isabelita Reyes: Asst. Vice President for PublicAffairs
Director, UP System Information Office
Contact Numbers: 926-1572 0918-9091361
This is the third year that such a survey has been conducted and its results given prominence by local dailies. But according to UP Vice President for Public Affairs Cristina Pantoja Hidalgo, UP has never agreed to participate in this survey. In fact, this year, President Emerlinda R. Roman did not even receive an invitation to be a part of it. Nor did she receive any questionnaire to answer.
What she did receive was an email message from QS Asia Regional Director (Asia Pacific), Mandy Mok, informing her that UP had “gone up in the rankings” for 2008. The email also contained an invitation to buy “an attractive package” from THES-QS. The “package price,” which includes a banner on topuniversities. com, a full page full
color ad in Top Universities Guide 2009, and a booth at Top Universities Fair 2009, amounts to $48,930.
Since UP was not invited to participate and therefore had not provided any data, UP officials do not know where and how the figures were obtained on which the ranking was based, Hidalgo said.
“UP can hardly be expected to spend more than 2 million pesos on publicity for itself involving a survey conducted by an organization that refuses to divulge where it obtains its data,” she added.
In 2007, UP was invited to participate in the survey, but when THES-QS refused to explain where it obtained the data used to determine UP’s rank in the 2006 survey, university officials decided not to accept the invitation to be part of the 2007 survey. In 2006, UP was ranked No. 299, and Ateneo was ranked No. 500.
UP wrote THES-QS in July 2007, informing them of UP’s decision not to be a part of the survey; and again in September 2007, requesting the organization to respect UP’s decision. In response, research assistant Saad Shabir wrote back saying that if it did not receive the information it would be “forced to use last year’s data or some form of average.”
Surveys and rankings obviously have their usefulness. But, as the National University—status officially granted to it with its new Charter on its centennial year—UP feels
that before it agrees to participate in such an exercise, it must carefully examine the indices by which it is to be evaluated. It also needs to be convinced about the reliability of the methodology used in the exercised.
The THES-QS ranking is supposedly meant to serve as “the definitive guide to universities around the world which truly excel.” In evaluating institutions it computes half of the index based on its reputation as perceived by academics (peer review 40%) and global employers (recruiter review 10%). Since it does not specify who are surveyed or what questions are asked, the methodology is problematic.
In an earlier statement released in August this year, and carried by several national dailies, UP said: “Even peers require standardized input data to review. But according to the International Ranking Systems for Universities and Institutions: A Critical Appraisal, published by BioMed Central, the Times simply asks 190,000 ‘experts’ to list what they regard as the top 30 universities in their field of expertise without providing input data on any performance indicators (http://www.biomedce ntral.com/ 1741-7015/ 5/30). Moreover, the survey response rate among selected experts was found to be below 1%. In other words, on the basis of possible selection biases alone, the validity of the measurement is shaky.” (See Pano, “Only Two RP Unviersities Made It…” UP Newsletter, August 2007, p. 5.)
According to the statement, the other half of the index is based on such indicators as student-to faculty ratio, the number of foreign faculty and foreign students in the university, and the number of citations in internationally accredited publications. “Data for these indicators depend on the information that participating institutions submit. An institution’ s index may be easily distorted if it fails to submit data for the pertinent indicators, or if it chooses not to participate.”
Sergio S. Cao, PhD
Professor of Finance
College of Business Administration
University of the Philippines
Diliman, Quezon City
In the overall rankings released Monday, Ateneo rose from number 451 in 2007 to number 254 this year, while UP rose from 398 last year to 276.
The THE-QS World University Rankings are based on data gathered in the following categories: peer academic review, recruiter review, international faculty ratio, international student ratio, student-faculty ratio, and research citations per faculty.
Ateneo had an overall score of 48.0 out of 100, up from 30.8 last year, while UP posted a 45.9 overall score, up from 34.7 last year.
Ateneo was tied with Spain’s Universidad Autonoma de Madrid, while UP was tied with Germany’s Universitat Ulm and Universitat Wurzburg, and the United States’ Virginia Polytechnic Institute.
The two universities also figured in the subject-specific rankings for the first time.
Ateneo and UP were both ranked in the top 100 Arts and Humanities institutions worldwide: ADMU was ranked number 79, while UP was at number 82, along with the University of Notre Dame in the US.
ADMU, UP, and another local university, De La Salle University, were also part of the 100 institutions with the highest employer review scores.
Ateneo was rank 76 in employer review, tied with the University of Western Australia, with a score of 88; UP was tied with the University of North Carolina at rank 82, with a score of 87; and DLSU was at rank 92, with a score of 84.
In the overall rankings, universities in the United States and the United Kingdom, led by Harvard University at rank 1, continue to dominate the rankings. The highest-ranked Asian university was the University of Tokyo in Japan (19),while the National University of Singapore (30) was the highest-ranked for Southeast Asia.
Meanwhile, two other Philippine institutions – the University of Santo Tomas and the De La Salle University – were part of the group ranked 401-500.
“Further down the rankings, fewer data are available to evaluate each university and the statistical appropriateness of discerning one university from the next begins to decay. Responses for institutions in our survey drop off exponentially from the top of the table, by the time it gets past 400 the results become highly sensitive to error. As a result, precise positions beyond 400, are not published,” QS explained in its rankings tally in topuniversities. com.
Now in its fifth year, the research is conducted and compiled by QS Quacquarelli Symonds and features in print in Times Higher Education on 9th October and online on the QS web site http://www.topuniversities .com on 10th October.