Aminado akong dumadausdos na pababa ang kalidad ang edukasyon sa mahal kong UP (sa Pilipinas sa pangkalahatan). Alam na natin ang mga dahilan: kakulangan ng pondo, paaralan, libro, guro, walang katapusang pagtaas ng matrikula, kakurakutan ng gobyernong Arroyo at ng mga nauna sa kanya, at maraming-maraming-maraming iba pa.
Sa UP, nagsialisan na ang maraming guro at pinili na lang magturo sa DLSU, ADMU, UA&P, o sa mga unibersidad sa labas ng bansa. Kumusta naman 'yong pang-pandesal lang ang kinikita ng mga guro sa UP kumpara sa pang-Auntie Anne's almond-flavored pretzel ang kita sa ibang Unibersidad? Problema din ang kakulangan ng reseach output mula sa kaguruan. Pa'no kasi, walang pondo. Ang matrikula, mula P300 per unit sa undergrad, naging P1000 na. Sa graduate studies, mula P500 per unit, ngayon P1500. Sa'n ka pa. Kaya naman sobrang disadvantaged ang mga mag-aaral na nasa mababang income bracket. Meron ngang STFAP, andami namang butas.
Ayun. Kaya di ako magtataka kung bumaba man ang ranking ng UP sa mga university ranking sa mundo.
Pero ang di ko mapalampas ay 'yong institusyong napaka-iresponsable sa pangangalap ng datos para i-rank ang mga unibersidad sa mundo. Ayos lang ang ranking, dahil kahit paano, nakikita natin kung paano makipag-compete ang mga unibersidad sa Pilipinas sa iba pang mga unibersidad sa ibang bansa. Pero kung napaka-problematiko ng metodong ginamit, na hindi ipinaliliwanag, ay ibang usapan na 'yan.
---------------------------------------------
In the 2008 university rankings recently released by the Times Higher Education Supplement (THES) and Quacquarelli Symonds (QS), only the University of the Philippines and the Ateneo de Manila University (among Philippine universities) made it to the top 400. UP rose from 398 in 2007 to 276 this year; Ateneo rose from 401-500 to 254. De La Salle was ranked 415th and UST was ranked 470th.
According to UP Vice President for Public Affairs Cristina Pantoja Hidalgo, UP President Emerlinda R. Roman did not receive any invitation to participate in the survey this year or any questionnaire to answer. What President Roman received was an email message from QS Asia Quacquarelli Symonds’ Regional Director (Asia Pacific), Mandy Mok, informing her that UP had “gone up in the rankings.”
Since UP had not been invited to participate in the survey and had not provided any data, UP officials do not know where and how the figures were obtained on which the ranking was based.
Hidalgo revealed that the message also contained this statement: “In view of the good news, would you like to consider signing up the following at a very attractive package price?” The “package price,” which includes a banner on topuniversities.com, a full page full color ad in Top Universities Guide 2009, and a booth at Top Universities Fair 2009, amounts to $48,930.
“UP can hardly be expected to spend more than 2 million pesos on publicity for itself involving a survey conducted by an organization that refuses to divulge where it obtains its data,” Hidalgo said.
In 2007, UP was invited to participate in the survey, but when THES-QS refused to explain where it obtained the data used to determine UP’s rank in the 2006 survey (where UP was ranked No. 299), university officials decided not to accept the invitation to participate in the 2007 survey. Moreover, the university was given barely a week to respond to the questionnaire.
UP wrote THES-QS in July 2007, informing them of this decision, and again in September 2007, requesting the organization to respect UP’s decision. In response, research assistant Saad Shabir wrote back saying that if it did not receive the information it would be “forced to use last year’s data or some form of average.”
These rankings are supposedly meant to serve as “the definitive guide to universities around the world which truly excel.” In evaluating institutions it computes half of the index based on its reputation as perceived by academics (peer review 40%) and global employers (recruiter review 10%). Since it does not specify who are surveyed or what questions are asked, the methodology is problematic.
An earlier statement, released by UP in August this year, and carried by several national dailies, said: “Even peers require standardized input data to review. But according to the International Ranking Systems for Universities and Institutions: A Critical Appraisal, published by BioMed Central, the Times simply asks 190,000 ‘experts’ to list what they regard as the top 30 universities in their field of expertise without providing input data on any performance indicators (http://www.biomedcentral.com/1741-7015/5/30). Moreover, the survey response rate among selected experts was found to be below 1%. In other words, on the basis of possible selection biases alone, the validity of the measurement is shaky.”
According to the statement, the other half of the index is based on such indicators as student-to faculty ratio, the number of foreign faculty and students in the university, and the number of academic works by university researchers that have been cited internationally. “Data for these indicators, however, typically depend on the information that participating institutions submit. An institution’s index may be easily distorted if it fails to submit data for the pertinent indicators, or if it chooses not to participate.”
As Dr. Leticia PeƱano-Ho said in an article carried by the UP Forum last year: “The crux of the matter is to identify the indices that can approximate the different landscapes of universities. There might be a need to relate these indicators to the unversities’ mission statements. UP’s constituents can identify their own indicators and decide on their desirability, relevance and reliability. These criteria should, as an added value, provide international comparisons.”
- http://www.up.edu.ph/features.php?i=93