Jump to content

Talk:College and university rankings/Archive 2

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 2

THES?

Why even cite these guys? They're unabashedly biased towards England. They ranked the University of Manchester as tied with the Ecole Polytechnique, in Engineering. Do you realize that British engineering companies actively prefer French engineers over British engineers, largely due to the incredibly weak educatin system in the U.K.? THES isn't fooling anybody.

A better question is how many ordinary French universities beat out Oxbridge in engineering. —Preceding unsigned comment added by 78.190.159.108 (talk) 15:16, 12 December 2009 (UTC)

And your evidence for this against a respected educational journal is... 129.67.110.19 (talk) 18:54, 20 January 2010 (UTC)

Agreed as far as the general methodological favor of anglophone universities is concerned. But the ranking stresses (the perceived impact of) an institution's scientific output, as opposed to the quality of its education. Thus, X -which is primarily an elite educational institution, not so much a research university- comes out rather mediocre. By the way, that's true for ARWU and HEEACT rankings as well. 147.142.186.54 (talk) 13:24, 10 March 2010 (UTC)

Inaccurate comment

“But we note also that this survey establishes its rankings by appealing to university staff, even offering financial enticements to participate (see Appendix II). Staff are likely to feel it is in their greatest interest to rank their own institution more highly than others. This means the results of the survey and any apparent change in ranking are highly questionable, and that a high ranking has no real intrinsic value in any case. We are vehemently opposed to the evaluation of the University according to the outcome of such PR competitions.”

This objection is largely invalid - institutions cannot vote for themselves in the academic peer review. Also, the academic peer review would be a 'PR competition' if the survey were composed of a random sample of people. In fact, the survey asks academics around the world to name the top institutions within their field of expertise (the idea being to get first-hand opinion from experts, and also to cover the gamut of subjects by surveying experts in a wide range of fields). I think it's fair to say that top academics base their opinion on actual knowledge and experience rather than just 'PR'. Similarly, the employer peer review asks employers which universities have provided them with top graduates - so it's actually a way of factoring educational outcomes into the rankings. Of course to an extent general reputation is likely to have an influence on employer responses, but this is arguably just as useful from a student perspective - it tells you which degrees are highly regarded by employers, thus giving an indication of the 'market value' of degrees from different institutions.Dannybyrne (talk) 13:57, 16 July 2010 (UTC)

Merger of THE-QS and QS ranking articles

Since the break between THE and QS, there have been two separate, almost entirely duplicative articles: Times Higher Education-QS World University Rankings and QS World University Rankings covering the same ranking. QS owns the historical data, methodology, and is going to continue to publish the previous ranking under the name "QS World University Rankings". Thus, this ranking essentially just changed names by dropping "THE" from the title. THE is creating its own, brand new ranking, the "Times Higher Education World University Rankings", which has its own, separate, third article. It seems obvious to me that the original Times Higher Education-QS World University Rankings needs to be merged into QS World University Rankings, but it has gotten resistance from an editor with what appears to be a WP:COI for THE. There seems to be editors on both sides with COIs. Those editors without COIs, and there seems few such editors in this topic, please add your comments on the merger to the discussion here. Thank you. CrazyPaco (talk) 20:02, 22 July 2010 (UTC)

G-Factor

I dont think this is notable. It appears to be a once-off publication which is only available on archive.org. The top 300 is also archived. --John Vandenberg (chat) 10:04, 25 July 2011 (UTC)

USN&WR bias

An unregistered editor is insisting on removing information describing the U.S. News & World Report rankings and replacing it with accusations that it is biased. First, I object to removing the description. Second, I object to inserting the bias allegations given the extraordinary weak sources the unregistered editor has put forth (one is a blog post written by a college student and the other is some sort of crowd-sourced blog-like webpage). Thoughts? ElKevbo (talk) 05:47, 27 August 2011 (UTC)

I agree, highly controversial content such as this requires higher quality sources.Rangoon11 (talk) 13:08, 27 August 2011 (UTC)
ElK, you've been doing the right thing reverting him, but has anyone left a message on his talk page yet?HuskyHuskie (talk) 13:56, 27 August 2011 (UTC)
Thanks. I haven't left him or her a message and at this point it would probably be better if someone else did so it doesn't seem like I'm trying to antagonize him or her. ElKevbo (talk) 15:31, 27 August 2011 (UTC)
 Done By the way, while I certainly understand your hesitancy to approach the anon after several back and forths with the edits, this demonstrates why it's so good to welcome an editor the very first time that they edit, so that there is a basis for dialoguing with them on issues that come up later. Is this guy a vandal? My guess is no; I think he's just someone with an opinion that has no idea what our policies are, and if we welcome him, we may get a valued editor. If he is a vandal, then our greeting makes it that much more certain that it's okay to string him up quickly. HuskyHuskie (talk) 16:12, 27 August 2011 (UTC)

European Union rankings

The second paragraph under European Union begins with "The report ..." but definitely does not seem to refer to the same report as discussed in the first paragraph. The first paragraph talks about a ranking in which the University of London was ranked first, while the second paragraph talks about a report in which Cambridge and Oxford were on top. There are no links or references in this second paragraph. What on earth is being discussed in this second paragraph? — Preceding unsigned comment added by 75.75.5.239 (talk) 15:12, 22 March 2014 (UTC)

Stanford editor(s) caught again posting fake rankings

Be aware that the Stanford editor(s) has put up another made up ranking on the Rankings of universities in the United States page and may try to add it here again. He calls it the "Acceptance Rate (Selectivity)" ranking. There was a "Top 100 Lowest Acceptance Rates" list put out by US News for 2013, but the Stanford editor is using this and a random website that has some acceptance rate data for 2014 to create his own original research ranking. I tried to appease him by renaming the section "Top 100 Lowest Acceptance Rates" and limiting it to the 2013 US News list, but he keeps reverting it to his original research ranking.--TDJankins (talk) 07:34, 6 January 2015 (UTC)

Belgium

Belgian universities Ghent University and Katholieke Universiteit Leuven are mentioned in almost every list, but Belgium isn't even mentioned here?

"Ghent consistently rates among the top universities not only in Belgium but also throughout the world" "With 55,484 students in 2014–2015, the KU Leuven is the largest university in Belgium and the Low Countries" "As of 2015-2016 academic year, KU Leuven ranks as 35th globally according to Times Higher Education,[5] 82nd according to QS World University Rankings[6] and 90th according to the Academic Ranking of World Universities.[7] KU Leuven is consistently considered to be within the top 100 universities of the world and in contention with Ghent University as the best Belgian university."

Ignoring the enormous bias towards English speaking univeristies both are consistently ranking among the top of the universities. (That aren't from a country where the native language is English) the omission of both is strange.

78.23.197.184 (talk) 11:44, 8 December 2015 (UTC)

Keyser42soze (talk) 20:00, 24 January 2018 (UTC)

Thanks for the heads-up. I've 'rescued' the dead link with a link to archive.org. Robminchin (talk) 22:09, 24 January 2018 (UTC)

Various rankings mostly evaluating on institutional output by research.

Not sure what this sentence is trying to say. Rankings rely heavily on research as a measure of institutional output? Various rankings mostly evaluate institutional output by research? Institutional output - as measured by research - is weighted heavily in the ranking? Mathyeti (talk) 19:21, 24 March 2017 (UTC)

SCIMAGO ranking

Can you please add two more parameters: College_and_university_rankings#SCImago_Institutions_Rankings and Webometrics_Ranking_of_World_Universities? Both rankings are easily retrievable form the websites below maintained by the Spanish National Research Council:

http://www.scimagoir.com/

http://www.webometrics.info/en

Thank you! — Preceding unsigned comment added by 97.98.78.51 (talk) 02:02, 30 May 2018 (UTC)

Original Research Rankings

I removed two original research rankings made by Wikipedians, the "YAR" ranking and the "Selectivity" ranking. Not only are these original research, but there's also no way these Wikipedians evaluated every college in the US when they stitched together these rankings. I also removed them from the "Ranking of Universities in the United States" page, but a user named "Download" keeps putting them back.--TDJankins (talk) 20:17, 1 September 2014 (UTC)

Please see Talk:Rankings of universities in the United States for discussion. -download 03:20, 2 September 2014 (UTC)

Top rankings

Apologies if this has been proposed before. The most important worldwide rankings are The Times, the ARWU and QS. Shouldn't they be listed in their own section at the beginning of the article? Current setup gives the same weight to all global rankings, which is simply not the way the world works. Ladril (talk) 18:16, 23 September 2019 (UTC)

Someone did to that after your proposal, and it seems logical. The probably is, people will try to edit their own rankings out of the larger list into the selective one - someone had just done that with the Leiden ranking, for example. If there is a fourth ranking that should be there as the most read and influential, it should definitely be CWUR. Cheers.

213.245.146.78 (talk) 19:16, 28 September 2019 (UTC)

"It should definitely." If there is something that should definitely be in the top category, there definitely should be a reliable source listing it alongside Times, QS, and Shanghai ranking. There is a source listing the Leiden ranking among those, so I would like to please request you do not take it out the "most prestigious" list again without proper justification. Ladril (talk) 13:59, 29 September 2019 (UTC)
The phrase "some sources also include the Leiden ranking" remained unsourced. I work with international Education for years, deal with dozens of these rankings every day and had never heard of the "Leiden ranking". I am sorry, but it just isn't one of the 3 or 4 top university rankings in the world. It is so obviously a very, very personal position of yours being defended here that if you add it again among the top rankings I will simply have to ask for external, formal intervention from Wikipedia moderators. Anyone who is somewhat familiar with the topic knows that a "Leiden ranking" isn't one of the top 3 or 4 university rankings in the world. 213.245.146.78 (talk) 20:24, 3 October 2019 (UTC)
You seem to be new to Wikipedia, and I understand (I was also new once). First, some practical advice. It's usually not a good idea to use your credentials as argument. First, because there is no way to know if you actually have them (there have been widely publicized cases of people pretending to have degrees they did not, for example). Secondly, there is a Wikipedia guideline about verifiability (WP:V), which means that what matters for the purpose of editing articles is whether you can show that your information is actually accurate according to third-party sources. Professional credentials almost always take second seat to this, so insisting on them is just wasting everyone's time.
Regarding your statement "The phrase "some sources also include the Leiden ranking" remained unsourced." Here is a sentence from one of the sources currently in the article:
"QS World University Ranking, THE World Ranking, ARWU, and CWTS Leiden Ranking are the leading university rankings frequently encountered in articles, newspapers, and promotional publications."[1]
I will not revert your edit at this time (though I believe I have a pretty good case for doing it, and I make no promises about the future). I can only advise more careful reading of sources in the future, to avoid further misunderstandings. Ladril (talk) 17:39, 4 October 2019 (UTC)

References

  1. ^ Olcay, Gokcen Arkali; Bulu, Melih (1 October 2017). "Is measuring the knowledge creation of universities possible?: A review of university rankings". Technological Forecasting and Social Change. 123: 153–160. doi:10.1016/j.techfore.2016.03.029.

"Tier four university" listed at Redirects for discussion

A discussion is taking place to address the redirect Tier four university. The discussion will occur at Wikipedia:Redirects for discussion/Log/2021 January 26#Tier four university until a consensus is reached, and readers of this page are welcome to contribute to the discussion. Shhhnotsoloud (talk) 09:15, 26 January 2021 (UTC)

"Tier three university" listed at Redirects for discussion

A discussion is taking place to address the redirect Tier three university. The discussion will occur at Wikipedia:Redirects for discussion/Log/2021 January 26#Tier three university until a consensus is reached, and readers of this page are welcome to contribute to the discussion. Shhhnotsoloud (talk) 09:15, 26 January 2021 (UTC)

"Tier two university" listed at Redirects for discussion

A discussion is taking place to address the redirect Tier two university. The discussion will occur at Wikipedia:Redirects for discussion/Log/2021 January 26#Tier two university until a consensus is reached, and readers of this page are welcome to contribute to the discussion. Shhhnotsoloud (talk) 09:15, 26 January 2021 (UTC)

"Tier one university" listed at Redirects for discussion

A discussion is taking place to address the redirect Tier one university. The discussion will occur at Wikipedia:Redirects for discussion/Log/2021 January 26#Tier one university until a consensus is reached, and readers of this page are welcome to contribute to the discussion. Shhhnotsoloud (talk) 09:18, 26 January 2021 (UTC)

GLM internet rankings

Not withstanding the question of whether internet & blogosphere rankings are appropriate for a Wikipedia article on academic rankings (why not include football rankings as well then?), but the GLM rankings seem to be nothing more than a vehicle for the promotion of their TrendTopper MediaBuzz Reputation Management service for College and Universities (see here). Is this actually a legitimate ranking to include? Certainly it doesn't warrant the space it currently consumes in the article, and it is in definite need of a conflict-of-interest disclosure. CrazyPaco (talk) 02:43, 24 August 2009 (UTC)

IP edits from 70.116.22.174 suggest WP:PROMOTION is being violated by GLM. I removed the rankings because an organization ranking schools by a proprietary system and then selling services to improve an institution's place in those same rankings does not suggest a neutrality necessary for a non-biased, academically-honest ranking system. CrazyPaco (talk) 07:24, 30 October 2009 (UTC)

GLM uses citations from the top 5,000 global news print and electronic news media, as well as internet, etc. Academics have no problem here, in fact they see this as a non-unbiased snapshot since it cannot be influenced because the sample size is so large. See this article in the Harvard Crimson, http://www.thecrimson.com/article.aspx?ref=524332. The ranking, like all the others, is of a college or university's reputation. You can't influence a handful of deans (US NEWS), or change the rules of your SAT score statistics (Baylor, 2009), or enhance your admission selectivity by including spurious apps to increase your yield.

The Harvard article contains three relevant points:

1. "Indicating that prestige and media coverage seem to be correlated, six of the universities on the Global Language Monitor list also made the U.S. News top ten.",

2. "“Harvard is a brand name,” said Alex S. Jones, a former New York Times journalist who is the director of the Shorenstein Center on the Press, Politics, and Policy. “It’s the oldest and best known university in the United States.”

3) "Indeed, the study seems to validate the Harvard Kennedy School’s recent decision to rebrand itself. Known as the Kennedy School of Government until last spring, the public policy and administration changed its shorthand so that it includes the word “Harvard.” The new shorthand reference...is serving to reinforce that we are part of the Harvard University family of premier graduate schools,” said Doug Gavel, a Kennedy School spokesman."

Also, colleges see no link between distinguishing themselves among their six or eight peers and the rankings themselves. Btw, every ranking service has related services from the College Board to US News.

In summary, GLM is using an established brand-ranking technique for ranking a college's most important asset: it's reputation.

If you feel you must cut the size of the citation, fine, but excluding it for issues that colleges find non-existent seems a bit much. —Preceding unsigned comment added by 70.116.22.174 (talk) 13:40, 3 November 2009 (UTC)

The Harvard Crimson is a student newspaper, and none of the points brought forward are relative to the conflict of interest criticisms or blatant WP:Advertising violations. Further, those points don't legitimize the rankings for use in an academic context. There has been no study to correlate academic reputation/ranking with media saturation and GLM's method is not established as it is proprietary and not public. No such correlation is provided by the Crimson article or by GLM on it website (the tiny 6/10 top 10 US News sample is not statistically significant). Further, because the methodology is not publicly available, it can not be scrutinized. There is no way to know whether a school's mentions in the media take into account positive of negative mentions of the institution, how the rankings correlate with university size or research scope, what media and on-line materials are screened (e.g. how is MySpace vs Pubmed utilized?), if the methodologies screen for multiple names or university nicknames, if the methodology accounts for geographical name/location that may confound results, if paid university advertisements are screened out, etc., etc., etc. In light of the comment about US News' rankings being in a "handful of deans", where in actuality 25% of the score comes from surveys sent to 4,273 presidents, provosts, and deans of admission, GLM'S TrendTopper methodology can only be assumed to be based on Paul JJ Payack's sole preference of which criteria is to be included in his secret ranking system, which does not seem open to scrutiny or criticism. Further, in the Crimson article, there is no mention that the rebranding decision by Kennedy had anything to do with the GLM ranking. If this student newspaper article is the basis for GLM's claim that "Harvard...[has] used the rankings as a validation of their recent reputation management decisions" that would be, and I'm being generous, a very misleading statement.
Please provide links or evidence that other rankings are selling services to institutions to enhance their place in those same rankings. I'd especially be interested to find out what services US News & World Report, The Center for Measuring University Performance, Forbes, Times Higher Education, etc have for sale to educational institutions in order to "enhance their standings" in their published rankings. It would be great to provide such information in the article as it would provide important conflict of interest criticisms in their respective sections of the article. Speaking of which, appallingly, in all of GLM's press releases, not once does it disclose that some of the ranked universities may actually be clients. I guess this is not surprising, as in other Global Language Monitor endeavors, "academics" do have large issues with the organization's use of its proprietary PQI algorithm in word counting, and it has been intensely criticized by linguists[1] has even been referred to as a "complete fraud" by Geoffrey Nunberg of Cal-Berkley. The conflict of interest in GLM peddling its TrendTopper "Reputation Management Service" to improve an institution's place in its own proprietary rankings makes the objectivity of the ranking highly suspect, and the the IP edits emanating from GLM to whitewash any mention of those flaws does not aid its legitimacy. CrazyPaco (talk) 22:56, 3 November 2009 (UTC)

Here's a suggestion: adding this to the Other Rankings area "Global Language Monitor ranks the top 200 colleges and universities according to their appearances in the global print and electronic media, the Internet, blogs and social media. In 2008, Harvard topped the Universities list, and Colorado College, the colleges list. In fall 2009, the University of Michigan and Wellesley College topped the lists."

This Reuters articles is making the rounds: http://www.reuters.com/article/newsOne/idUSTRE59T4FJ20091030.

CollegeCritic —Preceding undated comment added 13:56, 3 November 2009 (UTC).

I'll went ahead and added that, but I removed the ranking "winners" because that is not within the scope or appropriate context of the article. We'll see how this goes, but I don't see the benefit of continuing to include the questionable GLM TendTopper rankings (and providing them with their own internet Media Buzz) if these issues are going to be promptly edited out by GLM employees. Its own article reeks badly enough of WP:Advertising. However, here's a suggestion for GLM to make your rankings legitimate: Disclose your blatant conflict of interest in your press releases: that your actually sell a service to increase the rankings of the institutions and that some of the institutions in your rankings may be clients. Disclose the methodologies and criteria included in your rankings. CrazyPaco (talk) 22:56, 3 November 2009 (UTC)

As a long-time veteran of the tech battlefields of Silicon Valley, it does appear that GLM discloses the fundamentals of the PQI in a powerpoint presentation on its site. (See http://www.languagemonitor.com/wp-content/uploads/2008/10/trendtopper-mediabuzz-overview-1014081.pdf.) In SI parlance, all algorithms are by definition proprietary. It's the highly valued IP of the company. We will, however, disclose much more of our processes and methodologies if you sign a non-disclosure agreement (NDA). Typically NDAs do not allow you to disclose the IP to any other party for a minimum of 18 months to 3 years. The presentation on the site looks like it discloses the components of the methodology but, of course, not the precise formulas. Again, all standard in high tech. One other point, apparently the list of linguists appears quite impressive, but as you probably know, you could line up any number of SI luminaries to loudly and publicly disparage any number of other SI luminaries (e.g., Gates vs. Jobs, Ellison vs. McNealy, etc.) in the harshest possible language.

Evidently, there are strong wills on both sides of these arguments. I am new to this, so I might be naive here, but I modified the article in an attempt to create a balanced (and proper) encyclopedia entry. AnalyticsGuy (talk) 22:52, 10 November 2009 (UTC)

I noticed your balanced entry removes any inclusion of criticism. Despite how it is done in Silicone Valley, GLM with a business model based purely around its proprietary PQI and its obvious knack for self-promotion, should not expect to be viewed similarly to rankings performed in an academic manner, with full disclosure of methodology and the ability to test its methodology through replication, with data sets largely drawn from publicly available "Common Data Sets" for each university, and without the glaring conflict of interests. At least they state in their releases now that they do not influence rankings (possibly because of complaints lodged here?). That is an improvement to be up front with that disclosure, and I think we can give them the benefit of the doubt, although there is no way to actually verify it. In any case, they can in no way be considered academic rankings and the relationship of the rank to academics can be easily dismissed when you see the University of Phoenix is number 37 among universities, among other anomalies. What is good is that they seem to be downplaying the relationship to academics and concentrating on what they are actually measuring, but they are still asserting wild relationships between things like Harvard dropping in their PQI rankings and the hit on their endowment, which percentage-wise, was not much different than the hit every other university's endowment took and is still much large than all other universities. Such statements not only come off as ridiculous, especially when not supported by accompanying study to attempt to validate such a relationship, but will continue be a reason GLM is ridiculed in academic circles. They need to stick to the releasing their results and stop the hollow interpretations, which I assume, is to try make their results seem more interesting or valid. I'm fine with their rank of internet pervasiveness, which begs the question of whether they belong in this article at all (same for G-factor, etc), but if they try to pass it off as anything more it is not going to fly. It is fairly obvious where all the edits for GLM are coming from, and I highly caution about Wikipedia:Conflict of interest. CrazyPaco (talk) 01:01, 11 November 2009 (UTC)

WURI Ranking and evidence of notability

I propose to include this new ranking system.

The World's Universities with Real Impact (WURI) ranking, developed by Hanseatic League of Universities and the Institute for Policy and Strategy on National Competitiveness (IPSNC), is a new system which aims to measure universities’ performance in creating value to society. Rather than relying on quantitative metrics, it relies on qualitative evaluation of innovative programs at universities. The ranking is composed of a global top 100 of "Innovative Universities" and top 50s in each of four areas: Industrial Application, Entrepreneurial Spirit, Ethical Value, and Student Mobility and Openness.[1]

List of some reputable sources, press releases and related articles:

  • UNITAR & Partners co-host launch of the 'World's Universities with Real Impact' (WURI) Ranking | UNITAR (unitar.org)[2]
  • A New WURI Ranking of Innovative Universities Released by Four International Organizations: HLU, UNITAR, FUS and IPSNC (prnewswire.com)[3]
  • South Korea's Research Institute, IPSNC, Shares Kaggle Data and Releases a New Ranking for Innovative Universities: World Universities With Real Impact (WURI) for 2020 (prnewswire.com)
  • CKGSB Ranked Top 50 in the World's Universities with Real Impact (WURI) Ranking 2020 (prnewswire.com)
  • WURI rankings: University of Split ranks 15th in ethics category | Croatia Week (crotiaweek.com)[4]
  • New global ranking positions SFU high for innovation, entrepreneurial spirit - SFU News - Simon Fraser University (sfu.ca)[5]
  • SFU named #1 in global university ranking for impact on sustainable cities and communities - SFU News - Simon Fraser University (sfu.ca)
  • Anche il Franklin College tra le migliori università - Ticinonline (tio.ch)[6]
  • ULAB amongst world’s top 50 universities for innovation, creativity | Dhaka Tribune (dhakatribune.com)[7]
  • ULAB celebrates ranking among top 50 global universities | Dhaka Tribune (dhakatribune.com)
  • ‘Media Studies departments should review and revise their curriculum every four years’ | Dhaka Tribune (dhakatribune.com)
  • ULAB ranks among world’s top 50 universities for innovation, creativity | (bdnews24.com)
  • ULAB celebrates achieving WURI ranking (newagebd.net)
  • ULAB among world’s top 50 universities with real impact (newagebd.net)
  • Trường ĐH Trà Vinh‎ vào top 100 của WURI Ranking 2020 - Báo Thái Bình điện tử (baothaibinh.com.vn)[8]
  • TVU - Tốp 200 đại học xanh, bền vững trên thế giới (nongnghiep.vn)
  • Đại học Trà Vinh lọt vào top 100 của WURI Ranking 2020 (nongnghiep.vn)
  • Đại học Trà Vinh đạt nhiều kiểm định chất lượng quốc tế (bnews.vn)
  • Tra Vinh University named in 2020 World’s Universities with Real Impact rankings | Society | Vietnam+ (VietnamPlus.com)
  • WURI랭킹, 국내외 대학 총장 25명 블라인드 평가… 종합 순위 스탠퍼드 '1위'·서울대 '15위' - 조선일보 (chosun.com)[9]
  • Incheon National University joins drive to innovate university ranking system (koreaherald.com)
  • 서울여자대학교, 세계 혁신대학 윤리적 가치 부문 34위 랭킹 쾌거 (donga.com)
  • "작은 지역대학의 반란? 지역사회 보듬으며 강해졌죠" - 조선일보 (chosun.com)
  • 조선에듀 :: 프리미엄 교육전문 미디어 (chosun.com)
  • 2020 WURI 세계혁신대학 서울대 '국내1위'.. 인천대 한국외대 톱3 - 베리타스알파 (veritas-a.com)
  • 서울여대, 세계 혁신대학 2020 WURI 랭킹 ‘Ethical Value’ 부문 세계 34위 차지 < 대학뉴스 < 뉴스 < 기사본문 - 한국대학신문 - 409개 대학을 연결하는 '힘' (unn.net)
  • INU aims to stand at center of Northeast Asian education (koreatimes.co.kr)
  • FEU only PH institution in innovative universities list – The Manila Times (manilatime.net)[10]
  • FEU makes it to innovative universities global list; Stanford, MIT 1-2 | ABS-CBN News (abs-cbn.com)
  • FEU Ranks 91st Most Innovative University in the WURI 2020 Global List | Tatler Philippines (asiatatler.com)
Press releases do little to convince us that a subject is noteworthy as they're obviously promotional. It's also impossible to evaluate any of these sources as you haven't provided links to any of them. But if they simply repeat what the ranking's publisher has said or what institutions included in the ranking have said then they do little to establish that the ranking is noteworthy.
You are also clearly engaged in an edit war to add this material and that's completely unacceptable; please revert your edit while it's being discussed.
Finally, you are also engaged in an edit war to add this ranking to the article of the institution at which you are currently a student; this is a clear conflict of interest. I strongly recommend you also revert that edit and take a few steps back from editing these articles to add this material. ElKevbo (talk) 17:23, 28 December 2020 (UTC)
As ElKevbo says, these all appear to be press releases. To establish notability you need actual articles written about the ranking in generally reliable, third-party sources. Not about how a local institution has done in the ranking (which is probably actually press-release churnalism) but about the actual ranking. Robminchin (talk) 06:13, 31 December 2020 (UTC)