33
Survey of the Academic Ranking of "Beni-Suef University" Compared to First 20 Egyptian Universities By Prof. Sayed Kaseb Faculty of Engineering, Cairo University Submitted to: Prof. Tarif Shawky Mohamed Farag Vice President of Beni-Suef University for Graduate Studies and Research July 2015

2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

Embed Size (px)

Citation preview

Page 1: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

Survey of the Academic Ranking of

"Beni-Suef University"

Compared to First 20 Egyptian Universities

By

Prof. Sayed Kaseb

Faculty of Engineering, Cairo University

Submitted to:

Prof. Tarif Shawky Mohamed Farag

Vice President of Beni-Suef University

for Graduate Studies and Research

July 2015

Page 2: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

1

Table of Contents

1. 1. Beni-Suef University 2

1.1 Mission and Vision

1.2 Strategic Goals

2. Ranking Overview 3

3. Ranking Methodology and Criteria 5

3.1. Webometrics Ranking

3.2. Journals Consortium Ranking

3.3. The U.S. News Ranking, Arab Region

4. Rank of Beni-Suef University 23

4.1. Rank according to Webometrics Ranking

4.2. Rank according to Journals Consortium Ranking

4.3. Rank according to The U.S. News Ranking, Arab Region

Appendices

I: Berlin Principals 26

II: List of International, Regional and National Universities Rankers 29

Page 3: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

2

1. Beni Suef University Current Website: http://www.bsu.edu.eg/EnglishUniversityWebsite/

1.1 Mission and Vision

1. Vision

Through self- evaluation of its educational programs, Beni Suef University seeks to

advance a permanent development in its educational process, promote the quality of

performance, and obtain an outstanding position among local, regional and international

universities.

2. Mission

1. Preparing a qualified graduates fully equipped with scientific knowledge and

experiences in different fields of specialization. Enabling them to offer, create, and

compete in labor market whether locally, regionally or internationally.

2. Providing a new generations of remarkable scientists, intellectuals, and men of letters

who can make an effective participation in society by promoting its culture and

environment.

3. Reinforcing the sectors of production for universal competition through distinguished

academic programs and standards.

1.2 Strategic Goals:

The university seeks to achieve its mission through the followings:

1. The continuous development of educational and postgraduate programs to cope with

the requirements of the present time and lead to graduate well qualified students.

2. The effective use of the available and renewable University facilities to promote the

scientific and practical levels of the graduates and sustain the development of societal

services.

3. Updating the level of University administrative system by turning it into electronic

one to connect not only the University‘s faculties together but also with other

national, regional and international educational associations.

4. Providing new prominent educational programs and institutes to enable the University

to achieve self-reliance.

Page 4: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

3

5. Establishing specialized scientific centers for effective training and societal service

which strengthen the cultural progress.

6. Creating an atmosphere of excellence and innovation and deepening noble values that

maintain the fabric of society, national loyalty, ideal examples, and good ethics.

7. Promoting community on scientific grounds and resolving its problems that impede

success through directed research projects.

2. Ranking Overview

In 2006 the International Ranking Expert Group (IREG) considered principles that assure a

quality and good practice in higher education institutes ranking – the so called Berlin

Principles on Ranking of Higher Education Institutions [IREG].

The most important guidelines proclaim that rankings should

1. Recognize the diversity of institutions and take the different missions and goals of

institutions into account

2. Specify the linguistic, cultural, economic, and historical contexts of the educational

systems being ranked

3. Be transparent regarding the methodology used for creating the rankings

4. Measure outcomes in preference to inputs whenever possible

5. Use audited and verifiable data whenever possible

6. Apply measures of quality assurance to ranking processes themselves

7. Provide consumers with a clear understanding of all of the factors used to develop a

ranking, and offer them a choice in how rankings are displayed

These Guidelines are quite ambitious, because until now there is hardly any implementation

that really complies with them. However the guidelines are really important, as they seem to

be quite promising to bring order into the heterogeneous network of universities worldwide.

Recently, university rankings became quite popular all over the world. Actually they are

important, as they provide an easy comparison of different institutions to the customer,

stimulate the competition between the candidates and give hints to investors about their

sponsoring target.

Page 5: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

4

The core of ranking is establishing comparability among Higher Education institutions. This

comparability is based on methods and techniques which are intended to identify the best

institutions in their overall performance and prospective fields, using objective and

transparent data. Competition is also the rationale underlying ranking efforts. In fact,

competition has been a characteristic element in science and humanities for centuries, both

for those directly involved in research and for their institutions.

Ranking has taken new forms in the last decades, and in this process, has also been

increasingly used a new dimension of measuring quality in higher education sector. Since

research does not stop at national boundaries, internationality is an integral element of

research.

As one can think of very different priorities, it is not easy to compare universities. In order to

make the process of ranking transparent and credible, it is essential that the ranking is based

on objective data, such as statistics.

Most common rankings are based on scientific performance, normally described by

bibliomatic indicators, which shows the quality of research. From these analyses one can see

where innovations might evolve and where the best researchers might be found which is of

course very important. But one can also think about rankings based the professors'

supervising quality, which might show where the quality of the education is good. This is

perhaps even more interesting in the students', and perhaps also in industrial perspective.

These two rankings usually don't fit together, and additionally there are many different

subjects. So these two points are important to remember:

8. It is not easy to generate a general ranking.

9. A general ranking might not be relevant for a special application.

The following sections represent an intensive survey for the “Academic Ranking of

Beni-Suef University” in comparison with the “First 20 Egyptian Universities”. It

should be stated that there are lot of “International Universities Rankers”, but they are

not included in this survey because they published a short list which not contain “Beni-

Suef University”.

Page 6: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

5

3. Ranking Methodology and Criteria

There are many rankers in the world, however, in this report some of them will be included,

only, which they contain the rank of "Beni-Suef University" in their ranking tables.

The information reported in this section of the report was extracted from the website of each

ranker at July 2015.

3.1. Webometrics Ranking

Website: http://www.webometrics.info/en

(Twice a year at Jan. and July)

Methodology: http://www.webometrics.info/en/Methodology

The Ranking Web or Webometrics is the largest academic ranking of Higher Education Institutions.

Since 2004 and every six months an independent, objective, free, open scientific exercise is performed

by the Cybermetrics Lab (Spanish National Research Council, CSIC) for the providing reliable,

multidimensional, updated and useful information about the performance of universities from all over

the world based on their web presence and impact.

1. History

The Cybermetrics Lab has been developing quantitative studies on the academic web since the mid-

nineties. A first indicator was presented during the EASST/4S conference in Bielefeld (1996) and the

collection of web data from European universities started in 1999 supported by the EU funded project

EICSTES. These efforts are a follow-up of our scientometric research started in 1994 that has been

presented in the conferences of the International Society for Scientometrics and Informetrics (ISSI,

1995-2011) and the International Conferences on Science and Technology Indicators (STI-ENID,

1996-2012) and published in high impact journals (Journal of Informetrics, Journal of the American

Society for Information Science and Technology, Scientometrics, Journal of Information Science,

Information Processing & Management, Research Evaluation and others). In 1997 we started the

edition of an all-electronic open access peer-reviewed journal, Cybermetrics, devoted to the

publication of webometrics-related papers.

In 2003 after the publication of the Shanghai Jiatong University breakthrough ranking, the Academic

Ranking of World Universities (ARWU), we decided to adopt the main innovations proposed by Liu

and his team. The ranking will be built from publicly available web data, combining the variables into

a composite indicator, and with a true global coverage. The first edition was published in 2004, it

Page 7: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

6

appears twice per year since 2006 and after 2008 the portal also includes webometrics rankings for

research centers, hospitals, repositories and business schools.

2. Objectives and motivation

The original aim of the Ranking is to promote academic web presence, supporting the Open Access

initiatives for increasing significantly the transfer of scientific and cultural knowledge generated by

the universities to the whole Society. In order to achieve this objective, the publication of rankings is

one of the most powerful and successful tools for starting and consolidating the processes of change

in the academia, increasing the scholars’ commitment and setting up badly needed long term

strategies.

The objective is not to evaluate websites, their design or usability or the popularity of their contents

according to the number of visits or visitors. Web indicators are considered as proxies in the correct,

comprehensive, deep evaluation of the university global performance, taking into account its activities

and outputs and their relevance and impact.

At the end a reliable rank is only possible if the web presence is a trustworthy mirror of the university.

In the second decade of the 21st century the Web is key for the future of all the university missions, as

it is already the most important scholarly communication tool, the future channel for the off-campus

distance learning, the open forum for the community engagement and the universal showcase for

attracting talent, funding and resources.

3. Philosophy and justification

Webometrics only publish a unique Ranking of Universities in every edition. The combination of

indicators is the result of a careful investigation and it is not open to individual choosing by users

without enough knowledge or expertise in this field. Other publishers provide series of very different

rankings using exactly the same data in different fashions that is completely useless and very

confusing.

Webometrics is a ranking of all the universities of the world, not only a few hundred institutions from

the developed world. Of course, “World-class” universities usually are not small or very specialized

institutions. Webometrics is continuously researching for improving the ranking, changing or

evolving the indicators and the weighting model to provide a better classification. It is a shame that a

few rankings maintain stability between editions without correcting errors o tuning up indicators.

Rankings backed by a for-profit company exploiting rank-related business or with strong political

links reflected in individual ranks should be checked with care. Research only (bibliometrics) based

rankings are biased against technologies, computer science, social sciences and humanities,

disciplines that usually amounts for more than half of the scholars and students in a standard

comprehensive university. Webometrics also measure, in an indirect way, other missions like teaching

or the so-called third mission, considering not only the scientific impact of the university activities,

Page 8: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

7

but also the economic relevance of the technology transfer to industry, the community engagement

(social, cultural, environmental roles) and even the political influence.

Webometrics uses link analysis for quality evaluation as it is a far more powerful tool than citation

analysis or global surveys. In the first case, bibliometrics only counts formal recognition between

peers, while links not only includes bibliographic citations but also third parties involvement with

university activities. Surveys are not a suitable tool for World Rankings as there is not even a single

individual with a deep (several semesters per institution), multi-institutional (several dozen),

multidisciplinary (hard sciences, biomedicine, social sciences, technologies) experience in a

representative sample (different continents) of universities worldwide.

Research output is also key topic for Webometrics, but including not only formal (e-journals,

repositories) publications but also informal scholarly communication. Web publication is cheaper,

maintaining the high standards of quality of peer review processes. It could also reach much larger

potential audiences, offering access to scientific knowledge to researchers and institutions located in

developing countries and also to third parties (economic, industrial, political or cultural stakeholders)

in their local community.

We intend to motivate both institutions and scholars to have a web presence that reflect accurately

their activities. If the web performance of an institution is below the expected position according to

their academic excellence, university authorities should reconsider their web policy, promoting

substantial increases of the volume and quality of their electronic publications.

Candidate students should use additional criteria if they are trying to choose university. Webometrics

ranking correlates well with quality of education provided and academic prestige, but other non-

academic variables need to be taken into account.

4. Composite indicators and Web Impact Factor

Probably one of the major contributions of the Shanghai Ranking was to introduce a composite

indicator, combining with a weighting system a series of indicators. Traditional bibliometric indexes

are built on ratios like the Garfield’s Journal Impact Factor that based on variables following power

law distributions is useless for describing large and complex scenarios. The Ingwersen proposal in

1997 for a similarly designed Web Impact Factor (WIF) using a links/webpages (L/W) ratio is equally

doomed by the mathematical artifacts that generates.

Following the Shanghai model we developed an indicator transforming the ratio L/W into the

following formula aL+bW, where L & W should be normalized in advance and a & b are weights

adding 100%. We strongly discouraged the use of WIF due to its severe shortcomings. The composite

indicator can be designed with different sets of variables and weightings according to the developer’s

needs and models.

Page 9: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

8

5. Design and Weighting of Indicators

Webometrics uses an “a-priori” scientific model for building the composite indicator. Other rankings

choose arbitrary weights for strongly dependent variables and even combine raw values with ratios.

None of them follow a logical ratio between activity related and impact related variables, i.e. each

group representing 50% of the total weighting. Referring to the individual variables, some of them

have values larger than zero for only a few universities and others segregate universities according to

differences so small that they are even lower than their error rates. Prior to combination the values

should be normalized, but the practice of using percentages is mostly incorrect due to the power law

distribution of the data.

Webometrics log-normalize the variables before combining according to a ratio 1:1 between

activity/presence and visibility/impact groups of indicators.

Page 10: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

9

The current composite indicator is now built as follows:

1. Visibility (50%):

IMPACT. The quality of the contents is evaluated through a "virtual referendum", counting all the

external inlinks that the University webdomain receives from third parties. Those links are

recognizing the institutional prestige, the academic performance, the value of the information, and the

usefulness of the services as introduced in the webpages according to the criteria of millions of web

editors from all over the world. The link visibility data is collected from the two most important

providers of this information: Majestic SEO and ahrefs. Both use their own crawlers, generating

different databases that should be used jointly for filling gaps or correcting mistakes. The indicator is

the product of square root of the number of backlinks and the number of domains originating those

backlinks, so it is not only important the link popularity but even more the link diversity. The

maximum of the normalized results is the impact indicator.

2. Activity (50%):

PRESENCE (1/3). The total number of webpages hosted in the main webdomain (including all the

subdomains and directories) of the university as indexed by the largest commercial search engine

(Google). It counts every webpage, including all the formats recognized individually by Google, both

static and dynamic pages and other rich files. It is not possible to have a strong presence without the

contribution of everybody in the organization as the top contenders are already able to publish

millions of webpages. Having additional domains or alternative central ones for foreign languages or

marketing purposes penalizes in this indicator and it is also very confusing for external users.

OPENNESS (1/3). The global effort to set up institutional research repositories is explicitly

recognized in this indicator that takes into account the number of rich files (pdf, doc, docx, ppt)

published in dedicated websites according to the academic search engine Google Scholar. Both the

total files Both the total records and those with correctly formed file names are considered (for

example, the Adobe Acrobat files should end with the suffix .pdf). The objective is to consider recent

publications that now are those published between 2008 and 2012 (new period).

EXCELLENCE (1/3). The academic papers published in high impact international journals are

playing a very important role in the ranking of Universities. Using simply the total number of papers

can be misleading, so we are restricting the indicator to only those excellent publications, i.e. the

university scientific output being part of the 10% most cited papers in their respective scientific fields.

Although this is a measure of high quality output of research institutions, the data provider Scimago

group supplied non-zero values for more than 5200 universities (period 2003-2010). In future editions

it is intended to match the counting periods between Scholar and Scimago sources.

Page 11: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

10

3.2. Journals Consortium Ranking

Website: http://journalsconsortium.org/

(Once a year)

Methodology: http://ranking.journalsconsortium.org/about_the_ranking#methodology

Journals Consortium’s mission is to evaluate universities (and other higher institutions)

and research journals based on their publications and citation, as well as to bring research

articles to everyone’s reach.

The Total influence Factor (TIF) is the sum of Research Publication and Citation score

plus the Internet/Web Presence score. The Research Publication and Citation (RSC) score

is directly proportional to the number of publications and citations of a university or

higher institution for the previous 5 years. Journals Consortium utilizes the publications

and citations scores available on Google Scholar.

The Internet/Web Presence (IWP) is also directly proportional to the number of times the

university or higher institution appears on the Internet.

RSC ∝ number of publications and citations of university for the previous 5 years.

IWP ∝ number of times the university or higher institution appears on the Internet

TIF = RSC + IWP

The Internet/Web Presence score carries less weight compared to the Research

Publication and Citation scores. The scores are normalized by logarithm transformation

for portability.

Page 12: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

11

3.3. The U.S. News Ranking

Website: http://www.usnews.com/education

(Once a year)

Methodology (Nov. 25, 2014): http://www.4icu.org/about/index.htm#ranking

The inaugural U.S. News Best Global Universities rankings were produced to provide insight

into how universities compare globally. As an increasing number of students are planning to

enroll in universities outside of their own country, the Best Global Universities rankings –

which focus specifically on schools' academic research and reputation overall and not their

separate undergraduate or graduate programs – can help those students accurately compare

institutions around the world.

The Best Global Universities rankings also provide insight into how U.S. universities – which

U.S. News has been ranking separately for the last 30 years – stand globally. All universities

can now benchmark themselves against schools in their own country and region, become

more visible on the world stage and find top schools in other countries to consider

collaborating with.

The overall Best Global Universities rankings encompass the top 500 institutions spread out

across 49 countries. The first step in producing these rankings, which are powered by

Thomson Reuters InCitesTM

research analytics solutions, involved creating a pool of 750

universities that was used to rank the top 500 schools.

To be included in the 750, an institution had to first be among the top 200 universities in the

results of Thomson Reuters' global reputation survey, described further below. Next, an

institution had to be among those that had published the most number of articles during the

most recent five years, de-duplicated with the top 200 from the reputation survey.

As a result of these criteria, many stand-alone graduate schools, including the Rockefeller

Institute of New York and University of California—San Francisco, were eligible to be

ranked and were included in the ranking universe.

The second step was to calculate the rankings using the 10 indicators and weights that U.S.

News chose to measure global research performance. Each school's profile page on

Page 13: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

12

usnews.com lists numerical ranks, out of 750, for the 10 indicators, allowing students to

compare each school's standing in each indicator.

The indicators and their weights in the ranking formula are listed in the table below, with

related indicators grouped together; an explanation of each follows.

Ranking indicator Weight

Global research reputation 12.5%

Regional research reputation 12.5%

Publications 12.5%

Normalized citation impact 10%

Total citations 10%

Number of publications that are among the 10 percent most cited 12.5%

Percentage of total publications that are among the 10 percent most

cited 10%

International collaboration 10%

Number of Ph.D.s awarded 5%

Number of Ph.D.s awarded per academic staff member 5%

1. Reputation Indicators: Results from Thomson Reuters' Academic Reputation Survey were used to create the two

reputation indicators used in our ranking analysis.

The survey, which aimed to create a comprehensive snapshot of academics' opinions

about world universities, had respondents give their views of the disciplinary programs

with which they were familiar. This method allowed respondents to rate universities at the

Page 14: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

13

field and department level, rather than at the institution level, creating a more specific and

accurate measurement of a university's reputation as a whole.

In order to appropriately represent all regions, Thomson Reuters took steps to overcome

language bias, differing response rates and the geographic distribution of researchers.

These steps included:

1. Sending an invitation-only survey to academics selected from Thomson Reuters'

databases of published research, based on the estimated geographic proportions of

academics and researchers across the globe

2. Providing accessibility in 10 languages

3. Rebalancing the survey's final results based on the geographic distribution of

researchers in order to overcome differing response rates.

The results of the survey were used in two separate ranking indicators as follows.

1. Global research reputation (12.5 percent): This indicator reflects the aggregation of the

most recent five years of results of the Academic Reputation survey for the best

universities globally for research.

2. Regional research reputation (12.5 percent): This indicator reflects the aggregation of

the most recent five years of results of the Academic Reputation survey for the best

universities for research in the region; regions were determined based on the United

Nations definition. This indicator had the effect of significantly increasing the

international diversity of the rankings, since it focused on measuring academics' opinions

of other universities within their region. This is the first time this indicator has been used

in any global ranking.

3. Bibliometric Indicators The bibliometric indicators used in our ranking analysis are based on data from the Web of

ScienceTM for the five-year period from 2008 to 2012. The Web of ScienceTM is a Web-

based research platform that covers more than 12,000 of the most influential and

authoritative scholarly journals worldwide in the sciences, social sciences, and arts and

humanities.

4. Publications (12.5 percent): This is a measure of the overall research productivity of a

university, based on the total number of scholarly papers (reviews, articles and notes)

that contain affiliations to a university and are published in high-quality, impactful

journals. This indicator is closely linked to the size of the university. It is also influenced

Page 15: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

14

by the discipline focus of the university, as some disciplines, particularly medicine,

publish more than others.

5. Normalized citation impact (10 percent): The total number of citations per paper

represents the overall impact of the research of the university and is independent of the

size or age of the university; the value is normalized to overcome differences in research

area, the publication year of the paper and publication type.

NCI is considered one of the core measures of research performance and is used by

various research evaluation bodies globally. The subject fields used in the analysis came

from Thomson Reuters' InCitesTM

product, which helps institutions evaluate research

output, performance and trends; understand the scope of an organization’s scholarly

contributions; and articulate outcomes to inform research priorities. InCites utilizes the

content and citation indicators found in the Web of ScienceTM.

6. Total citations (10 percent): This indicator measures how influential the university

has been on the global research community. It is determined by multiplying the

publications ranking factor by the normalized citation impact factor. Total citations

have been normalized to overcome differences in research area, publication year of

the paper and publication type.

7. Number of publications that are among the 10 percent most cited (12.5 percent):

This indicator reflects the number of papers that have been assigned as being in the

top 10 percent of the most highly cited papers in the world for their respective fields.

Each paper is given a percentile score that represents where it falls, in terms of

citation rank, compared with similar papers (same publication year, subject and

document type). As the number of highly cited papers is dependent on the size of the

university, the indicator can be considered a robust indication of how much excellent

research the university produces.

8. Percentage of total publications that are among the 10 percent most cited (10

percent): This indicator is the percentage of a university's total papers that are in the

top 10 percent of the most highly cited papers in the world (per field and publication

year). It is a measure of the amount of excellent research produced by the university

and is independent of the university's size.

9. International collaboration (10 percent): This indicator is the proportion of the

institution's total papers that contain international co-authors divided by the

proportion of internationally co-authored papers for the country that the university is

in. It shows how international the research papers are compared with the country in

which the institution is based. International collaborative papers are considered an

indicator of quality, as only the best research will be able to attract international

collaborators.

Page 16: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

15

10. School-Level Indicators Publicly available data sources were used to create the school-level indicators.

1. Number of Ph.D.s awarded (5 percent): This indicator reflects the total number of

doctoral degrees awarded in 2012. The number of doctorates awarded can be

considered an alternative indicator of research output and is linked to volume.

2. Number of Ph.D.s awarded per academic staff member (5 percent): This is the

number of Ph.D.s awarded per the number of academic faculty members for the same

year. This is a size-independent measure of the education environment at the

university.

How the Overall Global Scores and Numerical Rankings Were Calculated

To arrive at a school's rank, the overall global scores were calculated using a combination of

the weights and z-scores for each of the 10 indicators used in the rankings. In statistics, a z-

score is a standardized score that indicates how many standard deviations a data point is from

the mean of that variable. This transformation of the data is essential when combining diverse

information into a single ranking because it allows for fair comparisons between the different

types of data.

Several of the indicators were highly skewed, so the logs of the original values were used.

The indicators that used logs were:

1. Publications

2. Total citations

3. Number of publications that are among the 10 percent most cited

4. Number of Ph.D.s awarded

5. Global research reputation

6. Regional research reputation

This log manipulation rescaled the data and allowed for a more normalized and uniform

spread across each of the indicators. After these six indicators were normalized, the z-scores

for each indicator were calculated in order to standardize the different types of data to a

common scale.

In order to calculate a school's overall global score, the calculated z-scores for each of the 10

indicators were then weighted using the assigned weights described earlier. U.S. News

determined the weights based on our judgment of the relative importance of the ranking

factors and in consultation with bibliometric experts.

Page 17: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

16

The overall global score for each school was calculated by summing the school's weighted

values for each indicator. The minimum score from the pool of 750 schools was then

subtracted from each of the scores in order to make zero the lowest possible score.

The scores were then rescaled by multiplying the ratio between the overall performance of

each university and the highest-performing university by 100. This forced the scores to fall

on a 0-100 scale, with the highest-performing school earning an overall global score of 100.

The top 500 universities out of the 750 were then numerically ranked in descending order

from 1 to 500 based on their weighted, rescaled overall global score. Each school's overall

global score was rounded to one decimal place in order to increase variance between scores

and to minimize the occurrence of ties.

In addition, the 750 universities received a numerical rank for each of the 10 ranking

indicators, such as publications, total citations and global academic reputation, based on their

z-score for that indicator. The highest-scoring university for each of the 10 indicators

received a rank of 1 and the lowest-scoring university received a rank of 750. Ties were

allowed.

As noted earlier, the numerical ranks for each of the 10 indicators are published on

usnews.com for each school ranked in the top 500. This means that there are some schools in

the top 500 rankings that have ranking indicators with numerical ranks in the 501 to 750

range. The numerical ranks published for each ranking indicator are to be used to determine

the relative position of each school in that indicator. The numerical indicator ranks were not

used to calculate the overall global score.

Data Collection and Missing Data

The data and metrics used in the ranking were provided by Thomson Reuters InCitesTM

research analytics solutions. The bibliometric data were based upon the Web of ScienceTM

.

Publications are limited to those published between 2008 and 2012. However, the citations to

those papers come from all publications up to the most recent data available. For the 2015

edition of the U.S. News Best Global Universities, published in 2014, this cutoff was around

April 2014. It is necessary to use a slightly older window of publication to allow for citations

to accumulate and provide statistically relevant results.

The subject fields used in the analysis came from Thomson Reuters' InCitesTM

schema and

did not include arts and humanities journals, and therefore they are excluded for the citation-

based indicators; but articles from arts and humanities journals were included in the papers

count used in the publications indicator. Arts and humanities journals accumulate few

Page 18: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

17

citations and citation analysis is less robust; therefore, the deliberate exclusion of arts and

humanities improves the robustness of the results.

When data were not available, such as Ph.D.s awarded, a z-score of zero was used so as to

neither reward nor penalize the university (i.e., it is treated as an average of all the other

universities).

When the value is zero it is not possible to calculate the log value; therefore, a substitute is

used. The substitute is one-tenth of the minimum value of all other institutions. There were

no missing data in the bibliometric or reputation indicators.

University Rankings by Region

After the overall top 500 rankings were calculated, U.S. News then produced additional

rankings. The U.S. News Best Global Universities rankings by region show the top

institutions in four regions with a large number of globally ranked schools. Those regions are

Asia, Australia/New Zealand, Europe and Latin America. To determine which countries are

in which region, we used the United Nations definition of geographical regions.

The methodology for the region rankings is based entirely on how a school ranked in the

overall Best Global Universities rankings covering the top 500 schools worldwide.

Universities are numerically ranked in their region based on their position in the overallBest

Global Universities rankings.

For example, in Europe, the highest-ranked university in the overall top 500 rankings is the

United Kingdom's University of Oxford, at No. 5 globally, which also makes the school No.

1 in Europe. The second highest-ranked university in Europe is the U.K.'s University of

Cambridge, which is ranked No. 6 globally, making it No. 2 in Europe.

University Rankings by Country

The U.S. News Best Global Universities rankings by country show the top institutions in 11

countries with a large number of globally ranked schools. Those countries are Canada, China,

France, Germany, Italy, Japan, the Netherlands, South Korea, Spain, Sweden and the United

Kingdom.

The methodology for the country rankings is based entirely on how a school ranked in the

overall Best Global Universities rankings covering the top 500 schools worldwide.

Universities are numerically ranked in their country based on their position in the overall Best

Global Universities rankings.

Page 19: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

18

For example, in Canada, the highest-ranked university in the overall top 500 rankings is the

University of Toronto, at No. 14 globally. That means it is also ranked No. 1 in the Best

Global Universities in Canada rankings. The second highest-ranked university in Canada in

the overall rankings is the University of British Columbia, ranked at No. 30 globally, which

means it's ranked No. 2 in Canada.

U.S. News; the Best Arab Region Universities Rankings

Methodology (Nov. 3, 2014): http://www.usnews.com/education/arab-region-

universities/articles/methodology

The inaugural U.S. News Best Arab Region Universities rankings are the first in-depth

assessment of schools in the region. This 1.0 version of the rankings is the beginning of a

long-term project to develop surveys and rankings for the region.

U.S. News believes that the 2015 Best Arab Region Universities rankings will allow

prospective students, parents, policymakers and employers in the region to accurately

compare institutions – something that had not been possible in the past due to a lack of

standardized educational data. Arab region universities will also be able to use these rankings

as a way to benchmark themselves against schools in their own country and region and

discover top schools from other countries to collaborate with.

The rankings – which are based on bibilometric data and research metrics provided by

Scopus, part of the Elsevier Research Intelligence portfolio – focus specifically on

institutions' academic research output and performance and not their separate undergraduate

or graduate programs.

Scopus is Elsevier’s abstract and citation database of peer-reviewed literature, covering 55

million documents published in more than 21,900 journals, book series and conference

proceedings by more than 5,000 publishers. For the Best Arab Region Universities rankings,

the Scopus database was aggregated for universities in the Arab region by school and subject.

The first step in producing the overall rankings was to determine which of the 800-plus Arab

region universities would be eligible to be included in the analysis. U.S. News worked with

bibliometric experts at Elsevier to set the analytical time period for the rankings: papers

published in the five-year period from 2009 through 2013. This time period was chosen since

many Arab region universities have only recently begun emphasizing the importance of their

faculty publishing in journals and engaging in research.

Page 20: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

19

Since various publication metrics were the sole basis of the overall Best Arab Region

Universities rankings, U.S. News decided that to be included and ranked, an Arab region

university had to have 400 or more total publications tracked by Scopus, meaning an average

of 80 papers per year in the five-year period.

This publication threshold is well below the one used to determine eligibility for the U.S.

News Best Global Universities rankings of the top research institutions worldwide; however,

it was considered high enough to be the basis for a sophisticated, comparative analysis of

publications and citations in the Arab region. As a result of setting the threshold at 400 or

more total papers, 91 schools were included in the overall rankings.

Papers published by Arab region institutions in the subject area of physics and astronomy

were excluded based on input from Elsevier's bibliometric experts, who determined that their

citation characteristics would distort the results of the overall rankings. There is, however, a

separate subject ranking for physics and astronomy that is based on papers published

exclusively in those fields. Branch campuses in the Arab region that are operated by a parent

university in another country were not considered for these rankings.

The second step was to calculate the rankings for the 91 universities using the nine ranking

indicators and weights that U.S. News chose to measure research output and performance; all

indicators were based on the 2009-2013 period. The weights emphasize, in nearly equal

proportions, the importance of getting published in peer-reviewed journals; getting those

publications cited by other researchers in their work; and having a paper be highly cited in its

field.

Each school's profile page on usnews.com lists numerical ranks, out of 91, for the nine

indicators, allowing students to compare each school's standing in each indicator. The

indicators and their weights in the ranking formula are listed in the table below, with related

indicators grouped together; an explanation of each follows.

Ranking indicator Weight

Publications 30%

Cited publications 5%

Percent of publications cited 5%

Page 21: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

20

Ranking indicator Weight

Citations 20%

Field-weighted citation impact 10%

Number of highly cited publications in top 10 percent 5%

Percentage of total publications in top 10 percent 5%

Number of highly cited publications in top 25 percent 10%

Percentage of total publications in top 25 percent 10%

Output Metric

1. Publications (30 percent): This is a measure of the overall research productivity of a

university, based on the total number of publications that have at least one author

affiliated with that institution. A publication that is co-authored by authors from

different institutions thus counts as a full publication (+1) toward the publication

output of each university. This indicator is closely linked to the size of the university.

It is also influenced by the discipline focus of the university, as some disciplines,

particularly engineering, publish more than others.

Citation and Impact Metrics

2. Cited publications (5 percent): This indicator represents the total number of

publications for that school that have been cited at least once. It shows the extent to

which other researchers in the scientific community utilize the research output

produced by an entity. It is dependent on the size of the university.

3. Percent of publications cited (5 percent): This indicator provides a breakdown of

what percentage of an institution’s total publications in a given year and subject area

have thus far been cited at least once. It shows the extent to which other researchers in

the scientific community utilize the research output produced by an entity.

4. Citations (20 percent): This indicator represents the total number of citations to

earlier publications made in another new journal article or other publication since the

original articles were published. The total number of times research by a university is

cited is a measure of how impactful and influential the research has been. The act of

Page 22: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

21

one scholar citing another is recognition that the scholar was influenced by the earlier

work or utilized it.

5. Field-weighted citation impact (10 percent): This indicator is a metric that is used

as a proxy to measure the quality of a paper. It compares the actual number of

citations received by a publication with the expected number of citations for

publications of the same document type (article, review or conference proceeding

paper), publication year and subject. This enables the comparison of citation impact

across subject areas with different publication velocities or publication type norms. It

is one of the most sophisticated indicators in the modern bibliometric toolkit.

Research Excellence Metrics

1. Number of highly cited publications in top 10 percent (5 percent): This indicator

reflects the number of papers that have been assigned as being in the top 10 percent of

the most highly cited papers in the world in their respective fields. Each paper is given

a percentile score that represents where it falls, in terms of citation rank, compared

with similar papers (same publication year, subject and document type).

2. Percentage of total publications in top 10 percent (5 percent): This indicator is the

percentage of a university's total papers that are in the top 10 percent of the most

highly cited papers in the world (per field and publication year). It is a measure of the

amount of excellent research produced by the university and is independent of the

university's size.

3. Number of highly cited publications in top 25 percent (10 percent): This indicator

reflects the number of papers that have been assigned as being in the top 25 percent of

the most highly cited papers in the world in their respective fields. Each paper is given

a percentile score that represents where it falls, in terms of citation rank, compared

with similar papers (same publication year, subject and document type). As the

number of papers is dependent on the size of the university, it can be considered a

robust indication of how much excellent research the university produces.

4. Percentage of total publications in top 25 percent (10 percent): This indicator is

the percentage of a university's total papers that are in the top 25 percent of the most

highly cited papers in the world (per field and publication year). It is a measure of the

amount of excellent research produced by the university and is independent of the

university's size.

How the Overall Scores and Numerical Rankings Were Calculated

To arrive at a school's rank, the overall scores were calculated using a combination of the

weights and z-scores for each of the nine indicators used in the rankings. In statistics, a z-

Page 23: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

22

score is a standardized score that indicates how many standard deviations a data point is from

the mean of that variable. This transformation of the data is essential when combining diverse

information into a single ranking because it allows for fair comparisons between the different

types of data.

Several of the indicators were highly skewed, so the logs of the original values were used.

These indicators were:

1. Publications. Cited publications. Citations.

2. Number of highly cited publications in top 10 percent.

3. Number of highly cited publications in top 25 percent.

This manipulation rescaled the data and allowed for a more normalized and uniform spread

across each of the indicators. For the two indicators related to the number of highly cited

publications, in some cases, a school had a zero for the indicator. Because it is not possible to

calculate the log value of zero, a substitute was used. The substitute was one-tenth of the

minimum value of all other institutions.

After the indicators were normalized, the z-scores for each indicator were calculated in order

to standardize the different types of data to a common scale.

In order to calculate a school's overall score, the calculated z-scores for each of the nine

indicators were then weighted using the weights described earlier. U.S. News determined the

weights based on our judgment of the relative importance of the ranking factors and in

consultation with bibliometric experts. The overall score for each school was calculated by

summing the school's weighted values for each indicator.

The minimum score from the pool of 91 schools was then subtracted from each of the scores

in order to make zero the lowest possible score. The scores were then rescaled by multiplying

the ratio between the overall performance of each university and the highest-performing

university by 100. This forced the scores to fall on a 0-100 scale with the highest-performing

school earning an overall score of 100.

The universities were then numerically ranked in descending order from 1 to 91 based on

their weighted, rescaled overall score. Each school's overall score was rounded to one

decimal place in order to increase variance between scores and to minimize the occurrence of

ties.

In addition, the 91 universities received a numerical rank for each of the nine ranking

indicators, such as publications, citations and number of highly cited publications in top 10

percent, based on their z-score for that indicator. The highest-scoring university for each of

Page 24: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

23

the nine indicators received a rank of 1, and the lowest-scoring university received a rank of

91. Ties were allowed.

As noted earlier, the numerical ranks for each of the nine indicators are published on

usnews.com for each of the ranked schools. These numerical indicator ranks are to be used to

determine the relative position of each school in that indicator. The nine numerical indicator

ranks were not used to calculate the overall score.

Rank of Beni-Suef University

The following section represents an intensive survey for the “Academic Ranking of

Beni-Suef University” in comparison with the “First 20 Egyptian Universities”. It

should be stated that there are lot of “International Universities Rankers”, but they are

not included in this survey because they published a short list which not contain “Beni-

Suef University”.

4.1. Beni-Suef University Rank according to Webometrics Ranking, Jan. 2015

University Name Egypt Arab Middle East Africa World

Cairo University 1 2 9 3 474

American University in Cairo 2 5 33 9 1050

Mansoura University 3 8 44 11 1167

Benha University 4 12 58 16 1419

Alexandria University 5 13 61 17 1448

Zagazig University 6 19 86 28 1922

Assiut University 7 22 97 29 2026

Kafrelsheikh University 8 28 114 31 2292

Minia University 9 35 125 42 2565

Suez Canal University 10 43 148 48 2886

Helwan University 11 44 152 50 2973

Page 25: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

24

Ain-Shams University 12 45 153 51 2984

University of Tanta 13 46 159 52 3052

Minufiya University 14 54 171 59 3289

Arab Academy for Science & Technology

and Maritime Transport

15 59 175 62 3378

Fayoum University 16 60 178 63 3416

South Valley University 17 70 206 76 3966

German University in Cairo 18 74 218 83 4199

British University in Egypt 19 94 249 101 4832

Al Azhar University 20 97 256 105 5020

Beni-Suef University 22 107 283 114 5488

N.B Beni-Suef Univeristy occupied the sixteenth position nationaly and the 2771

position worldwide on webometric ranking 2015 http://www.webometrics.info/en/aw/Egypt

4.2. Beni-Suef University Rank according to Journals Consortium Ranking

University Name Egyptian Africa Total Influence

Factor

Cairo university 1 2 43.43

Ain Shams University 2 10 40.54

Alexandria University 3 17 35.62

Al-Azhar University 4 21 34.38

Mansoura University 5 23 33.88

The American University in Cairo 6 25 33.64

Zagazig University 7 30 31.56

Assiut University 8 32 30.74

Seuz Canal University 9 41 26.69

Tanta University 10 45 26.19

Helwan university 11 51 24.28

Minia university 12 63 22.06

Benha university 13 67 21.83

South valley university 14 95 17.41

Page 26: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

25

University Name Egyptian Africa Total Influence

Factor

Menoufia university 15 96 17.36

Fayoum university 16 98 16.83

Sohag university 17 100 16.64

Beni-suef university 18 108 16.04

German University in Cairo 19 133 13.85

British University in Egypt 20 155 11.58

4.3. Beni-Suef University Rank according to The U.S. News Ranking, Arab Region

University Name Egyptian Arab Overall Score

Cairo University 1 4 81.5

Mansoura University 2 6 68.8

Ain Shams University 3 7 67.9

Alexandria University 4 9 65.8

Assiut University 5 13 60.7

Zagazig University 6 15 55.3

Suez Canal University 7 19 52.1

Al Azhar University 8 20 52

University of Tanta 9 21 50.4

Minia University 10 22 49.4

Menoufia University 11 25 45.1

Beni-Suef University 12 28 43.4

Helwan University 13 35 38.5

Sohag University 14 42 32.3

South Valley University 15 43 32

Benha University 16 44 31.8

Kafrelsheikh University 17 46 31.4

American University in Cairo 18 58 24.7

Fayoum University 19 72 16.6

German University in Cairo 20 77 13.6

Page 27: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

26

Appendix I:

1. Berlin Principles on Ranking of Higher Education Institutions

Rankings and league tables of higher education institutions (HEIs) and programs are a global

phenomenon. They serve many purposes: they respond to demands from consumers for easily

interpretable information on the standing of higher education institutions; they stimulate

competition among them; they provide some of the rationale for allocation of funds; and they

help differentiate among different types of institutions and different programs and disciplines.

In addition, when correctly understood and interpreted, they contribute to the definition of

“quality” of higher education institutions within a particular country, complementing the

rigorous work conducted in the context of quality assessment and review performed by public

and independent accrediting agencies. This is why rankings of HEIs have become part of the

framework of national accountability and quality assurance processes, and why more nations

are likely to see the development of rankings in the future. Given this trend, it is important

that those producing rankings and league tables hold themselves accountable for quality in

their own data collection, methodology, and dissemination.

In view of the above, the International Ranking Expert Group (IREG) was founded in 2004

by the UNESCO European Centre for Higher Education (UNESCO-CEPES) in Bucharest

and the Institute for Higher Education Policy in Washington, DC. It is upon this initiative that

IREG’s second meeting (Berlin, 18 to 20 May, 2006) has been convened to consider a set of

principles of quality and good practice in HEI rankings — the Berlin Principles on Ranking

of Higher Education Institutions.

It is expected that this initiative has set a framework for the elaboration and dissemination of

Rankings — whether they are national, regional, or global in scope—that ultimately will lead

to a system of continuous improvement and refinement of the methodologies used to conduct

these rankings. Given the heterogeneity of methodologies of rankings, these principles for

good ranking practice will be useful for the improvement and evaluation of ranking.

Rankings and league tables should:

A) Purposes and Goals of Rankings

Page 28: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

27

1. Be one of a number of diverse approaches to the assessment of higher education inputs,

processes, and outputs. Rankings can provide comparative information and improved

understanding of higher education, but should not be the main method for assessing what

higher education is and does. Rankings provide a market-based perspective that can

complement the work of government, accrediting authorities, and independent review

agencies.

2. Be clear about their purpose and their target groups. Rankings have to be designed with

due regard to their purpose. Indicators designed to meet a particular objective or to inform

one target group may not be adequate for different purposes or target groups.

3. Recognize the diversity of institutions and take the different missions and goals of

institutions into account. Quality measures for research-oriented institutions, for example,

are quite different from those that are appropriate for institutions that provide broad access

to underserved communities. Institutions that are being ranked and the experts that inform

the ranking process should be consulted often.

4. Provide clarity about the range of information sources for rankings and the messages each

source generates. The relevance of ranking results depends on the audiences receiving the

information and the sources of that information (such as databases, students, professors,

employers). Good practice would be to combine the different perspectives provided by

those sources in order to get a more complete view of each higher education institution

included in the ranking.

5. Specify the linguistic, cultural, economic, and historical contexts of the educational

systems being ranked. International rankings in particular should be aware of possible

biases and be precise about their objective. Not all nations or systems share the same

values and beliefs about what constitutes “quality” in tertiary institutions, and ranking

systems should not be devised to force such comparisons.

B) Design and Weighting of Indicators

6. Be transparent regarding the methodology used for creating the rankings. The choice of

methods used to prepare rankings should be clear and unambiguous. This transparency

should include the calculation of indicators as well as the origin of data.

7. Choose indicators according to their relevance and validity. The choice of data should be

grounded in recognition of the ability of each measure to represent quality and academic

Page 29: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

28

and institutional strengths, and not availability of data. Be clear about why measures were

included and what they are meant to represent.

8. Measure outcomes in preference to inputs whenever possible. Data on inputs are relevant

as they reflect the general condition of a given establishment and are more frequently

available. Measures of outcomes provide a more accurate assessment of the standing

and/or quality of a given institution or program, and compilers of rankings should ensure

that an appropriate balance is achieved.

9. Make the weights assigned to different indicators (if used) prominent and limit changes to

them. Changes in weights make it difficult for consumers to discern whether an

institution’s or program’s status changed in the rankings due to an inherent difference or

due to a methodological change.

C) Collection and Processing of Data

10. Pay due attention to ethical standards and the good practice recommendations articulated

in these Principles. In order to assure the credibility of each ranking, those responsible for

collecting and using data and undertaking on-site visits should be as objective and

impartial as possible.

11. Use audited and verifiable data whenever possible. Such data have several advantages,

including the fact that they have been accepted by institutions and that they are

comparable and compatible across institutions.

12. Include data that are collected with proper procedures for scientific data collection. Data

collected from an unrepresentative or skewed subset of students, faculty, or other parties

may not accurately represent an institution or program and should be excluded.

13. Apply measures of quality assurance to ranking processes themselves. These processes

should take note of the expertise that is being applied to evaluate institutions and use this

knowledge to evaluate the ranking itself. Rankings should be learning systems

continuously utilizing this expertise to develop methodology.

14. Apply organizational measures that enhance the credibility of rankings. These measures

could include advisory or even supervisory bodies, preferably with some international

participation.

Page 30: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

29

D) Presentation of Ranking Results

15. Provide consumers with a clear understanding of all of the factors used to develop a

ranking, and offer them a choice in how rankings are displayed. This way, the users of

rankings would have a better understanding of the indicators that are used to rank

institutions or programs. In addition, they should have some opportunity to make their

own decisions about how these indicators should be weighted.

16. Be compiled in a way that eliminates or reduces errors in original data, and be organized

and published in a way that errors and faults can be corrected. Institutions and the public

should be informed about errors that have occurred.

Berlin, 20 May 2006

2. Appendix II:

3. List of International, Regional and National Universities Rankers

International Rankings

1. Academic Ranking of World Universities (Shanghai Ranking Consultancy; ARWU)

2. World University Rankings (QS),

3. World University Rankings (THE),

4. Scimago Institutions Ranking (SIR),

5. University Ranking by Academic Performance (URAP),

6. Webometrics Ranking of World Universities,

7. Asia's Best Universities (Asia Week)

8. CHE-Excellence Ranking (CHE)

9. Global University City Index

10. Newsweek (weekly magazine)

11. Performance Ranking of Scientific Papers for World Universities (Higher Education

Evaluation and Accreditation Council of Taiwan)

12. The Top 100 Full-time Global Programmes (Financial Times)

Page 31: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

30

Regional and National Rankings

Argentina

1. Consejo Nacional de Evaluation y Acreditation de las Universidades (Consejo Nacional

de Evaluation y Acreditation de las Universidades)

Australia

1. Good Universities Guides (Hobsons Australia)

2. International Standing of Australian Universities (Melbourne Institute of Applied

Economic and Social Research, Melbourne University)

Brazil

1. Provao, annual standardized examination ranking university programmens on a five-grade

scale from A to E (National Institute for Educational Studies and Research)

Canada

1. Canadian Psychological Association Graduate Guide

2. Maclean's university ranking (Maclean's)

Chile

1. Consjo Nacional de Acreditation (National Accreditation Agency, grants accreditation for

different lengths of time from three to seven years)

1. Ranking de las mejores universidades del país/Ranking universidades El Mercurio (EI

Mercurio)

2. Ranking de universidades Qué Pasa (Que Pasa)

China

1. Academic Ranking of World Universities (ShanghaiRanking Consultancy)

2. Academic Reputation Ranking in Taiwan (Education Evaluation Section, Center for

Learning and Teaching, Tamkang University)

3. China Academic Degrees and Graduate Education Development Center

4. Performance Ranking of Scientific Papers for World Universities (Higher Education

Evaluation and Accreditation Council of Taiwan)

5. Rankings by the Research Centre for China Science Evaluation, Wuhan University

6. Ranking of Universities in Hong Kong (Education 18.com)

7. The Chinese Universities Alumni Association Ranking

8. The Guangdong Institute of Management Science Ranking (Guangdong Institute of

Management Science)

9. The NETBIG Ranking (Netbig)

Germany

1. CHE University Ranking (Center for Higher Education Development, in partnership with

Die Zeit)

2. Ranking of Germany's National Innovative Capacity-The Innovative Indicator for

Germany (German Institute for Economic Research, DIW Berlin)

3. The best universities in Germany (Karriere)

India

Page 32: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

31

1. India Today Ranking (India Today)

2. JAM College Rankings

3. National Assessment and Accreditation Council

Italy

1. Universidad de la República (La Republic)

Japan

1. Asahi Shimbun Newspaper ranking

2. Japan University Accreditation Association

3. Kawaijuku Rankings

4. World Education News and Reviews (WES) Japan

Kazakhstan

1. Ranking of Higher Education Institutions in Kazakhstan

Korea

1. Korean Council for University Education Ranking

Malaysia

1. The Rating of Higher Education Institutions

Netherlands

1. The Leiden Ranking (Leiden University)

New Zealand

1. PBRF Rankings of New Zealand Tertiary Education Institutions (Tertiary Education

Commission)

Nigeria

1. Ranking of Nigerian Universities

Pakistan

1. Ranking of Universities, Pakistan (Pakistan Higher Education Commission)

Poland

1. Perspektywy

Portugal

1. Jornal Publico

Romania

1. Academic Ranking and Rating

2. Ad-Astra ranking

3. Ranking of Universities (The National Council of Research in Higher Education)

Slovakia

4. Academic Ranking and Rating (The Independent Slovak Academic Ranking and Rating

Agency)

Page 33: 2015-10-11 ندوة ترتيب جامعة بني سويف طبقا لنظم التصنيف العالمية للجامعات

32

Spain

1. Generador de Rankings RI3 para clasificar Instituciones Iberoamericanas de

Investigación

2. National Graduation Rate Ranking (GRS Research Group)

Sweden

1. Ranking of Universities of Colleges (Moderna Tider)

Switzerland

1. Champions League (The Swiss Federal Government’s Zentrum für Wissenschafts - und

Technologiestudien)

2. Switzerland University Ranking

Thailand

1. Ministry of Higher Education Ranking

Tunisia

1. Comite National d'Evaluation

Ukraine

2. Compass: Ranking of Ukrainian Universities

1. UNESCO Chair, Kyiv Polytechnic Institute, to be published by Zerkalo Nedeli (weekly

magazine)

United Kingdom

1. Business School Rankings (Financial Times)

2. Sunday Times Ranking (The Sunday Times)

3. The Good University Guide (The Times, London)

4. The Guardian University Guide (The Guardian)

5. The Daily Telegraph (daily newspaper)

6. World University Rankings (THES & QS)

United States

7. America's Best Colleges (US News and World Report)

8. NRC Ranking of U.S. Psychology Ph.D. Programs (Social Psychology Network)

9. The Washington Monthly College Rankings (Washington Monthly)

10. The Top American Research Universities (The Center for Measuring the Performance of

American Universities)

11. The Top 100 Global Universities (Newsweek Inc.)

12. The Princeton Review - 2008 Best 366 Colleges Rankings

13. UTD Top 100 Business School Research Rankings (The UT Dallas' School of

Management)