Monday, September 12, 2016

Cited researchers

A few weeks back, Shanghai released the list of cited researchers. It called its list as "Most cited researchers" Around 20 Indians, including one from IISc, find a place in that list. An important factor that plays a role in Shanghai Rankings is the MCR (Most Cited Researcher) list that accounts for 10% of the ARWU subject rankings. MCR is based on the top 150 researchers across the world in that field of engineering.

Thomson Reuters has also released its list of highly cited researchers. This list is called as "Highly cited researchers" Roughly 1500 from USA, 87 from China and 2 Indians find a place in this list.

37 comments:

Anonymous said...

Citations are not important. Unless one has btech from IIT, no good work can be done. This is clear from your other thread on prospective faculty.

In any case quality of mind and Pedigree are much more important than citations. This can not be measured by web of science

Anonymous said...

I looked at the list of chemical engineers in the MCR of Shanghai. There are three chemical engineers including the blog owner. They have 8000, 12000 and 14000 citations. But none of them are reputed in the community because they lack quality of mind. None of them are winners of awards such as the Bhatnagar prize or have B.Tech from IIT.

Anonymous said...

I am so surprised that medicore work done by these scientists have received so many citations and now they are in the top 150 of the world chemical engineers !

Anonymous said...

There are several from India

Madras, Giridhar Chemical Eng Indian Institute of Science
Mohan, Dinesh Chemical Eng Jawaharlal Nehru University
Yadav, Ganapati D Chemical Eng University of Mumbai
Stamenković, Vojislav R Chemical Eng University of Mumbai
Pandit, Aniruddha Chemical Eng University of Mumbai
Choudhary, Vasant R Chemical Eng National Chemical Laboratory India
Venkata Mohan, S Chemical Eng Indian Institute of Chemical Technology
Choudary, Boyapati Chemical Eng Indian National Science Academy
Ahmad, Absar Chemical Eng National Chemical Laboratory India
Aminabhavi, Tejraj M Chemical Eng Soniya College of Pharmacy
Parida, Kulamani M Chemical Eng Siksha O Anusandhan University
Anirudhan, Sreenivasan Chemical Eng University of Kerala

We should appreciate them for getting citations. Whether they are good or bad does not matter at this moment. Why to criticise?

Only one person from Harvard, two from Stanford, five from MIT, and none from Caltech.

Anonymous said...

Ironically, citations account for 10% of ranking but the overall ranking fell for IISc from 201-300 in 2003 to 301-400 in 2016! So we need to spread this mantra of increasing citations around IISc ;)

Anonymous said...

"Only one person from Harvard, two from Stanford, five from MIT, and none from Caltech."

That is only in chemical engineering. In all of engineering, Caltech has many scientists appearing in the list. Please look at the list carefully.

Your comment is trying to show "See there is no one from Caltech, so the list is wrong" Great scientists from Caltech are missing but people from unknown places in India are there..so the rankings are wrong.

In all of engineering, Caltech has many scientists appearing in the list. In all of engineering, IISc has only 1 scientist. As unfortunate it is, one can call this scientist all sorts of names and ask him to leave IISc.

Anonymous said...

"overall ranking fell for IISc from 201-300 in 2003 to 301-400 in 2016! "

In 2003, there were 4 scientists from IISc who were there in a similar list. Now, there is only one. This is not the list of chemical engineering but all engineering together.

A similar list for sciences is also there. In 2003, there were six people in that list. Now there is only one.

Anyway, if we read the comments carefully, it is clear that we do not want this one person to appear in this list. Let us criticize him so that he will leave IISc.

Anonymous said...

No point in just copying names:

The following people are given wrong affiliations or do not exist:

Stamenković, Vojislav R Chemical Eng University of Mumbai

Vojislav R. Stamenkovic
Argonne National Laboratory
https://scholar.google.co.in/citations?user=sF3BwMAAAAAJ&hl=en


Choudary, Boyapati Chemical Eng Indian National Science Academy

He does not exist. How can some one publish from INSA ? it is not an institution.

Ahmad, Absar Chemical Eng National Chemical Laboratory India
Anirudhan, Sreenivasan Chemical Eng University of Kerala

Both of the above people belong to chemistry and not chemical engineering.

Unfortunate that Scopus and Shanghai makes mistakes like these. That's why one should not believe in this crap.

Of course, no IIT Chemical engineer figures in the list ! shows the greatness of IIT.

Anonymous said...

"None of them are winners of awards such as the Bhatnagar prize "

Wrong..many in the list have won Bhatnagar prize or fellows of academies such as INSA or INAE.

Anonymous said...

Anonymous read, cut and paste the selective sentences. Remind of an old joke where an Indian politician convinced himself that he is the best and is by putting period (.) according to his preference.

My points were

1. Let us appreciate and not criticize our Indian scientists. Clearly written which you have omitted. One person from IISc is genuinely working hard and getting appreciation. He is not relying on me or you for rising up. I never and will never suggest he should leave IISc

2. An example of Harvard and so on was provided to just let people know that these lists are irrelevant. In fact someone pointed out several mistakes supporting my argument.

3. Ranking of IISc and other Indian institute is falling. This is truth and not one person is responsible. Again for the matter of fact, I asked people in appearing on these lists to spread the mantra of success. Not sure how you read these sentences or may be I am missing periods (.) and I apologize for that.

Anonymous said...

You again say that these lists of irrelevant. But they are relevant for rankings.

You give example of Caltech saying no scientist in Caltech is there, so the list is useless. But many Caltech scientists are there in the list just no one in chemical engineering.

But saying that these lists are irrelevant, you are saying people who are listed in these lists are useless.

Anonymous said...

"Again for the matter of fact, I asked people in appearing on these lists to spread the mantra of success."

Where did you say it? It is quote from some anonymous person.

" One person from IISc is genuinely working hard and getting appreciation. He is not relying on me or you for rising up."

Why should he spread the mantra of success when people like you constantly criticize and say these lists are irrelevant and, therefore, appearing on these lists are also irrelevant.

Anonymous said...

I think we are going in circles. Points are simple

1. People did not volunteer to be on the list. They were added by some software analysis. Software analysis took account of a number of citations. There are many errors and several brilliant young scientists will miss the bus.

2. Prof. Madras is on the list due to his exemplary citations. For every single manuscript he submits, he has to face criticism (peer review). Just because he is on the list will not grant him an easy peer review/ criticism. Does he care about us? No, he never did or should never do! Is he responsible for making sure that peer review criticism is answered? YES and that is the only way he can publish. So we all including Obama, Modi, Putin face criticism on a daily basis. Does that mean they should run away from their countries or IISc in this case? Certainly not. I hope my analogy is clear and you can understand it.

3. Mantra of success. How to do science so that others care about it (citations)? In fact at JH, Solomon H Snyder was asked to give a lecture on how to do good science as he is one of the top 5 cited scientists in the world. And he did deliver several lectures to his peers. Even today if you hear his introduction it starts with citations! Believe me, even after winning Lasker he humbly states that he wishes his science can be improved and not measured by some random numbers! He openly states that these numbers are useless. Does that mean he is saying he is useless? No, he meant that these numbers should not be believed as there are several factors contributing to these numbers.

4. The number of citations Tu Youyou received is far less than many of the scientists listed. Just another example that science matters not numbers.

Anonymous said...

"here are many errors and several brilliant young scientists will miss the bus. "

There are no errors, they are made by Shanghai and verified several times. The cut off to appear in the list is 4000 citations (from journals in chemical engineering) in the last five years. You people do not understand all this and constantly criticize. The citations are based on the papers published in chemical engineering journals. It does not matter what the author is from. He/She can be in chemistry or even biology but it is citations based on papers published in journals classified under chemical engineering. Of course, you don;t take time to understand all this.

Numbers may not matter. Scientists who get very high numbers are mostly good scientists because their work is being recognized. Of course, many scientists who work in vague fields may not get cited despite being good. But it does not mean they are bad. It is only that most of the highly cited scientists are good.

But do we acknowledge that there is no one from Caltech in chemical engineering who has received 7000 citations in the last five years but a lonely professor in chemical engineering in IISc who have received 7000 citations in the last five years despite not going abroad or any place to publicize the work ? No, we say the list is useless, citations are useless, numbers do not matter...do we have the courage to say, yes maybe, maybe that these scientists sitting in the corner of world without any money to go abroad etc are able to publish reasonable science that gets cited more than 7000 times in the last five years?

Anonymous said...

Yes, sir, we have a courage and we did mention it in the very first post.

"We should appreciate them for getting citations. Whether they are good or bad does not matter at this moment. Why criticise?"

I believe we did touch the nerve of someone.

The bottom line is good science will get recognized and there is is not direct measure for the good science. Some people argue impact factor, others support H-index and many will go for a number of citations. Some of the Indian scientists are brilliant and are doing excellent work. We all support and appreciate them. In the campus, Prof. Madras has more respect than anyone else.

The origin of specifies is cited for 32886 times in the last 145 years (~226 per year or only 1134 times in the last 5 years).

Anonymous said...

"In the campus, Prof. Madras has more respect than anyone else. "

ROTFL. You have a great sense of humor. I do hope you are not in IISc campus because then you can not appreciate this joke.

Anonymous said...

I was wondering what is the criteria and metrics for this list ? Is it citations received in 2016 alone ? and how come Robert Langer did not make it here ? curios! :)

Anonymous said...

Also, mistakes abound in this list. Prashant Jain - UIUC is not Chemical Engineering but Chemistry. Many many highly cited and accomplished Chemical Engineers (for e.g. Langer) do not make it to the list and many other unknown names make it which is "interesting"!

Anonymous said...

If cutoff to appear in the list (as mentioned by someone above) is 4000 citations in the last 5 years... then clearly this list is full of countless errors! Many people are missing. Samir Mitragotri, Robert Langer, Ravi Kane, Rachel Segalman, Patrick Doyle...pick any ChemE professor's name from a top school and chances they'll have atleast 4000 citations in the last 5 years..
The point about publishing in chemical engineering journals...that's a pretty lose (read useless/pointless criteria) because most chemical engineers cum researchers are doing highly interdisciplinary work these days and publishing in appropriate forums/journals is the key not publishing in some journal identified by the tag of chemical engineering. Chemical engineers can't be boxed in/identified by the criteria of publishing in Chem engg. journals. Further leads one to conclude that this Shanghai list is even more meaningless.
If there are additional criteria that went into making this list, then please clarify! will be happy to be enlightened further! :)

DC said...


"But do we acknowledge that there is no one from Caltech in chemical engineering who has received 7000 citations in the last five years but a lonely professor in chemical engineering in IISc who have received 7000 citations in the last five years despite not going abroad or any place to publicize the work ? "
Dear Anonymous from (September 16, 2016 at 8:54 PM),

Aren't you surprised that no one from Caltech Chem Engg made it to the list ? Did it not make you question the list..Here, please go to google scholar and look up Frances Arnold, James Heath, Mark Davis.. or better still here's the full list : http://www.che.caltech.edu/

Please fact check. Isn't this even more evidence pointing towards how inaccurate and hastily prepared as well as misleading the list is ?

Thanks

Anonymous said...

Please read the methodology of making the list. The tag of chemical engineering journals is for a very large number of journals. Anyway, this is based on normalized citations.

http://hcr.stateofinnovation.thomsonreuters.com/page/methodology

In Essential Science Indicators, all papers, including Highly Cited Papers, are assigned to one of 22 broad fields (the 22nd is Multidisciplinary, on which see below). Each journal in Essential Science Indicators is assigned to only one field and papers appearing in that title are similarly assigned. In the case of multidisciplinary journals such as Science, Nature, Proceedings of the National Academy of Sciences of the USA, and others, however, a special analysis is undertaken.

To determine how many researchers to select for inclusion in the new list, we considered the size of each ESI field in terms of number of authors (as a proxy for population) represented on the Highly Cited Papers for the field. The ESI fields are of very different sizes, the result of the definition used for the field which includes the number of journals assigned to that field. Clinical Medicine, for example, makes up some 18.2% of the content of ESI while Economics and Business, Immunology, Microbiology, and Space Science (Astronomy and Astrophysics) account for 1.8%, 1.8%, 1.4%, and 1.1%, respectively.

For example, people like you mention "Robert Langer" and a few others from Caltech appear in the lists for Chemistry and Biology though they are "chemical engineering". Langer, for example, is under "Biology and biochemistry"

Read the following

I have been named a Highly Cited author in Engineering but my field and departmental affiliation is actually Mathematics. Would you change my designation to Mathematics?

We understand that you identify yourself as a mathematician, but we found your greatest impact, according to our analysis, to be in Engineering as it is defined in Essential Science Indicators. There is no universally agreed field classification scheme, and the use of journals to define fields is approximate at best. The practical advantage of our method is that we can fairly compare individuals against one another in the same consistently defined sphere.

Anonymous said...


Robert Langer is under "Biology and Biochemistry"
Mark Davis is under Pharmacology & Toxicology
Grubbs and Goodard are under "Chemistry"

Read the following

http://hcr.stateofinnovation.thomsonreuters.com/page/frequently-asked-questions

I have been named a Highly Cited author in Engineering but my field and departmental affiliation is actually Mathematics. Would you change my designation to Mathematics?

We understand that you identify yourself as a mathematician, but we found your greatest impact, according to our analysis, to be in Engineering as it is defined in Essential Science Indicators. There is no universally agreed field classification scheme, and the use of journals to define fields is approximate at best. The practical advantage of our method is that we can fairly compare individuals against one another in the same consistently defined sphere.

Anonymous said...

Can you please provide the links for "Biology and Biochemistry", Pharmacology & Toxicology and Chemistry and other rankings ? Thanks!

Anonymous said...

I am trying to post the criteria/methodology for Shanghai MCR but it is not appearing. Maybe it is too long..

For the HCR list,

http://hcr.stateofinnovation.thomsonreuters.com/sites/default/files/content/hcr/archive/2015_HCR_as_of_December_1_2015.xlsx

So briefly, Shanghai is based on Scopus (instead of Web of science for HCR) and for five years (it is ten years for HCR). The criteria is more relaxed in MCR i.e., more people appear in MCR than HCR. HCR is strictly 1% of population while MCR is much more. Thus only 2 Indians appear in HCR but many appear in MCR.

Under both MCR and HCR, people like Langer/Mitragotri and many others like Davis, Grubbs who are actually chemical engineers appear under different specializations.

If you want to go by citations alone and by departmental affiliations, recently the journal has published all faculty from chemical engineering departments who have more than 7500 citations in Web of Science. The list had around 200 faculty including five from India: Pandit, Joshi, Madras, Sharma .. Here is the link to that article (please note that this is behind a paywall)

Anonymous said...

Got it! Thanks.
Funny that Langer is listed in Biology, Mat Sci and Pharmacology but not Chem Engg! OK, So how come some very highly cited researchers still do not make it to any of the categories ? :) including our very own Ashutosh Sharma from IITK..
Now just because Thomson Reuters says that "There is no universally agreed field classification scheme, and the use of journals to define fields is approximate at best. The practical advantage of our method is that we can fairly compare individuals against one another in the same consistently defined sphere." does not mean there isn't a better methodology! and doesn't mean we as scientists should agree to what Thomson Reuters has to say!
Fine, there came a list..loosely prepared and inaccurate on many counts (when objectively assessed)...someone made it..we saw it, discussed it and now it can be ignored! :) no reason to celebrate and go gung-ho over it as in..who's from India..who is from China..who is from Harvard...finding people with >> than XX citations over YY years is not a hard mathematical problem! wonder how Thomson Reuters or Shanghai couldn't get it!

Anonymous said...

Ashutosh Sharma has lesser number of citations than Madras or Pandit. Look it up in web of science or google scholar.

You have to understand what is normalization of citations..go and search in google or talk to Prof. Giridhar Madras. He has written an article on normalization of citations. Of course, you may think he is stupid, so don;t talk to him but the jist is this:

For example, if you publish in Astronomy you will be cited far less than if you publish in Biology. That's why it is normalized.

If you just do ">> than XX citations over YY years" irrespective of the field, you will get mostly biologists or people publishing in biology journals.


This is from MAdras's article

the impact of publications from two scientific fields cannot be compared directly. The average impact of the journals from each of the approximately 120 scientific subfields was expressed by a "coefficient of the scientific subfield" [Scientometrics,16, 478]. This parameter for Biochemistry & molecular biology was, for example, 2.000, which means that the average rate of citation for the journals covering this subfield (6.22 citations per paper) was twice the average citation rate for all the journals referenced by the science citation index (3.11 citations per paper) during the same period. Therefore, if we wish to attribute a "weight" to the impact factor of a journal from a particular subfield, this shouldbe the inverse of the "coefficient of the scientific subfield".
These weighted or "standardized" impact factors allow the comparison of the output in different scientific subfields.

This is for impact factor and not for citations. The normalized citations are explained in Web of Science website.

If you don't want to normalize citations, you can read the other link which just gives chemical engineers who have been cited more than 7500 times..but that also has only five faculty.

But this does not mean that the worth of a faculty is based on citations..it is just that the methodologies followed are clearly stated. One may not agree with the methodology but one has to understand this before criticizing.



Prof. Madras, what is your comment on all these discussions?

Anonymous said...

The main purpose of the post is to show the number of MCR/HCR directly influences the rankings of the institute. It does not matter where the cited researcher appears but the number of cited researchers from that Institute.

You can call this methodology as useless and having no value. But the rankings of institutions are determined by this methodology. therefore, it makes sense to understand the methodology and see whether we can do something about it.

If you are already a top institute such as Caltech, MIT, you don't really need to care but when you are languishing at the bottom of 300, you need to introspect and ask whether we can do something. At the end of the day, you may say that these don't matter but that should be a judgment based on understanding and not emotions.

Thanks for reading

young faculty in IISc.

DC said...

I do not need to talk to Prof. Madras to know that he is stupid. I can get the same information from his posts and publications.

Anonymous said...

Dear DC (post above),
Please refrain from personal attacks and maintain the objectivity and scientific rationale of the discussion. Please put forth your topic-related point objectively nd kindly do not troll the discussion with useless comments. Thanks.

Anonymous said...

Btw, did not find any article on normalized citations by Prof. Madras or through web of science website. Can you please point to it or to a general article explaining the concept/logic behind normalized citations and it's use for listing highly-cited individual researchers (not institutions) in a field ? Thanks

Anonymous said...

Btw, did not find any article on normalized citations by Prof. Madras or through web of science website. Can you please point to it or to a general article explaining the concept/logic behind normalized citations and it's use for listing highly-cited individual researchers (not institutions) in a field ? Thanks

Anonymous said...

Now you have resorted to name calling. So I will call it quits

The HCR is based on 1% of researchers in each field. Therefore, the numbers in each field will vary. Thus the cutoff for chemical engineering in terms of citations will be lower than biology but higher than astrophysics.

Please read the following (and its sublinks) to understand the thresholds

http://ipscience-help.thomsonreuters.com/incitesLiveESI/ESIGroup/fieldBaselines.html
http://ipscience-help.thomsonreuters.com/incitesLiveESI/ESIGroup/citationThresholds.html
http://ipscience-help.thomsonreuters.com/incitesLiveESI/ESIGroup/citationThresholds/thresholdHighlyCited.html

Before I quit, I will give you some links:

Here are some links

http://ipscience-help.thomsonreuters.com/incitesLive/institutionalProfilesGroup/dataCollectionGroup/biblioAnalysis/normalizedCitationImpactScoreCalc.html

This is for an individual, institution or country:

http://ipscience-help.thomsonreuters.com/inCites2Live/indicatorsGroup/aboutHandbook/usingCitationIndicatorsWisely/normalizedCitationImpact.html

An article on field normalized citations:
http://www.sciencedirect.com/science/article/pii/S1751157715300456



http://hcr.stateofinnovation.thomsonreuters.com/page/frequently-asked-questions

Since you belive the following, go ahead with the following:
I believe I have a method that produces a result more consistent with the scientific community’s perception of top researchers in a field. Would you take into account my feedback?
I want to talk to someone at IP and Science in detail about the methods used to generate this new list. How may I do so?


The final new list contains about 3,000 Highly Cited Researchers in 21 fields of the sciences and social sciences. Only two Indians. And there are no other Indians who have been more cited than them anyway !

DC said...

to Anon September 18, 2016 at 9:24 PM

Does the truth bother you?

Anonymous said...

How is Prof. CNR's name missing from the list?

Anonymous said...

Friends, as I read through the comments, I only saw views at the opposite ends of the spectrum. It appears that some folks consider citations as the only important metric, while the others seem to think that citations are useless. We can all agree that citations are one of the important metrics for judging scientific output, although not the only one. It goes without saying that any ranking is admittedly subjective. If the same ranking had not included any scientists, then we would be arguing that the standard of Indian science has dropped dramatically. We should be proud that despite limited facilities, we have so many Indian scientists on this list. Let us be positive, recognize our best scientists, and move our science forward!
- H

Anonymous said...

I used to regularly lurk and post advice on this blog a few years back. I just dropped by after a long gap to check out what's new. Looks like this public service blog is now infested by malicious trolls infected with verbal diarrhoea! Prof. Madras is indeed a man with boundless patience and tolerance. Similarly, credit should also go to iitsriram for his public service on this blog. If it was me, i would have shut this down a long time ago or called in a pest controller.

DC said...

Anonymous Anonymous said... September 20, 2016 at 11:03 PM
How is Prof. CNR's name missing from the list?

Because you can not read ! CNR's name is on the HCR list. The MCR list posted by the blog is only for engineering.

Let us be positive, recognize our best scientists, and move our science forward! - H September 21, 2016 at 4:49 AM

The question is how you can call Prof. Giridhar as a scientist, forget best scientist !

"public service blog"

Did not know that Prof. Giridhar runs public services..does he run a bus for the public also: