Every year, around the time when the new academic year begins, the ministry of education releases its annual rankings of government and private higher education institutions in the country. The rankings, which have been published since 2016, are calculated based on the government’s National Institutional Ranking Framework.
The rankings are published across different categories, including engineering, management, medicine and law. In each category, the ministry lists the 200 institutions that in its assessment are the best in the country that year.
The NIRF rankings are a crucial resource for millions of students who have to decide what institutions and programmes to seek admission in each year.
But are the rankings based on a rigorous methodology and consistent data?
This question came to fore last September when an employee of the Indian Institute of Management Mumbai alleged that the institute had inflated numbers pertaining to income, expenditure and faculty strength in the data it submitted to the NIRF. Scroll cross-checked these claims by sourcing data that the institute had submitted to the NIRF, as well as the data published in its annual reports, and found that there were indeed significant discrepancies between the two. Neither the institute nor the ministry of education had responded to Scroll’s requests for comment.
However, the case highlighted one of the fault lines of the rankings system: it relies on self-reported data. Data that the institutions submit is used to assign ranks to them, raising questions about the reliability of the process.
The IIM Mumbai employee’s allegations raised concerns over whether similar discrepancies could be found in data from other leading institutions.
Scroll examined the data submitted to NIRF by all government engineering and management institutes ranked in the top ten. We picked data from 2022-’23, the same year for which IIM Mumbai’s data came into question.
That year, seven Indian Institutes of Management were among the top 10 institutes in the management category. The engineering category featured nine Indian Institutes of Technology.
We compared the data submitted by the institutes to NIRF with the data published in their annual reports. We specifically considered the data in these documents pertaining to income and expenditure.
Our analysis showed significant discrepancies between data submitted for NIRF and data published in the annual reports of institutions.
Most institutes did not respond to our questions about these discrepancies. But professors with administrative experience in public institutes, speaking on the condition of anonymity, admitted there was a widespread lack of clarity about how data should be categorised. One IIT professor said he had heard that “institutions often exaggerate numbers to get higher rankings, even when it comes to their number of published papers”.
Confusion over data
Experts who have analysed the NIRF system have raised concern in the past over the lack of clarity and transparency when it comes to the data submitted for the rankings.
In June 2024, V Ramgopal Rao, a former director of IIT Delhi, and Abhishek Singh from the Birla Institute of Technology and Science published a paper on the rankings and found several flaws and inconsistencies in the system.
“The reliance on self-reported data raises pertinent questions regarding the consistency and accuracy of the information presented,” they wrote.
This was not only because it would be in each institution’s interest to report favourable data. “Institutions varying in size, structure and resources may interpret and report data differently, potentially leading to disparities in the ranking outcomes,” they wrote. “The absence of stringent mechanisms for verifying the accuracy and uniformity of the submitted data introduces an element of uncertainty into the rankings.”
This, they noted, has serious implications on the reliability of the rankings system. “Without standardised reporting practices, the rankings may inadvertently favour institutions adept at presenting data in a favourable light rather than those genuinely excelling in academic parameters,” the paper stated.
The paper also criticised the NIRF system for the lack of transparency about its methodology, specifically about the assessment of financial data.
“To mitigate ambiguity and potential misinterpretations, it is imperative to establish unambiguous and explicit definitions of metrics, especially those which capture financial data,” the paper noted. “Formulating clear and well-defined rules and criteria is essential to ensure a standardised and equitable assessment.”
Comparing data for the IIMs
In the NIRF documents of the IIMs, expenditure is broken down into two heads: operational expenditure and capital expenditure.
In contrast, the annual reports of the IIMs break down expenditure into several heads, including staff payments and benefits, academic expenses, and administrative and general expenses.
The IIT professor mentioned above, who has been involved in submitting data for NIRF, and an IIM professor, who has analysed such data closely, told Scroll that the categories of expenditure mentioned in the annual reports correspond to the “operational expenditure” category of the NIRF documents.
Thus, we compared these two sets of figures between the NIRF documents and annual reports.
We found that only in one instance, of IIM Lucknow, did the two figures correspond exactly. IIM Bangalore reported an operational expenditure of around Rs 287.87 crore to NIRF, which was Rs 70 crore higher than what it declared in its annual report. In the remaining four IIMs, expenditure figures declared to NIRF were lower than those declared in the annual reports.
Thus, the data did not suggest that a majority of these institutes had exaggerated expenditure figures in their NIRF submissions. This was despite the fact that the framework’s methodology rewards institutions for high operational expenditure, as part of a calculation termed “financial resources and their utilisation”. But the data did show widespread discrepancies between these figures and those in the annual reports, indicating a lack of clarity about what data that falls under these heads.

When it comes to income, the IIM NIRF document lists income from three categories: sponsored research, consultancy projects and “executive development programs/management development programs”.
However, the annual reports of the IIMs list income under several heads, including from academic receipts, “grants/subsidies”, income from investments, income from sponsored research and consultancy projects, and income from executive/management development programmes, as well a category termed “other income”.
Where the NIRF documents and the income statements of annual reports contained identical categories, we compared figures directly. When the categories did not match, we examined other sections in the annual reports to locate income from sources mentioned in the NIRF documents, such as sponsored projects. Even in instances where money from sponsored research was listed under sections such as “current liabilities and provisions”, we included it in our calculations, despite three IIM professors maintaining that these entries were not technically categorised as income.
Thus, we compared the total income from sponsored projects and consultancy listed in the NIRF documents, with the highest income or credit shown under these heads in any section of the institutions’ annual reports.
Separately, we compared the income from executive and management development projects in the former documents, with the highest income or credit listed under this category in the annual reports.
We found that of a total of 12 such comparisons across the six institutes, in 10 cases, the income reported to NIRF was higher than that declared in the annual report. The highest such difference was found in the instance of IIM Kozhikode, whose NIRF documents listed an income from executive development programmes of Rs 137.13 crore, which was Rs 60.16 crore higher than the figure declared in its annual report.


“Higher income and expenditure allows institutions to project a bigger scale of operations,” a professor from a private university told Scroll.
Another professor from a public university said, “If institutions are able to garner higher earnings from their consultancy programmes and research, it secures them a higher ranking.”
Indeed, the NIRF methodology rewards institutions for high research funding and consultancy income, and in the case of management institutes, for income from executive and management development programmes.
Scroll emailed all the above institutions, seeking clarity on these apparent discrepancies. As of publishing, only IIM Bangalore had responded.
The institute did not reply to specific queries about figures in the two sets of documents. It noted that a “NIRF Methodology document spells out the ranking formula for Financial Resources Utilisation”. A guidelines document “has further instructions on what kind of expenditure should be included and what are those that are to be specifically excluded”, it noted. It added, “Our reporting is compiled on the basis of the Methodology and Guidelines documents.”
For their annual reports, the institute said, “institutes like the IIMs are governed by the uniform reporting format issued by the Ministry of Education in consultation with the CAG”. It stated, “The Income and Expenditure and Balance Sheet formats and significant accounting policies are spelt out in that document. Our annual accounts are also compiled on the basis of the mandated reporting format.”
Further, it said, “Given this background, when we report relevant data under Financial Resources utilisation for the NIRF evaluation, data is taken from the annual accounts and specific inclusions and exclusions as per the guidelines are made in order to arrive at the final number that is used in the methodology document to arrive at per student values.”
Data for the IITs
Similarly, Scroll also analysed the data submitted by all the IITs that figured in the top 10 ranks of the engineering list for the year 2022-’23. Nine out of the ten ranks were occupied by the IITs – but since data was unavailable for the Hyderabad and Rohtak institutes, we analysed data for a total of seven institutes.
With regard to the IITs too, Scroll compared the sums of operational expenditure mentioned in the NIRF with the total expenditure mentioned in the institutions’ annual reports.
Out of the seven, three institutions stated higher expenditure figures in the NIRF documents. IIT Madras had the largest difference between the figure submitted to NIRF and the figure mentioned in its annual report. In its NIRF submissions, the institute stated that its expenditure for the year was around Rs 1,360 crore, but in its annual report, the figure mentioned was around Rs 727 crore, or Rs 633 crore less.
The other two institutions that stated higher expenditure figures in the NIRF documents were IIT-Delhi and IIT BHU.

When it came to income, the NIRF documents listed income from sponsored research and consultancy projects. As with the IIMs, we looked through the annual reports and compared these figures with figures listed in any section of the annual report under the same category. If more than one section contained an entry under the same head, we used the higher number.
This analysis indicated that four IITs had reported higher income from sponsored research and consultancy projects to NIRF than the figures they had declared in their annual reports.
The largest such difference was seen in the case of IIT Bombay, which showed a total income from these sources of Rs 569.49 crore. Of this, the consultancy income mentioned in the two sets of documents was identical: Rs 99.63 crore. But the declared income from sponsored research in NIRF documents, Rs 469.86 crore in NIRF documents, was Rs 196.52 crore higher than the figure in the annual report.

On the other end of the spectrum was IIT Madras, which in its annual report declared income of Rs 1505.16 crore from consultancy and sponsored projects, which was Rs 656.74 crore higher than that declared in its NIRF documents.
Only IIT Kharagpur explicitly mentioned a specific category under the income statement of its annual reports that listed income from “Sponsored/Research/Consultancy Projects”. While the institute listed income from these sources of Rs 227 crore in its NIRF documents, it listed income of Rs 116 crore from them in its annual reports.
“I’ve also heard that institutions often exaggerate numbers to get higher rankings, even when it comes to their number of published papers,” the IIT professor said. “There is no proper verification that happens to ensure all the data institutions are sending in is true.”
Another professor from an IIT noted that “if NIRF made definitions clearer”, confusion over data categories could be avoided. “Some of these discrepancies happen because there is not enough clarity,” the professor said. “Sometimes when we have doubts, we ask counterparts in other IITs to find out what data to submit.” The professor suggested that NIRF conduct sessions or workshops to clearly explain all the categories and what they entail.
After the IIM Mumbai staffer flagged concerns about the institution’s data, the administration suspended two of its staff for six months – one, the whistleblower themself, and another staffer who the adminstration believed was involved in the process of drafting the letter. Staff at IIM Mumbai said that the author of the letter faced pressure from the institute and decided to give up on the matter. After their six-month suspension ended, they were absorbed back into the administration.
However, the other employee, a professor at the institution, continued to be targeted by the administration, other staffers told Scroll. His six-month suspension was extended to nine months and in January to his surprise, he was handed a termination letter. Both the reasons for his suspension and termination were that he had allegedly “maligned the name of the institution”.