Identifier in the map of Russian science. Scientists criticized the project "Map of Russian Science

May 21, 2012 by the Minister of Education and Science Russian Federation Dmitry Livanov is appointed. In his first public speech, he voiced the intention of the Ministry of Education and Science (MES RF) to conduct a comprehensive audit of the research and development sector, including RAS institutes, state scientific organizations and higher educational establishments... This statement can be called the birth of the "Map russian science».

Unfortunately, due to the events around the reform of the Russian Academy of Sciences, this project somehow got lost and, in our opinion, did not receive proper attention from the IT community. We offer you a small retrospective: the path of the project from concept to implementation.

The aimless path turns blue before me
A long way, dug by streams,
And then - darkness; and hidden in this darkness,
Soaring destinies Fatal Actor.

Alexander Blok, October 1899

Part 1: competition

The project "Map of Russian Science" (http://mapofscience.ru/) was officially announced in December 2012. On the eve of the Ministry of Education and Science of the Russian Federation, a competition was held for the implementation of research work on the topic "Formation of a system for assessing and monitoring the results of research activities of organizations and scientists for a regular assessment of the state of the field of science." The initial (maximum price) of the contract is 100 million rubles. Funding for the project was provided within the framework of the federal target program "Research and Development in Priority Areas of Development of the Scientific and Technological Complex of Russia for 2007-2013" (Competition for 2012, Measure 2.1, Stage 11, Lot 1).

The following organizations took part in the competition:

  1. Institute for Systems Analysis Of the Russian Academy sciences;
  2. PricewaterhouseCoopers Russia B.V. (hereinafter - PwC);
  3. Scientific Research Institute of Automatic Equipment. Academician V.S.Semenikhin;
  4. Moscow state University instrumentation and computer science;
  5. Moscow State technical University named after N.E.Bauman;
  6. Lomonosov Moscow State University;
  7. INEC-Information Technologies.
Baumanka was not allowed for a formal reason: an outdated extract from ERGUL. Whether it was negligence in the preparation of the documents, or some other factor played, it will hardly be possible to find out.

The private consulting company PricewaterhouseCoopers Russia BV won the tender, offering the contract value of RUB 90 million. and the term of its implementation is 90 days.

It should be noted that Moscow State University proposed to develop a "Map of Science" for about half the price - for 50 million rubles, but received from the competition committee low points for quality and qualifications, taking second place. It looks strange, considering that the University has some useful developments in this area: the information and analytical system "Science-Moscow State University" has recently been launched, which collects and analyzes the publication activity of employees.

In the official press, the decision to choose PwC as the contractor was explained by the desire of the customer, represented by the RF Ministry of Education and Science, to conduct an “audit of Russian science” by an external organization that has nothing to do with the scientific environment.

PwC spent 40 million to buy data from the Web of Science (hereinafter - WoS) database from Thomson Reuters and 15 million to deploy technical infrastructure. Plus, the system requires, according to the estimates of the performers, 10-15 million rubles. per year for support.

Unfortunately, we could not find in the public domain the state contract with the winner of the tender, as well as the terms of reference. (Attention, question: does this not contradict the competition legislation?) I would very much like to look at the amount of work that was announced on paper. Although from a formal point of view, this is no longer so important, since the project is formalized as R&D: its result can be just a report, the implementation of at least a prototype is completely optional.

Part 2: what did you want to do?

“Our goal is to identify by name those scientists and those small research teams (that is, laboratories, research groups) that are already working in Russia at a high international level. We will be doing this project with one simple goal - to understand where Russia remains competitive today, which fields of science are promising for us today, where we have a chance to make a breakthrough in the future. And, most importantly, to support - specifically - those people, those scientists, those laboratories that deserve this support, ”Dmitry Livanov said in his interview to the TV channel“ Prosveshchenie ”.

In the document prepared by PwC, the project itself is described by the following thesis: “The map of Russian science” should become the basis for making informed management decisions in the field of research activities of scientists and organizations ”; also set specific goals:

  1. "Inventory" of the current state of Russian science;
  2. quick access on demand to current and correct indicators of Russian science;
  3. analytical tools for making informed management decisions;
  4. identification of the most authoritative experts and research teams for their targeted support;
  5. comparing the level of development of science in Russia with other countries and identifying growth points;
  6. ensuring the transparency of management decisions.
In the same document, three main pillars of the project are declared: data coverage, data quality and functional.

Data coverage was planned to be provided through:

  • international sources: publications, reports at international conferences, patents, publications;
  • russian sources: publications, patents, grants, R&D, publications;
  • indicators of both fundamental and applied science.
As a result, there should have been a "unique coverage base, providing the most complete possible coverage of the results of research activities of Russian scientists."

The data quality meant:

  • clearing the source data to exclude misrepresentations of proper names;
  • the use of a mechanism for correcting data by scientists and organizations themselves;
  • use of unique identifiers of scientists and organizations.
As a result, it was expected to achieve "an unprecedented level of data accuracy that will help ensure the visibility of Russian science to the international community."

Finally, the functionality included:

  • tools for comparing and identifying growth points in science;
  • construction of reports on the specified parameters for scientists, organizations, scientific directions;
  • flexible search and filtering of data;
  • identification of informal groups of scientists.
It is "an advanced set of analytical tools and their flexibility to address management challenges."

And now laymen, probably, it will not be very interesting, but since Habr is a technology blog, we considered it necessary to show the work plan and architecture of the system being built. There are three slides in total.

Barrels and arrows


Work plan


Architecture!

Part 3: what happened?

First of all, we suggest that the readers of Habr themselves evaluate whether the declared functionality corresponds to what was implemented. “Map of Russian Science” is available at this link http://mapofscience.ru/. Can this be considered a role model? Is this project unique not only in Russia, but also in the world? Try to answer these questions yourself.

Not so long ago, the main page of "Maps" was updated. A red flashing block has been added to tell us, “Attention! The system is in trial operation. " This was probably due to the large number of negative reviews about the project. If you look at the appendix to the competition, you will notice that the deadline for this project is the end of 2013. Thus, it seems unlikely that anything drastically corrected in this version for "trial operation", and we can proceed to assessing the entire project.

The goal of the project (if anyone did not remember) was to "identify by name those scientists and those small scientific teams (that is, laboratories, scientific groups), which are already working in Russia at a high international level. " In our opinion, it is impossible to do this using the proposed tools.

Data coverage
Rather, it did not work out than it did. Two main databases - the Russian Science Citation Index (RSCI) and the Web of Science (WoS) - are presented in the range of 2007–2012, and even with a reservation regarding WoS. Data is simply not of current interest (top bar of the year) and incorrect (lower bar of the year) for the stated main goal. And this despite the fact that access to the WoS database (part of it related to Russian scientists) cost the state 40 million rubles (practically without the right to transfer data to the Ministry of Education and Science).

For the rest of the sources declared in the project, there is also, to put it mildly, some incompleteness. After a long search for leading Russian scientists, it was not possible to find their books, monographs, as well as information on participation in R&D and grants. It can be assumed that these data are either simply not provided in the science map, or they could not be prepared.

Data quality
In our opinion, this task was key in the implementation of the "Map of Russian Science", it was exactly that technological part, which was the main difficulty and had to take away the main effort and time. Simply put, the task of the whole "Map" was to reduce, clean and correctly link the data. Or, as follows from the transcript of the inception meeting of the expert groups on the implementation of the Map of Russian Science project, the key block of work was “cleaning and integrating data from various sources”. And, unfortunately, this part didn't work out at all. The data were not combined at all: we are offered either RSCI or WoS. In fact, we are presented with just an interface to these two bases, with not very intelligible functionality. It so happened that it was to the quality of the data that most of the complaints from the scientific community arose. We tried to put them together (but we must have missed something - there are a lot of complaints):
  1. the use of a classifier (rubricator) of scientific areas that is not applicable to the current areas in Russian science;
  2. arbitrary selection (grouping) of scientific institutions by headings;
  3. lack of control over the level of random coincidences;
  4. inconsistency of numerical indicators with real values \u200b\u200b(by the number of scientific institutions, by the number of publications in WoS and RSCI, by the number of patents, by the citation index), errors when operators transfer data from one database to another;
  5. incorrect selection of "leading" institutions or researchers (top 5), based on the use of arbitrary features that have no connection with each other (either WoS data, or RSCI, or alphabetical order, or rubricator, etc.);
  6. incorrect (incorrect) spelling of full name a researcher in both Russian and English writing systems;
  7. incorrect (incorrect) affiliation of the researcher;
  8. lack of separation of namesakes and their correct correlation with the scientific direction and scientific institution;
  9. lack of information about the divisions of organizations (including the faculties of large universities, such as Moscow State University and St. Petersburg State University).
Functional
Not everything is good with the functionality either. For example, here is how the data correction mechanism is implemented: "Correction of technical errors noticed by users is carried out through the provision of a paper version of the comments, certified by the seal of the organization in which the user works." Meanwhile, the aforementioned transcript says: “The main principle of the project is to minimize efforts on the part of scientists. It is assumed that most of the information in personal accounts will be filled in automatically. "

Until now, not a single person has been found who can clearly explain what information the so-called "heat map" carries. The only, in our opinion, interesting feature is the link “collapse the map” at the bottom right, its functionality is at least unusual and contains the lion's share of self-irony.

We tried to register in the system to see how it looks from the inside. We were lucky enough to create an account about a month before writing this review, because currently registration of new users for some reason no longer works (it seems that all polymers were lost).

To register, a scientist must provide his full name, year of birth and email, and then go through the "verification" procedure. This can be done in two ways: by mail or through the so-called SPIN code.

Verification by mail is done through a "manual loop". To avoid this, we decided to master the innovative SPIN code. Most likely, not every reader is familiar with this concept (among us there were also such ones), so we will decipher it.

The SPIN code is the author's personal identification code in SCIENCE INDEX, an information and analytical system built on the basis of data from the Russian Science Citation Index (RSCI).

We sent an application for a SPIN code by filling out a huge form on the RSCI website with several dozen fields and classifiers (in just some 20 minutes), and successfully received the code after two weeks of waiting. Rejoicing in our achievements, we entered the SPIN code in the scientist's profile, after which the "Karta" informed us that this information requires verification (not again!). Two weeks have passed since the entry, and the account has not yet been verified.

If you had enough patience, then you got to your personal account.

Personal Area


Edit in personal account there is nothing special, since it contains only the data that you entered during registration. The authors of the system imply that the scientist will tell the rest about himself by filling in a considerable number of fields. Note that in Western systems (ResearchGate, Academia.edu, Google Scholar), after registration, the user receives an almost ready-made profile that the system has prepared for him, automatically collecting data from various sources. He can only confirm them and, if necessary, supplement them.

It is doubtful that scientists will voluntarily use a system in which registration alone takes more than 4 weeks. One thing is obvious - "minimizing efforts on the part of scientists" did not work.

The official unsatisfactory assessment of the project is also in line with our conclusions. “This is a model, this is not even a pilot project,” noted the Deputy Minister of Education and Science of the Russian Federation Lyudmila Ogorodova (a model for 90 million).

Part 4: the reaction of the scientific community

This will be the most concise part of our story. The reaction from the scientific community was sharply negative.

Part 5: reasons for failure

As follows from the official position of the Ministry of Education and Science of the Russian Federation and numerous reviews of the expert scientific community, the "Map of Science" turned out to be unsatisfactory. Whether it meets the goals of the fulfilled state contract, we will not argue due to the lack of information about it. Another thing is important - how could such a situation be avoided? In our opinion, the key point in this story is that all the data on which this public information system was built is not open.

And here we would like to touch upon a very urgent problem of open data in science. They simply do not exist. But they would have been open, perhaps, and there was no need for such a state order. Any professional developer interested in open data and science could implement the Science Map. Moreover, there would be several such “cards” with the corresponding demand from the state and the scientific community.

Let's take a look at the list of alleged Russian sources for the Map of Science:

  1. articles in Russian and foreign journals (NEB);
  2. russian and foreign patents (FIPS);
  3. grants (FGBNU NII RINKTSE, RFBR, RGNF);
  4. r&D and R&D reports (CITiS);
  5. dissertations and abstracts (CITiS);
  6. book publishing (Russian Book Chamber);
  7. information about scientific organizations and their departments (including universities and their departments).
The vast majority of the above sources were formed at the expense of the state budget and it is not clear why these data are not public.

Part 6: how to fix the situation?


On May 21, 2012 Dmitry Livanov is appointed Minister of Education and Science of the Russian Federation. In his first public speech, he voiced the intention of the Ministry of Education and Science (MES RF) to conduct a comprehensive audit of the research and development sector, including RAS institutes, state scientific organizations and higher educational institutions. This statement can be called the birth of the "Map of Russian Science." Unfortunately, due to the events around the reform of the Russian Academy of Sciences, this project somehow got lost and, in our opinion, did not receive due attention from the IT community. We offer you a small retrospective: the path of the project from concept to implementation.

The aimless path turns blue before me
A long way, dug by streams,
And then - darkness; and hidden in this darkness,
Soaring destinies Fatal Actor.

Alexander Blok, October 1899

Part 1: competition

The project "Map of Russian Science" (http://mapofscience.ru/) was officially announced in December 2012. On the eve of the Ministry of Education and Science of the Russian Federation, a competition was held for the implementation of research work on the topic "Formation of a system for assessing and monitoring the results of research activities of organizations and scientists for a regular assessment of the state of the field of science." The initial (maximum price) of the contract is 100 million rubles. Funding for the project was provided within the framework of the federal target program "Research and Development in Priority Areas of Development of the Scientific and Technological Complex of Russia for 2007-2013" (Competition for 2012, Measure 2.1, Stage 11, Lot 1).

Part 5: reasons for failure

As follows from the official position of the Ministry of Education and Science of the Russian Federation and numerous reviews of the expert scientific community, the "Map of Science" turned out to be unsatisfactory. Whether it meets the goals of the fulfilled state contract, we will not argue due to the lack of information about it. Another thing is important - how could such a situation be avoided? In our opinion, the key point in this story is that all the data on which this public information system was built is not open.

And here we would like to touch upon a very urgent problem of open data in science. They simply do not exist. But they would have been open, perhaps, and there was no need for such a state order. Any professional developer interested in open data and science could implement the Science Map. Moreover, there would be several such “cards” with the corresponding demand from the state and the scientific community.

Let's take a look at the list of alleged Russian sources for the Map of Science:

  1. articles in Russian and foreign journals (NEB);
  2. russian and foreign patents (FIPS);
  3. grants (FGBNU NII RINKTSE, RFBR, RGNF);
  4. r&D and R&D reports (CITiS);
  5. dissertations and abstracts (CITiS);
  6. book publishing (Russian Book Chamber);
  7. information about scientific organizations and their departments (including universities and their departments).

The overwhelming majority of the above sources were formed at the expense of the state budget and it is not clear why these data are not public.

Part 6: how to fix the situation?

Over the past two weeks, the Russian scientific community has been actively discussing the merits and demerits of the "Map of Russian Science". However, they hardly talked about the merits. We asked the scientists who participated in the testing of Karta to give their opinion on this product.

Olga Moskaleva,
member of the working group formed by the Ministry of Education and Science to address the methodological issues of the project:

The information system "Map of Russian Science" (www.mapofscience.ru), put into trial operation on November 12, 2013, caused a large number of negative responses, mainly due to the lack of official information about the project from the Ministry of Education and Science (see, for example, -).

Last week, almost simultaneously, statements appeared on this matter from the ONR Council and from the Commission for Public Control over the Progress and Results of Reforms in the Sphere of Science. They begin similarly, with a listing of what is bad in the "Map of Science", what is wrong there in terms of publications, the number of employees in institutes, their positions, etc. Recommendations based on these comments are given directly opposite. ONR urges "researchers to register on the CSR website in order to take an active part in testing the project and in developing proposals for its improvement." The commission advises "to refrain from referring to the system" Map of Russian Science "to make any changes there."

So what, in such a situation, should researchers who find themselves in a situation of directly opposite directions to do? As noted in the blogs, scientists are independent people and will decide for themselves what to do. Then why and for whom are these statements accepted? Where to put a comma in relation to the Science Card - "to execute can not be pardoned"?

The excitement that has arisen is largely provoked by the statements of the Ministry of Education and Science that the "Map" is practically ready and is about to be used to evaluate scientific organizations. Meanwhile, at the very beginning of this project, the Map was positioned not at all as a tool for assessment, but rather as a tool to help scientists themselves, reducing the time spent filling out applications for grants and further reports on them. It was assumed that thanks to this system it would be easier to find and select experts in various fields of knowledge, including for holding competitions for grants, forming dissertation councils, etc.

What is the history of the issue and what is actually in the "Map of Russian Science" now?

The creation of this project was first announced in the summer of 2012, shortly after the appointment of Dmitry Livanov as minister. At first nothing but a general idea was voiced, and it could be assumed that the ministry was going to buy ready-made analytical systems from Elsevier or Thomson Reuters. This is, first of all, the SciVal line of tools, which carry out a very complex analysis of publications presented in the Scopus database based on the analysis of social media. It was created by Elsevier and is widely used around the world to assess the scientific productivity of organizations, countries and regions. A completely different approach (using normalized citation rates to compare different areas of knowledge and publications from different years) is implemented in InCites, developed by Thomson Reuters. It uses the Web of Science (WoS) database.

A brief description of these tools and their capabilities can be found in or on the websites of manufacturers. SciVal now has already built maps of science both for Russia as a whole, and for a fairly large number russian universities and scientific institutions RAS and RAMS. InCites also has many elaborate datasets for individual Russian organizations, and the data for Russia as a whole is regularly updated.

Of course, based on WoS or Scopus data, it is difficult to draw conclusions about the state of science in Russia, since most of the scientific publications of Russian scientists in these databases are not indexed, especially in social and humanitarian fields. However, the Russian Science Citation Index, where the humanities are fairly well represented, is less suitable for evaluating the publications of natural scientists.

The idea of \u200b\u200bbringing Russian publications and publications in international journals into one database and their joint analysis suggests itself, but this is a very difficult task. In addition, to evaluate scientific activities there is clearly little information only about publications in scientific journals - there are books, patents, speeches at conferences, grants, etc. It is this ambitious goal that was set by the Ministry of Light in one information system for all this data and conduct joint analysis, while building services for registration of applications, reports and selection of experts.

After consultations of the Ministry of Education and Science with representatives of scientific organizations and universities, which have information Systems research support, the terms of reference for the creation of the "Map of Russian Science" was formulated and a tender was announced to conclude a contract worth 90 million rubles, which was won by the consulting company PricewaterhouseCoopers.

In March 2013, the main details of the concept, as well as some of the results of the discussion on a specially created discussion platform, were presented at a meeting of the working group. At the meeting, a presentation of the PricewaterhouseCoopers project took place, after which the members of the working group and members of expert groups in the fields of science were given the opportunity to access the prototype of the working interface "Science Maps" for making comments and suggestions. In June, the second stage of testing "Maps of Science" began, already with partially loaded data from WoS and RSCI.

At present, trial operation of the industrial version of the "Map of Russian Science" has begun, in which each scientist can obtain individual registration data and check and correct data on his publications, grants, and patents.

And here, against the background of tension associated with the reorganization of the academies of sciences, all the events described at the very beginning began.

What exactly is the reason for such a sharply negative attitude of the majority to the still unfinished product? The main complaints are as follows:

  • Inconsistency of information about the number of publications of a scientist in WoS or RSCI with what is seen in the "Map of Science", as well as incorrect "linking" of publications to scientists
  • Incorrect information about organizations as a whole - the number of employees, the number of academics, doctors, etc.
  • The emergence of non-core scientific areas in the information about the institutes.

Comparison of Science Map with WoS and Scopus, often made in blogs, is incorrect in principle, since this project is not a citation index, and Google Scholar does not have a search by organizations.

All these mistakes are completely inevitable and easily explainable. The "Map of Science" is loaded with information from the WoS, selected on the basis of affiliation with Russia and Russian organizations, and not by the names of scientists. Therefore, to begin with, each scientist who compares his data in WoS and in the "Map of Science" needs to remember or check which affiliation indicated in the absent in the "Card of Science" articles.

Most likely, it turns out that a foreign organization is indicated or there is no affiliation at all. In principle, such publications could not get into the uploaded array of publications. If publications from WoS are attributed to another scientist or a scientist has "multiplied" into several different profiles, then it would be good to ask yourself the question: “What did I personally do to ensure that all my publications in WoS were collected in one profile, not confused with publications of the same names ? " Indeed, for this, a system for registering authors ReseracherlD has long been created, which greatly helps to avoid such errors in the database.

Nevertheless, as of today, only 11,472 authors from Russia are registered in ResearcherlD, with a total number of scientists over 300,000 (and the Map lists even more than 600,000, but this includes students, postgraduate students and foreign co-authors). For comparison: in Italy 11,245 scientists are registered in Germany - 11,733, in Great Britain - more than 20 thousand, and in China - more than 36 thousand.

As for the incorrect assignment of articles to organizations, an attempt to find the organization's publications in WoS often turns out to be an almost insoluble task due to an overly creative approach both to indicating affiliation and due to the abundance of title options on english language... This is compounded by the constant reorganizations, mergers and acquisitions, which make it difficult to determine to which organization a given publication may belong.

Now in WoS, a fairly large number of united organization profiles (Organization-Enhanced) have been created, merging all the names of the organization into one record. Only FIAN, offended by 1 article on vegetable growing, has 72 different names in the combined profile, and there is no guarantee that this is a complete list. The Kurchatov Institute does not have such a unified profile at all, and St. Pavlova, who does not have a subscription to the Web of Science, turned out to be included in the profile of the Institute of Physiology. Pavlov RAS ...

A completely similar situation with the data in the RSCI: in the Science Index system, the author is intended for the same with respect to the RSCI as ResearcherlD in relation to the Web of Science, 180 thousand scientists are currently registered, but most of them are actively engaged in checking their publications and citations only in 2013. 380 organizations are registered in a similar system for organizations, and automatic linking of publications to scientists and organizations, even without the aggravating factor of incorrect translation or transliteration, is also not always possible and requires verification by organizations.

Regarding the discrepancy between the data of the Map of Science and the number of publications in the RSCI, there is another factor that practically no one takes into account - the RSCI also loads data from Scopus, but for the Map of Science only publications in Russian scientific journals were submitted.

Data on positions, titles and affiliation to a particular organization in the "Science Map" are taken exclusively from the profiles of authors registered in the Science Index system. Registration in this system is carried out on the basis of the existing registration of the RSCI user. So if the scientist started using the RSCI since 2003 and indicated his position at that time and when filling in additional information upon receipt of the SPIN code did not update the data on the changed degrees, titles and positions, then today's Corresponding Member, according to the available information, "Science Maps" may well be considered an associate professor and candidate of sciences.

Selective acquaintance with the data of the "Science Map" shows that the information presented in it on organizations that reconcile the composition of the organization's profile in the Web of Science and are registered in the Science Index system - the organization coincides to a fairly good degree, and for organizations that do not have a profile in Organization-enhanced in Web of Science, there are quite a few bugs. In principle, these errors cannot be completely corrected either by the developers or by the ministry - this requires the participation of scientists themselves and representatives of organizations.

None of the critics paid attention to much more significant points - the lack of joint analysis of data from the RSCI and the Web of Science (for which all this work was started), as well as a complete lack of clarity with further destiny of this project. It is not clear how and on what basis the data will be updated, with what frequency. Will scientists need regular intervention to validate the data? Or, ideally, it could be organized by synchronizing information in the profiles of the authors of Science Index and ResearcherlD or ORCID, the fields for which are provided when a user is registered in the Science Map?

What has been done is technically working, graphs are being built, the available data is analyzed within the capabilities implemented to date, and even in this state, the functions of services and a unified data warehouse on publications, patents, grants, etc. will be very useful. Provided that the data is reconciled and regularly updated, the system can be a universal supplier of data for various reports, but for analytical functions, the capabilities are still clearly insufficient.

So what to do with the Science Map now - correct or ignore? Which of the recommendations to follow - ONR or RaceCom? The existing data errors are inevitable, they need to be corrected, one cannot do without the participation of scientists and organizations. However, all this makes sense only if there is a clear answer to the questions about further data update and further refinement of the analytical capabilities of the Science Map.

Andrey Tsyganov, member of the ONR:

In my opinion, the list of main complaints about the project "Map of Russian Science" speaks precisely of the success of this project at the present stage. Indeed, in this project you can find:

  • different information about the number of publications and citations of a scientist in WoS, RSCI and "Map of Science";
  • registration of the Russian and translated versions of articles in the personal card of the scientist;
  • incorrect information about organizations in general, for example, the number and composition of employees;
  • a list of all, including non-core, scientific areas in information about institutes, etc., etc.

All this is a consequence of the fact that the developers set themselves the goal of automatically collecting information and placing all the collected data in one place. They, not being experts, do not take responsibility for correcting, processing and interpreting information officially obtained from three databases: RSCI, Web of Science and, as planned, Scopus. The fact that the developers are so careful with every grain of information and just collect everything in one place is the main achievement of this project at the present time.

The main question is what exactly the community of scientists trusts to do, relatively speaking, "officials" and what does not trust them to do, that is, what scientists should do only themselves, not trusting anyone else.

If we say that “officials” must independently correct information about our publications, about our citations, this means that “officials” have the right to do this, not relying on the opinion of scientists themselves, or on the opinion of institutions, or on the opinion of experts. Should “officials” be given such a right?

For example, if we require “officials” to remove duplicates, that is, we give the “officials” the right to decide for themselves which version of an article, Russian and English, should be on the site, then we trust the “official” to decide for himself which publication is important, and which is not important, without taking into account the opinion of both the author of the publication and representatives of the institute where the work was performed.
If we demand to remove “extra” authors from the list of authors of publications of the institute, this means that we give “officials”, and not official representatives of institutes, the right to decide for themselves who worked, who was invited to work temporarily, and who did not work at the institute at all. But then do not blame me if the "official" instantly extends this right to the future and will decide for himself who will work in this institute and who will not.

Likewise, if we do not want “officials” to have the right to tell institutes what scientific areas to pursue, then we cannot give them the right to decide what scientific areas the institute has been engaged in over the past five years. Let them automatically enter all the directions that were indicated in the publications of authors from this institute, in alphabetical order and nothing more! Even to prioritize these areas, and not alphabetically, only representatives of the institute, and not “officials” or anonymous experts, should have the right.

So the main question for the "Map of Science" is not about the "curves" of the data, but about who will process, update and interpret this data. In my opinion, “officials” cannot be given unnecessary rights. Processing, analyzing and interpreting open data from publications is too dangerous a tool to voluntarily pass it into the hands of non-specialists.

And the point here is not in the "Map of Science", well, instead of it, they will "evaluate" us according to the Web of Science, according to the RSCI, according to Scopus, or according to all professional databases combined. The point is not in data, but in the rules for processing and evaluating information on publications and, most importantly, in justifying the legitimacy of the application of certain evaluations in various fields of knowledge.

Therefore, the community of scientists, without being distracted by trifles, should first of all develop strict, detailed, based not only on references to the great past and understandable for the whole society recommendations on the methodology, processing and interpretation of data on publications in various fields of knowledge, until others did it. ... There is no more time to fight windmills, the time has come to answer the same eternal question: who has the right to what, and what will have to be paid for these rights?

Alexey Ivanov, member of the ONR:

My main complaint is not to the technical shortcomings of the "Map of Science", of which there are a lot at the moment, but to a more fundamental thing: the question is who should have the right and the ability to correct the initial data. It is quite rightly noted that this can only be done by the institutions themselves and the scientists themselves. In fact, for the first time the term "Map of Science" sounded in one of the orders of the Society of Scientists (ONR) to the then newly appointed Minister Livanov. In this order, the ONR noted that such a map can only be built from below, and in order to stimulate scientists to fill in the data on themselves, it was proposed to announce a competition for personal scholarships for scientists, issued automatically when a certain threshold value is exceeded, which was not known a priori. but it was estimated at 5-7 publications over three years, taking into account that about 10 thousand people were rewarded (http://onr-russia.ru/content/grants-stipends-3 072 012). In this case, a "win-win" situation arose. Scientists were presented with a carrot, and in return the Ministry of Education and Science would receive a verified, albeit not completely complete, database of publications with an exact link to a person. Unfortunately, the ministry took a different path. It began to make a "Map of Science" from above. As a result, we came to where we started - the “Science Card” can be filled in only from the bottom. However, scientists do not foresee any gingerbread, but the fact that they will be beaten with a whip is believed immediately and irrevocably.

In this situation, I personally find myself in a situation of distinct cognitive dissonance. On the one hand, I have always said that reforms should begin with an assessment of the real state of affairs, and the "Map of Science" is a step in this direction. Accordingly, as a member of the ONR Council, I share the idea that the data in the "Map of Science" must be brought into line with reality, and no one, except the scientists themselves, can do this. On the other hand, the reform is already under way and life experience shows that the authorities do not care about the real state of affairs. The main thing for them is to observe some formality for the external public. Do you have a card? There is. Did the experts discuss? We discussed it. Did the scientists make corrections? They brought in. And the quality of the "Map of Science" itself is the tenth thing. Therefore, I perfectly understand the recommendation of the Public Oversight Commission not to have anything to do with the Science Map until it is clear how it is planned to be used.

She promised that the "Map" "will be the main, systemic tool for making decisions in different areas."

A few days ago, the "Map of Russian Science" started working on the Internet, and scientists began to look for themselves in this "guide" and find amazing things.

“This“ Map ”is a form of disgrace. The "map" gives out information that is completely untrue. Do you know where the best astronomers work with us according to the Map? At the Novosibirsk Institute of Nuclear Physics, which in his life did not do any astronomy.

And at the Landau Institute for Theoretical Physics they do everything in the world - up to biology.

The version of the system that the academics criticized in this way is a test version - this is written in red and white on the site of the Map of Russian Science itself. Currently, the official developer of the system is PricewaterhouseCoopers, which, on assignment, purchases data for the Map mainly from two sources - Thomson Reuters (the Web of Science database of scientific articles) and the Russian Science Citation Index. These two private companies supply data on Russian scientists under formalized contracts.

What has been done so far is the initial automatic processing of the data, an attempt to facilitate their further manual verification.

The claims made to the system are related to the complexities of the technical plan. “Many claims are made by people who have not figured it out and believe that the system should indicate all their publications for the entire period of their life. This is not at all the case, now we are talking about 2007-2012. In addition, the accuracy of the data for 2007, due to technical limitations of WoS, often does not allow to correctly compare the authors with the organizations in which they work, "a source close to the Ministry of Education and Science told Gazeta.Ru.

In addition, the data from these databases themselves are based on information received from scientific journals, including Russian ones, many of which do not always format publications correctly.

He explained the source and where the Fianovskoe "gardening" comes from.

The fact is that the Web of Science and the RSCI assign topics to journals, and not to individual publications. Therefore, each publication receives a topic assigned to a separate journal. In the Map, topics are assigned to organizations according to the topics of publication, therefore, one work of a scientist published, say, in a journal with an agricultural topic, is sufficient for the entire FIAN to be assigned the appropriate direction.

Most of the allocated 90 million rubles. goes to the purchase of data from WoS and RSCI, in addition, the data are presented by the Russian Book Chamber and FIPS at. Those who fear that the Map data will soon be used in evaluating scientific organizations, the developers of the system reassure: “When the organizations report back, they will need to submit not the Science Map data, which is not ready yet, but data directly from the Web of Science and Scopus ".

According to the source, the correct operation of the "Map" will also depend on the scientists themselves, for whom an option is provided that allows them to correct their own profiles in this system and report any discrepancies or lacunae noticed. “The success of this process will depend on what incentives scientists have to use the Map. For example, the profiles of scientists in the system can be used when applying for grants to simplify the preparation of applications, ”said the source.

True, such explanations do not really suit offended scientists. “You shouldn't treat people like that. Why do we have to rake them ... I won't say what? Do you see how it works? It works because it was created outstanding peoplesolving the most difficult task. This is a task for mathematicians, applied specialists, those who analyze databases, and not those who have gathered and decided to somehow do everything, ”Parshin said.