Microsoft Academic 2.0: Is It Any Better?

Microsoft released Microsoft Academic 2.0 in 2016 as its new search engine for academic searches. According to its website, Microsoft claims to streamline and refine the search process for researchers by taking better advantage of semantic-based searches. It provides researchers with points of reference that include authors, printed material (including presentations at conferences), and subject matter. This new research tool uses Microsoft’s own Bing search engine and touts the ability to “discover and index” new information while growing with the immense amount of available data (currently, more than 210 million entries). The application also uses the Microsoft Academic Graph (MAG), to show citation relationships among publications and authors. Using an additional user-friendly interface called “Academic Knowledge API”, the user can combine the indexing power of Bing with MAG to receive a histogram of related publications, journal entries, presentations, and authors.

What Others Say

According to Sven E. Hug and Martin P. Brändle in their recent blog, “Microsoft Academic 2.0 might outperform other commonly used search engines such as Google Scholar, Scopus, and Web of Science.” Given its ability to perform semantic searches and to analyze citations, the authors refer to this new application as becoming a “bibliometric superpower.” The authors suggest that the semantic search engine is such that it provides a more accurate result. As users become more familiar with semantics-based queries, they can refine and filter their searches to streamline their results. As mentioned, the user must use the Academic Knowledge API ($0.25 per 1,000 queries) to take full advantage of the capabilities of the new application. This API provides citations and frequency distributions from the database’s well-structured metadata.

In short, the authors suggest that Microsoft Academic 2.0 combines several important, user-friendly, and relatively inexpensive features from various other academic research search engines such as

  • broad coverage, as with Google Scholar;
  • structured and rich metadata, as with Scopus and Web of Science; and
  • a social network for academics, as with ResearchGate.

Comparisons to Other Research Applications

In her November 2016 blog entitled “Microsoft Academic: Is the Phoenix Getting Wings?” Professor Anne-Wil Harzing compared Microsoft Academic 2.0 to three other research search engines. Harzing is a professor of international management at the University of Middlesex with one of her focuses being the quality of academic research. Harzing sampled 145 academics that have careers in one of five disciplines (life science, general science, engineering, social science, and humanities) to compare query results and coverage of the new application with Google Scholar, Scopus, and Web of Science. She compared citations, coverage, and h-index information.

Microsoft’s citations were shown to be similar to those of Scopus (97 percent) and Web of Science (108 percent) but less than those queried using Google Scholar (only 59 percent). Its output encompasses a wider range than both Scopus and Web of Science but is not as comparable to Google Scholar. When comparing the h-index, which is a measure of an author’s impact based on their most cited papers, Microsoft Academic 2.0 was comparable to Scopus but only 77 percent that of Google Scholar. It again outranked Web of Science at 113 percent of its h-index. To compare citation numbers, Professor Harzing noted that Microsoft Academic includes only citations that can be validated and drops those deemed not credible. Given this information, its citation counts were higher than those of both Scopus and Web of Science and similar to those of Google Scholar across all five disciplines. In fact, the number of citations from Microsoft Academic was actually 12 percent higher than those of Google Scholar in the life sciences, but much lower than those of Google Scholar (by 69 percent) in the humanities. It appears that the new application is outperforming in the biological science fields but not so in the social sciences. This would be something to consider when you perform your searches.

Limitations and Resolutions

According to Hug and Brändle, the findings suggest that the new program is somewhat disappointing because more is expected from a database that gets the majority of its data from web pages. That is, it is presumed that the Microsoft database was compiled using feeds from the largest publishers, after which the metadata was indexed; however, with continued revision and growth, the application should provide a broader coverage. As is common, most new programs and applications are not without their initial glitches. Microsoft Academic 2.0, is no different. Initially, the application did not provide tutorials or instructions, and many first-time users were confused by the small icons or symbols because they weren’t defined. Many were also not aware that a string of words or a phrase could be used instead of a one- or two-word query.

With regard to the quality of the data, most journal articles were fairly consistent in listing the number of authors, but only about 89.5 percent had the publication years correct. In addition, conducting semantic queries takes some practice. With more than 50,000 fields of study in the database, performing a query based on traditional research fields instead of learning the tricks to semantic queries might prove time-consuming instead of helpful.

To help researchers use this new application, Microsoft is integrating a social network for academics so that they can find helpful information and instructions from colleagues and other users. Be sure to continue checking the Microsoft Academic 2.0 website for updates and instructions.

Rate this article

Rating*

Your email address will not be published.

You might also like
X

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • Q&A Forum
  • 10+ eBooks
  • 10+ Checklists
  • Research Guides
[contact-form-7 id="40123" title="Global popup two"]





    Researchers' Poll

    Which among these would you prefer the most for improving research integrity?