Skip to Main Content

Emerging AI Tools for Literature Review: Comparison of GenAI Tools

This guide consolidates the teaching materials for library workshop Emerging AI Tools for Literature Review.

Comparison of GenAI Tools

(updated 15 Dec 2024)

Update log:

  • 13 Dec 2024 - Perplexity new feature Spaces - allows to specify which sources (files, websites) to search within
  • Nov 2024 - Primo Research Assistant upgraded their model to GPT-4o-mini
  • Sep 2024 - Elicit added a feature for screening papers for systematic review; Primo and Web of Science both released their AI Research Assistant
  • 28 Aug 2024 - Scite upgraded their model to GPT-4o
  • 26 Jul 2024 - Elicit removed 5000 credit limit - Basic Plan now unlimited to everyone
  • 3 Jul 2024 - Elicit added Advanced Search (search by journal, author, DOI, OA status, citation count, etc.)
  • 26 Jun 2024 - GPTs (e.g. Dimensions GPT, Consensus GPT, SciSpace GPT) are free to all ChatGPT users now)

Comparing Scite, Elicit, Consensus, Scopus AI in more detail

This table compares the four tools for their capabilities in generating literature reviews. Learn more from this post: Trust in AI: Evaluating Scite, Elicit, Consensus, and Scopus AI for Generating Literature Reviews (20 Mar 2024)

  Scite (Assistant) Elicit Consensus (Copilot) Scopus AI
Access & Fees HKUST access
(subscribed by library)
https://elicit.com
(5000 free credits, life time)
https://consensus.app/
(20 credits per month for advanced AI features)
Currently on trial
(till end of Mar 2024)
Role in assisting literature review AI chatbot that can generate literature review with real citations in context AI research assistant that can generate literature matrix and a review with real citations AI academic search engine that can generate literature review with real citations AI feature add-on in Scopus that can generate literature review with real citations from Scopus
Data sources for training the models
  • Abstracts from Semantic Scholar (200M papers)
  • Full-text from OA articles
  • Citation statements from non-OA articles (under publisher agreements)
  • CrossRef
  • Abstracts from Scopus (94M works)
Unique features
  • Provide context to citations: supporting or contrasting or mentioning
  • Evidence from non-OA articles
  • Extract original text from papers for source verification
  • Literature matrix, with self-defined info columns (e.g. sample size, population, study design)
  • “Search first, AI second”
  • Label articles with study type, journal reputation, etc.
  • Extract original text from papers to answer the question
  • Consensus Meter” for Yes/No questions to show distribution of supporting and contrasting evidence
  • Study Snapshot” to extract sample size, population, etc.
  • Topic experts (most productive authors in the field)
  • Concept map
  • Foundational papers (most cited relevant papers)
No. of references included in the summary ~10

4 (8 for Plus users)

Literature matrix can include more articles

10 5-8
Search filters
  • Publication year
  • Journal
  • No. of papers to consult
  • Publication year
  • Study type
  • Contain keywords in abstract
  • Has PDF (OA)
  • Publication year
  • Study type
  • OA
  • Journal SJR Quartile
  • Domain
Not available
Export references Export to CSV, RIS, BibTeX

(Plus users only)

Export to CSV, RIS, BibTeX

Export as CSV Cite as APA, etc. BibTeX Export to CSV, RIS, BibTeX, Plain text
Save chat history Auto save Auto save Need to save search and retrieve Not available
© HKUST Library, The Hong Kong University of Science and Technology. All Rights Reserved.