AI tools for research

In this guide, you will find a list of artificial intelligence (AI) tools based on large language models (LLMs). The guide summarizes selected tools ranging from general chatbots to specific tools for literature search, mapping, and subsequent analysis as well as tools for data analysis. The aim is not to provide an exhaustive list for each category, but to offer a curated selection based on each tool’s apparent levels of quality and accessibility. Keep in mind that LLMs can sometimes hallucinate and that they generally work best in English. It is therefore advisable to approach the tools and their outputs critically. If you are interested in using AI for academic writing, visit our Tools to support writing guide.

Need help with something? Have a recommendation for an AI tool? Contact us by email or schedule a consultation.

Artificial intelligence is transforming the way we approach scientific knowledge. It opens up new possibilities for data analysis, resource discovery, and personalized support. At the same time, however, it brings with it challenges related to data protection, transparency, reproducibility of outputs, issues of authorship, and ethical and academic guidelines.

Ethical guidelines for working with AI

As a general rule, it is important to use artificial intelligence tools transparently and to acknowledge use of such tools or even cite a tool properly. This is especially true when working with tools that have an integrated chatbot. The European Commission’s “Living guidelines on the responsible use of generative AI in research” set out basic recommendations for the use of AI:

  • The user (student, researcher) is ultimately responsible for the AI outputs used.
  • When working with AI tools, it is important to be aware of their limitations (inaccuracy, bias, hallucinations, sycophancy).
  • AI should be used transparently, especially in cases where AI has significantly influenced your output.
  • Data security must be ensured and sensitive data safeguarded.

How do I acknowledge AI use?

Acknowledging AI use may be required in cases where you have relied on an AI tool for language corrections, stylistic editing, or brainstorming. However, there is no across-the-board rule regarding this. Always check your university or publisher’s AI policy or guidelines. If your institution does not provide a template for acknowledgement statements, you could use the following template provided by the University of Newcastle instead. This is just one example of what AI acknowledgement may look like:

"I acknowledge the use of ChatGPT (https://chat.openai.com/) to provide a background summary of the essay topic that I used to inform my basic level of understanding. I also generated a list of synonyms to help me expand my search and suggest some key articles on the topic, which were searched for in Library Search.“

When and how do I cite AI outputs?

If you include any AI outputs (images, graphs, code, text excerpts) in your text, you must clearly acknowledge the AI tool that generated the output by means of citation.

The citation should (at a minimum) include the following basic information:

  • Name of the AI tool
  • Model/version
  • Prompt
  • Date of tool use
  • Link to tool/chat

Below you will find citation templates for commonly used citation styles.

“How to cite AI?” prompt. ChatGPT, GPT-4o, OpenAI, 31. 7. 2025, https://chat.openai.com/chat.
OpenAI. (2025). ChatGPT (March 14 version) [Large language model].https://chat.openai.com/chat.

In this case, it is recommended to include only a footnote without a bibliographic entry.

OpenAI. Response to “Tell me how to fix a flat bicycle tire.” ChatGPT-4, September 30, 2024.https://chatgpt.com/share/66fb0ff3-7280-8009-93a9-d956f412390b.
OpenAI. ChatGPT. Online, generative AI. Version GPT-4o mini. 15. 5. 2025. Available from: https://chatgpt.com/ [cit. 2025-05-16].

Selected institutional policies

Before using AI for academic work, you should always check your university/faculty or academic journal publisher's policies first to ensure that you are maintaining academic integrity and not violating any applicable rules.

 

 

 

AI chatbots generate texts according to internal parameters (e.g., a dataset trained on a large amount of data organized according to statistical probability) based on your input. In addition to enabling information search, which is still far from perfect, they can help with writing, language editing and text analysis, programming, and other tasks.

We do not recommend relying entirely on the accuracy of information generated by any chatbot. Ideally, check any original information sources cited and then draw your own conclusions directly from them. In addition to random hallucinations, these tools may also have difficulty understanding more complex queries, have limited knowledge of recent events, or provide irrelevant sources. Because they are trained on available data, chatbots may exhibit varying quality outputs for different fields.

If you decide to use text generated by a chatbot, make sure that you are not violating or the rules of your institution , especially with regard to plagiarism and proper citation.

For more information, please feel free to arrange a consultation or contact us by email.

When to use generative AI?

  • Preparing to search for academic resources: creating keywords and search query using logical operators
  • Text summarization: quick summarization of extensive scholarly articles or other documents
    • This is not a complete replacement for critical reading. For more information on using artificial intelligence for text summarization, see our Reading for writing guide.
  • Language support: grammar correction, stylistic suggestions, or text translation
  • Structuring and ideas: suggestions for outlines of articles, presentations, grant proposal, brainstorming
  • Individual support: quiz creation, pre-trained agents/educational features (Tutor Me, Study & Learn), study plan creation
  • Coding and programming: editing and structuring code; explaining chunks of code

When should I not use generative AI?

  • If university, faculty, or publisher rules prohibit it.
  • For activities that require high accuracy and verifiability (generating bibliographies and citations, gathering statistical data).
  • Whenever the process itself matches the actual purpose of work that cannot be replaced by chatbot output (e.g., learning to write your own code, completing a school assignment using one's own mind, formulating one's own research goals).
  • If you do not have sufficient knowledge to verify the correctness of output. Keep in mind that you are the person ultimately responsible for all AI-generated outputs.

What to watch out for

  • Paywall and access to a model: Many chatbots are only freely available to a limited extent and may have daily limits. Make sure that the quality of the output from the version you're using is adequate and that the daily limit is sufficient for your needs.
  • Differences between versions of models: Generative AI models (such as ChatGPT) have different versions (e.g., ChatGPT models) that excel at different tasks (writing code, logical operations, writing coherent text). We recommend checking which version best fits your particular task. Different chatbots from different companies (e.g., Gemini, ChatGPT, Claude) may vary in terms of output quality. Comparisons of tools are provided by so-called "language model benchmarks" that gauge performance such as Humanity’s Last Exam or Epoch.ai.
  • Bias and hallucinations: Since generative AI tools are trained on huge amounts of data, poorer quality or biased data containing social stereotypes about race, gender, or other hot-button issues may have been used for training. Because such tools use predictive algorithms, these tools can hallucinate (that is, provide misleading or inaccurate information). Tools may also have censorship filters to avoid certain topics or to respond in a “desirable” manner, however incorrect or misleading. In such cases, the responses may not reflect reality, but rather the values of the company operating the model.

What to consider when choosing a tool

  • Specific use: When choosing a chatbot, you need to know why you want to use it. Currently, commonly used chatbots offer similar features but output quality may vary. Therefore, always compare chatbots and verify the quality of outputs for your specific task.
  • Functionality, integration with other tools, and ability to work with uploaded documents: Test whether a chatbot works correctly in your language and if outputs quality is good enough. Test and compare individual chatbot features such as: accesibility on the internet, ability to write code in a desired language (Python, R, Java), ability to work with uploaded documents, integration with cloud storage (e.g., OneDrive, Google Drive).

The lmarena.ai website can help you decide about what chatbot to use by comparing the responses of any two models to a selected prompt. It also provides user ranking of models according to various categories, such as text generation, web development, image creation, and more.

 

 

 

AI detectors are tools that attempt to determine whether a text has been generated by artificial intelligence, particularly chatbots based on large language models (LLMs). It is important to emphasize that any judgement about a text is indicative only since detection capabilities are often most reliable in English than other languages and can vary significantly. The quality of detection by individual tools can also change over time, depending not just on how the tools evolve, but also on the evolution of the AI model.

We recommend using AI text detectors as supporting tools but not as proof of AI use.

For more information, please feel free to arrange a consultation or contact us by email.

How AI detectors work

AI detectors combine several methods to analyze a text and determine the likelihood of AI (as opposed to human) generation:

Statistical analysis focuses on words used and their frequency of occurrence as well as sentence structure and other textual patterns. These variables are compared to texts written by humans. Natural language processing is also used to measure values such as perplexity (predictability of text) and burstiness (irregularity and alternation of short and long sentences). Lower values of both are often found in AI-generated texts.

Machine learning detection uses patterns and matching probability to compare human-written texts to AI-generated texts and may search for watermarks or traces left behind by AI (such as hidden metadata or non-printable characters).

AI detector shortcomings

AI detection tools still struggle to achieve accurate and reliable results. The accuracy of the results is affected both by the detector selected and by the language used by a tool. Problems include:

  • Refers to a situation in which an AI detector incorrectly identifies human-written text as generated by artificial intelligence.
  • False positives often have worse consequences than false negatives, especially for individuals who may face accusations of fraud and/or damage to their reputation.
  • At present, false positives tend to be more common among neurodivergent individuals and non-native English speakers. This is because they often use sentence structures different from texts used to train the detector.
  • Texts created before the advent of artificial intelligence are also often mislabeled.

Some texts produced by artificial intelligence may be easily spotted by humans. Given the way AI generates texts, it is possible to identify words and other linguistic patterns that an artificial intelligence tool significantly overuses. However, moderate use of such expressions does not necessarily mean that artificial intelligence has been involved.

 

 

 

Tools in this category allow you to ask questions in natural language and get answers directly from the content of academic articles. Their goal is to quickly identify relevant publications and provide answers to your questions. Unlike literature mapping tools, they focus on searching for information from full texts and abstracts and providing specific answers or summaries. The tools draw mainly from freely accessible citation and bibliographic databases (e.g., OpenAlex, Semantic Scholar).

We recommend you always verify any AI generated results using original sources and supplement AI searches with traditional library search tools (e.g., library cross-database discovery tools, Google Scholar, Web of Science, Scopus, and the like) to obtain a reliable and complete overview of relevant academic literature.

For more information, please feel free to arrange a consultation or contact us by email.

Scite_

Scite_ primarily focuses on how a given source has been cited. For the selected publication, it classifies citations according to whether they represent a neutral mention, support, or questioning of any part of it, or whether it is a self-citation. It also alerts you if there have been additional corrections or withdrawals of the article after publication. Furthermore, based on a query, it searches for relevant literature to answer it. For greater transparency, it displays a specific paraphrased section from the source used.

Registered NTK patrons can Scite_.
You are not required to create an account or log in to search Scite_. However, an account is needed to access many Scite_ features, such as notifications, assistant history, and dashboards. The first registration (or login for an existing account) with any email address to Scite_ must be done from the NTK network (NTK-simple). Create an account via Sign Up and log in. After that, the account will work for remote login as well via Log In at Scite_.

  • Access: Registration recommended to access chat history log (login in with ORCID)
  • Pricing: Freemium – access to the paid version with registration in the NTK network
  • LLM version(s) used: GPT 4o-mini, GPT o3-mini, Claude 3.5 Haiku
  • Content searched: Scholarly journals (based on publisher agreements), PubMed, open access articles
  • Key features: User customization of search strategy using filters, citation analysis of articles, AI assistant, table with extracted data (methods, results, etc.), browser and Zotero extension
  • Limitations: Varying article quality, limited access to articles

Elicit

Elicit searches for the most relevant sources and creates a structured report with a detailed description of the search strategy it used to find them. Furthermore, similar to SciSpace, it summarizes what it considers to be the most relevant articles based on your query and provides an extended list of sources through its "Find papers" function.

  • Access: Registration required
  • Pricing: Freemium
  • LLM version(s) used: Not specified
  • Content searched: Semantic Scholar, OpenAlex
  • Key features: Search and summary of relevant articles based on a query, report creation, query quality assessment
  • Limitations: Varying quality of article searched, export records only in the paid version, monthly/weekly limits to the free version

Perplexity

Finds and summarizes relevant scientific articles by combining results from the internet with academic databases content. Provides citations directly in its response to a query. Can provide a quick overview of a topic or create a basic report.

  • Access: Limited access without registration, registration recommended for chat history
  • Pricing: Freemium
  • LLM version(s) used: Sonar (proprietary model), Claude, ChatGPT, Gemini
  • Content searched: Internet and academic databases (not specified)
  • Key features: Wide selection of models, citation sources, detailed search settings
  • Limitations: Daily quotas for free use of advanced features, content sources are not transparent, complex verification of output accuracy

SciSpace - Literature Review

Based on a natural language question or a keywords-and-operators query. SciSpace searches for a list of sources and generates a summary for each source retrieved. It compiles a short text on the query topic based on what is considers the most relevant sources to be. It also provides an extended overview of the list of sources with basic information (sample, methods, results).

Do you have any recommendations for other tools? Send us an e-mail.

 

 

 

Mapping tools use academic literature metadata (i.e, citations, author[s], keywords, references, abstracts) to create citation graphs and visualize relationships between publications. Unlike literature search tools, they do not primarily integrate full texts.

Due to limited integration with key academic literature databases, these tools cannot yet replace traditional literature search via library databases. We therefore recommend that you first create your own collection of literature via searches conducted using e.g., library discovery tools, Google Scholar, Web of Science, or Scopus and only then use mapping tools to help you navigate the literature and discover additional sources.

For more information, please feel free to arrange a consultation or contact us by email.

Inciteful

Inciteful is a free and does not require registration. It maps connections between selected articles and it also allows you to create a collection, suggesting similar sources or linking two randomly selected publications based on their citations. Unlike other tools, Inciteful provides detailed information about the created literature collection such as the most represented authors, journals, or institutions.

  • Access: No registration required
  • Pricing: Completely free
  • Methods used: Citation network analysis (machine learning), bibliometrics
  • Content searched: OpenAlex, Open Citations, CrossRef, Semantic Scholar
  • Key features: Paper Discovery, Literature Connector, record export (.ris, .bib)
  • Limitations: Functionality and depth of analysis limited when compared to other tools

Litmaps

Litmaps uses databases and freely accessible metadata to create citation maps. Unlike other tools, Litmaps visualizations can be annotated, edited, exported, and shared. Based on its collection of literature, it offers additional recommended sources. Features that distinguish this tool from similar tools are only available in the subscription version.

Open Knowledge Maps

A free tool for exploring and mapping literature based on shared citations, references, or keywords. Unlike other tools, it also filters literature into thematic clusters.

  • Access: No registration required
  • Pricing: Completely free
  • Methods used: Citation network analysis (machine learning), bibliometrics
  • Content searched: PubMed, Bielefeld Academic Search Engine (BASE)
  • Key features: Categorization of articles into topics
  • Limitations: Limited depth of analysis, recommended articles limited to 100 most relevant, does not recommend further reading

Research Rabbit

A completely free tool that suggests items based on articles you have added to your collection. Links between literature are formed by relationships (authors, citations and references, keywords) and it recommends other similar research results in the field. Similar in function to the free version of Litmaps.

Do you have recommendations for other tools? Send us an e-mail.

 

 

 

Tools in this category help process and interpret text data using artificial intelligence. They allow you to quickly identify key topics, summarize large documents, compare different texts, explain terms or reveal relationships between terms. Some also search for additional sources.

Using text analyzers can save time, but the content must always be checked for accuracy. For more information on the usefulness of AI in text analysis, see our Reading for writing guide.

We also recommend checking whether you are allowed to upload a document to a tool. In this regard, you must comply with Copyright law (§ 39c, § 39d) as well as your institution's rules.

For more information, please feel free to schedule a consultation or contact us by email.

ChatPDF

ChatPDF provides a summary of any text or a freely accessible article (found via URL) that you upload as a PDF file.

  • Access: Registration recommended, limited access without registration
  • Pricing: Freemium
  • LLM version(s) used: GPT-4o
  • Knowledge base: Content of the uploaded document
  • Key features: PDF summarization, answers to questions about content, creation of PowerPoint presentations or flashcards based on a document
  • Limitations: Daily limit for free version – 20 messages per day

NotebookLM

AI research assistant from Google. Through an integrated chatbot, it allows you to summarize and analyze uploaded documents. Uploaded documents create a knowledge database for the model, significantly reducing the risk of hallucinations. In addition to text analysis, the tool can be used as a knowledge database for creating and organizing notes.

  • Access: Google account required
  • Pricing: Freemium (can be used free of charge, subscription version increases limits)
  • LLM version(s) used: Gemini 2.5 Flash
  • Knowledge base: User-uploaded documents
  • Key features: Summaries, notes, mind maps, interactive podcasts
  • Limitations: Document reading not that user-friendly (i.e., PDFs are converted to plain text), personal note export is complicated

SciSpace - Chat with PDF

SciSpace, in addition to literature search, offers summaries of entire articles or sections, for which it either finds additional sources or paraphrases in various ways. It draws on freely available metadata and openly-accesible publications available on the internet. SciSpace also analyzes files that you upload yourself.

  • Access: Limited access without registration, registration recommended to preserve chat and search history
  • Pricing: Freemium
  • LLM version(s) used: GPT 3.5 (free version), GPT-4o (subscription version)
  • Knowledge base: Open access articles and metadata (OpenAlex, Semantic Scholar, Google Scholar), user-uploaded articles
  • Key features: Explanation or summary of selected sections of text, tables, images, recommended reading for selected sections of text, conversion of articles into podcasts, conversation with a collection of articles
  • Limitations: Quality of articles can vary, restrictions in the free version, advanced features only available with a subscription

Do you have any recommendations for other tools? Send us an e-mail.

 

 

 

Data analysis find tools, unlike general chatbots, are developed specifically for data analysis. They can help to process large amounts of data efficiently, find connections within datasets, and also facilitate dataset interpretation. To use them, you do not have to have knowledge of a programming language (Python, R) or statistical software (that said, to verify outputs, you must have this knowledge).

When using data analysis tools, you must understand how the tool processes data, how the data is secured, and what happens to the data. Be especially careful if you intend to upload original or sensitive data to this kind of tool.

For more information, please feel free to arrange a consultation or contact us by email.

Julius.ai

This tool allows you to perform data analysis and statistical evaluation using natural language (without the need for programming). Users upload their own data files and generate descriptive statistics, visualizations, and regression models using text queries. A distinct advantage of this tool is the option to create and save steps for recurring analytical activities. According to its developers, data uploaded to the tool by users is not used to train the model (more information on data security here).

  • Access: Registration required
  • Pricing: Freemium
  • LLM version(s) used: OpenAI a Anthropic
  • Knowledge base: User-uploaded documents (.xls, .csv, .json, SPSS file, .sql)
  • Key features: Descriptive statistics, visualization and regression models, automation using analytical blocks, option to connect to cloud storage
  • Limitace: Advanced features behind paywall, data security concerns

Powerdrill

PowerDrill.ai allows you to upload data files (CSV, Excel, TSV, PDF, Word) or connect to an SQL database and ask questions in natural language. The tool automatically converts your natural language query into analytical operations (e.g., filtering, comparison, trends). Unlike Julius.ai, it can generate reports and presentations based on data uploaded to it. The tool protects user data in accordance with GDPR and ISO 27001.

  • Access: Registartion required
  • Pricing: Freemium
  • LLM version(s) used: Not specified
  • Knowledge base: User-uploaded documents (.xls, .csv, .tsv, .sql)
  • Key features: Data cleaning and analysis, creation of reports or presentations based on data, prompt templates
  • Limitations: Advanced features behind paywall (daily and monthly quotas), data security concerns

Do you have any recommendations for other tools? Send us an e-mail.

 

 

 

Books at NTK

University guides

  • Tilburg.ai: A portal managed by Tilburg University. Offers an overview of recommended tools and a large number of guides on how to use artificial intelligence for academic work. It also includes examples of AI use in research and teaching, and recommendations for the ethical and effective use of generative AI.
  • Umělá inteligence UPOL: Palacký University Olomouc portal provides a comprehensive overview of artificial intelligence, its use in education and research, ethical recommendations, and potential risks.
  • Gen AI Guidebook: University of Geneva guide containing detailed information on generative AI, its ethical and security risks, principles for responsible use, and guides for creating effective prompts.
  • Reaktor AI: Jan Evangelista Purkyně University in Ústí nad Labem portal provides an overview of news, guides, and trends in the field of AI.
  • AI CUNI: Charles University in Prague guide offers up-to-date information on AI, including a list of training courses, webinars, and educational activities

Blogs and video tutorials

  • Andy Stapleton: YouTube channel mapping new AI tools for academic work.
  • One Useful Thing (Ethan Mollick): Blog with commentary on the progress and limitations of artificial intelligence.
  • The Effortless Academic (Ilya Shabanov): A blog focusing on searching and processing literature using AI tools.

Online courses

  • Become a prompt engineer: A series of articles on effective prompting and various prompting techniques.
  • Ethics of AI: A course focusing on the ethics associated with the development and use of AI.
  • Elements of AI: An introductory course providing basic information about AI, its practical applications, and social impacts. The course is available in many languages.
  • Learn Prompting: A comprehensive guide to prompting, from basic to advanced techniques. It also offers an introduction to generative AI and its possible uses.

Do you have any recommendations for further educational materials? Send us an e-mail.

 

 

 

Editor: Adam Urban Last modified: 23.9. 2025 19:09