The physical space of the MSK Library is permanently closed to visitors as of Friday, May 17, 2024. Please visit this guide for more information.
The chatbot ChatGPT is one of a growing number of generative AI tools using natural language processing, an AI domain, to respond to user queries. As it has gained increased attention, concerns about its use in academic, scientific, and other settings have grown.
The information provided by ChatGPT is not always accurate. Learn about:
ChatGPT use can lead to AI-plagiarized writing. However, tools to detect AI-written content have been found to be unreliable.
On a larger scale, ChatGPT and other AI tools use unsustainable amounts of energy and fresh water.
There is the promise that generative AI, like ChatGPT, can cut down on the amount of time it takes to complete a review, and it can! But you have to know the strengths and limits of the tool you are using.
AI can be helpful for generating ideas that you vet and adapt. For example, ChatGPT might be beneficial to inform your research question and help you brainstorm inclusion and exclusion criteria and what terms you want in your search strategy. What it is not able to do is build a systematic search strategy. It may also suggest citing articles that do not exist, known as a hallucination.
AI can also be built into software. For example, Covidence, a systematic review software used at the MSK Library, uses machine learning to sort records by relevance and classify randomized controlled trials. They are also beta testing using team-entered inclusion and exclusion criteria to auto-remove records. PubMed uses AI to sort by best match and to index records.
Before using something, explore its parameters, and be skeptical about anything that seems too good to be true or removes all human intervention from a task.
A selection of journal/publishing guidelines and policies surrounding ChatGPT and similar AI tools. This is not a comprehensive list; always read the guidelines of your journals of interest.