A Critical Look at NotebookLM

NotebookLM, Google’s AI note-taking app, is freely available for students to use, but should they?
by LSA Learning & Teaching Technology Consultants

Given all of the hype and discourse in recent years surrounding generative AI, it is imperative as educators that we have a grasp on not just what opportunities or problems AI creates, but also on the specific tools that students have access to. Examining these tools critically and understanding how they work is vital to navigating this new educational landscape. One such tool is NotebookLM, made freely available by the university. NotebookLM uses Google’s Gemini models to create a custom note-taking assistant based on user inputted sources and links. NotebookLM ingests documents, slides, and multimedia uploaded by users and creates summaries with citations, answers questions, and produces multimedia materials about the subject. It is intended to be a study assistant, to help make connections between source documents, organize thoughts, and generate new ideas. Our initial research into using NotebookLM has been both intriguing and discomforting. However, before we get into the research and implications of using NotebookLM, it is important to know what it is capable of.

 

To test out the model’s capabilities, we created a few different Notebooks within Gemini to see what it would produce. In one notebook, we took fifty documents that covered the readings of one course. In another, we put two textbooks from the same subject to make connections between them. And in the last one, we input links to pages and websites related to LSA Technology Services. Once doing so, we are greeted by three panes on the dashboard for each notebook - the source materials, the chat, and the studio. The source panel contains the ingested source materials as well as an option to “research” new items by writing a short prompt and allowing Gemini to find new materials to add to the notebook. For example, in our textbook notebook, we asked it to generate a deep research report into the overall discipline. It came back with a lengthy report with over thirty citations as well as twenty new sources that it suggested to add to the notebook.  The chat panel works similarly to other LLMs, however it is trained on the materials that have been ingested into the notebook. At the bottom of the panel it suggests prompts to ask the chat to create connections between the source materials. These responses will contain links to specific citations in the source documents and can be saved as notes for future reference. Finally, the studio panel contains generative options that create guides and multimedia based on the source documents. There we were able to generate AI voiced podcasts, video overviews, flashcards, quizzes, comparative analysis reports, short answer practice exams, and more. Each of the items could either be generalized for all of the sources or specifically targeted by a user entered prompt. One of the most powerful items the studio can generate is a mind map. NotebookLM plots out the topics and categories from the source documents and links them together in a map that can navigate to specific sections of the source documents related to the topic or category. New items are being added regularly to the tool, such as the recently added infographic and slide deck options.

 

Evaluating NotebookLM’s output revealed intriguing and discomforting results in equal measure. Since the models are drawing information from the source documents and providing, the hallucination rate feels much lower than other LLMs although that would require research to prove. The podcasts, blog posts, and videos produced by the studio panels were surprisingly engaging but required careful prompting to be anything more than superficial summaries of the materials. We encountered the most errors in these when we were using websites as an example, since it filled in the information and policy gaps that were not present on the site with generic sounding but benignly incorrect information.  When investigating the responses in the chat panel, we asked a handful of instructors to evaluate the answers that NotebookLM generated to short answer questions commonly asked on take home exams. The consensus was that the answers contained all of the relevant connections needed for a passing grade, but lacked the deep analysis or novel ideas that would have been necessary for full marks. More research would need to be done in other disciplines to see if this is true more broadly. If you are interested in lending your voice, feel free to reach out to us to conduct further research. Additionally, since the writing of the models depends so heavily on the source materials, many of the tell-tale signs of AI writing were not present in the answers. The writing is more formal and factual, containing almost none of the flourishes that over emphasize or sound promotional that we see in other models. One distinct place where NotebookLM does not meet expectations is in its generative infographics and slide deck options. While still in beta, these graphics require precise prompting to achieve mediocre and inauthentic results that are often filled with typos and incorrect information.

 

The implications of using NotebookLM are more difficult to quantify than other AI tools, simply because it does more than other AI tools and is trained solely on the documents and sites that are ingested into it. On one hand, it appears to be an even more efficient way of producing dishonest materials more difficult to detect due to the direct citations and lack of tell-tale AI language. And currently, the multimedia that it produces has no valuable insights and it is prone to errors. However, as we have been testing NotebookLM, it has shown itself to be a powerful tool for grasping a subject. The mindmap and quiz functions are helpful ways of organizing and reinforcing the materials of a course, and the answers that the chat window gives are detailed and link directly to the source material for verification. In the wrong context, this could be an easy way to dishonestly finish a take-home exam or research paper. In the right context, this can be an invaluable resource for students just starting in a new discipline or those looking to find links across a variety of sources.

 

If you would like to speak with an instructional consultant about using NotebookLM, or about AI usage in the classroom, you can request a consultation.

 

Email
Release Date: 03/12/2026
Category: Learning & Teaching Consulting; Teaching Tips
Tags: Technology Services

TECHNOLOGY SERVICES

G155 Angell Hall, 435 South State St, Ann Arbor, MI 48109–1003
734.615.0100
LSATechnologyServices@umich.edu 

Technology Services Contact Center Chat