Event box

Hallucination Check: Verifying Sources in the Age of AI

Hallucination Check: Verifying Sources in the Age of AI Online

This webinar will equip attendees with the knowledge and tools to navigate the challenges of AI-generated text, specifically focusing on the risk of hallucinated citations. We will delve into the nature of hallucinated citations, where AI models confidently cite nonexistent or fabricated sources. This phenomenon poses significant threats to research integrity, as it can lead to flawed conclusions and undermine the credibility of academic work.

The webinar will explore the implications of citation chaining, where inaccuracies in one source can propagate through subsequent citations, compounding the impact of AI-generated errors. We will discuss practical strategies for identifying and mitigating these risks, including techniques for verifying source credibility, detecting patterns of AI-generated inaccuracies, and ensuring the accuracy of citations in academic research. This session is invaluable for researchers, students, and anyone seeking to enhance their critical evaluation skills in the digital age.

Register below to get a reminder 1 day before the live session. Or just click this link to attend: https://nu.zoom.us/j/92684856654

Related LibGuide: Artificial Intelligence: OpenAI, ChatGPT, LLMs, and More by Daniel Johnston

Date:
Tuesday, February 25, 2025
Time:
3:00pm - 3:50pm
Time Zone:
Pacific Time - US & Canada (change)
Online:
This is an online event. Event URL will be sent via registration email.
Audience:
  Doctoral Students     Faculty & Staff     Graduate Students     Undergraduate Students  
Categories:
  Advanced Library Research      Library Research  

Registration is required. There are 99 seats available.

Event Organizer

Profile photo of Stephanie Johnson
Stephanie Johnson