This tutorial covers the hallucination phenomenon observed in generative AI models like autoregressive models and sequence-to-sequence models. The tutorial also provides a bird’s eye-view of the approaches to mitigate hallcuinations.