Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"
-
Updated
Apr 24, 2024 - Python
Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"
Build accurate and secure AI applications to unlock value faster
Leaderboard Comparing LLM Performance at Producing Hallucinations when Summarizing Short Documents
A curated list of trustworthy deep learning papers. Daily updating...
✨✨Woodpecker: Hallucination Correction for Multimodal Large Language Models. The first work to correct hallucinations in MLLMs.
[ACL 2024] Benchmarking the Hallucination of Chinese Large Language Models via Unconstrained Generation
Attack to induce LLMs within hallucinations
Official repo for SAC3: Reliable Hallucination Detection in Black-Box Language Models via Semantic-aware Cross-check Consistency
Initiative to evaluate and rank the most popular LLMs across common task types based on their propensity to hallucinate.
Code for ACL 2024 paper "TruthX: Alleviating Hallucinations by Editing Large Language Models in Truthful Space"
Codes related to the paper "On hallucinations in tomographic imaging"
An Easy-to-use Hallucination Detection Framework for LLMs.
Official implementation for the paper "Detecting and Mitigating Contextual Hallucinations in Large Language Models Using Only Attention Maps"
DCR-Consistency: Divide-Conquer-Reasoning for Consistency Evaluation and Improvement of Large Language Models
mPLUG-HalOwl: Multimodal Hallucination Evaluation and Mitigating
Code for Controlling Hallucinations at Word Level in Data-to-Text Generation (C. Rebuffel, M. Roberti, L. Soulier, G. Scoutheeten, R. Cancelliere, P. Gallinari)
Code for PARENTing via Model-Agnostic Reinforcement Learning to Correct Pathological Behaviors in Data-to-Text Generation (Rebuffel, Soulier, Scoutheeten, Gallinari; INLG 2020)
[ICML 2024] Official implementation for "HALC: Object Hallucination Reduction via Adaptive Focal-Contrast Decoding"
Add a description, image, and links to the hallucinations topic page so that developers can more easily learn about it.
To associate your repository with the hallucinations topic, visit your repo's landing page and select "manage topics."