Epistemic diversity across language models mitigates knowledge collapse

By: Damian Hodel, Jevin D. West

Published: 2025-12-17

View on arXiv →
#cs.AI

Abstract

This research explores how maintaining epistemic diversity across multiple language models can prevent "knowledge collapse," a reduction to dominant ideas. This is vital for building robust, reliable, and unbiased AI ecosystems, particularly for applications requiring diverse perspectives and continuous learning.

FEEDBACK

Projects

No projects yet