• jol@discuss.tchncs.de
    link
    fedilink
    arrow-up
    2
    ·
    2 days ago

    I know, and accept that. You can’t just tell an LLM not to halucinate. I would also not trust that trust score at all. If there’s something LLMs are worse than accuracy, is maths.