I wish we had a good metaphor for what LLMs do that doesn't involve pretending it has some intent
like a lot of the harsher criticism is like "it makes things up" which implies malice or at least intent, which... there isn't
there's "it draws from a distribution fitted on its input data" which is true but also not really illustrative in any way
@halcy i tend to conceptualize it like “it outputs what a random webpage on the topic would say”, and that seems to elicit the right level of source criticism
- replies
- 0
- announces
- 1
- likes
- 8
@halcy I liked the "it has no idea what truth even /is/".