It occurs to me that, much like the web, what’s absent from the next wave of AI tools are any sort of concept of transclusion. Translcusion would have the sources of data traveling along the same pipes as that data itself, and make attribution actually possible. Can you imagine if LLM were actually accountable for providing the root of each source?
Ted Nelson never quite cracked that technological nut, and we are so well past it that nobody even thinks about it anymore.