I've heard people who have been exposed to LLM on ChatGPT and elsewhere say that they feel it works well with Scrapbox, and I think the same, but I've never been able to verbalize why that is.
relevance - Scrapbox and the use of knowledge representation and permalinks in books - > Nishio's Scrapbox has a small granularity of entries, like a fragmented scrapbook of notes, but it's easy to absorb knowledge. - give rise to
@nishio: Creating knowledge packages is actually easier than creating documents for human readers. Because humans have limited short-term memory, we need to give knowledge in a "good order", which is very hard for the author, but when giving it to LLM, the order is irrelevant, so just write whatever comes to your mind!
I thought, "That's Scrapbox."
This page is auto-translated from /nishio/ScrapboxとLLMの相性の良さ using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I'm very happy to spread my thought to non-Japanese readers.