2024, november 16.
how i build a LLM? how i start it?
to answer thats question, i opened my ChatGBT and ask. what is LLM and how i start build the LLM?
the highlight answer from chatGBT is called transformer, and to learn about it, i must read a paper with title "Attention is all you need". I search in Google Search with that paper title, and i found it. here is the paper.
What the contained in that paper? the first i saw are authors. mostly authors in that paper from Google. interesting, very interesting.
back to the topic, what are the contained? ahhhh.... a lot.. i confused, so many new things, and old things. I think I must stay to this paper, and learn it before i create code.
bye, see you in review paper "Attention is all you need", not paper titled, is like a book tilted. very cool paper titled.