100 Days of Generative AI — Day 2 — LLM Tokens vs Parameters ?

Estimated read time 1 min read

We often hear/read this terminology: Llama 3.1 is trained with 8 billion, 70 billion, and 405 billion parameters and 15 trillion tokens…

 

​ We often hear/read this terminology: Llama 3.1 is trained with 8 billion, 70 billion, and 405 billion parameters and 15 trillion tokens…Continue reading on Medium »   Read More Llm on Medium 

#AI

You May Also Like

More From Author

+ There are no comments

Add yours