LLaMA-2

LLM trained on 2T tokens with double Llama 1's context length, available in 7B, 13B, and 70B parameter sizes.

LLaMA-2

LLM trained on 2T tokens with double Llama 1's context length, available in 7B, 13B, and 70B parameter sizes.

Let's stay in touch.

Get Contact
cta-area