Falcon 180B is a highly advanced language model featuring 180 billion parameters and trained on 3.5 trillion tokens. It leads the Hugging Face Leaderboard for pre-trained Open Large Language Models and is available for both research and commercial use.
Remarkably, despite being half the size of Google’s PaLM 2 Large, it performs at a comparable level. Falcon 180B excels in various tasks, including reasoning, coding, proficiency, and knowledge tests, even surpassing Meta’s LLaMA 2. Among closed-source models, it ranks just behind OpenAI’s GPT-4.
Falcon 180B is accessible to developers under a royalty-free license based on Apache 2.0. This license includes restrictions on illegal or harmful use and requires additional consent from TII for those intending to provide hosted access to the model. The model is free to download, use, and integrate into applications and end-user products. For more details, you can visit the official Falcon 180B page.