• | 11:00 am

Open source gets a new player. UAE’s TII unveils Falcon 180B, surpasses Meta’s Llama 2

The largest open-source language model boasts a staggering 180 billion parameters.

Open source gets a new player. UAE’s TII unveils Falcon 180B, surpasses Meta’s Llama 2
[Source photo: Anvita Gupta/Fast Company Middle East]

Now accepting applications for Fast Company Middle East’s Most Innovative Companies. Click here to apply.

Boosting generative artificial intelligence capabilities in the Middle East and adding to the list of open-source large language models (LLMs), the Abu Dhabi-based Technology Innovation Institute has released Falcon 180B, a highly scaled-up version of Falcon 40B. This is the largest open-source language model, boasting a staggering 180 billion parameters. 

It is trained on a dataset of 3.5 trillion tokens from TII’s RefinedWeb dataset, making it the longest single-epoch pretraining process for an openly accessible model. 

Topping the HuggingFace leaderboard, Falcon 180B surpasses Meta’s Llama 2 in size by 2.5 times and utilizes four times the computing power. It also outperforms Llama 2 70B and OpenAI’s GPT-3.5 in terms of MMLU. It also uses multi-query attention.

TII claims their new model outperforms Meta’s LLaMA2 in various benchmarks, including reasoning, coding, proficiency, and knowledge tests.

This release comes following the success of Falcon 40B, released back in May of this year. Falcon 40B was one of the first open-source models for researchers and commercial users.

Google and OpenAI, two pioneers in the large language model field, have kept their foundational models under wraps, citing concerns that they could be used to spread misinformation or other harmful content.

“We are committed to democratizing access to advanced AI, as our privacy and the potential impact of AI on humanity should not be controlled by a select few,” said Faisal Al Bannai, Secretary General of the Advanced Technology Research Council.

“While we may not have all the answers, our resolve remains unwavering: to collaborate and contribute to the open source community, ensuring that the benefits of AI are shared by all,” he added.

Falcon 180B is only slightly behind OpenAI’s GPT-4 in terms of performance and is on par with Google’s PaLM 2 Large, despite being half the size. 

More than 12 million developers adopted and utilized the first release of Falcon. The new upgrade is expected to become the key model for various domains, from chatbots to code generation.

“Falcon 180B heralds a new era of generative AI, where the potential of scientific advancement is made available through open access to fuel the innovations of tomorrow,” said Dr. Ebtesam Almazrouei, Executive Director and Acting Chief Researcher of the AI Cross-Center Unit at TII.

“As we delve into the frontiers of science and technology, our vision extends far beyond innovation; it’s about nurturing a profound connection to address global challenges through collaborative breakthroughs,” she added.

Falcon 180B is compatible with major languages, including English, German, Spanish, and French, but has limited capabilities in Italian, Portuguese, Polish, Dutch, Romanian, Czech, and Swedish.

More Top Stories:

FROM OUR PARTNERS