On October 12, KLCII announced that the Aquila Aquila large language model series has been fully upgraded to Aquila2, and the 34 billion parameter Aquila2-34B has been added. It is reported that the new model performs well in inference and generalization, and has achieved a series of achievements in scenarios such as agents, code generation, and literature retrieval. At the same time, KLCII also released an open source family bucket, including the Aquila2 model series, a new version of the semantic vector model BGE, the FlagScale efficient parallel training framework, and the FlagAttention high-performance attention calculation subset. These open source projects will foster collaborative innovation in large model research.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)