Babbitt News, recently, Shanghai Artificial Intelligence Laboratory (Shanghai AI Laboratory) released XTuner, an open source toolbox for large model training, which once again lowered the threshold for large model training. It is reported that XTuner focuses on the fine-tuning process, providing a lightweight fine-tuning framework for various open source large models, once again consolidating the practical tool attributes of the full-chain open source system. XTuner supports the adaptation of multiple levels of hardware. Developers only need to use a minimum of 8 GB of consumer-grade video memory to train an "exclusive large model" suitable for specific demand scenarios.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)