DailyNews
vip

According to IT Home on December 21, Google recently launched a "learning cost model" training dataset called TpuGraphs, which is mainly used to "optimize the compiler" and "improve AI deep learning capabilities". Google pointed out that current AI deep learning systems are usually trained using frameworks such as TensorFlow, JAX, and PyTorch, which mainly optimize the model through the heuristic algorithm of the underlying compiler, and use the "learning cost model" in the relevant compiler to improve the performance of the compiler and improve the deep learning ability of the final output model. Compared with industry competitors, Google's TpuGraphs dataset is 770 times larger in "average graph size" and 25 times larger in "number of graphs" than the industry training set. Google claims that applying the TpuGraphs dataset to the compiler effectively addresses issues such as "scalability", "efficiency", and "quality" of the final output model.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)