A record-breaking artificial intelligence (AI) modeling system was unveiled by the Beijing Academy of Artificial Intelligence (BAAI) at the 2021 Beijing Academy of Artificial Intelligence Conference in Beijing on June 1.
WuDao 2.0's 1.75 trillion parameters broke the previous record of 1.6 trillion parameters set by the Google Switch Transformer, both known in the AI industry as a "super-large-scale pretraining model."
"WuDao 2.0 is the first trillion-scale model in China and the largest in the world," said Tang Jie, BAAI vice academics director.
WuDao 2.0 achieved excellent results in nine benchmark tasks in the pre-trained model field, Tang said, and was close to breaking the Turing test in poetry and couplets creation, text summaries, answering questions, and painting.
"WuDao 2.0 aims to enable machines to think like humans and achieve cognitive abilities beyond the Turing test," Tang said.
At present, English language development dominates large-scale models internationally. The BAAI, promoted by Beijing Municipal Science and Technology Commission and the Haidian district government, started the research and development of large-scale intelligent models based on a Chinese language development environment.
Organizers say that this year's online-and-offline conference attracted more than 200 experts and more than 30,000 AI professionals.
BAAI released WuDao 1.0, China's first homegrown super-scale intelligent model system, on March 20, 2021.
Go to Forum >>0 Comment(s)