2024-07-18
한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina
Machine Heart Report
Synced Editorial Department
Today, market sources said that Zhou Chang (nickname: Zhong Huang), a technical backbone of Alitong Yiqianwen, has resigned to start his own business. According to Synced, Zhou Chang will indeed resign, but the final date of resignation has not yet arrived.
Zhou Chang is an important figure in the Tongyi Qianwen large model. From 2020 to 2021, he led the team to design and implement the ultra-large-scale multimodal pre-training model M6, which achieved breakthroughs in the number of parameters and low-carbon training mode. This year, Tongyi Qianwen's open source model Owen1.5-72B ranked first in the HuggingFace model performance list (ChatArena) and is also one of the model series commonly used by the open source community.
Earlier, Zhou Chang played an important role in many projects during his time at DAMO Academy, such as the development of the ultra-large-scale product graph representation algorithm APP, the user representation framework ATRank, and the CLRec series of vector recall algorithms based on self-supervised contrastive learning.
Zhou Chang graduated with a bachelor's degree in computer science and technology from Fudan University in 2012 and a doctorate in computer software and theory from Peking University in 2017. He joined Alibaba through campus recruitment and has worked at Alibaba for more than eight years.
He specializes in deep learning, graph mining, and distributed computing. He has published more than 30 articles in top international conferences in the fields of machine learning, data mining, and databases, and served as a reviewer for academic conferences such as NeurIPS/ICML/KDD/WWW. Zhou Chang's team's research results have won awards and honors such as the First Prize for Scientific and Technological Progress of the China Electronics Society and the Leading Innovation and Entrepreneurship Team of Hangzhou.
According to Career Bonus, an Alibaba insider commented on Zhou Chang: "He really wants to create a larger model that is more versatile, rather than being vertical to a single field or scenario."
At present, Zhou Chang himself has not responded to the next plan.
About Tongyi Qianwen
Tongyi Qianwen is a large-scale language model independently developed by Alibaba Tongyi Lab, a natural language processing laboratory under Alibaba Group. Tongyi Lab is committed to researching and developing general natural language processing technologies and providing intelligent language processing services to various business departments under Alibaba Group. The current head of Alibaba Tongyi Lab is Zhou Jingren.
On April 7, 2023, Tongyi Qianwen began invitation testing.
On April 11, 2023, all Alibaba products will be connected to the Tongyi Qianwen model for comprehensive transformation.
In August 2023, Tongyi Qianwen open-sourced the Qwen-7B model.
In September 2023, Tongyi Qianwen was officially opened to the public; on the 25th of the same month, Alibaba Cloud announced the open source of Tongyi Qianwen's 14 billion parameter model Qwen-14B and its dialogue model Qwen-14B-Chat, which are free for commercial use.
On October 31, 2023, Tongyi Qianwen 2.0 was officially upgraded and released, and the Tongyi Qianwen App was also released.
On December 1, 2023, Alibaba Cloud open-sourced Tongyi Qianwen's 72 billion parameter model Qwen-72B, 1.8 billion parameter model Qwen-1.8B, and audio model Qwen-Audio. So far, Tongyi Qianwen has achieved "full-size, full-modality" open source, with a total of 4 large language models with 1.8 billion, 7 billion, 14 billion, and 72 billion parameters, as well as two multi-modal large models for visual understanding and audio understanding.
On January 4, 2024, the "Tongyi Dance King" function was launched on the Tongyi Qianwen App.
On January 26, 2024, Tongyi Qianwen's visual understanding model Qwen-VL launched the Max version.
On March 22, 2024, the 10 million-word long document processing function will be open to everyone for free.
On March 29, 2024, the first MoE model Qwen1.5-MoE-A2.7B was open sourced.
On April 3, 2024, Tongyi Lingma was officially launched on the Tongyi Qianwen APP and opened to the public for free.
On April 7, 2024, the 32 billion parameter model Qwen1.5-32B was open sourced.
On April 28, 2024, the 100-billion-level parameter model Qwen1.5-110B was launched.
On May 9, 2024, the Tongyi Qianwen 2.5 large model was released and renamed Tongyi.
As of May 2024, Tongyi Qianwen provides eight industry models, covering programming, reading, audio and video processing, character creation, finance, customer service, health, law and other fields. It can be applied to writing code, reading code, checking bugs, optimizing code, obtaining long text summaries and overviews, audio and video content processing, personalized character creation, interpreting financial reports and research reports, analyzing financial industry events and other scenarios.