Google ‘mT5’ Pretrained Text-to-Text Transformer Achieves SOTA Performance on Multilingual Benchmarks | Synced
Google recently introduced mT5, a multilingual variant of its “Text-to-Text Transfer Transformer” (T5), pretrained on a new Common Crawl-based dataset covering 101 languages.
Source: Synced | AI Technology & Industry Review
Google recently introduced mT5, a multilingual variant of its “Text-to-Text Transfer Transformer” (T5), pretrained on a new Common Crawl-based dataset covering 101 languages.