Google on Tuesday said its new MUM (Multitask Unified Model) has helped improve searches for vaccine information, helping people find timely, high-quality information about COVID-19 vaccines globally. Google had recently unveiled MUM that offers expert-like answers to questions with fewer queries to complex tasks. “AstraZeneca, CoronaVac, Moderna, Pfizer, Sputnik and other broadly distributed vaccines all have many different names all over the world — over 800, based on our analysis. People searching for information about the vaccines may look for ‘Coronavaccin Pfizer’, ‘mRNA-1273’, ‘CoVaccine’ — the list goes on.
“Our ability to correctly identify all these names is critical to bringing people the latest trustworthy information about the vaccine,” Pandu Nayak, Google Fellow and Vice President, Search said. He added that identifying the different ways people refer to the vaccines all over the world is hugely time-intensive, taking hundreds of human hours. “With MUM, we were able to identify over 800 variations of vaccine names in more than 50 languages in a matter of seconds. After validating MUM’s findings, we applied them to Google Search so that people could find timely, high-quality information about COVID-19 vaccines worldwide,” he said.
He explained that as people everywhere searched for information, Google had to learn to identify all the different phrases people used to refer to the novel coronavirus to make sure high quality and timely information from trusted health authorities like the World Health Organization (WHO) and Centers for Disease Control and Prevention surfaced. Nayak said this first application of MUM helped get critical information to users around the world in a timely manner, and Google is looking forward to the many ways in which MUM can make Search more useful to people in the future. “Our early testing indicates that not only will MUM be able to improve many aspects of our existing systems, but will also help us create completely new ways to search and explore information,” he added.
MUM can learn from and transfer knowledge across the over 75 languages it is trained on. With its knowledge transfer abilities, MUM does not have to learn a new capability or skill in every new language. It can transfer learnings across them, helping quick scaling of improvements, even when there isn’t much training data to work with. This means that MUM requires far fewer data inputs than previous models to accomplish the same task.
Pingback: Realty firms hail Sebi move to reduce minimum subscription amount
Pingback: Supporting Moderna for Covid vax to be donated to India: Cipla
Pingback: Google says its Multitask Unified Model helped improve vaccine search ✔️ Autocomp