Will MUM of Google put an end to SEO?

At the annual Google I/O conference, it was announced that there would be updates to existing technologies such as Google Maps or Google Photos together with some amazing technologies such as LaMDA, a skilled conversationalist AI that could revolutionize chatbot tech, or MUM.

Today I’ll review MUM, which stands for Multitask Unified Model. In short, this technology is directed towards making search engines more powerful, the same as BERT did a couple of years ago. I will explain how MUM could potentially impact SEO in the future to the point of causing SEO to be obsolete.

MUM — The brain of the search engine

MUM is an improvement upon Google’s search engine. Like other popular state-of-the-art models like GPT-3 or LaMDA, MUM is built with transformer architecture. BERT (MUM’s predecessor) is similar in this regard, with the main difference being that MUM is 1000x more powerful.

The main issue MUM intends to resolve is needing to type out many queries and perform many searches to get the answer you need.” MUM leverages its abilities best for queries which don’t have an easy answer to help Google’s search engine tackle complex tasks.

In the demo, the power of MUM is illustrated with an example query: “You’ve hiked Mt. Adams. Now you want to hike Mt. Fuji next fall, and you want to know what to do differently to prepare.” This is a question we could imagine asking a hiking expert, but not a search engine.

With technology of the present, we’d need to search for each variable we can imagine (i.e time of year, weather, terrain elevation, style of gear, rock type…) and afterwards compare results to yield a useful answer.

Instead of that, MUM could directly solve the directly via it’s toolkit. MUM responded, “Mt. Fuji is roughly the same elevation as Mt. Adams, but fall is the rainy season for Mt. Fuji so you might need a waterproof jacket.”

MUM’s skillset — Multilingual and multitasking

MUM has been trained across 75 languageswhich allows the system to break language barriers. With today’s tech, we’d have to search for info in Japanese to find satisfactory answers for our hiking trip to Mt. Fuji. MUM could simply look up the info directly and then translate it for you.

MUM has been trained on multiple tasks, achieving “a more comprehensive understanding of information and world knowledge than previous models.” The possibilities here are wild. MUM could understand that Mt. Adams and Mt. Fuji are mountains. It could understand you’ll need to know differences in geology but you won’t care about fauna and flora. It could understand that preparing might not refer to physically train (as you’ve already hiked in Mt. Adams), instead it could mean things like “finding the right gear.”

MUM is able not just to “understand” language, but to “generate” it (in this sense it could be compared to GPT-3). Yet, I’d qualify the word “understanding” here. I don’t think MUM (nor GPT-3) can understand language. I’d say these systems are real examples of Searle’s Chinese room argument. To understand we need meaning. To get meaning, we need to link the form of language symbols with their external, experiential representations in the world. A language model inside a computer — MUM, GPT-3, or LaMDA — can access the form of the symbols, but they can’t experience the world, which keeps meaning out of reach.

Lastly, it’s always worth mentioning that Google holds responsibility as the main priority for its AI systems. For each AI they create, they try to reduce bias and reduce the carbon footprint. And MUM is no different.

MUM’s edge — Multimodality

But by far, the most important breakthrough MUM brings us is its capability to manage multimodal information (which is beyond GPT-3 and LaMDA’s abilities). MUM can combine info from images and text (in the future Google will include audio and video).

“Eventually, you might be able to take a photo of your hiking boots and ask, “can I use these to hike Mt. Fuji?” MUM would understand the image and connect it with your question to let you know your boots would work just fine.”

This is new in the AI world at this level and it’s a key step towards artificial general intelligence. The importance of this milestone can be better illustrated when comparing AI systems with the human brain. One of the features that make our brain magnificent at navigating the world is its multisensorial nature.

  • The world is multimodal. This means that most events/objects around us produce information of different kinds (electromagnetic, mechanical, pressure, chemical, etc.). Think of any object, an apple for instance. It has color, form, texture, taste, smell…
  • Our brain is multisensory. We are endowed with a set of sensory systems that permit us to perceive the multimodal nature of the world. The brain interprets and integrates all this info into a single representation of reality.

Keeping the distance, MUM could be the first of a generation of AIs capable of combining multimodal info similarly as we do.

MUM could make SEO obsolete

Google has been trying to make the search engine feel more natural since its conception. When they launched BERT in 2019, Nayak wrote in his blog post that the fact that the search engine wasn’t appropriately understanding queries was “one of the reasons why people often use “keyword-ese.” A google search is its own style of communication, nowhere near how we’d ask a question to another person.

BERT was the first stone of the path to change the search engine from matching keywords to interpreting the context of words in a sentence. BERT, before MUM, already changed the game for SEO. Google acknowledged there isn’t a way to optimize for BERT other than optimizing for users. BERT reduced the impact of using keywords in web page rankings. The system now mostly cares about whether a web page answers the query of the user, even if the exact keywords are missing.

MUM could revolutionize the search engine to such an extent that the concept of SEO becomes obsolete. People won’t be looking for pages directly, they’d simply ask MUM a query and MUM would do the job as if it were a human personal assistant. Keywords would matter very little. Why would we force ourselves to write queries in the “keyword-ese” language when MUM can understand natural language?

Of course, keywords are still important in the sense that queries will still contain them, but they won’t help a web page position better anymore. The idea of optimizing content for MUM won’t exist. There won’t be a direct way to game the search engine. People will stop writing articles for the algorithm for good and write them for the people instead.

Final thoughts

MUM’s power relies on its multilingual, multitasking, and above all in its multimodal nature. It will revolutionize the search engine, probably incrementing manifold the impact BERT had in 2019. As Edwin Toonen puts it, Google search engine will stop being a search engine to evolve into a “knowledge presentation machine.”

SEO experts have to adapt every time Google makes an update to the algorithm. When BERT came out they didn’t see much impact on their performance. However, if the time comes when we don’t need to search for anything and instead we could simply write in natural language our queries for a super-powerful AI to find our answers for us, we can only wait to see what would happen to SEO.

Leave a comment

Your email address will not be published. Required fields are marked *