As Google marks 25 years, over the years, the global tech company has continued to innovate and make Search better by enabling people to connect to the information they were looking for.
“This includes creating new ways to search to helping businesses connect with customers through search listings and ads to even unveiling our conversational AI chatbot, Bard,” Google said in a statement.
To celebrate its birthday, here are some milestones that made Google more helpful and shaped it into what it is today.
Google Images - 2001
When American actress, dancer and singer Jennifer Lopez attended the Grammy Awards in 2000, her Versace dress, according to Google, became an instant fashion legend and the most popular query at that time.
This is what inspired the creation of Google Images because back then, Google noted that search results were just a list of blue links and therefore so many people couldn’t easily find the picture they were looking for.
“Did you mean?” - 2001
“'Did you mean', with suggested spelling corrections, was one of our first applications of machine learning,” Google said.
“Previously, if your search had a misspelling like ‘floorescent’, we would help you find other pages that had the same misspelling which isn’t usually the best pages on the topic.”
Over the years, the tech company has developed new AI-powered techniques to ensure that even if a user’s finger slips on the keyboard, they will still find what they need.
Google News – 2002
During the tragic events of September 11, 2001, popularly known as 9/11, Google said people struggled to find timely information in Search.
To meet the need for real-time news, Google News was launched in 2002 with links to a diverse set of resources for any given story.
Autocomplete – 2004
A year after developing Easter eggs, Google launched the first autocomplete feature in 2004 as a “Google Suggest” aimed at automatically predicting queries in the search bar when someone starts typing.
Today based on Google’s analysis, on average, autocomplete reduces typing by 25 per cent and saves an estimated 200 years of typing time per day.
Local information – 2004
People’s reliance on traditional phone books for business information paved the way for local discovery.
In 2004, Google Local added relevant information to business listings like maps, directions, and reviews.
In 2011, the tech giant added click-to-call on mobile making it easy to get in touch with businesses while on the go.
Based on Google’s data, on average, local results in Searche drive more than 6.5 billion connections for business every month, inclusive of phone calls, directions, ordering food, and making reservations.
Google Translate and Trends -2006
Google said that its researchers started developing machine translation technology in 2002 to tackle language barriers online.
Four years later, they launched Google Translate with text translations between Arabic and English and today, the platform supports more than 100 languages with 24 added in 2022.
Google Trends was built to help the company understand trends on Search with aggregated data which led to the creation of their annual Year in Search.
Today, Google Trends offers a free dataset of its kind, enabling journalists, researchers, scholars, and brands to learn how the search changes over time.
Universal Search – 2007
“Helpful search results should include relevant information across formats, like links, images, videos, and local results,” Google said.
“We redesigned our systems to search all of the content types at once, decide when and where results should blend in, and deliver results in a clear and intuitive way. The result, Universal Search, was our most radical change to Search at the time.”
Google Mobile App and Voice Search - 2008
With the arrival of Apple’s app store, Google launched it’s first Google mobile app on iPhone.
They introduced features such as Autocomplete and ‘My Location’ with the aim of making Search easier with fewer key presses and was especially helpful on smaller screens.
Today, users can do so much with the Google App, available on both Android and iOS, from getting help with translating documents with Lens to accessing other visual translating tools in just a tap.
The same year, Google introduced search by voice on the mobile app, expanding the feature to the desktop in 2011.
Search by Image - 2011
“Sometimes, what you’re searching for can be hard to describe with words,” Google said.
This resulted in the introduction of Search by Image enabling users to upload any photo or image URL, find out what it is, and where else the image is on the web. This update paved the way for Lens.
Knowledge Graph – 2012
“We introduced the Knowledge Graph, a vast collection of people, places, and things in the world and how they’re related to one another, to make it easier to get quick answers,” Google added.
Knowledge Panels, the first feature powered by the Knowledge Graph, gave users a quick snapshot of information about topics like celebrities, cities, and sports teams.
Popular Times - 2015
This feature was unveiled in Search and Maps to help users see the busiest times of the day when they search for places like restaurants, stores, and museums.
Discover – 2016
By launching a personalized feed, now called Discover, Google aimed to help users explore content tailored to their interests right in their mobile app, without having to search.
Lens - 2017
Google Lens turns one's camera into a search query by looking at objects in a picture, comparing them to other images, and ranking those other images based on their similarity and relevance to the original picture.
Today, the Lens feature has been enabled on the app and according to the company’s data, it sees more than 12 billion visual searches per month.
Flood forecasting - 2018
To help people better prepare for impending floods, Google created forecasting models that predict
when and where devastating floods will occur with AI. Today, Google has expanded flood warnings to 80 countries.
BERT - 2019
A big part of what makes Search helpful is the ability to understand language.
In 2018, the tech company introduced and open-sourced a neural network-based technique to train its language understanding models called BERT (Bidirectional Encoder Representations from Transformers).
According to Google, BERT makes Search more helpful by better understanding language, meaning it considers the full context of a word.
“After rigorous testing in 2019, we applied BERT to more than 70 languages,” they said.
Shopping Graph and Hum to Search – 2020
“Online shopping became a whole lot easier and more comprehensive when we made it free for any retailer or brand to show their products on Google,” the company said.
They also introduced Shopping Graph, an AI-powered dataset of constantly updating products, sellers, brands, reviews, and local inventory that today consists of 35 billion product listings.
Google launched Hum to Search in its app so that users can no longer be frustrated when they can’t remember the tune that’s stuck in their heads.
The machine learning feature identifies potential song matches after you hum, whistle, or sing a melody. One can also explore information on the song and artist.
About this result – 2021
To help people make more informed decisions about which results will be most useful and reliable for them, Google added the “About this result” feature next to most search results.
Google said that it explains why a result is being shown to you and gives more context about the content and its source, based on best practices from information literacy experts. ‘About this result’ is now available in all languages where Search is available.
Multisearch - 2022
Multisearch was created enabling users to search with text and images simultaneously as it helps uncover information that one is looking for.
“Now you can snap a photo of your dining set and add the query “coffee table” to find a matching table,” Google noted.
First launched in the U.S., Multisearch is now available globally on mobile, in all languages and countries where Lens is available.
Search Labs and Search Generative Experience (SGE) - 2023
“Every year in Search, we do hundreds of thousands of experiments to figure out how to make Google more helpful for our users,” Google added.
With Search Labs, users can test early-stage experiments and share feedback directly with the teams working on them.
“The first experiment, SGE, brings the power of generative AI directly into Search. You can get the gist of a topic with AI-powered overviews, pointers to explore more, and natural ways to ask follow-ups.”
Since launching in the U.S., Google has rapidly added new capabilities, with more to come.