Growth of Visual Searching

For years people have been using text, and more recently speech, to search for various items or information on the internet using computers or mobile devices. As the ability to search on the web continues to evolve, so do the methods that allow people to do these searches.

Visual Searching has now started to take forefront for user experience. This concept of searching allows individuals to use their phone’s camera to take pictures of items, locations, or anything they find interesting, and easily find information about the snapshot’s content.

Companies, brands, and ecommerce websites are now focusing on using images as a query instead of text. There is a focus on pushing images and how a product is shown to a consumer rather than having the consumer try to remember a product and do the searching themselves.

For many, this is easier than text based searching as humans are meant to process visual based information; we are visual beings. We are more adept to use a photograph or image when attempting to explain a location visited or something seen to another person. It saves time from having to remember specific details when you can snap a quick picture and have it at your fingertips for reference.

The Future of Searching

Though text based searching is still, and will continue to be, one of the main search engine preferences that people will turn to, visual searching is easily becoming adopted.

The ability to snap a picture of a piece of clothing or item that you have interest in removes the barrier of trying to remember the name or specifics of said object. Quick turnaround in getting information about something that you like is more appealing, and people are acclimated to the instant gratification of receiving information at that moment.

In 2017, Jumpshot and Moz reported that visual searching accounted for 27% of all searches among the 10 major properties on the web, which shows that visual search is gaining traction and users are finding the benefits in searching based upon the images they snap.

Pinterest has already gained the benefit of early adopting this new searching ability by allowing their users to snap pictures of items, find similar items, and then pin them to their boards.

Google, Microsoft, and other corporations are jumping on the bandwagon to gear their services towards the needs of users. Product and services companies are also capitalizing on the way images and products are searched by everyday consumers.

Brands already taking advantage

As people start to use this new way of searching, so do the companies that take advantage of this method and how they can use it to advance their brand. Such names include Pinterest, Google, EBay, Bing, and Target.

Pinterest Lens

Pinterest was one of the first adopters, incorporating visual searching into their already popular product. Their mobile application uses a new feature called “Lens” allowing users to take pictures of items they see or like.

From there they highlight what they like out of the picture and then find similar “pins” that other users have posted or items related to the one that they have selected. This ranges from clothing styles, home décor, food items, recipes, etc.

Pinterest Lens
Pinterest Lens
Pinterest Lens

Being a Pinterest user, mainly for their web version, I was interested in how Pinterest Lens would work. Simply after opening up the application, you can click on the camera icon located within the search bar and take a picture of something around you. Pinterest will process this image and load various pins from other users that contain similar items.

From a user experience standpoint that I find intuitive, is that where the camera icon was previously located, it will show the picture that you just took along with tags that your image is similar to.

According to Pinterest, their Lens application has taken off with users doing more than 600 million visual searches a day, with this number likely to increase as this feature continues to gain traction. Presently Pinterest will only scan for items within their own application gathering only what other users have pinned, unlike Bing and Google that scan the entire web for content similar to your snap.

Google Lens

Being the go-to option for searching the internet, Google of course would take advantage of this feature to improve their already intuitive and accessible engine.

Google released their own picture based searching service called “Google Lens” which is native to Android devices. Being native to Android devices allows Lens to be more accessible to users without having to download an outside application.

Using Google Lens, you will open up the Google Photos and the app will scan through all of the photos stored on your device. Following the image scan, you will be able to select a photo and press on the "recognition" button. Lens will proceed to scan the selected image for faces, location, and anything else that it can recognize to give a proper search result.

Google Lens
Google Lens
Google Lens
Google Lens

Trying out this functionality myself, I used a picture that I took at Graffiti Park in Austin 2 weeks ago. Google Lens was easily able to recognize the location based upon the skyline and various aspect in the image providing me with a name of the location, reviews, and asked if I wanted to search further about the place on Google.

Following in the footsteps of Pinterest and Bing, Google Lens will allow for users to search for styles, which will enable brands to feature their clothing and other hot items for people to purchase. Online shopping will get a major boost due to the ease of finding styles or items. Along with providing additional context to images, users can take pictures of food or restaurants to receive links containing recipes or even hours for a potential lunch hotspot.

Being already the king of the search engine mountain, it will be exciting to see how Google continues to evolve this service and how they will adapt to people searching on a visual basis. Already Google is taking advantage of artificial intelligence and continuous learning to improve the way their app delivers information to end users.

Bing Visual Search

Recently Microsoft joined in, announcing that their popular search engine, Bing, would also start incorporating visual searching.

Being part of Code Authority, a Microsoft Gold Certified Partner, I was able to sit in and listen to a presentation given by Microsoft Developers about the inclusion and capabilities of their Bing Visual Search functionality. Their conference discussed the internal workings on how their visual search engine runs and networks that they pull from receive their data.

Microsoft has taken the approach of building upon pre-existing networks through continuous learning to aid in the ability of their users to scan the entire web, not just within a singular application for images within images.

Bing Visual Search

Along with adding it into their application, Microsoft is giving the power of this visual search engine to other developers to incorporate into their own services. By giving their service out to other developers, it gives brands the ability to improve to improve the way consumers search for their products, along with finding recommendations for other items.

Microsoft continues to enhance their service, allowing people to highlight certain items within a picture such as an article of clothing, lamp on a table, or a person a picture that they might think is a celebrity.

From a developer standpoint, it seems that Microsoft wants to give the power of continuously learning in the hands of other developers to see how they can utilize it to their advantage.

How Can Visual Searching Continue to Evolve?

As a user centric designer (interface and experience), I always look towards the forefront of how new technologies or ideologies can be improved and evolved upon. Visual searching offers a unique ability to improve upon an already beneficial concept.

From a commerce standpoint, you can already find items similar to the one in your photo.

Concert/Music

For concert goers and music lovers, this could be beneficial and have a unique spin. I’ve noticed at a ton of concerts that I’ve attended, people love to take pictures and videos; myself included. When it comes to opening bands, you might not know much information about this artist or some of their popular songs.

By taking a picture or video of the band, you can get details about the artist, artists similar to them, their discography, songs you might enjoy, concerts in the area with a similar sound, and other details. Latest news and collaborations would be intriguing for a user to keep up to date with the latest coming out. If you are already a fan of the band, having the option to purchase merchandise or see images of upcoming items about to be sold by the band would help an artist gain traction with their fan base.

While this does allow you to do it via Google Photos, unfortunately Google recommends that you either open up Google Chrome or Safari to view further information about these artists, thus on a user experience standpoint, having to open up a browser to view this information is a bit tedious. Having this all within the photos app would be fantastic to get you started on viewing content, thus leading you down a rabbit hole later on.

These visual searching capabilities could also tie into other applications like Spotify, YouTube, Instagram and Snapchat. If a user takes a picture of an artist, they could have the option to start streaming music from that artist (most popular songs), view YouTube music videos, any latest images or videos taken by people at that band’s concert, or most recent Snapchat stories. It will allow a viewer to capture the essence and energy of being at the performance and potentially gain interest in wanting to go to the next available show.

Gamers

Being a former professional gamer, along with hobby gamer from time to time, these capabilities can also tie into the field of video gaming.

Professional gamers use different rigs along with mice, keyboard, and headsets, which at times, can give them an advantage over the competition. This advantage entices other players to achieve that same style of gaming and “play like the pros.”

Google Lens Gaming

Marketing and sites can benefit from a kid taking a picture of their favorite player or rig and getting information on the spot. They could find out the type of computer, including hardware, mouse, keyboard, and headset used to play their favorite games. Diving deeper, a gamer could take a picture of a mouse during an in-game action moment and getting the exact sensitivity settings that player is using at the moment.

As my mind continues to race, another concept comes to mind; an individual taking a picture of a game in action. Through visual searching, results could pop up about the type of monitor or TV the game is playing on with the optimal settings to play the game on, where you can buy the monitors, buy the game, find out more information on the game, console it is played on, and even possibly purchase the game with setup all in one for the best price.

Home Décor

Presently people are already using visual searching to improve their home décor and shopping capabilities. As it stands, users are able to take picture of an item within a room and view similar items or where it is presently sold.

To improve upon this functionality, the user could take a picture of a room, choose a select few items, and find room layouts with similar items along with which items work well together to create feng shui or what additional objects would pair well with the ones selected.

Diving further into the possibilities, developers can take advantage and allow the users to select items within videos and having an augmented reality 3D interactive display to view the object from all angles. They could even get in contact with a seller and purchase the item.

Hair Care/Styles

Already within the cosmetic industry, people are taking images to their hair stylists to give them a concept of what they would like done for their own style.

Hairstyle Search

With visual searching, someone could take a picture of a hairstyle to find salons and hair stylists that do the type of style. This reduces the friction of having to manually look for salons or spend time researching through multiple locations in your area just to find the right one.

Fitness Industry

As a fitness enthusiast and weight lifter of over 12 years, having visual searching being incorporated into the fitness industry and tied into workouts can be a major plus. Before joining Code Authority, I was a developer and UI/UX designer on a startup fitness application called JEFIT which helped and aided users with their fitness journey, so I am very intrigued about how this functionality could tie into and aid in fitness journeys.

The purchasing and usage of supplements plays a major role in marketing and encouraging people into starting a fitness regimen. Advertising companies push products that will “change someone’s life”, help them lose weight, and turn them into these fitness gods. For a marketing company or supplement store, they can take advantage of having their product’s pictures being taken, giving users insight to where they can buy said product. Information on athletes that take these supplements, video reviews, details about nutritional facts, areas with best prices, and effects that these supplements could provide would all be an effect of visual searching.

Exercise Serach

Additionally, understanding various exercises would be a major benefit through the usage of taking video or pictures of a workout. Conceptually, someone would take a picture or video of another individual doing an exercise. They would then, through visual searching, receive information about the workout, how to perform it correctly, muscles it works, similar exercises, and long term effects performing the workout. For those afraid or intimidated about entering a gym and doing workouts incorrectly, this will aid in getting them started.

Medical Assessment

People are already turning to medical sites and applications that use images to determine if they have a medical condition or issue. Through the use of visual searching, people can take pictures of a potential rash, bruise, or medical condition that they may be experiencing and view images similar to their symptoms.

From there, tabular options could pop up or continuous scrolling, allowing for searching through symptoms, doctors who treat these issues, over the counter remedies, being referred to seek out medical advice immediately, and more. This will also reduce the need for people to rush out to medical examinations for basic illnesses or ailments.

Moving Forward

The possibilities are endless with visual searching and as this feature continues to evolve, so will the ways that user experience designers adapt and evolve with this concept. We already use the benefits of speech searching, mainly with asking Alexa, Siri, or Google Voice AIs to look up information for us.

Artificial intelligence and user learning will also greatly improve this functionality as continuous learning will predict user approach and create cases on how they will put it to their benefit.