How do search engines compare a submitted photo to all others in their database? Well this is an interesting, and not so easy to answer question.
Reverse image search technology is all about image processing and classification algorithms, of course.
So there are many different approaches towards this, and honestly, I cannot tell you which one is currently used by which search engine. I can only list here approaches that have most probably been used by reverse image search technology:
SIFT (scale invariant feature transform) features are very suitable to recognize similar pictures. They are invariant to size and location.
SURF (speeded up robust features) features basically speed up the SIFT technology by replacing the Gaussian filters of SIFT.
What Search Engines reveal
Now and then there is some insight into what certain reverse image search engines are really doing. I will list any of those links here:
To find out which technology is really used by giant search engine Google, one should have an eye on their submitted patents. And they are very actively submitting new patents. Here is a list of the latest most important granted patents:
- Retrieval of similar images to a query image
Publication date: May 21, 2015
- Mobile image-based information retrieval System
Publication date: June 14, 2011
Please contribute to find more Details. Send me a note!