The other day Barry Schwartz talks about spam SMX ask Matt Cutts and Bing responsible about using whitelists and exceptions in the algorithms of Google and Bing.
After giving some rodeo in the reply, the reply was that Google if you use white lists (lists of exceptions they call them), but really, these white lists are a function of each algorithm. That is, there is no global whitelist to allow immune to certain websites against any change or improvement in search results, otherwise these will create white lists for certain algorithms, since they are not 100% perfect.
In short, these white or emergency ready to serve certain “innocent” sites do not fall within filters or major algorithm changes. Set the example of example.edu, which includes the term sex but that really is the name of a university and therefore should not be removed from the results safesearch filter.
Moreover Microsoft said it also used are lists of exceptions, and reviewed after the “big” algorithm changes so that those sites whitelist are not affected.
All this implies manual retouching in search results, already they supposed they were doing, especially for sites “tricky” as copyrights, sex, etc .., but in turn, these manuals would also touches issues of positively to certain websites. It would be interesting to know what these webs of the “white list”.