Whilst the Internet is growing exponentially and acting as the repository of an enormous amount of information, the content quality is also being compromised. Content relevancy is continually being decreased by an increasing number of unethical website designers who are implementing various black hat techniques to trick the search engine algorithm in order to obtain a higher ranking. Research was conducted with the aim to establish the how and when of search engine algorithms’ attempts to curb spamdexing. Nevertheless, cloaking, another form of spamdexing, was identified as existing on the search engine result page (SERP) of Google, regardless of the practice having been denounced by the search engine. Five similar websites with varying keyword densities were designed and submitted to Google, Yahoo! and Bing. After 16 days of the first experiment, three of these websites were cloaked by an Iranian site. The cloaking lasted for 10 days for the first, 11 days for the second and 39 days for the third website. A phase 2 experiment was conducted and the third website was not indexed by Google for both phases of the experiments. This was the case in spite of the fact that it had a high keyword density, as supported by scholars. Also, the Iranian webmasters might have opted to scrap the site. This could be done in expectation of a higher number of visitors to the site looking for laptops rather than the actual information that was included on their site. None of the websites submitted to Yahoo! and Bing were cloaked. This research provided evidence that the search engine algorithms are still failing to fully address these practices and some developers are implementing cloaking without being identified by the search engines. The research also established that the waiting time for indexing can be prolonged by such practises and this may result in some websites not being indexed.