Practises of mentioning a specific keyword as many times as possible to garner higher rankings in search results have been challenged in recent times. On page SEO is rapidly becoming obsolete. Apart from keyword count, other metrics such as LDA scores are also part of the picture. Today there are more than 200 ranking factors involved in the process. Close studies have also pointed a departure from usual trends, wherein the keyword density is inversely related to rankings for certain words such as “car insurance” and “television”. This probably underlines that search engines, today, are using advanced algorithms which can even penalize a content for using a keyword too many times. However, for a word like “dress”, the search results follow the usual trend of giving high rankings to contents which mention the keyword most number of times. This again shows the varying importance of keywords. As a case in point, for a keyword search for “Liverpool FC fan blog”, a text containing words like “excitement”, “gutted”, “anticipation”, “petrol money” are more relevant than a text with words “Anfield” , “Champions” or “Kenny Dalglish”. For measuring such relevance Latent Dirichlet Allocation or LDA is a very effective method. This method compares two texts and rates them by looking at related keywords instead of actual keywords. The Search Engine Optimization community should understand these nuances of writing better content and instead of focusing to improve a single metric they should rather take all the metrics into account and strive for betterment in every aspect to a certain extent. The job lies not in cheating the search engine to give high ranking to your content but to actually make readers read your content. Search engine optimizers should be open to incorporate this change in their view points. Finally it is the balancing act of cracking search engine algorithms as well as bringing overall improvement to your content that would ensure a top ranking site.