Content Authority

A general problem that often occurs on many websites is the lack of unique content throughout the website. Most websites have unique navigation menus and query bars but not the key enough of the key element, which is fresh and unique content.

So suggestions are to:
Categorize each section of your website on mini-niches and provide unique content accordingly to each niche. These content pages MUST bring further value to your visitors. Try to elaborate as much as possible on each topic in each section. Your goal is to turn your website into an encyclopaedia populated with original, targeted & quality unique content.

When working on each page, try to make it keyword-rich on a selected keyword(s). This means that you must have a keyword density of 3% to 7% for the target term. Remember to bold and sometimes even underline the target term. This will indicate to search engine spiders (combined with other SEO factors) that those are the keywords you are weighting in that particular section.

Have your website organized in a way that with each new step on different levels you are getting more and more into specific content. If you are always redirecting on multiple sections of your website to the same pages you are not providing the navigational flow that Google expects from a quality website. Instead, try to dig deeper into each section when providing links. Long tail or very specific keywords should be used for this purpose. This will make Google index your website as an authority content source provider in the niche by covering all relevant aspects & variations thus enabling your website to be established as a top authority in the field.

It is extremely important to avoid duplicate content issues.  Each section should produce its own unique content. If we repeat the same content replacing only the name of the cities or we may be vulnerable to duplicate content penalizations. This actually occurred to one of the most important travel & booking websites on the Internet. Its Latin American branch was just using the same rough content on several sections just changing the target keyword. The trick was quickly noticed by Google and severe penalizations to their website were applied.
The same idea applies if we are just getting the information for each city from an API or a public resource: it is NOT unique content. So if that is the only option (instead of working on unique copyrighted in-house produced content) I suggest that we mix the formula by injecting some of our own content (will have to be hand made) among public data scraped contents.