As information technologies develop and search engines get more sophisticated, in order to jump on the bandwagon and improve online presence you definitely need to use search engine optimization (SEO). Google searchbot algorithms change more frequently than they used to, becoming more intelligent, the effectiveness of technologies that stand behind these rules is steadily growing, and the “cat and mouse” game is losing sense. Now it is more profitable to follow the house rules, rather than trying to break them. With the right strategy at the beginning you will make it cost-effective. There are following top factors of ranking websites used by search engines, such as: user-experience (UX), page-loading speed, relevant content, mobile-friendliness, great query profile, traffic and page layout.
Google search is now focused on offering high quality content for mobile solutions. It didn’t happen out of the blue, because it had been announced in 2016 on its official blog for webmasters https://webmasters.googleblog.com/2016/11/mobile-first-indexing.html. Adapting your website for phones, with its navigation, features that are offered, how content is accessed are among the top priorities. Pages are ranked according to the principles of User Experience (UX). In addition, experts advise to ensure that the content on both desktop and mobile versions should be consistent. Carefully optimized texts will be of little or no use if they are cropped or shortened to save space on a small screen.
Another technology that is advised to integrate is AMP. Accelerated mobile pages (AMP) is a project led by Google which is intended to improve the performance of web content and advertisements. It is reported by news giants like CNBC and Gizmodo that AMP pages load three times faster than non-AMP pages. It is popular in all informational industries like search engines, social platforms, content publishing, ecommerce, financial services.
Great query profile
Keywords are fundamental to online marketing process. It should be mentioned that search engines store web pages in an index, particularly keyword based. During the past few years global community have been using mobile technologies extensively, regarding them to the greatest extent as if they were their personal assistants. Thus, it is extremely important making your pages searchable through understandable and detailed queries. About 40% of users now use Google Voice Input on their mobile phones and tablets. Speech recognition has reached a high level of accuracy, it saves time and gradually becomes a full-fledged alternative to keyboard. Researches show that in the next three years the number of voice queries will catch up with text ones.
User’s session duration
Another important parameter is user’s session duration. The longer unique users remain on the website, the more relevant information should be to users, thus the website scores higher in SERP. According to SearchMetrics reports, 3 minutes 10 seconds is the average session duration for pages that get the top ten results. Perhaps, it might not look surprising that the place in search results also corresponds to how long is the article, the best results achieved by pages with about 2000-3000 words. Planning content can make your website rise to the top or sink to the pitfall. That’s how important it is, so this part usually requires the most of time. Don’t start panicking if session duration is less then average on your website It also depends on website’s niche.
To start with let’s take a look at website’s layout and design. No one would deny that more preferable are good looking, easy to read style, concise text, easy to find information. The first thing to consider would be making website’s categories according to the query profile that was explained earlier. Implement breadcrumbs for easier navigation. Make sure robots.txt file is in place and is properly configured. It is placed in root directory of the site and contains special instructions for search robots on which folders to index. Indexing is what makes pages reachable through search queries on Google and other search engines.
The next step is creating a sitemap.xml file. This is an analogue of the site map, designed specifically for search robots. Ensure that all pages with the relevant content is there, as they are going to be indexed. Pages that have the same content, but can be accessed with different URLs are considered to be duplicates and should be avoided make a negative impact on ranking. Each article or product should be accessible through one URL.
are considered as different pages by search engines, even though they all follow to Homepage. Use site mirroring: leave one version of your website, for example with www, but all the others are redirected to it. One way to tackle this is “Redirect 301”. It is a permanent redirect that passes link weight to the redirected page.
Search engines like Google make stakes on security. One way to get it is implementing SSL certificate and pass all traffic in encrypted form. Otherwise it is considered as unsafe and users will be discouraged of accessing website and invaluable score marks and site’s reputation might be harmed considerably.
Ensure error-handling when someone is trying to access non-existent or deleted pages. “HTTP 404 error” creates an exception and CMS provides visitors with a friendly “The requested page is deleted or not available” page. It can be configured in .htaccess file.
Divide URLs into groups. Category pages URLs are terminating with “/” sign , and product pages or articles without it respectively. They should be legible, easy to remember and not too long. It makes links more intuitive and understandable for visitors. In addition, apply “3 clicks strategy” when planning your content hierarchy: any page should be accessed in 3 clicks or less.
Another important tool is Google Search Console (earlier called Google Webmaster’s Tools). It provides web developers and marketers with powerful and robust tools for both development and optimisation on all stages. There you need to choose the main mirror of your website, for example with www or without. Disregarding it will create duplicates in search index.
Meta title, meta description, headers
Title, meta description, headers are powerful tools for SEO optimization. It is worth to mention that, even though headers are often misused by web developers for solely text style reasons, they are the source information for search engine crawlers and shouldn’t be treated light headed.
Follow this recommendations for meta titles and meta descriptions:
- Ensure that all pages have unique titles, descriptions and h1 tags.
- It is desirable for meta titles to be about 35 to 60 characters long. Text from titles is seen in search results as a clickable link to your website. Long titles might be cropped in the most interesting part if they are longer than recommended length.
- Titles must contain relevant keywords which also shouldn’t be repeated more than twice.
- Meta description length is about 120-156. Search engines use its contents as a snippet in search results which is a brief details of what is a page about. Keep in mind that descriptions are written primarily for people and not for search engines and everything longer 200 characters will be cropped.
There are following recommendations for heading tags:
- Heading tags H1-H6 are used in a nested hierarchy H1-> H2 -> H3….
- H1 header is used only once on each page
- Headers must contain 1 to 7 words.
- H1 and title should not be the same. Paraphrasing is recommended in this case.
- It is not desirable to use tags such as <span>, <em>, <a>, <href> inside header tags.
There are following recommendations for handling images. <img> tag should be assigned with alt and title attributes. The title pops up when a user hovers the mouse over the picture, also it gets into user search results. Alt attribute is an alternative text for the picture, and shown unless it is loaded. It is also used by search engines when considering relevance of information. Make sure alt attribute contains relevant keywords. For higher rankings make sure that images’ attributes alt and title are not exactly the same.
Make sure all your pages are accessible from the home page, because interlinking issue is an obstacle for search engine to crawl and index pages. Microdata is important to how your website is seen on SERPs and it makes your website stand out from competitors. Meta data will transform in response to reviews, events, organizations and personalities, article author and will be shown to users in search results
Information that builds trust
Your goal is to improve overall users experience, create platform that ensures high conversion rate and in perspective make your visitors, – customers. But first you need to build trust with them. Feedback, comments form, contact information, vacancies page and media content such as videos would be helpful. A good sign of trust is to register in Google My Business and fill in information about your business, with photos. Genuine customer reviews would be an additional bonus. For instance, trustpilot.com offers such services. It is a website which publishes reviews for online businesses and helps shoppers make decisions on what and where to buy. Trustpilot also has advertising features for online businesses. For this to work install review plugin in your CMS or add external links to review websites.
Search engines and humans see pages differently. For one, information that we might find interesting and desirable when it is embedded in a form of media file such as Flash, video, audio, Java applet is hidden for crawlers. To make the process of search optimisation consistent there must be some means to convey this data to the search engines.
Content with keywords is invisible for crawlers if it is located after submission required forms. Additional benefit would be for each page to have at least 150 different words in a form of a coherent text and 1-2 pictures accompanying it. Having too many pictures or videos may overwhelm users.
Blank pages don’t convey any relevant information and thus harm ranking. They are either to be deleted, filled in with relevant content or removed from indexing in Google Search Console, robots.txt file or using <meta robots=”nofollow, noindex”> in HTML code of required pages.
if you wish uncrawlable data to be indexed provide transcripts and descriptions for your podcasts, videos and graphs in HTML code.
The SEO strategies are necessary, because online marketing is highly competitive with millions of new websites appearing in search results every week. By building your website with search optimisation in mind you ensure less work needs to be done in the future.
Checklist for programmer
- Requirements for CMS
1.1. Administrative panel: give insight into data, manage user accounts, custom plugins, 301 Redirects, custom design templates, HTML editor, logs, email sender, website layout and page hierarchy control;
1.2. Page-loading speed less than 3 seconds. Test from different locations;
1.3. generate sitemap on demand;
1.4. mobile-friendly and desktop versions of the website: dynamic adaptation to different screen sizes (desktop, smartphone, tablet), mobile version is optimized for slower Internet connection and less powerful devices.
1.5. Possibility to edit on each page: Title Tag, Meta-Description, URL, Semantic Headlines, Alt-Text of Images;
1.6. Robots.txt Control;
1.8. Customer Reviews plugin, rating in Google Search Results.
1.9. Automatic check for duplicated pages.
1.10. SSL certificate, Internal/external links should be accessible through https
1.11. Integration with Google Tag Manager, Google Analytics, Google Search Console.
- Correctly structured website
2.1.1. Content is no less than 150 words, including keywords.
2.1.2. Correctly structured Menu according to product categories and needs.
2.1.3. Optimised background color, font and text style.
2.1.4. Company’s logo.
2.2 Requirements for every page:
2.2.1. Implement correct hierarchy of headers H1 -> H2 -> H3.
2.2.2. H1 tag is used once. Contains 1 to 7 words.
2.2.3. Meta title length is 35-60, includes keywords
2.2.4. Meta description length is 120-156, includes keywords
2.2.5. H1 and title are not the same
2.2.6. Content is no less than 150 words, should be related to the title and headers.
2.2.7. Social buttons are in place, in the bottom of the page.
2.2.9. Table of contents for long articles.
2.3 Requirements for URL:
2.3.1 human-friendly word-formed: https://example.com/human/friendly/links
2.3.2 Category URL: https://example.com/category/
Product URL: https://example.com/category/some-product-1
2.3.3. All internal links are written explicitly:
2.4. Breadcrumbs: correspond to the URL and to the tree structure
2.5 Categories: Home -> Contact us, Blog, Vacancies, Services, Feedback, Comment Sections
2.6. Carousel (slider) for services/products on the main page.
2.7. Microdata, schema.org implementation;
2.8. Sub-section depth less or equal to 3.