A user-friendly website engages users. However, this alone cannot guarantee a good amount of web traffic. Every website should establish their online presence but especially the new ones.
Search Engine Optimisation (SEO) is a vital factor in establishing your website’s online presence. Users can easily find your website and as a result, more traffic will be driven your way. (Source: B2C)
There are 3 basic things that can make a new website SEO-friendly:
A sitemap is a list of web pages that contain a website’s content. (Source: Google Search Console Help) It includes all the important pages that you want the users and search engines to find or crawl on your site. (Source: TechTarget) The pages contained within the sitemap are directly linked to their counterpart segments of the site.
The most common sitemaps are HTML Sitemap and XML Sitemap.
a) HTML Sitemap
A sitemap in HTML format is designed to help users find the arrangement of your website’s content.
b) XML Sitemap
A sitemap in XML format is intended for search engine crawlers. It is an organised format that tells the search engine about the web pages that need to be crawled. (Source: Digital Talks) It usually contains XML tags. Some of these XML tags contain the URL of the page, the frequency of a page update and the last modification of the file. (Source: sitemaps.org)
2. Robots.txt with Sitemap
A robots.txt file is a text file that signals the robots or search engine crawlers to website components that need to be crawled. It typically contains directives. This is the list of valid guidelines for crawlers submitted in the robots.txt file. There are two important directives in a robots.txt file:
The “allow directive” identifies the specific directories that can be accessed by crawlers. If you set it at “full allow”, all content will be crawled.
The “disallow directive” identifies which directories should not be accessed by crawlers. It can be “full disallow” (no content will be crawled) or “conditional allow” (certain content will be crawled).
(Source: Google Developers)
A robots.txt file should also have a link to the sitemap to allow robots to easily find and access the location of your sitemap. Some robots.txt files with sitemap commonly contain the disallow directive only. This is because the robots assume that all the other directories in your website with no disallow directive can be crawled. (Source: Advanced HTML)
Note: All directories with the disallow directive should not be added to the sitemap. This is to reduce the “crawling period” of robots.
3. Optimised Metadata
The term metadata describes the significant data that search engines need to index your website. By optimising the metadata you can control how the content of your website will appear in the search engine results. This can also improve your search engine rankings.
Metadata contain three types of data:
· Meta Title
The Meta title contains the title of the page. It is an important factor in SEO because it shows up on the Search Engine Result Pages (SERPs).
· Meta Description
The Meta description is the preview snippet of a certain page. It is composed of one to two sentences that describe the summary of a page’s content. It is necessary to provide an optimised Meta description to help users understand what they will find on the page.
· Meta Keywords
Unlike the Meta title and the Meta description, Meta keywords do not show up on the SERPs. Intended for search engine crawlers, they appear in the HTML code of a certain web page. The Meta keywords should be in line with the optimised keywords within your web page content.
The optimised metadata should be unique, concise and accurate so that it can produce maximum SEO impact.
These things can help to promote the significance of your website. More traffic to your website means more opportunities to convert traffic into leads and leads into sales.
It’s a bit like making a cake really… If you mix together the finest ingredients and take care in your preparation, you will be rewarded with very pleasing results!