SEO analysis

SEO audit is the first step to successful website optimization

On page SEO analysis

Check your website

You have your own website, but not as many people as you would like? Probably something is wrong with her and needs to be corrected as soon as possible.

U

This is what it is for SEO analysisthat is being used by more and more people. SEO analysis or in other words SEO audit is the first and most important step in website optimization. With it, you can understand not only your own site, but also its competition and the industry in which you operate.

We mean SEO analysis, which is a guarantee that not only your site will succeed, but especially you. If you want to welcome more visitors to your site, you should make sure that its content fits the search engine needs.

F

Many people think the site is fine and do not need any analysis. However, this is a mistake, because without it you cannot identify and correct the errors.

Z

Sources with 4xx status code

4xx errors often point to a problem on a website. For example, if you have a broken link on your page and visitors click on it, you might see a 4xx error. It is important to regularly monitor and correct these errors as they can have a negative impact and can reduce the authority of your site in the eyes of users.

Z

Blocked resource indexing

Indexing resources can be limited in several ways. In the robots.txt file, by the Noindex X-Robots tag, by the Noindex Meta tag. So make sure all your unique and useful content is available for indexing.

Z

robots.txt

The robots.txt file is automatically crawled when Googlebot arrives at your site. This file should contain commands for robots, such as pages that should or should not be indexed. To disable indexing of some content (such as pages with private or duplicate content), use the appropriate rule in your robots.txt file. For more information about this policy, please visit http://www.robotstxt.org/robotstxt.html.

Note that commands placed in a robots.txt file are suggestions rather than absolute rules that robots should follow. There is no guarantee that a robot will not review content that you have banned. "

Z

Sources with 5xx status code

5xx error messages are sent when the server has a problem or error. It is important to regularly monitor these errors and investigate their causes, as they can have a negative impact and reduce the authority of the site in the search engine eyes.

Z

The 404 page is set up correctly

A custom 404 error page will help keep users on the web. In an optimal world, it should let users know that the page you are looking for does not exist, and should include elements such as your HTML Sitemap, navigation bar, and search box. More importantly, a 404 response code should be returned on a 404 page.

Z

.xml sitemap

The XML sitemap should contain all the web pages you want to index and be placed on a web page in a basic directory structure (eg http://www.yoursite.com/sitemap.xml). In general, it serves to support indexation. You should update it each time you add new pages to your site. In addition, the sitemap should follow the prescribed syntax.

Sitemap (sitemap.xml) allows you to prioritize each page and tell the search engines which pages to crawl more often (that is, update more often). Learn how to create an .xml file at http://www.sitemaps.org/.

redirects

redirection
Z

A page with and without www is set up correctly

Usually, websites are available in the domain name without the term "www." Merging both URLs will help prevent search engines from indexing two versions of web pages.

Although indexing both versions does not penalize, setting one of them as a priority is best practice, partly because it helps to reduce the value of SEO from links to one common version. You can find or change your current primary version in a .htaccess file. It is also recommended that you set your preferred domain in Google Search Console.

Z

Redirect pages 302

302 redirection is temporary, so no link juice passes through it. If you use them instead of 301, search engines can continue to index old URLs and treat new ones as duplicates. Or they can split the popularity of the link between the two versions, thereby damaging the search rank. For this reason, it is not recommended to use 302 redirects if you are permanently moving a page or site. Instead, stick to 301 redirects to maintain link juice and avoid duplicate content.

Z

Pages with long redirect strings

In some cases, either because the .htaccess file is set incorrectly or because of intentional action, the page may end up with two or more redirects. It is strongly recommended to avoid such redirect chains longer than 2 redirects, as they can cause multiple problems.

Z

Pages with meta refresh tag

Meta refresh is in principle a violation of Google's quality guidelines and is therefore not recommended for SEO. As one Google representative notes, "In general, we recommend that you do not use meta refresh redirects, as this may cause concern to the user (and search engines who might mistakenly consider this to be a redirect). in terms of crawling, indexing, or ranking, but it would still be a good idea to remove the meta refresh. "

Z

Problems with HTTP / HTTPS versions

The use of secure encryption is strongly recommended for many websites (such as those that make transactions and collect sensitive user information.) However, in many cases, webmasters face technical problems installing SSL certificates and setting up HTTP / HTTPS versions on the web.

If you use an invalid SSL certificate (such as an untrusted or expired certificate), most web browsers will prevent users from visiting your site by displaying an "insecure connection" warning.

If the HTTP and HTTPS versions on your site are not set up correctly, both can be indexed by search engines and cause duplicate content issues that may weaken your site's ranking.

Z

Redirect pages 301

301 redirects are persistent and are typically used to troubleshoot duplicate content or to redirect certain URLs that are no longer needed. Using 301 redirects is completely legitimate and is good for SEO because 301 redirects will link power from the old page to the new one. Just remember to redirect your old URLs to the most relevant pages.

Z

Pages with rel = "canonical" tag

In most cases, duplicate URLs are processed through 301 redirects. However, sometimes, for example, when the same product appears in two categories with two different URLs and both must be live, you can use rel = "canonical" to determine which page should should be considered a priority. It should be properly implemented in the tag on the page and point to the version of the main page you want included in search engines. If you can configure the server, you can specify the canonical URL by using the rel = "canonical" HTTP header.

Internal and external links

links
Z

Wrong links

Broken links can be a bad signal for search engines and users. If the page contains many broken links, they will conclude that it has not been updated for some time. As a result, your site's ranking may be downgraded.

Although 1-2 broken links will not penalize Google, try to regularly check your website, fix broken links (if any), and make sure they don't increase. In addition, users will enjoy your site more if they don't see broken links pointing to nonexistent pages.

Z

Dofollow external links

Simply put, dofollow links are links that lack the rel = "nofollow" attribute. Such links are followed by search engines and uploading PageRank (note that links can also be restricted by bulk tracking via the nofollow tag).

While there is nothing wrong with linking to other sites through dofollow links, if you link large links to irrelevant or low quality sites, search engines may conclude that your site sells links or participates in other link schemes and may be penalized.

Z

Pages with excessive links

According to Matt Cutts (former head of Google's Web Spam team), “… there is still good reason to keep less than 100 links: a better user experience. If you see more than 100 links on a page, this can be confusing for your users while reducing the user experience. The site may look good for you unless you put on a "user's hat" and see what the site looks like for a new visitor. "While Google is still talking about user experience, too many links on the page can harm your ranking. So the rule is simple: the fewer links on the page, the fewer problems with its ranking. Try to follow best practices and keep the number of outbound links (internal and external) up to 100.

Pictures and ALT lyrics

paintings
Z

Bad images

While broken images on the web don't directly affect search engine ranking, they certainly deserve to be corrected for two reasons.

First of all, erroneous images are a decisive factor in the user experience and can result in visitors leaving the site without meeting their goals.

And secondly, missing images can hinder crawling and indexing of pages, making it difficult for robots to crawl important page content.

Z

Empty ALT texts

Although search engines cannot read text from images, alt attributes (also known as "alternative attributes") help search engines understand what your images display.

Best practice is to create alternate text for each image using the keywords in it, if possible, so that search engines can better understand the content of your site, hoping your site will be better in the search results.

On page meta data

Z

Empty TITLE tags

If the page does not have a filled title or the title tag is blank (looks like <title> </title> in the code), Google and other search engines decide for themselves which text is displayed as your page title in SERP snippets. This means that you won't have control over what people see on Google when they find your page.

So, whenever you create a new website, make sure to add an optimized name that is attractive to users.

Z

Too long TITLE tags

Each page should have a unique keyword-rich title. At the same time, you should try to keep short title labels. Headlines longer than 70 characters will be truncated by search engines and will be unattractive in search results. Even if your site is on the first page of search engines, their titles are shortened or incomplete, they don't get as many clicks as they would otherwise get.

Z

Duplicate DESCRIPTION tags

According to Matt Cutts, it is better to have unique meta descriptions, and even better to have meta descriptions blank, rather than duplicate their pages. So make sure your most important pages have a unique and optimized page description.

Z

Duplicate TITLE tags

The page title is considered the most important element on the page. This is a strong signal of relevance to search engines because it tells them what the site is all about. Of course, it's important that the title contains your most relevant keyword. In addition, however, each page should have a unique title to ensure that search engines do not have trouble determining which site is relevant to the query. Pages with duplicate titles have less chance of high ranking. If your site contains sites with duplicate titles, this may also affect the ranking of other sites.

Z

Empty DESCRIPTION tags

Although meta descriptions do not directly affect ranking, they are still important because they create a snippet of text that people see in search results. The meta description should "sell" the website to the user and encourage him or her to visit the website.

If the meta description is empty, the search engines will decide what to include in the snippet.

Z

Too long DESCRIPTION tags

Although meta descriptions do not directly affect ranking, they are still important because they create a snippet that people see in search results. Therefore, descriptions should "sell" the webpage to the searchers and encourage them to click on it. If the meta description is too long, the search engine will shorten it and may seem unattractive to users.

Why SEO audit is important

There are several general reasons why this analysis is mentioned important for every page. One of them is that search engines are constantly updating their algorithms for better search, so it's important to know about them to make the necessary edits to the site.

SEO analysis will also help you to discover bad links. If you don't know about them, you can't fix them, so people can't reach your page. So you can recover lost traffic with this analysis, which literally saves you.

In using the website are very important as well keywordsthat people most often search for. Analyzing these words is a key process to get important insights into exactly what your target audience is searching on the Internet. If you use the option to target your keywords exactly to the group of people you need, success will come sooner than you might think.

An important question in terms of web traffic is also his Contents. It must not be outdated in any case, otherwise you will also lose those visitors who get lost to your site. So keep your site in mind update regularly so that it is fresh and meets the requirements of its visitors.

An important part is also backlinksthat can also help you very much. You should avoid those that are from inappropriate or irrelevant sources, as they may spoil your ranking in search results.

If you're thinking about creating your own site or you've already created one, think that SEO analysis is an essential step to success. As it is a technical process, it should be done by someone with experience, who will do it professionally and properly. Without this analysis, however, you cannot expect your site to be visited and successful as you wish.

Englishslovak