Page 1 of 1

Common mistakes on websites - Felipe Bazón

Posted: Mon Dec 23, 2024 9:03 am
by Irfanabdulla1111
We can also see that there are many errors with AMP.

It seems like you just have to press a button and that's it when Google gives you the right instructions, but implementing them correctly sometimes doesn't seem urgent or important.

This is a topic I have worked on and I can say that AMP is not the panacea of ​​marketing, since even if it generates traffic, they are not usually URLs where conversions are created, but even so, we are duplicating the URLs of a domain, so if we do not have it in a perfect state with things as simple as internal linking or canonicals, we can destroy more than we create.


Common mistakes on websites - Alex Navarro
@vivirdelared

Broken outbound links
Broken outbound links are often a problem that can go unnoticed in most cases.

This happens for the simple reason that these types of links tend to generate themselves over time.

If you have linked to another website from your page and it ceases to exist after a while, your link will remain pointing to a site that no longer exists.

This in itself is not a serious problem, but there are cases in which it can become one.

Imagine for a moment that someone registers an expired domain that has a link from your website.

If the person who bought that domain decides to set up a pornographic website or one with a similar theme, your page will suddenly be linked to a domain of that type.

This can negatively affect you in terms of SEO.

Another variant that could occur is that a competitor of yours registers that expired domain and makes a 301 redirect to their own page.

With which they could get a link from their main competitor (you).

The key is to check your site's outbound links at least several times a year to avoid these types of problems.

There are all kinds of free tools that can do this for you, as well as certain Chrome extensions that can check for you every time you browse your website.

How do trackers scan?

Deciphering how search engine bots work

Try Log File Analyzer! →
ADS illustration


@felipe09

I think the biggest problems for e-commerce sites in Brazil are related to tracking issues.

The vast majority of SaaS or Open-Source platforms generate many irrelevant pages that are crawled and indexed by robots 99% of the time.

The concept that says "the more pages that are indexed the better" is one of the most common misconceptions in SEO today.

Therefore, we need to look very carefully at the pages generated by the e-commerce platform and email list india define what should or should not be crawled and indexed.

A good starting point is the crawl report in SEMrush called Site Audit or the new Log File Analyzer tool.

I recommend blocking crawling and indexing of faceted navigation (filters) and all internal searches to avoid indexing of irrelevant pages and keyword cannibalization.