Quickly optimize your performance by requesting an SEO audit or by learning to do it yourself in the tutorial below.
Google referencing has changed and so have SEO audits. A complete technical approach, a page tree structure and content that hooks your customers are the basis of a good website in 2020. In highly competitive sectors, a good site is inseparable from power and authority. We are talking about popularity, authority of the domain name and backlinks. The principle is simple, the more authority you have, the more positions you have in Google's top 3.
Then, we will offer you a coaching type follow-up of your webmasters in order to support you in your communication dynamic with paid referencing actions (Google or Facebook ads), affiliations / partnerships, blogging, newsletter, netlinking, promotional calendar ...
Why do an SEO Audit?
Do you have an e-commerce website, but you are not generating any sales / requests for quotes, or contacts ? You do not appear in the results of Google ? Are you found on your brand name but no generic keyword?
It's good to have a site, it's even better to have qualified visitors !
Beyond a few basic tips that are not obvious to everyone
There is no point in filling your title tag with keywords , no we do not rank on empty pages, yes you need quality content, there is no point in repeating the same keyword dozens of times in your text )
Only an in- depth technical and semantic analysis of your site can highlight certain factors blocking your natural referencing or, conversely, allow you to deploy good content to make your offer of your products or services meet demand.
223 clicks from Google in 6 months, that's a real problem to deal with!
Here are the metrics of a D + 1year startup positioned in sports travel which achieves nearly 200K €
How to properly analyze your website?
An SEO audit is a clear, synthetic and precise approach whose objective is to set up an effective SEO strategy in the short, medium and long term. We do not practice the "audit by the kilo" by delivering 200 recommendations , on the contrary we will point out the 5 or 10key points which will allow the implementation of rapid results .
SEO and Google audit developments
In more than 20 years, Google has evolved a lot. Many updates to understand the content and absorb an exponentially growing number of websites. In 2013, Google turned a corner in the search for the identification of "quality content" , it began to seek to respond to the intentions of Internet users, to develop voice and conversational searches to enhance synthetic but very high content. qualitative. The post 2013 era is marked by machine learning, it is the update called "rank brain" which, as its name suggests, seeks to learn the behavior of Internet users and does tests, like us, the SEO consultants. , we test it.
Before 2013, the analyzes were simple
The audit focuses on highlighting the elements that allow the acquisition of traffic for your website. In the past, audits were limited to these three points in the audit:
Keywords, content and semantic cocoon , where we will deal with keywords, page architectures , meta title tags, meta description, markup (H1, H2…). We are interested here in the contents of the pages: texts, images, videos ...
Popularity, or netlinking , where we will analyze the incoming links to our site. If you are launching your site, or are a start up, we will help you integrate into an ecosystem that will boost your performance . If, on the contrary, you already have a lot of them, we can correct and optimize that.
Important e-commerce sites require a great deal of SEO expertise and a more in-depth audit
After 2013, analyzes become more complex
SXO, the importance of the customer journey
Ux - User eXperience + SEO = SXO, which means that the customer journey is an important criterion in the construction of a site and in the results of Google.
Google SEO competition analysis
The information gathered through a competitive study, a competitive benchmark, helps to better understand the market and to paint a picture of the gaps and possibilities in SEO. The final advantage is to understand the level of the competition: level of technical performance, quantity and quality of content, quantity and quality of backlinks and to implement your strategic planning intelligently.
We reproduce the audit of your site on a sample of sites which are positioned in the top 10 of the targeted request. Then we usually use the "keyword competition checker of seobserver" to compare the authority of the domain names present in the serp and compare it to yours.
Audit Performance E-commerce
We are able to analyze your sales performances, the profiles of your customers as well as your acquisition levers . From the initial audit, to the precise tracking of each channel, we are able to follow you on a long-term basis on aspects specific to e-commerce thanks to a first generalist approach and the implementation of recommendations by our network. specialists.
1. Audit SEO technique
As essential as it is mandatory, the analysis of the technical aspect of a site will already allow you to better understand its ecosystem, starting with the indexing of your tree structure of pages and the web performance of your website in general.
What is the objective of the seo audit?
It's quite simple, we want the Internet user and Google to be satisfied. How do we do it? The objective of the google seo audit is to facilitate the work of its Google Bot crawler and to improve the UX, the user experience . The behavior of Internet users in the SERP, the results of Google is undoubtedly taken into account in the Google algorithm. In theory, it's very simple, in practice, you have to be methodical and complete.
Googlebot is like an Internet user: binary! If it's not easy, it's complicated. If it's complicated, he doesn't want it ...
General information on how the internet and websites work
What is a website?
The anatomy of a website is similar to that of a human : a skeleton, muscles and organs, a face and skin. A website is the same,
the skeleton is the tree structure of pages : domain name and URL
muscles and organs, are the container and the functional perimeter , usually .php files and a database that generate ...
the face and the skin which is an interface in .html
How does a website work?
A website is a set of multimedia content (text, images, video, pdf ...) formatted by computer code from very different languages and frameworks php, ruby, python ... we call it the container. It is displayed according to page templates and offers a set of functions called the functional perimeter . It is identified in the "Word Wide Web" thanks to an ip address which takes the "human friendly" form of a domain name , from which derives a set of addresses: the URLs for Uniform Resource Locator , literally "locator resource uniform ” . That we appreciate when they are " human and seo friendly" , too. All the content is stored on a server, which is no more and no less than a computer.
You can host a site at home on your computer or on a larger computer, a server that is itself in a secure "clean room" in a data center . Why prefer a data center? Because they are often located near backbones, backbones of digital telecommunications networks ... while your computer may still have a 56Kb box in your country house and server management is a job.
How does it all communicate?
You've got the anatomy, now you can understand the dynamics, the "nervous system." Communication protocols are the rules that define the format for transmitting information from one computer to another . Hypertext Transfer Protocol (HTTP, literally " Hypertext Transfer Protocol") is a client-server communication protocol developed for the World Wide Web . HTTPS (with S for secured, ie “secure”) is the variant of HTTP secured by the use of SSL or TLS protocols.
Regarding the domain name and the server, this happens in the DNS. The Domain Name System , commonly abbreviated DNS, which can be translated as "domain name system", is the distributed computer service used to translate Internet domain names into IP addresses or other records. By providing a distributed name resolution service from the early years of the Internet, around 1985, DNS has been an essential component in the development of the network.
What is the particularity of an e-commerce site?
The e-commerce site is simply a website with a payment terminal. Simplistic but so true. All the peculiarity and complexity of the management of the sales funnel, from the product sheet to the page "thank you for your order" makes e-commerce sites a site in its own right. Without counting its multiple typologies, its specific issues and the high stakes that it represents.
However, there is sometimes animportant difference between a real estate catalog type e-business site which generates 5 million euros in annual turnover and an e-commerce site selling sugared almonds and wedding accessories which also does 5 million euros in turnover is technology.
Both can have been developed "from scratch", starting from a blank page or using frameworks : phython, symfony ... or CMS, of which we never touch the "heart": woocommerce, prestashop, magento ...
The analysis steps to follow:
1. The domain name :
We are going to check his age : the older he is, the better. Google will indeed give it more authority. A domain operated 20 years ago will carry more weight than a newly launched domain name. To check this, go to Whois .
2.Directions of the domain name
With HTTP, HTTPS, WWW and without WWW, you are likely to have a domain name duplicated 4 times. Which divides the power of the site by 4! This is a very common problem and the solution is simple, ask your webmaster to do the redirects in the .htaccess file. It must be 301 redirects.
You can check this with this tool to check http headers. These headers often give other indications to optimize performance such as switching to http2 , or to correct redirects to other domain names badly made during hazardous migrations ...
3. The server: hosting a website
Accommodation is a business, we will not need the same resources for a small wordpress with a dozen visitors / day or ecommerce magento who knows peaks connections 300,000 Visitors / day ... The settings servers are paramount. Your SEO consultant audits all the criteria that will impact his SEO. The two main ones:
- geographic location
If your site is French and targets France, it makes sense that your server is in France. He can possibly be physically abroad and have a French IP and benefit from CDN, remote accommodation ... many scenarios are possible, especially internationally. For example, it is common in Polynesia, sites are hosted in the USA, if this is the standard, it is recommended to respect it. Find your Ip here
- Second point: neighboring domains
There are two types of servers, shared (you share the server) and dedicated (you are alone at home). The ideal is to have a dedicated server, but it costs 10 times more expensive on average. With a shared server, you share your resources, and in particular an IP address with other sites, without knowing their activity: spammer, adult ... You can use yougetsignal to identify the sites that share your IP.
Having a cheap server is good. Sharing an ip with 999 other unsavory sites like 01sex.tv is less so. The risk is important if we generate income. A solution switch to a professional solution!
4. The page load time .
The faster a site, the better, if your site is too slow, the Internet user and Google may depreciate it. We will generally identify the 3 main factors:
- weight of the images, they are too heavy: in general we build a site for mobile and standard screen with a template of 1100pixels wide. Since the resolution of a screen is invariably 72 pixels, it is unnecessary to use images with a resolution of 300dpi and 5000 x 2000pixels. cf. GTMetrix which even provides recompressed images of clear and explicit enhancements.
5. .Htaccess et Robots.txt, sitemap.xml
The .xml sitemap, not to be confused with an html sitemal , is the table of contents of the book "Your Website" , it allows crawlers and search engines to have an immediate view of the entire site. This file must be updated automatically because its interest lies in its ability to give google a frequency of passage in order to crawl new content.
Its role is simply to facilitate the crawl of new pages. On sites with ten pages whose content does not move, it will not help you have better SEO. You may not even have one.
The robots.txt it authorizes or not the passage of certain robots in relation to others. Until 2019, it allowed to block the indexing of certain pages, now Google has taken all the rights ... to block the pages, you have to obfuscate them and use a “noindex” tag in the HTTP header of your page .
You can block all robots in the robots.txt file, but this is not recommended. Blocking only some of them optimizes its performance.
The .htaccess is the cornerstone of your SEO. It gives server directives and allows in particular the rewriting of "seo friendly" urls, the installation of canonical urls to avoid duplicate content, redirects 301 (permanent redirection) , 302 (temporary redirection) , 403 (when we got spammed by url injection), 503 (when you don't want to show her its content) ... and even send you back to your mother if you make a mistake, i.e. an HTTP code 504 = the server no longer responds ... know how to master what you do with doing this on a production site!
6. Basic and multilingual architecture
It is not aLegal Notice page or a 404 that will do your referencing but we will ask you to respect the standards.
It's so common to see real sites less well done than fake ones!
If you can type anything after the root of your domain name, such as mysite.fr/sqkfjnhsdnf, if nothing happens or you are not redirected to the home page, you are not do not have a 404 page. On custom-developed multilingual sites, it is not uncommon to see that there is no correspondence between the pages in Language 1 and in Language 2.
The use of hreflang tags which give the correspondences between pages and allow targeting a language and a country is strongly recommended cf. Google Support: Hreflang Ex.
7. Source code errors and 404 errors .
We will check that your site is not riddled with errors. To do this, go to the W3C Validator. Then, we will pass the site on different crawlers such as Xenu (free) or screamingfrog (free with a paid limit) .
Few errors = nothing serious. If you have a lot of errors, contact your developer and ask them to correct.
Note: The 404s are correct inside the pages . Too often we see them just making redirects.
8. Several dozen more parameters to auscultate ...
... to find out if it's itchy or if it tickles .... you understood it, the audit makes it possible to find indexing problems and to facilitate the work of collecting your data by Google Bot, the Google crawler .
After having retrieved the data, he must interpret them and we arrive on a sexier subject. A subject that is constantly being renewed both in substance and in form : the content of your pages!
2. Content audit
Just as important as the technical part, the content part focuses on the different media of your site and the page tree. Technique is the foundation and the walls of your house; the content: the style, the furniture, the decor and everything else ...
If you want to please Google, you will first have to please Internet users. This is Google's new machine learning paradigm : after successfully understanding content, it tries to understand users and their intentions!
1. Indexing your pages
Has your site been indexed by Google? In other words, has it listed all your pages? It can be easily checked using the site: command of the famous search engine. You just have to type site: mysite.fr and you will immediately have a list of all the indexed pages. If you have a 900 page site and it returns 10 results, you have a content or technical issue .
To know the precise number of pages on your site, you can consult the Google Search Console but you must also crawl it , that is to say use a small software that will list and display all your pages. Let's mention the two best known: ScreamingFrog (beware, limited to 500 URLs - if you have more, upgrade to the paid version) and Xenu (no frills on the design, it's free and super efficient).
This will allow you to verify that you don't have too many 404 errors (this means page unavailable - have you considered redirecting them (301) to a similar page?) And that your URLs have been rewritten using words -clefs and nonin raw version of stripping like monsite.fr/dslkfjdsfjdkgh-01214564.php.
2. Meta Titles Tags
A Title tag is the title of your HTML page - the one that appears in search results, or the text that appears in your browser tab.
Writing an impactful meta Title is a real delicate and often difficult exercise, which we can also A / B test !
In principle, it should not exceed 70 characters (but Google still takes longer titles into account), contain relevant keywords (the most important first) and, at the same time, make people want to click. . If you need to see your meta titles in search results, use Google SERP Snippet Optimizer .
3. Meta description tag .
The meta description tag is the one that appears below your title, or the description of the search result. He says to himself that it does not matter and if it is not filled in, google will look for the key words typed by the Internet user in the content of the page, which can be informable because "poorly served" by Google . We therefore recommend a unique meta description with keywords and eye-catching text stating your concept. It used to be limited to 156 characters. Today we have gone to 360.
4. Meta keyword tag
It is no longer useful. In the past, when there were few sites, 20 years ago SEOs filled it in to easily move up the rankings, but it no longer works. If you see meta keywords filled on your site don't waste time on it.
5. Semantic markup
Semantic markup designates the way in which texts are structured, using h1, h2, h3 tags ... Moreover, this text is marked with Hn. Ideally, each page should contain text (250-300 words) and tagged with relevant keyword phrases (the ones you want to position yourself on for example!).
If your page talks about "surfing", and your h1 (the most important tag) is "It's so good", that's zero for your SEO. The Title and the slogan are often confused.
As a reminder: only one H1, several H2 and several H3. If you see more than one H1 on your page, it won't penalize you, but you dilute the strength of the keywords you have included there. Likewise, an H2 on "My basket" has nothing to do there: it does not convey any semantic value. To quickly see the markup of a page, you can install the Web Developer extension on Chrome or firefox.
6 Text formatting
Is the text formatted and highlighted on the site? With bold (), quotes, italics ... The bolding of key expressions in your text allows your texts to be read diagonally and therefore a global understanding ... search engines see them too . As for empty pages, that is to say without text, they are to be avoided!
7. Text hidden in white on a white background
If text is hidden from the Internet user, by any means whatsoever, it is cloaking. A penalty is possible.
8. Duplication of content
is text content duplicated? To find out, take a short paragraph, copy it, and google it. If the same text is on one or more other sites, well done, your site is duplicated! Duplication is penalizing - contact these sites quickly to have this content removed. As a reminder, each text must be unique, original and personalized. And even if you take a sentence and reformulate it by changing a few ends, it's called near duplicate - and it's spotted too!
9 Identification of Images
The alt tag is as unimportant as the keywords. Don't waste time on it. The context of the photo will determine its nature. This is how we end up with the photo of Adriana Karambeu in the results of the request "Pizza with Anchovies"
Image tags no longer influence Google Image results, only context matters!
10 Tree structure and url structure
In this part, we check that the site does indeed benefit from a tree structure (for example, mysite.fr/categorie1/sous-categorie1/page1.html). It often happens that we take over freshly delivered sites in their entirety in order to review the structure with clean and orderly URLs. In the past, we then checked the level of clicks to access information with a maximum of 3. Today we have more elaborate strategies and more complex structures such as semantic cocoons built on 4 levels of content, of different quality and other deep, important pages but outside the main navigation.
11 Internal mesh.
It is good to check the internal links of your site. Are they correctly written (in my keywords?). All unimportant links (to legal notices, credits, shopping cart, social networks…) can be obfuscated, that is to say that you prevent your site from transmitting authority to them. Facebook really doesn't need it, don't worry.
12 Position O, PAA -People Also Ask et Knowledge graph
These basics from before 2013 having been seen, now we must agree with the latest updates from Google relating to machine learning, rankbrain. To take competitive requests, you have to comply with today's standards:
It has long been known that Internet users appreciate long requests and ask questions. This is what we now call voice and conversational search "Hello Google, what is the weather like today?". The advent of the use of mobile and the indexing of sites in "mobile first" has revolutionized SEO.
SEO is simple or you profit from it, or it will be someone else!
13 Micro dates
Your web pages have an underlying meaning that people understand when they read them. But the search engines have a limited understanding of what is discussed on these pages. By adding additional tags to the HTML of your web pages - tags that say, "Hey search engine, this information describes this specific movie, or place, or person, or video" - you can help search engines and other applications better understand your content and display it in a useful and relevant way. Microdata is a set of tags, introduced with HTML5, that allows you to do this. Learn more at schema.org. Consult us to help you be more clickable in the Google SERPs
14 Semantic cocoon and EAT - Expertise Authority trust
This complete and sustainable SEO technique revolves around a quality c ontenu that covers the triptych: expertise, authority, trust and is the only complete technique combining a technical, marketing and sales SEO approach . Its fields of application range from a simple site of a few pages to e-commerce of a few hundred pages or real estate portals of several tens of thousands of pages. We can help you build a semantic cocoon
Popularity and Netlinking audit
In comparison to the two previous parts, the linking will be quite concise because it is the subject of a service in its own right. The technical part is the supporting structure of your house. The semantic part, the style, the furniture and the decor. Linking is the power, the size of the house. It is by linking that we measure the authority of the domain name.
The Internet is made up of aggregates. If your site is part of authority sites, you will be positioned as such!
Diagram taken from the website l'Atelier cartographie, by Franck Ghitalla
Hyperlinks of all kinds, allow you to get out of the deep web to hoist you into an aggregate of authority sites. Many are to be respected and it is not an amateur's job because Google wants quality and quality is very subjective. You can make mistakes that make you disappear from Google.
La Structure du web by Franck Ghitalla, for an article on the dark web
You can find more info on netlinking on this page.
FAQ: All the questions you ask yourself
Does social media have an impact on my SEO?
Yes, a buzz in facebook, or pins shared regularly on pinterest will boost your positions in Google. Social networks are clearly part of the ecosystem linked to netlinking.
Why does the customer journey influence Google positions?
Content written for everyone will always be less relevant than for a specific target. How to enter a generalist market, we focus on a niche and we become the specialist. The verticality of the specialist thus makes it possible to compete with very powerful but generalist sites.
Ex. A hotel in Biarritz can be in front of booking or airbnb if it has worked on its specialty.
It's the same on the generic market. This is the niche taken by platforms specializing in sports travel and themed stays.
EMD, or exact match domains
These are domain names with keywords corresponding to the keywords referred to as referencementparis.fr. Google says it doesn't work, but it works! Go to this page if you need to protect your brand.
Proven to work in multiple sectors of activity ranging from local gastronomy to video games, including bank comparators, construction and renovation services, or new real estate, we also work for liberal professions: osteopaths, lawyers ... ZENDOC: Contract Automation Software , we take pleasure in discovering your business and helping you activate effective leverage.
Specialist in growthacking and e-commerce , you will benefit from our experience to develop your activity on the internet in record time.
Derrière le flegme du surfeur se cache un professionnel aguerri aux techniques de webmarketing avancées qui vous permettront de surclasser vos... concurrents. Nicolas est un professionnel que je recommande !plus