View
228
Download
1
Category
Preview:
DESCRIPTION
Crawling, Indicizzazione e SEO - Slide a cura di Paolo Ramazzotti
Citation preview
What we do:
Consulenza & training:
Inbound Marketing SEO Conversion Rate Optimization Web Analytics Social Media Marketing
Outbound Marketing SEM / PPC / Multichannel Adv
Email Marketing
Software / Website Engineering
Question time!
Qual è il requisito più importante per essere trovati da Google?
L’organizzazione dei dati
Il processo di ricerca
Sito ASito BSito CSito DSito E
Google Index
Query
SERP
Stairway to heaven
CRAWLING
INDEXING
RANKING
Cosa vuol dire “CRAWLING”?
“Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.
We use a huge set of computers to fetch (or “crawl”) billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.
Google’s crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.”
What???
CRAWLING
“Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.
CRAWLING
We use a huge set of computers to fetch (or “crawl”) billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.
Possiamo “comunicare” con i Robot?
ROBOTS.txt
CRAWLING
Google’s crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.”
Il tuo Sito è “Crawlable”?
Il tuo sito è “crawlable”?
Google vuole sentirsi a casa
Robots.txt
Sitemap XML (pagine, immagini)
Sitemap HTML
Breadcrumbs
Layout
Apri un account su Google Webmaster Tools ORA!
Struttura delle URL
UX & Block Level Analysis
Block Level Analysis
What else?
La navigazione!
Chi siamo
• Mission• Vision• Dove siamo• Contatti
Prodotti
• Trasporti Internazionali
• Spedizioni nazionali
Servizi
• Logistica• Warehouse
La navigazione!
Home Azienda XY
• Mission• Vision
Trasporti Internazio
nali
Spedizioni nazionali Logistica Warehous
e Contatti
INDEXING
“When a user enters a query, our machines search the index for matching pages and return the results we believe are the most relevant to the user. Relevancy is determined by over 200 factors, one of which is the PageRank for a given page. PageRank is the measure of the importance of a page based on the incoming links from other pages. In simple terms, each link to a page on your site from another site adds to your site’s PageRank. Not all links are equal: Google works hard to improve the user experience by identifying spam links and other practices that negatively impact search results. The best types of links are those that are given based on the quality of your content.”
What???
INDEXING
“When a user enters a query, our machines search the index for matching pages and return the results we believe are the most relevant to the user. Relevancy is determined by over 200 factors, one of which is the PageRank for a given page.
INDEXING
PageRank is the measure of the importance of a page based on the incoming links from other pages. In simple terms, each link to a page on your site from another site adds to your site’s PageRank.
Not all links are equalNot all links are EQUAL!
INDEXING
Google works hard to improve the user experience by identifying spam links and other practices that negatively impact search results. The best types of links are those that are given based on the quality of your content.”
Google+ Brand Machine
Attento a questi due! Barracuda Check Tool
http://www.barracuda-digital.co.uk/panguin-tool/
Google vuol conoscere il tuo profilo
Schema.org
Rel=author e rel=publisher
Google Plus
Link Profile
Be a Brand!
Microdata – quanti e quali? http://schema.org/docs/full.html
Microdata – debug http://www.google.com/webmasters/tools/richsnippets
Usage & user intent
L’utente ha trovato quello che cercava?
BR CTR
UI e UX devono essere “sticky”, non “bouncy”
Paolo Ramazzotti
Thank You!
Recommended