We get lots of questions every day - in our Webmaster office hours, at conferences, in the webmaster forum and on Twitter. One of the more frequent themes among these questions are links and especially those generated through JavaScript.
In our Webmaster Conference Lightning Talks video series, we recently addressed the most frequently asked questions on Links and JavaScript:
Note: This video has subtitles in many languages available, too.
During the live premiere, we also had a Q&A session with a few additional questions from the community and decided to publish those questions and our answers along with some other frequently asked questions around the topic of links and JavaScript.
Googlebot parses the HTML of a page, looking for links to discover the URLs of related pages to crawl. To discover these pages, you need to make your links actual HTML links, as described in the webmaster guidelines on links.
Googlebot extracts the URLs from the href attribute of your links and then enqueues them for crawling. This means that the URL needs to be resolvable or simply put: The URL should work when put into the address bar of a browser. See the webmaster guidelines on links for more information.
As long as these links fulfill the criteria as per our webmaster guidelines and outlined above, yes.
When Googlebot renders a page, it executes JavaScript and then discovers the links generated from JavaScript, too. It's worth mentioning that link discovery can happen twice: Before and after JavaScript executed, so having your links in the initial server response allows Googlebot to discover your links a bit faster.
Fragment URLs, also known as "hash URLs", are technically fine, but might not work the way you expect with Googlebot.
Fragments are supposed to be used to address a piece of content within the page and when used for this purpose, fragments are absolutely fine.
Sometimes developers decide to use fragments with JavaScript to load different content than what is on the page without the fragment. That is not what fragments are meant for and won't work with Googlebot. See the JavaScript SEO guide on how the History API can be used instead.
The AJAX crawling scheme has long been deprecated. Do not rely on it for your pages.
The recommendation for this is to use the History API and migrate your web apps to URLs that do not rely on fragments to load different content.
This post was inspired by the first installment of the Webmaster Conference Lightning Talks, but make sure to subscribe to our YouTube channel for more videos to come! We definitely recommend joining the premieres on YouTube to participate in the live chat and Q&A session for each episode!
If you are interested to see more Webmaster Conference Lightning Talks, check out the video Google Monetized Policies and subscribe to our channel to stay tuned for the next one!
Join the webmaster community in the upcoming video premieres and in the YouTube comments!
In this time of a global pandemic, webmasters across the world—from government officials to health organizations—are frequently updating their websites with the latest information meant to help fight the spread of COVID-19 and provide access to resources. However, they often lack the time or funding to translate this content into multiple languages, which can prevent valuable information from reaching a diverse set of readers. Additionally, some content may only be available via a file, e.g. a .pdf or .doc, which requires additional steps to translate.
To help these webmasters reach more users, we’re reopening access to the Google Translate Website Translator—a widget that translates web page content into 100+ different languages. It uses our latest machine translation technology, is easy to integrate, and is free of charge. To start using the Website Translator widget, sign up here.
Please note that usage will be restricted to government, non-profit, and/or non-commercial websites (e.g. academic institutions) that focus on COVID-19 response. For all other websites, we recommend using the Google Cloud Translation API.
Google Translate also offers both webmasters and their readers a way to translate documents hosted on a website. For example, if you need to translate this PDF file into Spanish, go to translate.google.com and enter the file’s URL into the left-hand textbox , then choose “Spanish” as the target language on the right. The link shown in the right-hand textbox will take you to the translated version of the PDF file. The following file formats are supported: .doc, .docx, .odf, .pdf, .ppt, .pptx, .ps, .rtf, .txt, .xls, or .xlsx.
Finally, it’s very important to note that while we continuously look for ways to improve the quality of our translations, they may not be perfect - so please use your best judgement when reading any content translated via Google Translate.
Google Webmaster Help forums are a great place for website owners to help each other, engage in friendly discussion, and get input from awesome Product Experts. We currently have forums operating in 12 languages.
We’re happy to announce the re-opening of the Polish and Turkish webmaster communities with support from a global team of Community Specialists dedicated to helping support Product Experts. If you speak Turkish or Polish, we'd love to have you drop by the new forums yourself, perhaps there's a question or a challenge you can help with as well!
Current Webmaster Product Experts are welcome to join & keep their status in the new communities as well. If you have previously contributed and would like to start contributing again, you can start posting again, and feel free to ask others in the community if you have any questions.
We look forward to seeing you there!
Posted by Aaseesh Marina, Webmaster Product Support Manager
Fora pomocy dotyczące usług Google dla webmasterów są miejscem, w którym właściciele witryn mogą pomagać sobie nawzajem, dołączać do dyskusji i poznawać wskazówki Ekspertów Produktowych. Obecnie nasze fora działają w 12 językach.
Cieszymy się, że dzięki pomocy globalnego zespołu Specjalistów Społeczności, którzy z oddaniem wspierają Ekspertów Produktowych, możemy przywrócić społeczności webmasterów w językach polskim i tureckim. Jeśli posługujesz się którymś z tych języków, zajrzyj na nasze nowe fora. Może znajdziesz tam problem, który potrafisz rozwiązać?
Zapraszamy obecnych Ekspertów Produktowych usług Google dla webmasterów do dołączenia do nowych społeczności z zachowaniem dotychczasowego statusu. Jeśli chcesz, możesz wrócić do publikowania. W razie pytań zwróć się do innych członków społeczności.
Do zobaczenia!
Autor: Aaseesh Marina, Webmaster Product Support Manager
As a part of the Webmaster Conference series, last fall we held a Product Summit at Google's headquarters in Mountain View, California. It was slightly different from our previous events, with a number of product managers and engineers from Google Search taking part. We recorded the talks held there, and are happy to be able to make these available to all of you now.
In the playlist you'll find:
We hope you find these videos insightful, useful, and a bit entertaining! And if you are not subscribed to the Webmasters Youtube channel, here’s your chance!
Posted by John Mueller, Search Advocate, Google Switzerland
As the effects of the coronavirus grow, we've seen businesses around the world looking for ways to pause their activities online. With the outlook of coming back and being present for your customers, here's an overview of our recommendations of how to pause your business online and minimize impacts with Google Search. These recommendations are applicable to any business with an online presence, but particularly for those who have paused the selling of their products or services online. For more detailed information, also check our developer documentation.
If your situation is temporary and you plan to reopen your online business, we recommend keeping your site online and limiting the functionality. For example, you might mark items as out of stock, or restrict the cart and checkout process. This is the recommended approach since it minimizes any negative effects on your site's presence in Search. People can still find your products, read reviews, or add wishlists so they can purchase at a later time.
It's also a good practice to:
For more information, check our developers documentation.
As a last resort, you may decide to disable the whole website. This is an extreme measure that should only be taken for a very short period of time (a few days at most), as it will otherwise have significant effects on the website in Search, even when implemented properly. That’s why it’s highly recommended to only limit your site's functionality instead. Keep in mind that your customers may also want to find information about your products, your services, and your company, even if you're not selling anything right now.
If you decide that you need to do this (again, which we don't recommend), here are some options:
Proceed with caution: To elaborate why we don't recommend disabling the whole website, here are some of the side effects:
Beyond the operation of your web site, there are other actions you might want to take to pause your online business in Google Search:
Also be sure to keep up with the latest by following updates on Twitter from Google Webmasters at @GoogleWMC and Google My Business at @GoogleMyBiz.
What if I only close the site for a few weeks?
Completely closing a site even for just a few weeks can have negative consequences on Google's indexing of your site. We recommend limiting the site functionality instead. Keep in mind that users may also want to find information about your products, your services, and your company, even if you're currently not selling anything.
What if I want to exclude all non-essential products?
That's fine. Make sure that people can't buy the non-essential products by limiting the site functionality.
Can I ask Google to crawl less during this time?
Yes, you can limit crawling with Search Console, though it's not recommended for most cases. This may have some impact on the freshness of your results in Search. For example, it may take longer for Search to reflect that all of your products are currently not available. On the other hand, if Googlebot's crawling causes critical server resource issues, this is a valid approach. We recommend setting a reminder for yourself to reset the crawl rate once you start planning to go back in business.
How do I get a page indexed or updated quickly?
To ask Google to recrawl a limited number of pages (for example, the homepage), use Search Console. For a larger number of pages (for example, all of your product pages), use sitemaps.
What if I block a specific region from accessing my site?
Google generally crawls from the US, so if you block the US, Google Search generally won't be able to access your site at all. We don't recommend that you block an entire region from temporarily accessing your site; instead, we recommend limiting your site's functionality for that region.
Should I use the Removals Tool to remove out-of-stock products?
No. People won't be able to find first-hand information about your products on Search, and there might still be third-party information for the product that may be incorrect or incomplete. It's better to still allow that page, and mark it out of stock. That way people can still understand what's going on, even if they can't purchase the item. If you remove the product from Search, people don't know why it's not there.
--------
We realize that any business closure is a big and stressful step, and not everyone will know what to do. If you notice afterwards that you could have done something differently, everything's not lost: we try to make our systems robust so that your site will be back in Search as quickly as possible. Like you, we're hoping that this crisis finds an end as soon as possible. We hope that with this information, you're able to have your online business up & running quickly when that time comes. Should you run into any problems or questions along the way, please don't hesitate to use our public channels to get help.
It's been a few years now that Google started working on mobile-first indexing - Google's crawling of the web using a smartphone Googlebot. From our analysis, most sites shown in search results are good to go for mobile-first indexing, and 70% of those shown in our search results have already shifted over. To simplify, we'll be switching to mobile-first indexing for all websites starting September 2020. In the meantime, we'll continue moving sites to mobile-first indexing when our systems recognize that they're ready.
When we switch a domain to mobile-first indexing, it will see an increase in Googlebot's crawling, while we update our index to your site's mobile version. Depending on the domain, this change can take some time. Afterwards, we'll still occasionally crawl with the traditional desktop Googlebot, but most crawling for Search will be done with our mobile smartphone user-agent. The exact user-agent name used will match the Chromium version used for rendering.
In Search Console, there are multiple ways to check for mobile-first indexing. The status is shown on the settings page, as well as in the URL Inspection Tool, when checking a specific URL with regards to its most recent crawling.
Our guidance on making all websites work well for mobile-first indexing continues to be relevant, for new and existing sites. In particular, we recommend making sure that the content shown is the same (including text, images, videos, links), and that meta data (titles and descriptions, robots meta tags) and all structured data is the same. It's good to double-check these when a website is launched or significantly redesigned. In the URL Testing Tools you can easily check both desktop and mobile versions directly. If you use other tools to analyze your website, such as crawlers or monitoring tools, use a mobile user-agent if you want to match what Google Search sees.
While we continue to support various ways of making mobile websites, we recommend responsive web design for new websites. We suggest not using separate mobile URLs (often called "m-dot") because of issues and confusion we've seen over the years, both from search engines and users.
Mobile-first indexing has come a long way. It's great to see how the web has evolved from desktop to mobile, and how webmasters have helped to allow crawling & indexing to match how users interact with the web! We appreciate all your work over the years, which has helped to make this transition fairly smooth. We’ll continue to monitor and evaluate these changes carefully. If you have any questions, please drop by our Webmaster forums or our public events.
Enter your email address:
Delivered by FeedBurner