Some posts and pages should not show up in search results. To make sure they don’t show up, you should tell search engines to exclude them. You do this with a meta robots noindex tag. Setting a page to noindex makes sure search engines never show it in their results. Here, we’ll explain how easy it …Read: "Noindex a post or page in WordPress, the easy way!"
Crawl directives archives
Recent Crawl directives articles
Your site needs to be up and running if you want to be found in search engines. If you aren’t blocking anything — deliberately or accidentally — search engine spiders can crawl and index it. You probably know that Yoast SEO has lots of options to determine what does and doesn’t need to be indexed, but …Read: "Yoast SEO & Ryte: Checking your site’s indexability"
Google doesn’t always spider every page on a site instantly. In fact, sometimes, it can take weeks. This might get in the way of your SEO efforts. Your newly optimized landing page might not get indexed. At that point, it’s time to optimize your crawl budget. We’ll discuss what a ‘crawl budget’ is and what you …Read: "How to optimize your crawl budget"
It can happen to anyone: You’re working on your site, fiddling on some posts here and there, and hit update when you’re done. After a while, you check back on how a post is doing and, to your dismay, it disappeared completely from the search engines! It turns out you’ve accidentally set a post or …Read: "Help, I’ve accidentally noindexed a post. What to do?"
Your robots.txt file is a powerful tool when you’re working on a website’s SEO – but it should be handled with care. It allows you to deny search engines access to different files and folders, but often that’s not the best way to optimize your site. Here, we’ll explain how we think webmasters should use their …Read: "WordPress robots.txt: Best-practice example for SEO"
Sometimes Google does announcements about new features and we go “huh, why did they do that?” This week we had one of those. Google introduced a new set of robots meta controls, that allows sites to limit the display of their snippets in the search results. There is a reason for that, but they buried …Read: "Robots meta changes for Google"
If you want to keep your page out of the search results, there are a number of things you can do. Most options aren’t hard and you can implement these without a ton of technical knowledge. If you can check a box, your content management system will probably have an option for that. Or allows …Read: "How to keep your page out of the search results"
Why should you block your internal search result pages for Google? Well, how would you feel if you are in need of an answer to your search query and end up on the internal search pages of a certain website? That’s one crappy experience. Google thinks so too. And prefers you not to have these internal search …Read: "Block your site’s search result pages"
In our major Yoast SEO 7.0 update, there was a bug concerning attachment URLs. We quickly resolved the bug, but some people have suffered anyhow (because they updated before our patch). This post serves both as a warning and an apology. We want to ask all of you to check whether your settings for the …Read: "Media / attachment URL: what to do with them?"
Crawl errors occur when a search engine tries to reach a page on your website but fails at it. Let’s shed some more light on crawling first. Crawling is the process where a search engine tries to visit every page of your website via a bot. A search engine bot finds a link to your website and starts …Read: "What are crawl errors?"