Drupal + SEO: Optimizing Your Site for Search Engines

Dmitrii Susloparov

 

 

A few month ago when I wrote my first article about Drupal, some people noticed that for listing essential security modules one should have more development experience. It’s true when I’ve started working at Vardot, I had to interview many developers and site builders before writing every single blog post, but the topic I’ve chosen this time is my favorite. This topic is something (besides basics of Russian language) that  I’m teaching my colleagues, but not learning from them. And this topic is called search engine optimization.

 

 

Vardot boosted its organic traffic up to 202% during last 6 month, and in this article I’ll share with you our best practices - use them to increase search engine ranks of your Drupal sites, too. So how to take the most of search engine optimization when it comes to Drupal? Here are our answers.

 

1. Pay attention to UX

UX design covers any touch point that the user can have with the organization. This means that a better UX-design leads to happier customers, and a bad UX-design works as a customer repellant. If your product is marketed through word of mouth publicity, than more and more people start searching for it; Google understands that it is a very positive trend and increases your organic visibility. However, latest Google searching algorithms analyze not only the popularity of your brand, but also the behavior of your site visitors. Search engines calculate the amount of time people spent on a webpage, number of pages viewed per session, and bounce rate as an indicator of customer satisfaction. Below you’ll find UX factors that are critically important for developers.

 

  • Design

If a website attracts a good amount of traffic, but users are not able to find what they are looking for, Google is decreasing the position of the website on search engine result pages. An unattractive design or a slow website make users close it, which affects your SEO. Therefore, UX is becoming one of the important ranking factors.

 

  • Site structure

Website structure is an organic process that starts with an initial design and undergoes a number of tweaks and redesigns as it starts taking shape. In relation to SEO, the structure of a website plays a crucial role. There are mainly three aspects for site structuring: Site Navigation, Internal Linking, and URL structuring. When it comes to navigation, make sure you avoid the kind that makes it difficult for the search engines to crawl through your website’s pages. Secondly, it would be a good idea to create internal links which offer an opportunity to create and use keyword rich anchor texts. Finally, URL structuring gives that added and a very important benefit of integrating your company’s target keywords with its vital navigation area.

Link Structure: Analyzing the Most Important Methods

 

  • Proper code

There’s a concept in coding known as the time-space tradeoff. This means the complexity of a code can reduce its size but will increase its processing time and vice versa. The key here is to establish the right balance.

Also, it’s equally important to keep testing your code for bugs. They affect the bounce rate of a website as nobody likes to read a page that has loading or any other issues. Remember, it’s not always about crawlers - it’s much more about customer satisfaction.

 

  • Improve Site Speed, Performance and Raise Application Performance Index (Apdex)

Apdex measures the ratio of satisfactory response time and unsatisfactory response time against a set threshold. Last investigations show that websites with a load delay of a few seconds have 7% more bounce rate than the others. Make sure to check-list everything that can help you to make your site faster.  

 

  • Prevent Duplicate Content

Nothing kills a website faster than duplicate content. You need to have original and unique texts that convey your message effortlessly. In case you need to copy and paste an article as it is from somewhere else, make sure you add a canonical URL to let Google now that you’ve mentioned the original source and not plagiarized it.

 

  • Remove 404 (content-not-found)

Content-not-found occurs on the client side when a given web page is deleted or moved to another location and the new URL is not updated. A big number of broken links on your site can affect its PageRank and decrease its visibility in search engines. Make sure time to time to crawl your website using tools like SEMrush and remove links pointing to inactive or non-existent pages.

 

2. Tell search engines what is your site about

 

  • Title tags

 

Title tags are among the most important aspects of efficient SEO of any web page, and it is highly important to include your main keywords to the title of the page. Google is (or is?) following the approach of repeating your title tag information in the search engine result page (SERP), so it also affects the clicking rate. A well-crafted tag is easy to read and gets more clicks. The rule of the thumb is to write like a copywriter or write it in a way so that it can also be used on an advertisement brochure. Make sure that you have only one title tag per page. Keep things short and crisp. A title tag is more like a punchline: if you need to speak more about it, change it. The title tag should be 60 characters in length and should include the main keyword. For more advice, check this amazing article.

 

  • Meta tags

Correct use of Meta Tags increases your website’s search rankings. They can be found in the head of every page i.e. between the HTML tags. If you consider the statistics, 90% of users look only at the top 20 to 30 search results, so your goal should be to land somewhere in this range or higher. The top search engines today use meta tags to index your pages, therefore it goes without saying how important it is to use them to get your site indexed and increase its rankings. There exists a range of meta tags that Google understands - use them all.

 

  • Meta Description

 

 

The Meta description is a brief description of the page, basically a piece of code situated in the header. There you can include extra details. To make it worth for your SEO efforts, there are some rules that you can follow. Keep your description restricted to a range of 135 to 160 characters and don’t forget to include keywords in it. Keep the form of the subject active and actionable which simply means use sentences that drive the reader to click on your link. Make your description unique, structured and a reflection of what your content is about and also at the same time containing the focus keywords. Search Engine Land gives you more advice to the topic.

 

  • Open Graph

This will provide social identity to every page of your website. One of the key roles of Open Graph is that it adds a snippet that appears automatically when an article is shared. The amount of traffic you attract and clicks you get is proportional to how engaging your OG snippet text is. Facebook established it in order to improve the presentation of pages on their social network. By using this protocol, your webpage will become a part of their social graph. The protocol is to supply accurate information required by the search engines. This increases the SEO rankings of your pages because search engines like Google are optimized to detect these Open Graph Meta Tags. You can integrate the Open Graph Protocol in your web pages to boost the SEO rankings. Facebook also provides Open Graph Testing Protocol which can be used to check the information that will be presented. You can say that this is the new way to represent the data on the internet.

 

  • ALT Tags for Images

 

 

It’s more difficult for search engines to recognize images as compared to texts, and alt tags are used to explain crawlers what is the content of the page about. Search engine queries with respect to your image will be exponentially enhanced if you follow best policies to generate your alt tags. Moreover, for images that are linked, alt tags work in a similar manner as anchor texts for text links. Using alt tags on all your images may consume a little more time than what you originally planned, but remember that this does provide its share in increasing your rankings. Above all, for a better performance of your content make sure to include keywords in your image descriptions.

 

3. Create friendly environment for Crawlers

 

  • Search Engine Friendly URLs

Let me illustrate this with an example. Here are two sample URL’s for you “http://example.com/index.php?page=gall_&ry” and “http://example.com/gallery”. Which one do you find easier to read and understand? The same goes for SEO as well. Friendly URLs explain the path to a search engine in a way which is easy to understand and call. In the example we mentioned, it is the latter that will be easily called by search engine bot.

 

  • XML Site-map

XML sitemap is a document that allows a website’s webmaster to inform Google and other search engines about your website and lets them identify and understand each and every URL that helps in a better crawling. XML sitemap contains information like when was the webpage last updated, how often the website undergoes changes and information regarding relevance and importance of a given page in relation with other web pages. XML sitemap also lets you exclude some pages from bot’s crawl.

 

  • Add Site Theming

Imagine a web-crawler trying to go through your web page with “divs” everywhere. It would be hard for them to differentiate between the part of the document meant for navigation and the main article. They can analyze your document composition using some hints, like a “ul” list of internal links can mean page navigation. On the other hand, if a “nav” element is used instead of “ul”, the crawler will understand right away that this is for page navigation. H1 heading contains the category name or the product name. H2 is used for subheadings and to segregate content into segregated blocks which are easier to scan. H3 doesn’t hold much information, and they are mostly used for closing the post heading.

 

4. Help people find you

 

  • Site Verifications

Site verification on Google, Bing, Yandex and other major search engines lets them know that you are the actual owner of the given website. Once your ownership is verified, Google lets you access the private Google Data, which can improve a spider bot’s crawling abilities on your website.

 

  • Resource Description Framework (RDF)

Make sure your site interface includes worldwide accepted standards for Resource Description Framework (RDF) that enhances metadata description for the web.

  • Google News Site-map

The Google News sitemap contains a lot of benefits in comparison to merely numbering the URLs of every page of your website. With proper titles and publication dates tagged on a sitemap, crawlers are able to categorize your content more accurately. Moreover, you get the facility to comments on your content with keywords, stock tickers, and other metadata.

  • Multilingual content

If your site supports multiple languages, you can surely talk to more potential clients on the internet (check the statistics here). To increase the traffic of your site, translate it to different languages and make it valuable for many people.

  • Social Media integration

Social media like Facebook, Twitter, LinkedIn, etc generate more traffic and give you additional mentions on the web. Although Google doesn’t consider shares as high as normal backlinks, crawlers still measure the visibility of your site. If people keep talking about it on social networks, your rank in search engines will grow. Produce more valuable content that will be shared by people!

 

5. Analyze your results

 

  • Google Analytics Integration

Usually, people can not predict everything, that’s why testing has the same importance as an actual development. The best instrument to understand what your site visitors like and what they dislike is Google Analytics. Use it to monitor and analyze the traffic and the performance of your website.

 

Conclusion

It’s true that Drupal is one of the best CMSs in terms of SEO: it helps you to generate a clean code, prevents duplicate content and spam, easily integrates with third-party tools and is very user-friendly and configurable. However, I’ve seen many Drupal sites that are not optimized at all and as a result, they don’t get much organic traffic from search engines.

It is very important to remember that our responsibility as Drupal community members is not only to create an SEO-friendly framework for our customers but also to educate them on how to get the most of it. I hope this article will help you to build websites that rank high in search engines and prove that Drupal is one of the best available CMSs for SEO.