Search Engine Optimization (SEO) on JavaScript

JavaScript is one of the world’s most popular programming languages. It is part of the ECMAScript prototype. Its first variation appeared in 1995 and has been constantly improved until it came to its current form.

This language is most often used in the development of applications and websites. A common mistake encountered is confusion between Java and JavaScript. But they are two different languages, despite the similar names.

The basic feature of this language is that it has been influenced by other programming languages (Python, Java, etc.) to make it as comfortable as possible. Working with this programming language is easier. Even users with no programming experience can do it.

Technology, Computer, Code, Javascript, Developer

The main functions of JavaScript:

  • The ability to modify browser pages;
  • Adding or removing tags;
  • Changing page styles;
  • Information about the user’s actions on the page;
  • Requesting access to a random part of the source code of a page;
  • Changing this code;
  • Doing things with cookies.

The scope of this language is surprisingly vast and is not limited to anything. It is used to develop test editors, applications (for computers, mobile devices, and even servers), and application software.

But most typically, this programming language is used to create websites. In this article, we’ll describe the rules of search engine optimization sites in this programming language. You will learn about JavaScript crawl, the rules of using scripts, and many other elements of working with JS. 

JavaScripts and SEO

There are several reasons why it is important to pay attention to good JavaScript SEO when working with such sites:

  • Incorrect implementation of JavaScript can result in some page content simply not being indexed. Here it is worth noting right away that Google is doing very well with JS processing. The search engine announced that it would start using CSS and JS when crawling resources until 2013. Since then, it has made great strides in this direction. Google can handle and index JavaScript in most cases, as long as there are no errors in the code. If the code is not written quite correctly, however, even a small error can prevent the search robot from indexing the content available after JS processing. This can negatively affect the site’s ranking if the content is important. 
  • Page segments available after JS processing often contain internal links to other pages on the site. This means that if the search robots cannot reproduce JavaScript, they will not be able to click on these links. As a result, pages that have no backlinks from other pages on the site and are not listed in the sitemap will be hidden from indexing. 
  • JavaScript’s files are quite heavy, and adding them to a page can significantly slow down the loading speed. This often results in a higher bounce rate and lower visibility.

All these problems can be avoided if you understand how search engines analyze JS code and how to optimize it. That’s why further in this article we’ll talk about technical optimization. The vast majority of websites use JS to improve the user experience, collecting statistics, interactivity, loading content, menus, buttons, and other elements. The goal of SEO specialists is to make the process of content crawling easier and, if possible, avoid the issues that typically arise when processing pages with JavaScript.

Search engines are trying to get the same content that users see in the browser. At Google, the Web Rendering Service (WRS, part of the Caffeine indexing system) is responsible for the rendering and crawling process.

Note that Google gets the final HTML for processing, but scans and caches almost all the additional resources needed to render the page (JS files, CSS, synchronous and asynchronous XHR requests, API pickup points, and so on). Some resources Googlebot may ignore as not important enough to display content.

How Does Google Index JS?

Programmer, Code, Coding, Technology, Html, Developer

Googlebot sends a GET request to the server and receives HTTP headers and page content in response. If there is no ban on crawls in the header or the robot’s meta-tag, the URL is queued for display. 

It is essential to keep in mind that under mobile-first-indexing conditions, in most cases the request comes from a mobile user-agent from Google. You can check which robot is crawling your site in Search Console (section “URL check”). 

Keep in mind that in the HTTP headers of the page, you can configure rules for individual user agents. For example, by disabling indexing for them or showing them different content from others.

It is important to remember that links loaded with JavaScript will not be detected until the rendering process is finished. This significantly slows down the site’s crawling process because Google has to constantly adjust the structure and relative importance of pages as they are rendered and new JS-enabled links are discovered.

General Recommendations for JavaScript’s SEO

Social Media, Social Networks, Bullet, World

Google’s algorithms are improving and are increasingly accurate in assessing the usefulness of content for the user. Search engines use social signal tracking to evaluate even a visitor’s behavior on the site. This will remain true in 2022.

How do you make your JS site well indexed? Here are a few rules: 

  • Large volume. Long articles will work better than short and medium-length articles.
  • Structure. Large articles especially need a well-thought-out structure to make them easy and fun to read.
  • Usefulness and relevance. Write about what your customers care about. Answer frequently asked questions, create guides on the selection of products, their configuration, and operation, a selection of useful tips, and life hacks. Collect a pool of information requests on your topic, select the most frequent and write articles based on them.
  • Video. Supplement text content with the video format. This will attract users who are not very fond of reading, and at the same time improve behavioral factors (time on page, interaction with content).
  • Detailed answer to the user’s question. Expand the topic of the article in as much detail as possible, so that the reader leaves satisfied, having received a clear and detailed answer. The article will be big, and that’s good.
  • Updating old articles. If you have traffic articles that are several years old, update them. Add information, update and change the publication date to a new one. Otherwise, you risk losing positions and traffic for them. Search engines prefer to display new content in the TOP.

Users are increasingly making requests to Google through the voice assistants Siri and Google Assistant. It’s faster than typing manually, and convenient if your hands are busy. A 2019 survey found that 48% of users use voice assistants to search the internet. We think that now this figure is even higher.

Voice queries are different from text queries. Users shape them differently. Written requests are concise and usually contain up to 5 words. Voice queries are longer, they use different word forms, additional words, question phrases. SEO for JavaScript must necessarily include such elements. 

When optimizing a site for voice search, you need to select and analyze long queries, and then include them in texts, headings, subheadings.

Please note that voice search is often the question “Where?”, “How?” or “How much?”. You can include user questions in your content and provide answers to them.

Semantically related words grow in importance

Google’s algorithms are sophisticated artificial intelligence systems. They can analyze the context of a query, understand the user’s search purpose, and give answers to fairly complex questions. They no longer focus only on exact or synonymous matches between words in a query and on a website page. Search engine AI works in a much more complex way and is getting closer to understanding a query as a human rather than as a machine.

Therefore, the more semantically related (LSI) and topical words there are on the page, the more queries, especially low-frequency ones, will be found to which the page will be relevant. Even those pages that do not have any occurrences of words from the query, but have LSI and thematic phrases, may become relevant. Thus, you can show up for a higher number of requests without over-spamming keywords.

One way to naturally saturate the text with LSI-phrases and thematic words is to write clear texts, diving deep into the topic and answering questions of your target audience in detail.

9 Tips to Speed up Your Site with JavaScript

Internet, Speed, Test, Speedometer, Slow, Round

The site can run faster if the following recommendations for JS-code optimization are implemented. Site speed is very essential for quality search engine optimization. Back in 2010, Google representatives said that loading speed was one of the main ranking factors.

In 2017, things got even more serious. Google released its “mobile-first” search algorithm. It tracks whether websites have an adaptive version for viewing on mobile devices, and yes, site load speed is also on the list of parameters to be checked. SOASTA’s 2017 research confirms that even increasing the time a page takes to load from 1 to 10 seconds increases the risk of failure by 127%.

Other research conducted by Unbounce found that 70% of respondents would hesitate to purchase from an online store if the site takes a long time to load.  

This is why it is very important to achieve high loading speeds and optimize scripts when optimizing JavaScript sites for search engine optimization. We have collected 9 tips on working with scripts for JS sites.

  1. If some script libraries are not needed on the page, you should disable them. During the development process, scripts are often connected that are not used later. For example, debugging scripts or scripts you were going to use, but then changed your mind. You should not force the user to load them.
  2. The non-priority JS scripts are to be loaded after the page is rendered (place the code at the end of the body section for this purpose). Some scripts are needed to create effects that are not initiated immediately after the page loads. Their delayed loading does not affect the first impression of the site. An automatic scrolling slider or a script to collect statistics of visits can easily be initiated after the page rendering in the browser. Let the user better quickly see the page, rather than waiting for an unnecessary time while loading.
  3. Download scripts from subdomains, other domains or use a CDN. Even the most advanced browsers have a limited number of files uploaded from one domain. If the images, styles, and scripts on the page are a lot, then there is a queue to download the data. The number of threads used is limited only to the domain. If JS files will be loaded from another domain (or subdomain), they will be loaded faster due to concurrency.
  4. Blocking scripts should not be loaded from unknown external domains. If an external domain stops responding or becomes responding with a long delay, it will slow down your entire site. If you use external domains for scripts, make sure those domains are stable enough.
  5. JS should be compressed, minimized, and optimized. Removing spaces, line breaks, shortening variable names, and other optimizations greatly reduce the size of script files and speed uploading. When using third-party libraries, include the minimized version in the production environment. For your scripts, use special libraries that compress and optimize JavaScript: UglifierJS, JSMin, Closure Compiler, YUI Compressor. When using aggressive optimizations, be sure to read the recommendations for the compressor used and test the performance of the resulting files.
  6. Jointly used scripts should be merged into one file. Uploading one file of 50 Kb is faster than uploading 10 files of 5 Kb. Uploading such files loads the serverless, and compression works more efficiently on larger files.
  7. Use GZIP to compress data. Modern browsers support compressed data processing. The optimal way is to pre-compress the used scripts to the maximum level and give pre-compressed files to the web server. 
  8. Cache scripts on the client-side. Do not force users to download the same script multiple times. Use appropriate headers in the web server response (Expires or Cache-Control max-age, Last-Modified, or ETag).
  9. Do not use big libraries unnecessarily. For example, for some elementary effects on the main page of the site is not necessary on all pages to connect jQuery. If the implementation of all the interactivity of the site can be sufficient for 30 lines of simple JS-code, the use of a large library is simply irrational.

Conclusion

We have described the main principles of JavaScript and SEO interaction. In this article, you learned many technical aspects of google crawling JavaScript, code optimization, using scripts, and much more. Don’t forget that in addition to this, it’s important to pay attention to other rules of search engine optimization. For example, you should fill JS-sites with unique content with keywords. You need to generate a relevant semantic core for this. Follow all these recommendations and your JS site will be well indexed by Google and other search engines.