These points were brought up during the latest SEO Mythbusting videowhich focuses on web performance.
Joined by Ada Rose Cannon of Samsung, Splitt discussed a number of topics about web performance as it relates to SEO.
Here are some highlights from the discussion.
As a result of Google’s two-pass indexing process, fresh content on a JS-heavy site may not be indexed in search results for up to a week after it has been published.
When crawling a JS-heavy web page, Googlebot will first render the non-JS elements like HTML and CSS.
The page then gets put into a queue and Googlebot will render and index the rest of the content when more resources are available.
Use dynamic rendering to avoid a delay in indexing
One way to get around the problem of indexing lag, other than using hybrid rendering or server-side rendering, is to utilize dynamic rendering.
Dynamic rendering provides Googlebot with a static rendered version of a page, which will help it get indexed faster.
Rely mostly on HTML and CSS, if possible
When it comes to crawling, indexing, and overall user experience its best to rely primarily on HTML and CSS.
For further information, see the full video below: