S3E3 Mercy Janaki, SEO Challenges with PWAs & JS

Keira Davidson (00:21):

Hi, and welcome to the TechSEO Podcast. I’m Keira Davidson, your host and today I’m joined with Mercy. Would you mind giving a little bit of background information on yourself, please?

Mercy Janaki (00:32):

Thanks, Keira. Thanks for having me. It’s a pleasure talking to you here. And hello everyone. I’m Mercy and I have more than 10 years experience and I enjoy what I do in all [inaudible 00:00:47]. And I’m currently working with a growth marketing agency, which is Position Squared. It is headquartered in Santa Clara, California, and I am based out in Bangalore, India.

Keira Davidson (00:59):

Oh, that sounds really exciting. So today the plan is around progressive web apps, JavaScript and how all that ties into SEO. Progressive web apps is a relatively new concept to myself. How would you go around learning information on this?

Mercy Janaki (01:23):

Yeah, so as you said, maybe more websites are coming into progressive web apps, and there are lot of advantages. Probably there is certain disadvantages in terms of SEO. But SEO it is not just one part of it.

The users and how you quickly wanted to load the content. How interactive you wanted to have your website. Then [inaudible 00:01:55] is a great solution for anyone.

So, based on that factor only I feel that many websites are recently taking an approach of getting the website done using JavaScript framework. Something like React JS and Angular JS and also the progressive web app. So especially when you are in a agency site, you always get a opportunity to learn more. Because you always tend to get lot of new clients and every client is not the same client, even you get to eCommerce client, each website will have a different set of issues. So you need to look for a different set of solutions. That is where the advantage for people who work in an agency is what I believe. And that is when I started facing certain websites that come for optimization that has the JavaScript framework. And that is how I started learning about it. And I should say that it is very challenging and very interesting to know.

Keira Davidson (03:12):

Yeah, from what I’ve read around, it’s quite complex, but it definitely sounds super interesting and something I should spend more time learning around. So my understanding is PWAs and JavaScript. It can affect the rendering of the page. And obviously, it’s really important that there’s no issues around that. How would you typically go around checking this or making sure that it’s okay and working correctly?

Mercy Janaki (03:47):

So as you rightly said, the rendering is the first thing that anyone has to take a look when they wanted to optimize the JavaScript or PWA website. So, and also people should be aware that indexing is different and rendering is different. Both are not a similar concept. And Google search consult is the perfect tool for you to go and identify is your Google bot is rendering all the information that you are serving via Java script. And sometimes you might not be able to get the Google Search Console access for multiple reasons. Like the client might not have just Google Search Console configured or for some reasons that is getting delayed, getting access is a problem. So in that case, I would recommend that you use your the mobile, the Google’s version of mobile friendly checker tool. There also, you can take up URL and see whether the Java script powered content or powered features are rendered particularly. So these are the two specific tools that I would recommend.

Keira Davidson (05:04):

That sounds great. Just out of curiosity. Can you also see it in dev tools as well in the performance tab?

Mercy Janaki (05:16):

So no, the performance cannot be seen through the mobile friendly checker. The Google Search Control is the only option where you can go and identify the the search queries that you are receiving and the clicks that you are receiving. But the other crawling tools that are available in the market is also becoming so sophisticated to solve these problems. So I use Screening Frog a lot to crawl a big volume websites. And do you have an option to specifically set the JavaScript crawling and rendering?

Keira Davidson (05:59):


Mercy Janaki (05:59):

That also gives you the right information. So there are tools available. So you don’t have to specifically call out tools. None of the SEO tools are so comprehensive that it covers each and every aspect of it. So, but my choice would be the Screaming Frog and also the Deep Run.

Keira Davidson (06:23):

Yes. Ah, sounds super interesting. And are there any ultimate dos and don’ts when it comes to implementing this at all? Or are there any key considerations that really need to be thought about before making the move?

Mercy Janaki (06:45):

Definitely. That’s a great question, I think, because most of the time, the problem that I see is when the website is being developed using the JS framework, what happens is it also comes with the client side rendering. So, the client side rendering and the JavaScript rendering goes hand in hand for user perspective. You can deliver a very interactive website. You can deliver a very fast loading websites. All of these things can be taken care of in terms of users, but when it comes to SEO world, this is definitely going to be a problem because only the Google bot is currently have the resource to crawl the Java script, not the other bots. Maybe they’re not sophisticated yet enough to do as much as the Google bot does it.

And also… So this is what I would say, instead of rendering the content through client side, probably the websites can, or the developers can, think of a scenario of doing it in a server side rendering so that the content and the basic SEO elements, like page title, descriptions, and canonical tags and meta robots tag and stuff like that can at least be rendered through server side. And probably the other things like the video embeds and stuff like that can be happened through the client site rendering. That is what I would recommend to do it.

Recently Google has also been promoting a lot about dynamic rendering. You use both server side as well as the client side rendering. To be specific, you can identify which is actually crawling the website. So if it is a user bot coming to your website, then you serve the details through server side so that the bots doesn’t have any issues in crawling your content, crawling your SEO specific details like metatag and robot, meta robots, and et cetera. When a user browse your website it comes from a browser point of view, then probably render the information through client side so that the interactivity or the fast loading experience can seamlessly happen for the user. So this is the, do’s and don’ts I would recommend on a large scale.

Keira Davidson (09:28):

That sounds great. Thank you. Are there any courses out there where you can learn about this, or is it a case of just using the typical sites for information on this? Like how would it beginner get into get gaining more knowledge on this?

Mercy Janaki (09:49):

Sure. I have started, I actually went into a situation where I have to learn about it, then I learned about it. So there was a website came for an optimization. I was not aware of JavaScript [inaudible 00:10:05]. And then I decided that, okay, this is where I led and I have to do it. So there a lot of resources that are available on online, especially that I have to say this because SEO community is one community where everyone is so happy to share their experience and share their learning for free of cost. I never seen such sharing happens in other communities.

So you name a problem, you have a solution online. So there are DIY guides come from tools like Screaming Frogs and even individual experts have done that. But the first recommendations that I would say is Google has been doing a video series on the JavaScript series. So that is a lightning talk happens. It is not so complicated. They make it very easy for the layman to understand the language. That is what I recommend to start with. But once you get the hang of the basics, probably you can start reading about more so that you can expand your knowledge in that.

Keira Davidson (11:18):

That sounds great. Yeah. I’ve been watching a couple of those videos that Google’s been put out and they’ve been so beneficial. So I think if I keep sticking it, then I can move on the reading side of it and it’ll all make a bit more sense.

Mercy Janaki (11:37):

Yeah, correct.

Keira Davidson (11:40):

So, do you have like a checklist on how to implement this? Or like you said, there can be things that are overlooked or key things that need to happen to make sure it runs smoothly. Do you typically follow a checklist for this? Or how do you go about that?

Mercy Janaki (12:01):

Yeah, first thing I would say that before getting into SEO optimization, the first thing to check is whether your content is being rendered by the bots. That is, if there is a problem in that, how much optimization you do, how much of content you put in on the website in terms of blog or product page and category page or whatever it is not going to help, because bot is not going to read that. So half the website has a launchpad for your organic growth. The website rendering is the key. So fix, have a check and identify whether it is all good to go. If not, probably all your concentration should revolve around that to fix it. Once the rendering is happening, then we can gradually get into all the SEO optimization. It is not totally different. It’s not a different set of game. Like you do optimization separately for a HTML website and do separately for a Java website. It is the same, but to make sure that the code the SEO optimization that you do is visible for the bot. That is where the problem is. So the checklists, I would say, remind the same for like the other websites that you optimize, but the initial the additional checkpoint should be is your content rendered by any bot, not just Google, but any bot.

Keira Davidson (13:35):

That sounds great. Yeah. That’s really important to make sure that it’s being rendered at all and that bots can access it no matter whether it’s Google or Yandex, whoever it is, it’s really important.

Mercy Janaki (13:49):

Yeah, you’re right.

Keira Davidson (13:52):

So my understanding is that you have some… You’ve been working on some sites that have taken this approach, obviously as of May, Google is going to be rolling out their page experience update, and it’s obviously not going to be an instant take instant effect. It’s going to grow over time. Do you find there’s many issues around these sites where there page experience issues?

Mercy Janaki (14:23):

Actually, I would say that these JavaScript and the PWS sites have a edge over the other websites in terms of page experience, because the main objective of doing these frameworks is to improve the page speed and provide the right interactivity for the users and the business. So in that way, I see always the edge over for these websites on the Google code launch that is going, I mean, it’s going to happen in the coming days. But probably the CLS is what we should be really care about. Is there going to be a large shift happening around it because, you are going to call different JS modules in different areas. And if you don’t do that, right, probably the CLS gets affected and you might need to have a look around that.

And also mistakenly what happens is we, the developers by mistaken tend to block certain JavaScript resources for Google bots as well. I mean, generally for bot, not just Google bot. Yeah. So we should also be careful about that so that if not every element of your page might be visible for the bot. So it is not related to the page speed, sorry, page experience metrics. But while talking about it, I just got about that.

Keira Davidson (16:02):

That’s great. Thank you for providing that insight. Yeah. It’s really important that to be aware of what developers are doing. So no, no accidents occur.

Mercy Janaki (16:16):


Keira Davidson (16:19):

I’ve really enjoyed speaking with you today. Thank you so much for joining the TechSEO podcast. It’s been great. And it’s highlighted to me personally, how much I’ve got to learn around PWAs, especially. Because it’s not familiar at all and so much information out there that I really need to tap into. So I really appreciate that. Thank you.

Mercy Janaki (16:47):

Thanks. Keira, it was a pleasure talking to you and we all learn from each other and grow together.

Keira Davidson (16:53):

Exactly. Thanks so much.

Mercy Janaki (16:55):

Thanks Keira. Appreciate it. Bye.

Keira Davidson (16:55):


Join the discussion