Low-code and no-code are revolutionizing website creation. However, projects developed with these tools are encountering serious Google indexing problems. Here are our field observations and recommendations.
In recent months, we have observed in Astrak a worrying trend: more and more sites developed with tools like vibe coding struggle to be indexed correctly on Google. Whether it's projects created via Lovable, Claude Code, Bolt or similar platforms, on-site feedback is unambiguous.
Indexing takes weeks, sometimes months. Some sites simply don't index never. And when they manage to be indexed, dereferencing often occurs very quickly.
We have identified two major problems who explain this situation, and we wish to share our findings with the community to prevent others from falling into the same pitfalls.
Problem #1: Lovable
JavaScript frameworks prevent Google from properly rendering and crawling pages. Resulting in weeks of waiting for partial indexing, or no indexing at all.
Problem #2: Claude Code
The ability to generate thousands of pages in a few hours triggers spam signals for Google. Indexing is refused, even when forced through Search Console.
Lovable and the client-side JavaScript trap
Lovable has become one of the most popular vibe coding tools for quickly creating websites and applications. The problem? Almost all projects generated by Lovable are based on JavaScript frameworks heavyweights (React, Next.js in SPA mode, etc.).
Or, Google has always had difficulty with client-side JavaScript rendering. Even though Googlebot technically knows how to execute JavaScript, the process is slow, resource-intensive, and most importantly: non-priority for the search engine.
Many Lovable website creators report the same problem: indexing that takes weeks, pages that remain invisible on Google, and content that the engine simply cannot read correctly. This is a finding shared by a large portion of the community.
Why is Google struggling with JavaScript?
When a site is built in «full JS,» Google must go through an additional step called rendering. rendering. Instead of simply reading the raw HTML of the page (which is almost instantaneous), Googlebot must:
Download the initial HTML (often empty)
With JS frameworks, the initial HTML contains almost no textual content. It's just a shell that loads the scripts.
Queue page for rendering
Google places the page in a specific queue for JavaScript rendering. This queue can take days or even weeks to process.
Execute the JavaScript and read the content
Googlebot finally executes JS to get the actual page content. But if something blocks (external API, slow loading, JS error), the content is lost.
Delayed or non-existent indexing
Ultimately, many pages never pass this stage successfully, especially on new sites without authority.
The coding vibe with tools like Lovable amplifies this problem because developers (often non-technical) are not aware of these technical constraints. They create visually stunning sites, but technically invisible for Google.
Claude Code and the risk of large-scale spam
The second problem we observe is different, but equally critical. It primarily concerns projects developed with Claude Code (and other AI-assisted coding tools) with the aim of Programmatic SEO.
The power of these tools makes it possible to generate and publish hundreds, if not thousands of pages in record time. And that's precisely where it goes wrong.
We supported two programmatic projects with very large volumes of pages. Neither of them ever managed to index sustainably. We attempted to force indexing via Search Console and via third-party tools (like Rapid Indexer). Result: the pages indexed briefly and then were de-indexed very quickly. Google clearly identified these projects as spam.
Why does Google detect spam
Google has sophisticated systems to detect sites that publish unnaturally. When a new domain suddenly publishes hundreds of pages, several warning signals are triggered:
There's a huge difference between redoing your homepage with Claude Code and publishing 5,000 pages on the first day for a new site. It's truly comparing apples and oranges. The publication cadence needs to remain natural relative to the team size and the site's history.
The problem isn't the tool itself. Claude Code is a fantastic tool that we ourselves use daily at Astrak. The problem is unregulated use of this tool to mass-publish without human supervision, without a natural cadence, and without the quality signals that Google expects.
Why traditional CMSs remain indispensable
An often underestimated aspect in the vibe coding community: Google Knows and trusts traditional CMSs. WordPress, Shopify, Webflow... these platforms are recognized by the search engine, which knows exactly how to crawl and interpret them.
When Google detects that a site is running on WordPress, it knows there's less of a chance it's automatically generated spam. It's not an absolute guarantee, but it's a trust signal that sites in pure HTML/JS without a recognized CMS do not have.
If you're using Claude Code for web development, consider pairing it with a recognized CMS like WordPress. You get the power of AI for development while retaining the trust signals Google expects from a legitimate site.
Human-centered: our approach at Astrak
At Astrak, let's be honest: we use Claude greatly. For content production, for design, for interactive widgets, for images, for almost everything. AI is an incredible productivity lever that we integrate at every stage of our work.
But the fundamental difference between our approach and the «full auto» that many adopt is that all content is managed, proofread, rewritten, and published by a human. And it's this human brick that makes all the difference in SEO.
the content
human
and enrichment
Search Engine Results Page
controlled
Why humans remain indispensable
- ✗ Massive publication without control
- ✗ Generic and repetitive content
- ✗ No SERP analysis
- ✗ Spam signals triggered
- ✗ No brand customization
- ✗ Risk of de-indexing
- ✓ Natural publishing cadence
- ✓ Custom and unique content
- ✓ SERP analysis before publication
- ✓ Enhanced quality signals
- ✓ Brand identity preserved
- ✓ Stable and sustainable indexing
Mix marketing and SEO: don't be 100% «SEO first»%
All the problems we've just described mainly concern sites that are 100 SEO% oriented without any other acquisition channels. Google detects this better and better, and tends to penalize sites whose sole reason for existence is to rank in its results.
The good news is that these indexing problems are considerably reduced when the site benefits from’other signs of legitimacy :
and video
social
recurrent
branded
Indexing
If you have a YouTube channel that drives traffic to your articles, active social media, regular returning users, a recognized brand in your industry… all of these signals help Google understand that your site is legitimate, even if it was built with vibe coding tools.
Google is liking «SEO first» sites less and less and tends to neglect them, penalize them, and make their indexing and ranking more difficult. If your only source of traffic is SEO, be extra vigilant about the quality and frequency of your publications.
Our concrete recommendations for SEO coding vibe
These problems are not insurmountable. Even on a Lovable or Claude Code project, there are solutions to significantly improve your indexing. Here are our recommendations based on our field experience at Astrak:
Prioritize a well-known CMS as your foundation
Use WordPress, Shopify, or Webflow as the foundation for your site. You can then use Claude Code to develop widgets, custom pages, or integrated tools, while maintaining the Google-recognized CMS core.
Maintain a natural posting cadence
Never publish hundreds of pages on the same day. Adopt a consistent pace with your team's size: a few articles per week for a small team, not 500 pages at once.
Have each piece of content proofread and rewritten by a human.
Even if AI produces quality content, human proofreading provides the personalization, SERP analysis, and fine-tuning that Google values. This is the basis of our method at Astrak.
Check the technical fundamentals
Ensure your sitemap is correct, robots.txt is properly configured, your site is crawlable, and there are no blocking JavaScript issues. Even on a Lovable or Claude Code project, these points can be improved.
Diversify your traffic sources
Don't bet everything on SEO. Develop your brand, be present on social media, create video content, collect customer reviews. These signals strengthen your site's legitimacy in Google's eyes.
Have your site audited by an SEO expert
If you're already in a dead-end situation, an in-depth technical audit can precisely identify what's preventing indexing and propose solutions tailored to your specific case.
Your vibe-coded site isn't indexing?
Schedule an appointment with our team for a free diagnosis. We'll identify your roadblocks and propose a concrete action plan to unblock your indexing.
Schedule a free appointment
