# Robots.txt for Galific Solutions # Purpose: Control crawler access & optimize SEO # Apply rules to all crawlers User-agent: * # Block non-SEO critical sections (examples) Disallow: /admin/ Disallow: /login/ Disallow: /register/ Disallow: /update-blog-editor/ Disallow: /editor/ Disallow: /cart/ Disallow: /checkout/ Disallow: /search? # Allow important assets for rendering Allow: /*.js$ Allow: /*.css$ Allow: /assets/ Allow: /images/ # Sitemap location (update if different) Sitemap: https://www.galific.com/sitemap.xml