Robots.txt File SEO Me Kya Hota Hai?
Robots.txt File SEO Me Kya Hota Hai?
Robots.txt File Kya Hai?
Robots.txt ek text file hoti hai jo website ke root directory me store hoti hai. Ye search engine crawlers (bots) ko batati hai ki website ke kaunse pages ya sections crawl karne hain aur kaunse nahi.
Robots.txt SEO Me Kyon Zaroori Hai?
-
Crawling Control
– Aap search engine bots ko guide kar sakte hain ki kaunse pages index hone chahiye aur kaunse nahi. -
Duplicate Content Se Bachav
– Agar ek website me duplicate content hai, to usse avoid karne ke liye robots.txt ka use hota hai. -
Server Load Kam Karna
– Aap un pages ko block kar sakte hain jo crawl karne ki zaroorat nahi hai, jisse server pe load kam hota hai. -
Sensitive Pages Ko Hide Karna
– Aap confidential ya private pages ko search engine me show hone se rok sakte hain.
Example: Robots.txt File Ke Practical Examples
1. Sabhi Crawlers Ko Puri Website Crawl Karne Ki Permission Dena
Explanation:
-
User-agent: *
→ Sabhi search engine bots ke liye apply hoga. -
Disallow:
→ Kuch bhi block nahi kiya, yani poori website accessible hai.
2. Sabhi Crawlers Ko Puri Website Crawl Karne Se Rokna
Explanation:
-
/
ka matlab hai poori website ko block karna.
3. Ek Particular Page Ko Block Karna
Explanation:
-
/private-page.html
ka matlab hai ki sirf ye page search engines me index nahi hoga.
4. Ek Folder Ko Block Karna
Explanation:
-
/admin/
folder aur iske andar jitne bhi files hain, wo crawl nahi hongi.
5. Sirf Google Bot Ko Block Karna
Explanation:
-
Ye sirf Google ke bot ko website access karne se roke ga.
6. Sitemap Ka URL Dena (Best SEO Practice)
Explanation:
-
Sitemap:
search engines ko batata hai ki website ka XML Sitemap kaha hai, jo indexing ke liye helpful hota hai.
Comments
Post a Comment