Robots.txt file kya hai or isay blog me kaise add kare ?

Posted on

519 total views, 12 views today

Robots.txt file kya hai or isay blog me kaise add kare ? Agar aap ek blogger ho to aap nahi chahoge ki ek choti si galti ki wajah se aapka blog ki search engine visibility par koi asar ho. Robots.txt file hamare blog ko search engine par hide karwane ka kaam karti hai or ya kisi bhi website or blog ki liye bahut important hota hai. To friends aaj hum aapko Robot.txt file ke bare me full information dene wale hai or aaj aap janoge ki.

  • Robots.txt file hota kya hai ?
  • Ye kisi bhi website or blog ke liye kyun jaruri hai ?
  • Blog me robots.txt file kaise add kiya jata hai ?

To friends aaj ka article aapke liye bahut helpful rahega kyunki maximum bloggers ko robots.txt ke bare me pata hi nahi hai or jinko iske bare me pata hai wo confusion ki wajah se isay apne blog par use nahi karte, aaj hum aapki wo sabhi confusion ko dur karne wale hai or in the end aap puri tarah samajh jaoge ki ye robots.txt aakhir hai kya ?

Robots.txt file kya hai or isay blog me kaise add kare ?

Robots.txt File hota kya hai ?

Waise to aap jante hi hoge ki robot ka matlab hota hai machine jo hamare diye gaye command ko follow karta hai or jaisa hum chahte hai waisa hi wo execute karta hai. Thik waise hi robot.txt ek text file hi hai jo apne andar kuch command liye rehti hai. Ye command search engine ko batati hai ki wo hamare blog pe kisko index kar or kisko nahi.

Jab aap apne blog ke sitemap ko search engines par submit karte ho to search engine ka crawler aapke sitemap ko crawl karta hai yani ki aapke blog ke sitemap ko scan karta hai or apne search result me usay show karne ke liye index karta hai.

Search engine aapke blog ke sitemap ko crawl karne se pehle wo dekhti hai ki aapke blog me robots.txt file hai ya nahi agar aapke blog me robots.txt file available hai to sabse pehle wo usay read karegi or usi ke hisab se aapke blog post ko index karegi.

Search engine crawler ko ko log bahut se naamo se jante hai jaise robot, bot or spider, isliye confuse mat hoiyega..

Jaisa ki humne aapko pehle kaha ki robots.txt file sirf ek text file hi hai jisme hum kuch message likhte hai , in message ko search engine command ki tarah leti hai or aapke blog par kya kya index karna hai or kya kya index nahi karte hai wo samajhti hai..

Maan lijiye ki aap apne blog ke category, tag ko search engine me index nahi karwana chahte to aapko iske bare me robots.txt file me likhna hoga taki jab bhi koi search engine jaise google, yahoo, bing aapke blog ko crawl kare tab wo aapke likhe gaye robots.txt file ke mutabik kaam kare.

To friends ab aapko puri tarah samajh me aa gaya hoga ki ye robots.txt file kya hota hai. Ab chaliye aapko batate hai ki robots.txt add karne ke fayde kya hai.

Robot.txt file add karne ke fayde

Ye to clear hi ki robots.txt file sirf or sirf search engine ke crawler hi read karte hai or robots.txt ke mutabik hi blog content ko index karte hai. Robots.txt file add karne ke bahut fayde hai jaise-

  • Agar aap apne blog ke kuch post or page ko search engine se hide karne chahte ho to aap robots.txt file ke jariye kar sakte ho. Ab aap sochte hoge ko kon se post ya page ko hide kare, to aap apne blog me kuch aise page create karte ho jo ki sirf user yani visitors ke liye hote hai jaise privacy policy page, contact us, disclaimer page etc agar aap in sabhi page ko search engine se hide karna chahte ho to aap easily robots.txt file ke jariye kar sakte ho.
  • Agar aap apne blog post ke tag section ko search engine se hide karna chahte ho to aap robots.txt ke jariye kar sakte ho. Aisa karne par aapke blog me double content ka issue nahi rahega or aapka blog seo friendly ban jayega. Double content ka matlab hai ek post blog ka kai jagah par najar aana. Jaisa ki aap sabhi jante ho ki jab hum apne blog post me koi tag use karte hai to uska ek link create ho jata hai , aise me jab koi vyakti us link ko open karega to post content ke kuch part usay expert ke rup me dikhenge jiski wajah se double content ki problem hoti hai.
  • Agar aap apne blog ke kuch section jaise image, category, tag, search ko hide karna chahte ho to aap easily robots.txt ke jariye kar sakte ho.

To friends apne blog ko search engine par usi page ko index karne sahi rehta hai jo apke blog ke seo ke liye faydemand ho. Aapne apne blog ko google search engine par jarur search kiya hoga, jab aap apne blog ko search engine par search karte ho to aapko ye to pata hoga ki search result me aapke blog ke post content ki bajaye page list, image, category or tag hi dekhne ko milte hai. Jisay hum easily robots.txt ki help se deindex kar sakte hai.

Robots.txt ko apne blog par implement karne se pehle iske bare me samajhna bhi jaruri hai ki isay kaise likha jata hai, to chaliye iske bare me aapko batate hai..

Robots.txt ki basic knowledge

Apne blog par robots.txt use karne se pehle aapko Robots.txt ki basic knowledge honi chahiyetaki aap samajh sako ki aapne jo robots.txt apne blog par add karne ja rahe hai uska matlab kya hai. Aapko internet par bahut se aise article mil jayenge jo aapko robots.txt file dete hai, lekin agar aap robots.txt file me likha kya hai usay samajh jao to aap apne blog ke liye robots.txt file bana sakte ho.

Robots.txt file example –

# Robot.txt file by s2b
User-agent: Mediapartners-Google
Allow: /
User-agent: *
Disallow: /2018
Disallow: /tag
Disallow: /search
Allow: /
Sitemap: https://www.smart2blogging.com/sitemap.xml

Aap upar diye gaye coding dekh sakte ho ye ek robots.txt file ka example hai. Aap is coding me dekh sakte ho ki isme User-agent, Allow or Disallow kai bar use kiya gaye hai. Or robots.txt file me ye tino hote hi hai, aap kisi bhi robots.txt file me in tino word (syntax) ko dekh kar bata sakte ho ki ye robots.txt file hi hai. Anyway sabse pehle humne jo aapko example diya hai uske bare me line by line samjha dete hai taki aapko kisi bhi tarah ki confusion na ho..

# Robot.txt file by s2b – Robots.txt file me agar aapko koi line start hone se pehle agar # najar aata hai to aapko samajh jana chahiye ki ye sirf information ke liye use ki jati hai jisay search engine crawler ignore kar deta hai. Matab ki agar aap apne robots.txt file me kisi bhi line ke aage # lagate ho to crawler usay read nahi karta.

User-agent: Jaisa ki maine aapko pehle hi kaha ki robots.txt file me jo kuch bhi likha hota hai wo sirf search engine crawler ko command dene ki liye hota hai ki wo aapke blog ke kis section ko index kare or kisay nahi. Lekin search engine ke kis crawler ko aap command de rahe ho wo User-agent ke jariye aap batate ho.

Agar sirf Google crawler ki baat ki jaye to isme bhi bahut se crawler hai jo aapke blog ke alag alag section ko crawl karte hai. Aapa google ke crawler list site me ja kar dekh sakte ho.

User-agent: Mediapartners-Google – Jaisa ki maine kaha ki User-agent kisi crawler ko command dene ke liye use kiya jata hai or Mediapartners-Google ek adsense crawler hai. Agar me apne robots.txt file me User-agent: Mediapartners-Google likhta ho to me adsense crawler ko command de raha hu ki wo mere blog ke adsense ad ke liye crawl kare ya nahi, or isi command ko hum Allow or Disallow ke jariye execute karte hai.

Allow: – Agar aapko apne blog ke kisi content ko crawl karwana chahte ho to usay allow ke jariye execute kiya jata hai. Jaisa ki aap upar diye gaye example me dekh sakte ho ki

User-agent: Mediapartners-Google
Allow: /

Iska matlab ye hai ki me Adsense ke crawler ko command de raha hu ki wo mere blog ko adsense ad ke liye crawl kare.

Disallow: – Agar aapko apne blog ke kuch section ko search engine crawler se chipana hai ya fir aap apne blog ke kuch section ko search engine me index nahi karana chahte to aapko Disallow command use karna hoga.

User-agent: * – Jaisa ki maine aapko bataye ki user agent kisi bhi particular crawler ko command dene ke liye use kiya jata hai lekin agar aap sabhi search engine jaise google , yahoo, bing ke crawler ko ek sath command dena chahte ho to aapko * use karna hoga, matlab ki aapko likhna hoga User-agent: *

SITEMAP – Aakhir me hum apne robots.txt file me apne blog ka sitemap dete hai.

NOTE – Friends ab aap puri tarah samajh gaye honge ki robots.txt file ke jariye crawler ko command kaise diya jata hai. Aap upar diye gaye robots.txt example ke jariye ek chij clearly samajh gaye hoge ki command dene ke liye hum sirf Allow or Disallow use karte. Isay or bhi acche se samajhne ke liye aap niche diye gaye robots.txt ke jariye samjhne ki kosis kare.

User-agent: *
Disallow: /2017
Disallow: /tag
Disallow: /search
Allow: /

Aap upar diye gaye robots.txt ke jariye ye to clearly bata sakte ho ki ye command sabhi search engine crawler ke liye hai kyunki hume isme User-agent: * ka use kiya hai. Hum sabhi search engine crawler ko Allow or Disallow ke jariye ye command de raha hai ki blog ke kis section ko index kare or kisay nahi.

Disallow: /2018 – Agar aapne apna blog blogger me banaye hai to aapko pata hoga ki aapne jitne bhi blog post 2018 me publish kiye huye hai wo aapke post url me najar aata hai. Or agar aap apne robots.txt file me Disallow: /2018 likh dete ho to search engine crawler wo sabhi blog post ko index nahi karegi jime /2018 aata hai.

Disallow: /tag – Agar aapne apna blog WordPress me banaye hua hai to aap apne blog post me tag jarur use karte hoge. Lekin kabhi aapne gor kiya hai ki jo bhi aap post me tag dete ho uska ek link create ho jata hai. Or agaraap in sabhi tag url ko search engine me index hone se bachana chahte ho to aapko apne robot.txt file me Disallow: /tag likhna hoga.

Disallow: /search – Agar aapne blog me koi search box hai jisay visitor aapke blog par kuch bhi search karne ke liye use karte hai, to jab bhi koi visitor aapke search box ke jariye kuch bhi type karke search karte hai to uska ek link create ho jata hai or isi link ko search engine par index na karne ke liye hum Disallow: /search use karte hai.

Allow:  / – Apne blog ke un sabhi section ko disallow kar ne ke bad aap search engine ko Allow: / ke jariye batate ho ki wo disallow ke alawa blog ke sabhi section ko index kare.

To friends ab aap puri tarah samajh gaye hoge ki robots.txt file kaise likha jata hai or kisi bhi robots.txt file ko kaise samjha jata hai. Ab chahiye aapko batate hai ki apne blog me robots.txt file kaise add kare ?

Robot.txt file kaise add kare

Note – Agar aapne apne blog me robots.txt file add nahi kiya hai to aapko chinta karne ki jaruart hi nahi kyunki search engine aapke blog ke liye automatic ek suitable robots.txt file create kar deti hai.

Lekin fir bhi agar aap apne blog ke kuch content ko search engine crawler se chipana chahte ho to aap apne blog me manually robots.txt file add kar sakte ho.

Sabse pehle apne blogger blog me login kare or Settings >>Search preference par jaye

1. Custom robots.txt ko enable kare or Yes ko select kare.

2. Apne robots.txt code ko likhe.

3. Save changes par click kare.

Conclusion

Friends aaj aapne robots.txt file ke bare me jana or ye bhi jana ki robots.txt file sirf aapke blog ki kuch part ko search engine se chipane ka kaam karti hai. Overall dekha jaye to robots.txt file kisi bhi blog ya website ke search engine me index hone se rokti hai or agar apke blog me kuch aise content hai jisay aap nahi chahte ki wo search engine par index ho to aa robots.txt ki help le sakte ho.

To friends agar aapko hamara aaj ka article accha laga ya fir aap humse kuch puchna chahte ho to comment jarur kare taki hum aaki sabhi problems solve kar sake. HAPPY BLOGGING

🚣‍♀️ Agar aapko ye article acchi lagi to isay apne social network par jarur share kare 🤓 THANKS

6 thoughts on “Robots.txt file kya hai or isay blog me kaise add kare ?

  1. Bahut hi achhi jankari share ki hai aapne kya aap bata sakte hai ki blogger blog me post per day view count kaise add kare

Leave a Reply

Your email address will not be published. Required fields are marked *