Here in this article, we will see how to add a robot.txt file in our nuxtjs application after it is built or generated before deployment.
Before learning the steps to add robot.txt in the nuxt app, let us learn about what and why robot.txt is important for a SSR or static generated site.
What is Robots.txt file?
Robots.txt also known as the exclusion standard or protocol is a way how a website communicates with the web crawlers and other web robots. It lets the web crawlers know that these are the areas or pages that need to be crawled on the website. It also tells the crawlers which page not to scan or crawl for indexing.
How do Robots.txt works?
Web Search Engines send out tiny programs called robots or spiders to crawl your website and fetch all the pages and their information to be indexed in the search engines. Here the robots.txt file tells these crawlers bots which page to scan and index and where pages are not.
You can tell this by the Allow and Disallow commands in the robots.txt file of your website. For example
User-agent: *
Disallow: /admin
Allow: /
As you can see in the above codes, we have disallowed the web crawlers not to index our admin page and allowed the rest of the pages to get indexed in the search engine.
Add Robots.txt in your Nuxtjs Application automatically.
To add the robots.txt file in our Nuxtjs Application, we will be using a node module that will automatically add the file during our build of the website.
The npm package name is @nuxtjs/robots – npm . What this package does is, injects a middleware to generate the Robots.txt file.
Follow the steps to install the package and add robots.txt
file in nuxt project.
Step 1: Install @nuxtjs/robots npm package
Add the packages to your project by running this command
yarn add @nuxtjs/robots # or npm install @nuxtjs/robots
This will add the robots module to your nuxt website.
Step 2: Setup the module in nuxt config file
Goto your nuxt.config.js
file and add the following line to your modules section:
export default {
modules: [
'@nuxtjs/robots'
],
robots: {
/* module options */
}
}
Step 3: Add module options in the config file
In the module option section, you can set the UserAgent, Disallow and Allow options for your robots.txt file. For Example
export default {
modules: [
'@nuxtjs/robots'
],
robots: {
UserAgent: '*',
Disallow: '/admin'
Allow: '/'
}
}
It means allow all the pages of the website and disallow the admin pages.
Once done just restart your project and when you generate or build the project, it will automatically create the file and you can find it in your dist folder.
Related Topics:
Add custom static 404 error page in Nuxt