How to serve robots.txt for a Vue app

12,361

Solution 1

If I am assuming correctly, you are building your app using the npm run build command from the webpack template creating a /dist folder which you deploy to Firebase. If that is the case you can just add a robots.txt file to that dist folder next to the index. That should work.

However, if better SEO is your aim, it can be better to prerender the page or use Server Side Rendering depending on the complexity of your application.

Solution 2

VueJS v3 build command copies anything in /public to your final dist/. So use the public/ folder for any additional files that you want in your final distribution.

Solution 3

MEVN case

MEVN stands for MongoDB, Express, Vue.js and Node.js.

If you have Vue.js as the frontend, and Node.js as the backend and RESTful API server, you can have your robots.txt put in Vue.js /static/ folder. Here is the Vue.js project structure for your reference.

Then you can simply configure the Express routing the following way to serve the robots.txt file:

app.use('/robots.txt', express.static(path.join(__dirname, 'dist/static/robots.txt')));

(Note: The dist folder is newly generated every time you build your Vue.js project: npm run build. And this approach would free you from adding the robots.txt file to the dist folder every time after the build.)

Share:
12,361
Overdrivr
Author by

Overdrivr

Updated on July 22, 2022

Comments

  • Overdrivr
    Overdrivr almost 2 years

    I use vue-loader to build my Vue app using webpack. My application is served by Firebase.

    For SEO purposes, I need to serve the file robots.txt at the root of my application (GET /robots.txt).

    How can I configure webpack/vue-loader to serve this file ?

    This is my current webpack base config ./config/index.js

    // see http://vuejs-templates.github.io/webpack for documentation.
    var path = require('path')
    
    module.exports = {
      build: {
        env: require('./prod.env'),
        index: path.resolve(__dirname, '../dist/index.html'),
        assetsRoot: path.resolve(__dirname, '../dist'),
        assetsSubDirectory: 'static',
        assetsPublicPath: '/',
        productionSourceMap: true,
        // Gzip off by default as many popular static hosts such as
        // Surge or Netlify already gzip all static assets for you.
        // Before setting to `true`, make sure to:
        // npm install --save-dev compression-webpack-plugin
        productionGzip: false,
        productionGzipExtensions: ['js', 'css'],
        // Run the build command with an extra argument to
        // View the bundle analyzer report after build finishes:
        // `npm run build --report`
        // Set to `true` or `false` to always turn it on or off
        bundleAnalyzerReport: process.env.npm_config_report
      },
      dev: {
        env: require('./dev.env'),
        port: 8080,
        autoOpenBrowser: true,
        assetsSubDirectory: 'static',
        assetsPublicPath: '/',
        proxyTable: {},
        // CSS Sourcemaps off by default because relative paths are "buggy"
        // with this option, according to the CSS-Loader README
        // (https://github.com/webpack/css-loader#sourcemaps)
        // In our experience, they generally work as expected,
        // just be aware of this issue when enabling this option.
        cssSourceMap: false
      }
    }
    
  • Overdrivr
    Overdrivr about 6 years
    Well that could work, but I am keeping this folder out of version control for now. But that's a good point, I could also modify my deploy script to copy the robots.txt at root. I'm leaving this question open for a few days out of curiosity, if nothing comes up I'll accept your answer.
  • Zakir Jaafar
    Zakir Jaafar almost 6 years
    Using firebase deploy you can use the predeploy script to move your robots.txt into the dist dir. Then when you deploy you'll have your robots.txt. I added mine at the same level as the firebase.json file. Just add a predeploy line to your firebase.json that looks like this: "predeploy": "./addrobots.sh", and then add a shell script (addrobots.sh) that does "cp robots.txt dist". Now you can keep your robots.txt under version control.
  • James Watkins
    James Watkins over 5 years
    What exactly is the variable "app" in this case?
  • Yuci
    Yuci over 5 years
    const express = require('express'); const app = express();, (see Hello world example: expressjs.com/en/starter/hello-world.html)
  • Bram
    Bram about 5 years
    VueJS v3 build command copies anything in /public to your final dist/. So the actual answer is: us the public/ folder.
  • Mandeep Singh
    Mandeep Singh over 4 years
    I don't understand why this answer has been downvoted. I believe this is the simple and correct answer
  • Tony O'Hagan
    Tony O'Hagan over 4 years
    Except that they are hosting on Firebase not Express so this answer is incorrect.
  • mhjb
    mhjb over 2 years
    with Netlify I ended up altering my package.json thusly: "build:prod": "vue-cli-service build && mv dist/robots-production.txt dist/robots.txt", "build:staging": "vue-cli-service build --mode staging && mv dist/robots-staging.txt dist/robots.txt",