Search code examples
ruby-on-railsrubyframeworksdevelopment-environmentproduction-environment

How does rails detect which environment it is in?


Out of curiosity, how does Rails detect which environment it is in when running on a server, ie either Production or Development?

When I run a rails app locally on my mac/linux machine, it know's it's in a development environment, but when it's deployed to a remote linux machine, it knows it's in production.

How is this so? What are the major implicit differences as far how the app runs, which resources it uses, etc?

Also is the Production/Development dichotomy fixed and part of the framework, or is it possible to establish something like a "staging" environment, that's for all intents and purposes the same as production but meant for testing?


Solution

  • let me quote from the 'rails 4 way' book

    The current environment can be specified via the environment variable RAILS_ENV, which names the desired mode of operation and corresponds to an environment definition file in the config/environments folder. you can also set the environment variable RACK_ENV or as a last resort you may rely on the default being development.

    I believe you could add your environments by easily creating a new file inside the config/environments folder, and start it easily by adding the RAILS_ENV while running the server

    rails s RAILS_ENV=staging or any other environment you want