I have a single application in django hosted on AWS. But, in these days, I turned It into multi-tenancy, using django-tenant-schemas. Locally, It runs normally. I can create my tenants and access them on my local django server. However, I'm not getting to run on AWS.
My .conf file for NGINX looks like this:
upstream django {
server unix:///home/ubuntu/folder_my_projct/mysite.sock; # for a file s$
}
# configuration of the server
server {
listen 80; address or $
server_name ssh *.example.com.br;
charset utf-8;
# max upload size
client_max_body_size 75M; # adjust to taste
location /media {
alias /home/ubuntu/folder_my_projct/media; # your Django project's$
}
location /static {
alias /home/ubuntu/folder_my_projct/static; # your Django project's$
}
# Finally, send all non-media requests to the Django server.
location / {
uwsgi_pass django;
include /home/ubuntu/folder_my_projct/uwsgi_params; # the uwsgi$
}
}
I did pull for my project that runs on AWS, everything worked in my Virtual Machine (makemigrations and migrate_schemas). However, when I try to access the subdomain, It doesn't work. My only change to access the subdomains was in the file above, pasting the * before the dot: * .example.com.br
. I tried using regex, but it didn't work as too (server_name ~^(?<subdomain>.+)\.example\.com\.br $;
). I do really appreciate if someone tells me what I'm doing wrong, or if I need to do anything else.
Change this line server_name ssh *.example.com.br;
by
server_name ssh.example.com.br *.example.com.br;
server_name *.example.com.br;
For each subdomain you'll need to create a configuration, depending what you want to access
server {
server_name ssh.example.com.br;
...
}
server {
server_name blabla.example.com.br;
...
}
server {
server_name *.example.com.br;
...
}