A complex situation in here!
The situation now: We have a main server doing only his stuffs. Data is changing every second on it. We need a web widget(html data) to share with other websites. That widget must be refreshed every one minute. The widget data will be changed every second. All other website's visitors must see that information. We can't handle such high traffic.The server is needed online 24/7,and they would not connect to it every one minute. I'm talking about a million impressions per month.
The solution we're working on: Get a several hosting plans. All hostings will store the HTML data that will be showed to the visitors. Every hosting account will do a cronjob every one minute to our main server. Gets the html and store it till next cronjob. Thats how we move the traffic from our main server to other place. Now is the part where the website's visitors will connect to the html stored at our hostings. The code down is connecting with the first hosting server,if he doesn't answer in some time,it will connect to the second one.And loop till some of them returns the HTML data. Of course if they get 100% loaded we'll get another new hosting.
<script>
server_1 = 'http://hostingserver_one.com/';
server_2 = 'http://hostingserver_two.com/';
wait_for_response = 5000;
one_minute = 60 * 1000;
half_minute = 30 * 1000;
right_away = 1;
current_refresh_minute = one_minute;
current_refresh_server = server_1;
function ajaxRequestInfo() {
$.ajax({
type: 'GET',
url: current_refresh_server,
timeout: wait_for_response,
data: {},
success: function(data) {
$(".data_for_refresh").html(data);
},
complete: function(data) {
window.setTimeout(ajaxRequestInfo, one_minute);
},
error: function() {
changeRefreshServer();
window.setTimeout(ajaxRequestInfo, right_away);
},
async: true
});
}
function changeRefreshServer() {
if (current_refresh_server == server_1) {
current_refresh_server = server_2;
} else if (current_refresh_server == server_2) {
current_refresh_server = server_3;
} else if (current_refresh_server == server_3) {
current_refresh_server = server_1;
}
}
$(document).ready(function() {
ajaxRequestInfo();
});
The question is: Is that the best way to be done?! If not whats better. I'm sure many of you already passed that situations but it's my first :)
Talking about one million with html files looks strange to me, ofc you can handle it more precisely.
Two thing you need to concern is that
load balancing and consuming more memory than one web server can provide.
If your account is high-load i mean have a too much load then should go for another server instead of bearing slowness of single server. On the other hand if you want to host many applications on single server you need a more memory a server can provide
Basically, I installed the standard wordpress on the server that makes the ProxyPass. Then I configured the site and installed the extensions and templates. I configured a SQL DB on this server, in our case it is also the proxy but it would be ideal to isolate it on its server or to an external database service like Xeround. In my WordPress, apache, mysql and memcached configurations, I always specify the internal private network IPs since all my servers at iWeb are Smart Servers. This eliminates traffic on the public network. It makes the setup much safer.
I read it here you can find more ideas Here
ultraking is the s/w you need to concern
Now,in you case you have a multiple html files which are going server by server to find out user's particular file . But that's not a good idea to find in a all server. Instead make one json object will have a information regarding which server contains which information So now scenario would be User > hits your website (contains information object about each server) >> user request for a file > fileter json object > hit to your uniquer
This will more reduced traffic to your server.Thanks
[edit]
Its all depends on a planning strategy you are doing for handling users. You can not handle a single user if you strategy and a right choice of tool doesn't site on a benchmark.