I have a set of URLs. I need to know how much time it takes for each URL to load completely (by noting down the start time and the end time. I am able to capture the start time).
The problem is each URL are completely different from each other. There is no common object that comes for each URL when they are loaded completely so that I can do object cloning and capture the object. So I wrote different logic for different each URL. I want to write a single bot which will check the time for each and every URL.
Each url, or more specifically, the website it points to, should be treated differently.
Due to today's fancy pre-initializing of websites (loading dummy content until the actual content is loaded) and extensive javascripts (inserting content or changing layout) it's nay impossible to create a script/bot which treats all urls in the same way. Think of the one-pager websites which load the content as you scroll, for example.
That being said, have a look at this thread on how to properly wait for content to be loaded. In short: pick an element on the website which is loaded last/late and use Object Cloning
to 'wait' for it to appear.
This should give you a good approximate. Since loading times for websites is based on your internet speed, which is variable as well, an approximate should be good enough in my opinion.
Unfortunately this doesn't directly answer your query, but I posted this answer since I strongly doubt you will get the answer you are hoping for.