In Actionscript, the Unix timestamp in milliseconds is obtainable like this:
public static function getTimeStamp():uint
{
var now:Date = new Date();
return now.getTime();
}
The doc clearly states the following:
getTime():Number Returns the number of milliseconds since midnight January 1, 1970, universal time, for a Date object.
When I trace it, it returns the following:
824655597
So, 824655597 / 1000 / 60 / 60 / 24 / 365 = 0.02 years. This is obviously not correct, as it should be around 39 years.
Question #1: What's wrong here?
Now, onto the PHP part: I'm trying to get the timestamp in milliseconds there as well. The microtime()
function returns either a string (0.29207800 1246365903) or a float (1246365134.01), depending on the given argument. Because I thought timestamps were easy, I was going to do this myself. But now that I have tried and noticed this float, and combine that with my problems in Actionscript I really have no clue.
Question #2: how should I make it returns the amount of milliseconds in a Unix timestamp?
Timestamps should be so easy, I'm probably missing something.. sorry about that. Thanks in advance.
EDIT1: Answered the first question by myself. See below.
EDIT2: Answered second question by myself as well. See below. Can't accept answer within 48 hours.
For actionscript3, new Date().getTime()
should work.
In PHP you can simply call time() to get the time passed since January 1 1970 00:00:00 GMT in seconds. If you want milliseconds just do (time()*1000)
.
If you use microtime() multiply the second part with 1000 to get milliseconds. Multiply the first part with 1000 to get the milliseconds and round that. Then add the two numbers together. Voilá.