In Facebook and twitter, we can see that there is a timestamp like "23 seconds ago" or "1 hour ago" for each event & tweet. If we leave the page for some time, the timestamp changes accordingly.
Since it is possible that user machine d开发者_高级运维oesn't have same system time as server machine, how to make the dynamic timestamp accurate?
My idea is: It is always based on server time. When request is sent to server for the web page, timestamp T1 (seconds to 1970/1/1) is rendered into inline javascript variable. The displayed timestamp ("23 seconds ago") is calculated by T1 instead of local time.
I'm not sure whether this is how Facebook/Twitter do it. Is there any better idea?
This is what happens with Facebook: when someone makes a post, Facebook records the timestamp from the server's clock. When the information is displayed to the user, this timestamp is sent to them; however, the live "X seconds/minutes/hours ago" label is based on the client's clock. You can test this by opening Facebook and looking at some recent news feed items, then changing your system's clock -- you will see that the labels will soon update to match the difference based on your system clock.
However, if you refresh the page, the correct time difference will once again be displayed, even though your system clock is still wrong, thus it seems to be based on a time difference sent by the server as well.
Reasoning from that, I would say that they send the time difference from the server, then, the live labels are updated using that value along with a difference calculated using the client's clock.
Here is a detailed example of what I mean::
- An item is posted at 14 Jan 2011 14:40:26 (server time)
- A user loads the news feed at 14 Jan 2011 14:40:42 (server time)
- Facebook sends the current time difference on the server. (Difference between 14 Jan 2011 14:40:26 and 14 Jan 2011 14:40:42 is 16 seconds)
- The page loads in the client's browser. According to the client's computer, the time is 14 Jan 2011 14:47:03, and is recorded using JavaScript. (Server time is still 14 Jan 2011 14:40:42)
- The initial label is populated with the value from the server and now says "16 seconds ago"
- Time passes. The client's clock now says 14 Jan 2011 14:47:25. (Server time is now 14 Jan 2011 14:41:04)
- The client-side time difference is calculated: difference between 14 Jan 2011 14:47:03 and 14 Jan 2011 14:47:25 is 22 seconds
- The live label is updated using the server time difference, plus the client time difference (16 seconds + 22 seconds) and the label now says "38 seconds ago"
The actual difference between the initial timestamp from the server (14 Jan 2011 14:40:26) and the current server time (14 Jan 2011 14:41:04) is 38 seconds, so you can see that even though the client's clock was wrong, it was still possible to calculate an accurate time difference using this method.
The simple answer is that all times are on the server and when you get the list of tweets, you see the difference in time between when the tweet was created and when it is served to you. You don't need anything on the client side
Facebook and Twitter calculate the time on the server side, and output the timestamp as normal text. There is no client-side calculation of the times on these sites, at least not that I have observed.
If you view the source of http://twitter.com/ at the moment you'll see HTML like this:
<div class="hc-tweet">
<div class="hc-label">Recently tweeted:</div>
<div class="hc-tweet-text">You don't know pain until you've worked in a French bakery.</div>
<div class="hc-meta">about 1 hour ago</div>
</div>
If you view the source of a typical Facebook page you'll see HTML like this:
<abbr title=\"Saturday, 15 January 2011 at 09:40\" data-date=\"Fri, 14 Jan 2011 14:40:26 -0800\" class=\"timestamp\">6 hours ago<\/abbr>
Note that this Facebook HTML is actually inside a Javascript string (that's why there is a backslash before the quotes) — so even though Facebook was using Javascript to populate the timestamp, it had been calculated on the server side.
精彩评论