R
Rickey Tom
This has to be a very common question, but my search did not come up with an
answer.
I needed to set an expiration time for a cookie. In .NET, is seems that the
server-side code is used to set the expiration date of the cookie. More
specifically, I've seen several examples (from the Microsoft site) where the
cookie information, including the expiration date is set on the server. I
wonder how this can work. If the client and server do not have the same time
stamp, how can this work. For example, if the expiration time is 20 min and
the client is 12 hours forward in time, as soon as the request returns to
the client, the cookie is expired and no longer "issued".
It seems to me that the server either needs to take into account, the
client's time when the expiry time is set.
I noticed that in some documentation, the expires flag is supposed to use
GMT. But if the user plays with the date
on their machine, the cookie may be prevented from expiring.
What is wrong with my logic?
Thanks
Rick
answer.
I needed to set an expiration time for a cookie. In .NET, is seems that the
server-side code is used to set the expiration date of the cookie. More
specifically, I've seen several examples (from the Microsoft site) where the
cookie information, including the expiration date is set on the server. I
wonder how this can work. If the client and server do not have the same time
stamp, how can this work. For example, if the expiration time is 20 min and
the client is 12 hours forward in time, as soon as the request returns to
the client, the cookie is expired and no longer "issued".
It seems to me that the server either needs to take into account, the
client's time when the expiry time is set.
I noticed that in some documentation, the expires flag is supposed to use
GMT. But if the user plays with the date
on their machine, the cookie may be prevented from expiring.
What is wrong with my logic?
Thanks
Rick