On Wed, Oct 20, 2010 at 02:46:25PM +0100, Thomas Marsh wrote:
> >
> >
> > Enter the Dec coordinates (d m s)
> >
> > +12 54 47.2
> >
> > 0.22537630556475219
> >
> > Corrected JD, Lighttime (Sec)
> >
> > 2455336.6290276516 424.48937938702886
> >
> > Difference in correction -> 0.01257 seconds -> Which I presume would be
> > down to UTC->TT->etc effects.
> >
>
> More likely this is the effect of observatory position I think. The WHT
> would be a little closer to the target in this case than the centre of the
> Earth which would add a little to the overall correction to get to the Sun.
> I don't think there is any reason for UTC vs TT vs TDB etc to
> change the correction by much as opposed to the absolute time. Remember it
> takes light 0.02 secs to travel a length = radius of Earth.
Since we are looking at sub-second precision, it is perhaps worth
noting that the UT family of time standards (UT, UT1, UTC, BST,
etc.) is not suitable: Your wall clock in the observatory probably
shows UTC. If you're not busy watching the fireworks on Hogmanay,
you will notice that UTC stops for a second every couple of years.
This matters, if there are leap seconds between your observations
(and if an error of a few seconds over a time span of a few years
really bothers you).
The TT family of time standards (TT, ET, TDB etc.) does not have these
leap seconds. These time standards run contiuously at constant rate.
As does atomic time TAI.
This does not affect your programme to calculate the BJD-JD
correction. But you may want to express the timings of your
photometry not in UTC but in TAI or TDB to take out the effect
of the leap seconds.
--
Horst Meyerdierks Royal Observatory Edinburgh
Linux/Network Manager [log in to unmask]
http://www.roe.ac.uk/~hme/ +44-131-6688-309
|