This chat of zenith cams, etc. is interesting.
How well could you do with something like the camera in the iPhone4
facing up. The front camera is VGA resolution.
Say you're on another planet?
(we transfer time to Mars Rovers using radio, but techniques that are
independent of radio are always useful)
In message 4F1DBCC9.9040900@earthlink.net, Jim Lux writes:
How well could you do with something like the camera in the iPhone4
facing up. The front camera is VGA resolution.
Very badly.
The major trouble is actually not getting the light from the star,
but making sure your camera/telescope/transit-circle has a known
and stable geometric relationship to the planet Earth.
--
Poul-Henning Kamp | UNIX since Zilog Zeus 3.20
phk@FreeBSD.ORG | TCP/IP since RFC 956
FreeBSD committer | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
On Mon, Jan 23, 2012 at 12:02 PM, Jim Lux jimlux@earthlink.net wrote:
This chat of zenith cams, etc. is interesting.
How well could you do with something like the camera in the iPhone4 facing
up. The front camera is VGA resolution.
Say you're on another planet?
You can use a stick pounded into the ground and wait until the shadow
has minimum length. But I assume we need better accuracy?
If you use a camera, accuracy will be limited by your knowledge of
where you are aiming the camera. If you are off by one degree then
the error is about 1/360 times the length of the day on your planet.
So finding the time is really about discovering where you have aimed
the camera. This is best figured out at night when you can see
stars. You can actually aim the camera at random, so long as you
measure the aim point and don't let it move.
That said, I think if you were to leave a cell phone in a fixed
position, un-moved all night you can likely get to 1/10th of a pixel
angular resolution. So what is the angle subtended by one pixel
on your phone divide that by 10 then multiply by one day. A
total guess is "about 1 mSec" if you use a full night's data. Just be
warned that reducing the data is not simple there are many steps
involved just one of then is matching your data to a good star catalog
and this implies having a good catalog.
You really can get to 0.1 pixel. You fit a function to the "fuzzy
blob" image of each star and then maybe 100 pixels contribute to a
solution.
Chris Albertson
Redondo Beach, California
On 1/23/12 12:05 PM, Poul-Henning Kamp wrote:
In message4F1DBCC9.9040900@earthlink.net, Jim Lux writes:
How well could you do with something like the camera in the iPhone4
facing up. The front camera is VGA resolution.
Very badly.
The major trouble is actually not getting the light from the star,
but making sure your camera/telescope/transit-circle has a known
and stable geometric relationship to the planet Earth.
Say you had it in some sort of "fixture" to allow it to be placed
repeatably with reference to your local earth position.
I can think of two general scenarios here.
One is where you "lay the iphone on the table" in a fixed position. One
could use the internal accelerometers to determine "level", but I don't
think you could tell orientation, unless, perhaps, you can see
circumpolar stars? That is, by watching the movement of the
stars/planets through the field of view over some hours, could you
figure it out? Or is there some fundamental ambiguity.
(obviously, you can trivially see the moon/sun)
The other scenario is where you get an inexpensive camera (webcam, or
perhaps some slightly better point and shoot) and build a precision
mount (so you DO have accurate knowledge of sensor orientation and
position) Could you, perhaps over time, do an insitu calibration?
I suppose any of these techniques is going to have issues with the
uncertainty in when the image is actually captured (e.g. there's
probably 10-100 ms you're not going to get away from).
On 1/23/12 12:29 PM, Chris Albertson wrote:
On Mon, Jan 23, 2012 at 12:02 PM, Jim Luxjimlux@earthlink.net wrote:
This chat of zenith cams, etc. is interesting.
How well could you do with something like the camera in the iPhone4 facing
up. The front camera is VGA resolution.
Say you're on another planet?
You can use a stick pounded into the ground and wait until the shadow
has minimum length. But I assume we need better accuracy?
An interesting approach, because it could conceivably get
"magnification" without using lenses or mirrors. Imagine the shadow tip
of a 2 meter long stick, and I have the camera positioned so that I only
see about 20cmx20cm. (of course, the shadow isn't that well defined,
because the angular extent of the sun is huge)
A similar scheme if i use a pinhole to project an image of the sun, and
image that, instead.
If you use a camera, accuracy will be limited by your knowledge of
where you are aiming the camera. If you are off by one degree then
the error is about 1/360 times the length of the day on your planet.
So finding the time is really about discovering where you have aimed
the camera. This is best figured out at night when you can see
stars. You can actually aim the camera at random, so long as you
measure the aim point and don't let it move.
That said, I think if you were to leave a cell phone in a fixed
position, un-moved all night you can likely get to 1/10th of a pixel
angular resolution. So what is the angle subtended by one pixel
on your phone divide that by 10 then multiply by one day. A
total guess is "about 1 mSec" if you use a full night's data. Just be
warned that reducing the data is not simple there are many steps
involved just one of then is matching your data to a good star catalog
and this implies having a good catalog.
iPhone cameras (and most webcams, etc.) seem to have a FOV about 45
degrees, so one pixel is around 0.1 degree. At 4 minutes time per
degree, that's about 24 seconds per pixel.
(It's not a monochrome sensor, either, so although it's NxM pixels, that
doesn't mean that you could actually resolve a planet to that scale,
depending on color, and how the image is processed)
You really can get to 0.1 pixel. You fit a function to the "fuzzy
blob" image of each star and then maybe 100 pixels contribute to a
solution.
tricky on a iPhone type camera, since star images are one pixel at best.
On the cameras I've seen that were designed to do this, they have a
cleverly designed optical system that blurs the image. (and another
scheme uses a camera with a multi pinhole mask in front, to render the
image in multiple places across the sensor.
I can think of two general scenarios here.
If you planet has air you will need to know how it refracts st
One is where you "lay the iphone on the table" in a fixed position. One
could use the internal accelerometers to determine "level", but I don't
think you could tell orientation, unless, perhaps, you can see circumpolar
stars? That is, by watching the movement of the stars/planets through the
field of view over some hours, could you figure it out? Or is there some
fundamental ambiguity.
No, you can point to any location and you can (in theory) figure out
where it's pointing given that you have a large enough field of view
to see many stars at the same time. You can make a fixture easy
enough, just some epoxy and a large boulder. I used lag bolts
onto my garage roof and it worked more than good enough.
If you can choose, straight up is the best aim point. Refraction is
not much of an issue and there is less air to look through. But
looking at the equator means there is less field rotation and the data
is easier to reduce. We looked at the equator because we did not want
to deal with image rotation. Motion blur is minimize down there too.
But if you want to know "absolute time" then you need more. Looking
at any random but fixed location will get you the period of the
planet's ration to about a mSec with cheap equipment but to get
absolute time you need to measure the aim point relative to the local
meridian. That is not as easy. Star with a protrator and a plumb
bob. That is what I used. But to refine that you need a good
source of time and for the purpose of this exercise we don't have
that. Only the plumb bob which means "a few seconds of error". maybe
an precision level can do 10X better?
(obviously, you can trivially see the moon/sun)
The other scenario is where you get an inexpensive camera (webcam, or
perhaps some slightly better point and shoot) and build a precision mount
(so you DO have accurate knowledge of sensor orientation and position) Could
you, perhaps over time, do an insitu calibration?
I suppose any of these techniques is going to have issues with the
uncertainty in when the image is actually captured (e.g. there's probably
10-100 ms you're not going to get away from).
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
--
Chris Albertson
Redondo Beach, California
El 23/01/2012 21:43, Jim Lux escribió:
One is where you "lay the iphone on the table" in a fixed position.
One could use the internal accelerometers to determine "level", but I
don't think you could tell orientation, unless, perhaps, you can see
circumpolar stars? That is, by watching the movement of the
stars/planets through the field of view over some hours, could you
figure it out? Or is there some fundamental ambiguity.
I don't know about the iPhone, but I've seen an HTC with a funny
application that, when you point anywhere in the sky, it shows you the
constellations that are there. Even if you point it to ground, it shows
you the constellations in the other hemisphere :) I don't remember if
the application is this http://www.google.com/mobile/skymap/ or
something similar, but in any case, the phone knows its orientation
quite good (well... also depends on the phone to have the right time, of
course... :) )
Regards,
Javier
A mercury mirror is better than a plumb bob.
Doug
From: Chris Albertson albertson.chris@gmail.com
To: Discussion of precise time and frequency measurement time-nuts@febo.com
Sent: Monday, January 23, 2012 1:12 PM
Subject: Re: [time-nuts] finding time astronomically.
I can think of two general scenarios here.
If you planet has air you will need to know how it refracts st
One is where you "lay the iphone on the table" in a fixed position. One
could use the internal accelerometers to determine "level", but I don't
think you could tell orientation, unless, perhaps, you can see circumpolar
stars? That is, by watching the movement of the stars/planets through the
field of view over some hours, could you figure it out? Or is there some
fundamental ambiguity.
No, you can point to any location and you can (in theory) figure out
where it's pointing given that you have a large enough field of view
to see many stars at the same time. You can make a fixture easy
enough, just some epoxy and a large boulder. I used lag bolts
onto my garage roof and it worked more than good enough.
If you can choose, straight up is the best aim point. Refraction is
not much of an issue and there is less air to look through. But
looking at the equator means there is less field rotation and the data
is easier to reduce. We looked at the equator because we did not want
to deal with image rotation. Motion blur is minimize down there too.
But if you want to know "absolute time" then you need more. Looking
at any random but fixed location will get you the period of the
planet's ration to about a mSec with cheap equipment but to get
absolute time you need to measure the aim point relative to the local
meridian. That is not as easy. Star with a protrator and a plumb
bob. That is what I used. But to refine that you need a good
source of time and for the purpose of this exercise we don't have
that. Only the plumb bob which means "a few seconds of error". maybe
an precision level can do 10X better?
(obviously, you can trivially see the moon/sun)
The other scenario is where you get an inexpensive camera (webcam, or
perhaps some slightly better point and shoot) and build a precision mount
(so you DO have accurate knowledge of sensor orientation and position) Could
you, perhaps over time, do an insitu calibration?
I suppose any of these techniques is going to have issues with the
uncertainty in when the image is actually captured (e.g. there's probably
10-100 ms you're not going to get away from).
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
--
Chris Albertson
Redondo Beach, California
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
On 1/23/12 1:18 PM, Javier Herrero wrote:
El 23/01/2012 21:43, Jim Lux escribió:
One is where you "lay the iphone on the table" in a fixed position.
One could use the internal accelerometers to determine "level", but I
don't think you could tell orientation, unless, perhaps, you can see
circumpolar stars? That is, by watching the movement of the
stars/planets through the field of view over some hours, could you
figure it out? Or is there some fundamental ambiguity.
I don't know about the iPhone, but I've seen an HTC with a funny
application that, when you point anywhere in the sky, it shows you the
constellations that are there. Even if you point it to ground, it shows
you the constellations in the other hemisphere :) I don't remember if
the application is this http://www.google.com/mobile/skymap/ or
something similar, but in any case, the phone knows its orientation
quite good (well... also depends on the phone to have the right time, of
course... :) )
yes, Pocket Universe (pUniverse) does this quite nicely (esp on the iPad)
But it uses the magnetic compass (and GPS) as well as orientation.
On Mon, Jan 23, 2012 at 1:08 PM, Jim Lux jimlux@earthlink.net wrote:
On 1/23/12 12:29 PM, Chris Albertson wrote:
On Mon, Jan 23, 2012 at 12:02 PM, Jim Luxjimlux@earthlink.net wrote:
This chat of zenith cams, etc. is interesting.
How well could you do with something like the camera in the iPhone4
facing
up. The front camera is VGA resolution.
Say you're on another planet?
You can use a stick pounded into the ground and wait until the shadow
has minimum length. But I assume we need better accuracy?
An interesting approach, because it could conceivably get "magnification"
without using lenses or mirrors. Imagine the shadow tip of a 2 meter long
stick, and I have the camera positioned so that I only see about 20cmx20cm.
(of course, the shadow isn't that well defined, because the angular extent
of the sun is huge)
A similar scheme if i use a pinhole to project an image of the sun, and
image that, instead.
This is why I sugested using the sun. It is easy. I know fisrt hand
that using camera pointed upward requires months and years of effort
and it is unlirly you will find one person who knows enough to pull it
off as a solo effort.
But a wire or better a slit that sweeps an image across a photo diode
is far simpler.
Yes the sun is huge angular extent but you measure the entire light
curve and fit a function to the curve to find the center of the fuzzy
shadow. Also you can collect data every clear day for years and over
time see how close you can get. I bet "pretty good".
You don't want a pin hole or you'd be adjusting the aim every day
To get better data you can have multiple slits so you get three or
five light curves, say 15 minutes apart every day.
The hard part will be the "simple" things like designing the
instrument so dirt and bird poop does not block the photocell or slit
and rain doe not get into the electronics. And build it sturdy
enough that it can last outdoors in the sun and rain for many years
with zero maintenance and not cost much.
If you use a camera, accuracy will be limited by your knowledge of
where you are aiming the camera. If you are off by one degree then
the error is about 1/360 times the length of the day on your planet.
So finding the time is really about discovering where you have aimed
the camera. This is best figured out at night when you can see
stars. You can actually aim the camera at random, so long as you
measure the aim point and don't let it move.
That said, I think if you were to leave a cell phone in a fixed
position, un-moved all night you can likely get to 1/10th of a pixel
angular resolution. So what is the angle subtended by one pixel
on your phone divide that by 10 then multiply by one day. A
total guess is "about 1 mSec" if you use a full night's data. Just be
warned that reducing the data is not simple there are many steps
involved just one of then is matching your data to a good star catalog
and this implies having a good catalog.
iPhone cameras (and most webcams, etc.) seem to have a FOV about 45 degrees,
so one pixel is around 0.1 degree. At 4 minutes time per degree, that's
about 24 seconds per pixel.
(It's not a monochrome sensor, either, so although it's NxM pixels, that
doesn't mean that you could actually resolve a planet to that scale,
depending on color, and how the image is processed)
You really can get to 0.1 pixel. You fit a function to the "fuzzy
blob" image of each star and then maybe 100 pixels contribute to a
solution.
tricky on a iPhone type camera, since star images are one pixel at best. On
the cameras I've seen that were designed to do this, they have a cleverly
designed optical system that blurs the image. (and another scheme uses a
camera with a multi pinhole mask in front, to render the image in multiple
places across the sensor.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
--
Chris Albertson
Redondo Beach, California
The atmospheric issue is more differential refraction, than refraction per
say. A zenith pointing camera is likely the best choice. The zenith is the
direction of the least atmospheric depth also.
-John
===========
I can think of two general scenarios here.
If you planet has air you will need to know how it refracts st
One is where you "lay the iphone on the table" in a fixed position. One
could use the internal accelerometers to determine "level", but I don't
think you could tell orientation, unless, perhaps, you can see
circumpolar
stars? That is, by watching the movement of the stars/planets through
the
field of view over some hours, could you figure it out? Or is there
some
fundamental ambiguity.
No, you can point to any location and you can (in theory) figure out
where it's pointing given that you have a large enough field of view
to see many stars at the same time. You can make a fixture easy
enough, just some epoxy and a large boulder. I used lag bolts
onto my garage roof and it worked more than good enough.
If you can choose, straight up is the best aim point. Refraction is
not much of an issue and there is less air to look through. But
looking at the equator means there is less field rotation and the data
is easier to reduce. We looked at the equator because we did not want
to deal with image rotation. Motion blur is minimize down there too.
But if you want to know "absolute time" then you need more. Looking
at any random but fixed location will get you the period of the
planet's ration to about a mSec with cheap equipment but to get
absolute time you need to measure the aim point relative to the local
meridian. That is not as easy. Star with a protrator and a plumb
bob. That is what I used. But to refine that you need a good
source of time and for the purpose of this exercise we don't have
that. Only the plumb bob which means "a few seconds of error". maybe
an precision level can do 10X better?
(obviously, you can trivially see the moon/sun)
The other scenario is where you get an inexpensive camera (webcam, or
perhaps some slightly better point and shoot) and build a precision
mount
(so you DO have accurate knowledge of sensor orientation and position)
Could
you, perhaps over time, do an insitu calibration?
I suppose any of these techniques is going to have issues with the
uncertainty in when the image is actually captured (e.g. there's
probably
10-100 ms you're not going to get away from).
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
--
Chris Albertson
Redondo Beach, California
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
On 1/23/12 1:20 PM, Doug Millar wrote:
A mercury mirror is better than a plumb bob.
Doug
Or Gallium?
But what sort of precision are we looking for here?
1 second of earth rotation is 1/240th degree (15 arc seconds), about
0.07 milliradian.
So on a plumb bob a meter long, you're looking for a displacement of
0.07 mm... Seems a bit challenging.
Even with an optical scheme looking for the reflection coming back from
your mirror a meter away, that's just 70 microns.. Well, at least it's
not a few wavelengths of light.
But I can see a lot of practical problems at that level of precision:
Vibration isolation?
Local gravitational anomalies. (I seem to recall tens of arcseconds for
this)
difference between local gravity vector and normal of the ellipsoid or
geoid due to non spherical earth, etc. (this one is calculatable)
In message 4F1DD9D3.6050003@earthlink.net, Jim Lux writes:
On 1/23/12 1:20 PM, Doug Millar wrote:
A mercury mirror is better than a plumb bob.
But not much.
Both of them are subject to aberations of the local gravity vector
(any mountains, valleys near by ?) and in the case of a rotating
liquid metal mirror to a lesser degree to interaction with the earths
magnetic field.
--
Poul-Henning Kamp | UNIX since Zilog Zeus 3.20
phk@FreeBSD.ORG | TCP/IP since RFC 956
FreeBSD committer | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.
Hi Chris:
I would say you want an optimum hole diameter for imaging the Sun.
Sort of like the f/100 school of photography.
For a few years I drove brass tacks into a hardwood floor at exactly noon where the tack was placed at the center of the
Sun's image using 3x5 cards with nested ellipsis of different sizes with a small hole in the centers. I choose the "pin
hole" diameter that was slightly larger than the hole size needed for good overall focus. If the hole is smaller than
needed for good focus you are getting a much dimmer image and much larger and the image gets fuzzy. For this
application maybe a hole somewhat larger that still has the same peak intensity as the in focus hole.
Another idea would be to use a photo sensor to read the spots from a Dipleidscope.
http://www.prc68.com/I/Dent.shtml
Have Fun,
Brooke Clarke
http://www.PRC68.com
http://www.end2partygovernment.com/Brooke4Congress.html
Chris Albertson wrote:
. . .
You don't want a pin hole or you'd be adjusting the aim every day
You may run into diffraction problems before achieving the sought accuracy?
How about measuring the motion of a tracker against a clock?
Don
Jim Lux
On 1/23/12 12:29 PM, Chris Albertson wrote:
On Mon, Jan 23, 2012 at 12:02 PM, Jim Luxjimlux@earthlink.net
wrote:
This chat of zenith cams, etc. is interesting.
How well could you do with something like the camera in the iPhone4
facing
up. The front camera is VGA resolution.
Say you're on another planet?
You can use a stick pounded into the ground and wait until the shadow
has minimum length. But I assume we need better accuracy?
An interesting approach, because it could conceivably get
"magnification" without using lenses or mirrors. Imagine the shadow tip
of a 2 meter long stick, and I have the camera positioned so that I only
see about 20cmx20cm. (of course, the shadow isn't that well defined,
because the angular extent of the sun is huge)
A similar scheme if i use a pinhole to project an image of the sun, and
image that, instead.
If you use a camera, accuracy will be limited by your knowledge of
where you are aiming the camera. If you are off by one degree then
the error is about 1/360 times the length of the day on your planet.
So finding the time is really about discovering where you have
aimed
the camera. This is best figured out at night when you can see
stars. You can actually aim the camera at random, so long as you
measure the aim point and don't let it move.
That said, I think if you were to leave a cell phone in a fixed
position, un-moved all night you can likely get to 1/10th of a pixel
angular resolution. So what is the angle subtended by one pixel
on your phone divide that by 10 then multiply by one day. A
total guess is "about 1 mSec" if you use a full night's data. Just be
warned that reducing the data is not simple there are many steps
involved just one of then is matching your data to a good star catalog
and this implies having a good catalog.
iPhone cameras (and most webcams, etc.) seem to have a FOV about 45
degrees, so one pixel is around 0.1 degree. At 4 minutes time per
degree, that's about 24 seconds per pixel.
(It's not a monochrome sensor, either, so although it's NxM pixels, that
doesn't mean that you could actually resolve a planet to that scale,
depending on color, and how the image is processed)
You really can get to 0.1 pixel. You fit a function to the "fuzzy
blob" image of each star and then maybe 100 pixels contribute to a
solution.
tricky on a iPhone type camera, since star images are one pixel at best.
On the cameras I've seen that were designed to do this, they have a
cleverly designed optical system that blurs the image. (and another
scheme uses a camera with a multi pinhole mask in front, to render the
image in multiple places across the sensor.
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
--
"Neither the voice of authority nor the weight of reason and argument
are as significant as experiment, for thence comes quiet to the mind."
R. Bacon
"If you don't know what it is, don't poke it."
Ghost in the Shell
Dr. Don Latham AJ7LL
Six Mile Systems LLP
17850 Six Mile Road
POB 134
Huson, MT, 59846
VOX 406-626-4304
www.lightningforensics.com
www.sixmilesystems.com
On 1/23/2012 3:02 PM, Jim Lux wrote:
How well could you do with something like the camera in the iPhone4
facing up. The front camera is VGA resolution
A lower bound can be estimated.
A cell phone (iPhone 4 rear camera) camera sensor has a resolution of
what? ~2600 pixels wide with a 45 degree field of view - that's ~ 60 arc
seconds per pixel, which is about 4 seconds of time. The Dawes limit is
about 1 second (17 arc-seconds) for a perfect .25" lens. Obviously worse
with a VGA resolution camera.
Can such a camera even "see" stars?
I think you'd want a slit, not a pin hole. The pin hole would be
better but it would only work one day a year. And it could be plugged
up.
I'm thinking the best way to build this might be to paint a sheet of
glass after masking out a very thin strip with vinyl tape. Face the
uncoated side to the sun. . The glass would keep dirt and water out.
Aim it at the ecliptic and surround the glass with bird spikes. Maybe
use a filter to reduce skylight but let IR in. To make the blue sky
look more black.
I think the optimum width of the slit, or pinhole diameter to make it
match the width of the photo detector. Making it wider does not put
more light on the detector. The geometry when give you a nice raise
and fall. You could place a full column of photo diodes in back of
the slit
On Mon, Jan 23, 2012 at 3:10 PM, Brooke Clarke brooke@pacific.net wrote:
Hi Chris:
I would say you want an optimum hole diameter for imaging the Sun.
Sort of like the f/100 school of photography.
For a few years I drove brass tacks into a hardwood floor at exactly noon
where the tack was placed at the center of the Sun's image using 3x5 cards
with nested ellipsis of different sizes with a small hole in the centers. I
choose the "pin hole" diameter that was slightly larger than the hole size
needed for good overall focus. If the hole is smaller than needed for good
focus you are getting a much dimmer image and much larger and the image gets
fuzzy. For this application maybe a hole somewhat larger that still has the
same peak intensity as the in focus hole.
Another idea would be to use a photo sensor to read the spots from a
Dipleidscope.
http://www.prc68.com/I/Dent.shtml
Have Fun,
Brooke Clarke
http://www.PRC68.com
http://www.end2partygovernment.com/Brooke4Congress.html
Chris Albertson wrote:
. . .
You don't want a pin hole or you'd be adjusting the aim every day
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.
--
Chris Albertson
Redondo Beach, California
On Mon, Jan 23, 2012 at 3:51 PM, Mike S mikes@flatsurface.com wrote:
On 1/23/2012 3:02 PM, Jim Lux wrote:
How well could you do with something like the camera in the iPhone4
facing up. The front camera is VGA resolution
A lower bound can be estimated.
A cell phone (iPhone 4 rear camera) camera sensor has a resolution of what?
~2600 pixels wide with a 45 degree field of view - that's ~ 60 arc seconds
per pixel, which is about 4 seconds of time. The Dawes limit is about 1
second (17 arc-seconds) for a perfect .25" lens. Obviously worse with a VGA
resolution camera.
The goal is not to create an image. A blur is actually better and
I've read of people intentionally using de-focus. What you do in
compute a best fit of the system point spead function (PSF). Or with
many blobs in the field you do a convolution of the image with the
system PSF.
The end product is not an image but a table of X,Y coordinates of each
detected star. You don't need to detect every star. Then you
search a star catalog and find thebest fit transformation matric that
takes you from X,Y to the catalog. The matric is your real
"product".
Typically you should expect about 1/10 of a pixel resolution at the
end. And then you take hundreds of images every night and average
them and you continue maybe for years.
If you were designing a camera for this purpose you make it so that a
typical star would cover maybe five pixels across so that the 5 by 5
pixel subimage would look like a Gaussian function. The centroid of
the function is your X,Y for the star. So you see that even with 5
pixel blurs you can likely find X,Y to much better than one pixel
width. This helps with noise too, noise would be a poor fit to a
2D Gaussian function. (and also there would be no catalog star for a
noise hit)
Chris Albertson
Redondo Beach, California
I think you'd want a slit, not a pin hole. The pin hole would be
better but it would only work one day a year.
Actually two days per year, unless it was adjusted for the summer or
winter solstice, then it'd be one.
-John
===========
On Mon, Jan 23, 2012 at 6:07 PM, J. Forster jfor@quikus.com wrote:
I think you'd want a slit, not a pin hole. The pin hole would be
better but it would only work one day a year.
Actually two days per year, unless it was adjusted for the summer or
winter solstice, then it'd be one.
I still think it is "one". because there are not an integer number of
days per year so you don't get and exact repeat in 6 months. Maybe a
pin hole would only work once ever? I don't know. To "work" the
pinhole has to exactly line up with the detector at the exact same
time of day.
But I'm not liking slits either because I can't see how to adjust them
to exact vertical.
I'm back to the first thing I thought of, a wire with a large weight.
Then you measure the light curve as shadow of the wire sweeps over
the detector.
Chris Albertson
Redondo Beach, California