Focused vs. Unfocused Differential Photometry: Observations of Exoplanets WASP-59b and WASP-33b

Defocusing is a technique used to capture photometry data by researchers to achieve a higher signal-to-noise ratio and therefore a smaller error in data. Since the light is spread over a much larger area of pixels on the CCD chip, the exposure time can also be increased, which helps to “smear out” the images and gather more photons. My research group wanted to try and quantify this, as well as do some observations of exoplanet transits. The theory is that when the light is only on a few pixels, variance and distortion on any of those pixels has a large effect on the data but with defocusing, the light is spread over a much larger pixel area so that if a few pixels are varying in any given image, the data set as a whole isn’t effected that much. Exoplanets are great for this type of observation since we are looking purely at the amount of light and no visual resolution is needed.

WASP-33b focused point spread function.

WASP-33b focused point spread function.

WASP-33b defocused point spread function. This created a very nice "donut".

WASP-33b defocused point spread function. This created a very nice “donut”.

Our research was done at the Cal Poly State University observatory, a 14 inch Meade LX-600 ACF telescope and an SBIG CCD camera. We were able to get three data sets: one for WASP-59b (13th magnitude host star) that was in focus, and two for WASP-33b (8th magnitude host star), one in focus and one out of focus. PyRAF was used for the data reduction and differential photometry process, but some custom tools were written to deal with the data sets. A tool called “daofind” creates lists of stars that meet certain criteria in each image of the data set so that you can use these stars to calculate photometric values such as magnitude and SNR. Unfortunately, it is not very good at keeping “found stars” consistent between images. Star #1 in image #1 is not necessarily star #1 in image #2, and so on. This is not a good problem when there are hundred or thousands of images. The tool I wrote compares all of the stars found in a given image against the stars from the previous images and if the x, y, and magnitude all change less than a certain threshold, it considers it to be the same star. This aligns the data that gets dumped and makes it easy to complete photometry.

These are some of our resulting light curves along with the binned curve that was used to estimate the transit start/stop times (marked with the yellow lines) and depth of the magnitude delta.

We obtained some great measurements of the transiting exoplanets. The defocused data was much easier to reduce and we did not throw out a single data point. With the other two focused runs, many data points were missing for comparison stars or the primary target and had to be omitted. Keep in mind, these graphs look differently because the data from WASP-33b focused contained about 2500 images since they had to be taken at 2.5 second exposure times. Focusing all of the star’s light onto a few pixels means that saturation is reached quickly, so this was the maximum exposure time that we could use. Defocusing allowed us to increase the exposure time to 15 seconds, so the data is naturally less noisy and there are far fewer data points in the graph. Defocusing also increased the SNR by a factor of 12.5 to 20.

While we could not draw a direct conclusion about the defocusing technique being the factor that made the data much better, the defocused data set was much less variant over the period of the transit than either of the two focused data sets and had a much larger SNR. We need to get more telescope time in order to draw a better and more quantitative result, but so far this looks very promising.

April Targets

Here is my list of targets for the month. The meridian times listed are in PDT for me at 120.5 degrees West in California, so your times may vary.

Meridian Target Type Mag Constellation Distance
04:19:31 PM M1 – Crab Nebula Diffuse Nebula 8.4 Taurus 6.5K ly
05:46:09 PM Jupiter Planet -2.1 Gemini
06:46:41 PM C/2014 E2 Jacques Comet 12.9 Puppis
07:25:00 PM M44 – Beehive Open Cluster 3.7 Cancer 590 ly
07:36:00 PM M67 Open Cluster 6.1 Cancer 2.6K ly
08:40:33 PM M81 – Bode’s Galaxy Galaxy 6.9 Ursa Major 11.7M ly
08:40:52 PM M82 – Cigar Galaxy Galaxy 8.4 Ursa Major 11.5M ly
08:52:02 PM NGC 3132 – Eight-Burst Nebula Planetary Nebula 10.0 Vela 2.6K ly
09:09:46 PM NGC 3242 – Ghost of Jupiter Planetary Nebula 11.0 Hydra 2K ly
11:04:00 PM M106 Galaxy 9.1 Canes Venatici 23.7M ly
11:14:47 PM M49 Galaxy 9.4 Virgo 55.9M ly
11:15:49 PM M87 Galaxy 9.6 Virgo 53.5M ly
11:21:20 PM NGC 4565 – Needle Galaxy Galaxy 10.4 Coma Berenices 40M ly
11:24:59 PM M104 – Sombrero Galaxy Galaxy 9.0 Virgo 29.3M ly
11:34:10 PM Mars Planet -1.4 Virgo
11:35:53 PM M94 Galaxy 9.0 Canes Venatici 15M ly
11:41:43 PM M64 – Black-eye Galaxy Galaxy 9.4 Coma Berenices 24M ly
11:57:55 PM M53 Globular Cluster 8.3 Coma Berenices 58K ly
12:00:49 AM M63 – Sunflower Galaxy Galaxy 9.3 Canes Venatici 37M ly
12:10:27 AM NGC 5128 – Centaur’s Shield Galaxy 6.8 Centaurus 13M ly
12:14:52 AM M51 – Whirlpool Galaxy Galaxy 8.4 Canes Venatici 25M ly
12:22:00 AM M83 Galaxy 7.5 Hydra 14.7M ly
12:27:12 AM M3 Globular Cluster 6.2 Canes Venatici 33.9K ly
12:48:12 AM M101 – Pinwheel Galaxy Galaxy 9.1 Ursa Major 21M ly
02:01:35 AM Saturn Planet 0.8 Libra
02:03:33 AM M5 Globular Cluster 6.7 Serpens 25K ly
03:08:35 AM M4 Globular Cluster 5.9 Scorpius 7.2K ly
03:26:41 AM M13 – Great Cluster in Hercules Globular Cluster 5.8 Hercules 22.2K ly
03:32:14 AM M12 Globular Cluster 7.7 Ophiuchus 15.7K ly
03:42:08 AM M10 Globular Cluster 6.4 Ophiuchus 14.3K ly
04:02:07 AM M92 Globular Cluster 6.3 Hercules 26.7K ly

Double Star Data Reduction

An example of the output data plots. The first two show a slice of the original data before processing. The last four plots show the data after being processed.

I have posted my first version of Python code on github which can process through large groups of FITS data cubes for reduction. The main features are breaking up FITS files into smaller pieces, writing the header data into a CSV file for an easy “log” of your observation run, and align, stack, and process double star images down to a single, pristine image. This was only my first attempt at doing the processing, and I was able to get it working fairly easily using some of the scientific Python modules such as NumPy, SciPy, Matplotlib, and PyFITS.

I chose to work with the FITS data cubes since that seems to be the widest accepted format of astronomical data, and also is an open standard that can be manipulated by anyone. The basic workflow for alignment goes something like this:

  1. Select the first slice of data from the FITS cube as the “reference” slice to which all other slices will be aligned.
  2. Normalize it based on a range set by the user.
  3. Select the data from that slice which is over a certain threshold set by the user. This allows a variable amount of background to be filtered out, depending on the sensitivity and noise in the data set.
  4. Using this “good” data, use “scipy.ndimage.measurements.center_of_mass” to find the center of mass of this first reference slice. These coordinates are used as the center to shift all of the other slices to later.
  5. Now that the center coordinates have been determined, start processing the rest of the slices. The first thing is to run “scipy.ndimage.filters.gaussian_laplace” over the slice which helps bring out the bright parts and eliminate the less bright parts.
  6. Find the center of mass of this individual slice.
  7. Normalize the data based on the range set by the user.
  8. Shift the data for this slice by the difference between the reference slice center of mass and this individual slice’s center of mass.
  9. Add the processed data for this slice into a NumPy ndarray of processed data.
  10. Repeat steps 5-9 for the rest of the slices in the data cube.
  11. Once completed and there is an array of processed data, use NumPy to take the mean of this processed data stack. This is the final stacked image.

An example of a double star slice before processing.

Once the final stacked image comes out the end, there are many things that can be done. For now, I’ve implemented a simple hack to guess the position of the primary and secondary stars by finding the coordinates of the “brightest spot”, i.e. maximum value on the stacked image (this is the primary star, in theory), then temporarily making a second copy of the stacked image, zeroing out all of the values for a 10 x 10 pixel box around the bright spot, and then repeating the process to find the next “brightest spot” (maximum value) which should be the secondary star. While this method is admittedly imprecise, it seems to have no problem finding both stars for most of the hundreds of data sets I have tested with. Of course, this assumes that the proper threshold and sigma values for the Laplacian filter have been used.

This is the processed image, with a clear double star and ready for easy measurement.

There are plenty of improvements to be made, and this was just a proof-of-concept which had great results. I want to work more on integrating the calibration values so that final results can be obtained automatically. There might be a better way to figure out the positions of the primary and secondary stars using K-means. I’m not sure on that one yet, it requires more research. There are also other ways to reduce the data which may prove to be better than the Laplace filter, but that was the path of least resistance and provided excellent results for a first trial.


Using Python for Telescope/CCD Camera Automation and Data Processing

After having gone on my first real astronomy data run at the Pinto Valley Observatory, I saw quite a few things that could be done to improve the workflow for next time around. The two things that I am focusing on at the moment are:

  • Telescope and CCD camera integration for data acquisition
  • Data processing

The goal is to eventually be able to feed a list to a script, and go to sleep while it completes the data acquisition for the night.

Telescope and Camera Control

For the telescope and camera integration, I would like to write some code that can take a list of targets, integration times, and perhaps some other settings, and is able to figure out what order to aim at the targets and take data based on when it is closest to the zenith for the least amount of atmospheric noise. It was very cumbersome do take notes on the data set and target on one laptop, control the telescope from a second laptop, and take the data and control the camera from a third laptop. All of this could be simplified with some scripting that uses the provided lists of targets, moves the telescope (with ASCOM, INDI, SiTech, etc.), corrects for tracking errors and centers the target, and then starts the data acquisition and stores it away somewhere.

The main problem with all of this seems to be the complete lack of standardization in the telescope and camera world. Most every one operates with it’s own set of standards, which defeats the purpose. The camera we were using on PVO run was an Andor Luca-S which requires the proprietary SDK in order to control it. SBIG and a few others are able to be controlled using free and available libraries and software. On the telescope side, ASCOM was a well intentioned idea, but it’s limited to Windows based machines, which I do not have. I use Ubuntu linux, so I’m looking for alternatives that can work on any platform. INDI can do this, but it has limited device support. Fortunately, it works with the Celestron NexStar 6 that I have to play with.

So, in short, there seem to be three pieces to getting somewhere with this:

  • Figuring out how to handle the target list. Do I need to have a catalog, or am I going to rely on the user to provide RA/Dec for the targets?
  • Telescope control and tracking adjustment. Dan Gray wrote a simple spiral search script for the SiTech controller at PVO which worked well. It required manual intervention to hit “stop” once the target was in the FOV, but that could be automated easily if the camera is integrated by looking at the overall delta brightness of the images and once it changes by a certain threshold, you know that the target is in the FOV. Then you could center using this mechanism by moving in the x direction slowly until the delta changes, backup to get it in the FOV again, then move in the negative x direction until the delta drops again. Divide that by two and that target is centered for the x direction, then rinse and repeat to center in the y.
  • The camera automation seems easy enough with the Andor SDK, but, it’s proprietary which means that if I wanted to work with a different kind of camera, it would take more code. There probably isn’t a way around this other than writing functions to deal with detecting supported camera types and having code that can deal with it.

Data Processing

I’ve already started writing a bit of Python code for the data processing side of things. For the PVO run, we took data and recorded it in the proprietary Andor format called SIF. It is an undocumented format and there is nothing in the SDK for messing with these files, so you are stuck using their SOLIS software to convert it to different formats. I’m never taking data like that again. :) The Andor camera supports the FITS file format, which is used by NASA and lots of astronomy folks out there, and has quite a bit of support, including with the python module called PyFITS. I’ve been messing with that a bit today. Since all of our data from PVO is in SIF, we have to manually convert it to FITS using SOLIS before we can manipulate it by any other means. This is a tedious process which can be avoided in the future by recording data directly into the FITS format.

FITS files can be multidimentional “cubes” of data, so that our entire 2000 image set of a single target is in a single, convenient file. This is very easy to manipulate and run analysis on, and requires no extra conversion. This is what we will use for future runs.

Dr. Genet and I also use an wonderful piece of software called REDUC which is written by Florent Losse to compile our data and do various things like lucky imaging or speckle interferometry. The only issue is that REDUC only supports single dimensional FITS files, so we have to break up the FITS file from SOLIS into individual FITS files for a set of data. I’ve already been talking via email with Florent to get support in his software for data cubes, since this is widely used and accepted as a standard format for astronomical data. In the meantime, I’ve written a script that can slice up a multidimensional FITS into individual FITS files, and it also writes out a log file (in CVS format for Excel or LibreOffice, at the moment) with contains the header information for each data set, such as the exposure time, EM gain, number of photos, temperature of the camera, etc. If Florent can get the data cube support build in, I won’t need to slice up the FITS files anymore and no data conversions will need to take place which is ideal. I will still want to be able to write out a log file, since this comes in really handy once you’ve got a few nights worth of data.

The other piece which I just started playing with is actually analyzing the data. Python has some neat tools, such as the very powerful matplotlib, which might help in doing this and I’ve only begun to scratch the surface, doing some simple histograms, heat maps, and contours for some of the data that I have. This is a tantalizing possibility and I’m going to be very curious to see what I can come up with.

So there are plenty of pieces to these puzzles, and I’m going to attempt to figure something out. The data processing seems like the easiest part to tackle for the moment, and I will brainstorm on the other pieces as I go along. I’ll be sure to post something once I figure out anything substantial, since there seems to be bits and pieces all over the web, but no one with solid solutions. It looks like some of the big observatories have it figured out, but they have a lot of resources, and completely custom written software for their specific equipment. I would like to come up with something that helps out us amateurs so that we can make our own contributions to science without all of the headache.



The Roweville Run: Three Days at the Pinto Valley Observatory

Sunset at the Pinto Valley Observatory.

We made it to Roweville, deep in the heart of the Mojave Desert, just as the sun was setting. Dr. Russ Genet and I had made the 8 hour journey and were greeted by our host, Dave Rowe, and two others, Dan Gray from Sidereal Technology and Jonah Hare from PlaneWave Instruments. We quickly got settled in and ate dinner, and by the time we were done, it was already very dark since we were completely isolated from any light pollution. It was time to get to work.

The Pinto Valley Observatory, or as we call it, “Roweville”, consists of a 20 inch PlaneWave Corrected Dall-Kirkham (CDK) telescope, a marvelous instrument with superb optics that was designed by Dave, and it was controlled with equipment and software that SiTech (Dan’s company) had produced. The day before Russ and I had arrived, the crew had outfitted the scope with Renishaw encoders which allowed us very high precision tracking and finding sky objects. This was very important since we were using an 4x and a 2x Barlow (for a total of 8x) to do our research, with an Andor Luca-S EMCCD (electron multiplying CCD) camera to gather our data. Our field of view was about 49×32 arc-seconds, so having the ability to precisely slew to objects was crucial, since we didn’t have a lot of wiggle room for inaccuracy.

The 20″ PlaneWave CDK Telescope

Our first night of observations was a success. We used only the 4x Barlow on this night with no filter. We struggled with a few minor problems, like camera drivers issues in Windows, getting the camera to focus since the Barlow was shifting the focal plane very far back (we realized that we had made a mistake and needed to put the spacers in front of the Barlow, not between the camera and the Barlow). After we sorted these minor issues out, I was given a brief lesson on the software by Dan and Jonah, and they let me loose on the controls for the telescope and the camera/data collection. I was very new to all of this equipment, but after we hit a couple of targets, I settled into my work flow and managed to be highly productive at gathering our data. We did a few meridian flips with the scope and the tracking seemed to get off at this point. It was determined that it was not the fault of the scope at all, but the minor shift in the camera instrumentation hanging off of the end of the scope. Our instrumentation extended about 1.5 feet from the back of the scope, creating a perfect lever arm that was at the mercy of gravity to make minor adjustments, even the slightest of which would throw off our FOV at such high magnification. We still were able to get a lot of usable data this first night, with 33 sets and 49,600 images collected. We focused mostly on calibration and drift targets, mostly to get a feel for the equipment and to make sure that the data reflected accurate results. We also collected numerous targets at varying altitudes in the sky in order to compile data on the variance of atmospheric scintillation.

Looking for rocks! From left to right: Jonah, Russ, Dave, Dan

We went for a nice hike the next morning to go rock hunting. Dave knew where to find a nice accumulation of agate, which he explained, had formed from a nearby volcanic explosion that rained a thick layer of white ash down on the area, and as rain water percolated down through the soil, it had deposited silica in successive layers, forming the agate. There were so many beautiful colors in the rocks, and a wide variety of shapes and sized. It provided a good hike and we saw many interesting features of the surrounding area such as Wild Horse Mesa, the Stone House, and many other neat geological features. All of the roads were unpaved, and some required a high clearance vehicle to get across. There were also a couple of campgrounds hidden away in the hills.

A mountain of agate.

The second night, we were able to use our experience from the first night to be much more successful. Dan had made a couple of changes to the tracking software that Russ and I had suggested after using it a bit, and Dan had also written us a simple spiral search script, which allowed us to quickly locate a target that didn’t show up in the FOV right away. This was a huge time saver. We also operated the equipment remotely from the warmth of the main cabin.

We definitely had our system down for this second night, recording 16 sets of data per target, with four different exposure times on four different filters (none, R-band, V-band, and I-band). This allowed us to compare the results and see what integration times and filters worked best for our speckle interferometry. We were able to split some sub arc-second separation double stars using these techniques, which was a huge success for us. We also collected data for a few calibration and drift stars. We got a total of 66 sets and 128,570 images this second night.

Wild Horse Mesa

We woke up and had a nice breakfast, and decided to go for a hike. Dan brought his quadcopter along and we went up into the New York Mountains, into Caruthers Canyon. When we got to the top, we found some abandoned copper mines which Jonah and I ventured into a bit. They seemed pretty steady, and we found old reminants of the mining days in the 1920s, such as big chutes to load up the mine cars, and parts of the track that they ran on. Dan flew his quadcopter up high enough that we almost lost sight of it, then returned it safely to the ground.

Our third night was the most productive, from a scientific standpoint. Dan refined the spiral search script for us, and we knew it was out last night to get data, so we started early, around 7pm. Our focus for this last night was almost entirely production stars, in other words, measuring doubles of various separations, most of which were below 2 arc-secoonds. About two hours into the night, the battery voltage for the ranch was down to 24.0v, so we needed to shut everything down for the night or risk damaging the batteries (Roweville runs off of solar power). We made the decision to unplug all of our laptops and turn off everything on the ranch but the telescope, which got us back up to 24.3v and allowed us to continue the data collection.

Russ looking for the next targets.

Now that Russ and I were under the gun, we became highly productive. We had to move out to the cold “warm room” of the observatory since we couldn’t connect remotely to the observatory computers anymore. Russ and I quickly got situated, and since time was running out, he started listing off targets to me, usually around five at a time. I would then slew the telescope to each of these and record the data in quick succession, and be would have a new list of the next few targets by the time I was done. Unfortunately, power ran out around 11pm and we had to shut everything down, but we had managed to collect 58 sets of data and 99,040 images this last night.

Relaxing and enjoying the view.

I can’t describe how much I learned during this three night stay at Roweville. Dave was a complete gentleman and excellent host. Everyone there during those few days contributed their piece to our research, and provided good company. Even though I was completely unexperienced when I showed up, everyone was very open to showing me how to do things and sharing their vast knowledge and experience with me. As a result, Russ and I will have probably five or six scientific papers coming from the data of this run, although it will take us a few months to compile the 275,000 images, over 25GB of data. It’s a beautiful spot to enjoy dark skies and some great hiking. This was truly a unique experience for me, and made me want to continue on my path towards uncovering the mysteries of the universe even more.

Roweville or Bust!

I’m headed out to the Mojave Desert with Dr. Russ Genet to use the 20-inch PlaneWave CDK telescope at Pinto Valley Observatory at Roweville, owned and run by David Rowe, and I believe Dan Gray (from Sidereal Technology) is also joining us. We have a list of many different targets for gathering data, with our primary goal being speckle interferometry. We are also going to attempt to record data from a lunar occultation of a double star. I can’t wait to leave, this will be an exciting trip! Hopefully the weather cooperates…

Happy Birthday, Carl Sagan!

Carl Sagan (1934 – 1996)

Today is the 78th trip around the sun since Carl Sagan was brought into this world. He inspired  everyone to think deeply about our cosmic ocean and the vast universe of hidden treasures that await us. He devoted his life to the pursuit of knowledge and critical thought, letting the facts lead the way, and was still able to explain it with a sense of calm thoughtfulness that made everyone stop and realize the inherent beauty of nature, science, and human thought.

Even today, Carl’s scientific efforts are bearing fruit. Just within the last couple of months, the Voyager 1 spacecraft is still providing us with groundbreaking scientific data, over 35 years after it’s launch. It is the furthest man-made object in space, currently exiting the solar system, and quite literally on its way to the stars. Because of Carl, we took the “Pale Blue Dot” photo which provided for the first time a unique perspective on just how vast the cosmos was, and how tiny of a planet we live on.

“All of human history has happened on that tiny pixel, which is our only home.” -Carl Sagan, speech at Cornell University, October 13, 1994. Earth is the tiny dot circled in blue.

Perhaps the most well known legacy of Carl is his Cosmos: A Personal Voyage, the 13 part series for PBS, which is still to this day the most watched PBS series in the world. Carl’s was of stating very complex and mind boggling ideas in simple, clear ways that anyone could understand, while still providing deep insight into the nature of the cosmos was at the heart of why this series was so special. He presented the largest pillars of knowledge that humans have uncovered with a poetic candor that mesmerized those who were watching.

I know that Carl played a large part in pushing me down the path that I have gone, and has left in his wake a new generation of role models to make the case for scientific though and reason, rationality, and human dignity. I hope that we can live up to the high water mark that he set. For now, he is back where he came from, someday, forming a new pile of star stuff.



Farewell, Comet 168P/Hergenrother!

This image shows the distinct nuclei splitting off from the comet. Credit: NASA/JPL-Caltech/NOAO/Gemini

If you want to catch a glimpse of the Comet 168P/Hergenrother, you’d better do it quickly. The comet has broken into at least four distinct pieces, according to NASA. This was discovered after it made it’s 7-year periodic appearance and had a sudden brightening from the expected 15th or so magnitude all the way up to a 9th magnitude. Over the last couple of months, it has brightened and dimmed a couple of times, probably due to this fracturing process. It’s perihelion (closest approach to the sun) was on October 1st, and may have also contributed to this break down and brightening process.

I made my second attempt to try and spot this dim comet tonight and was met with success! It was exciting to see something that I knew I would only get to see once. For my first attempt (last night), I had tried to catch it in my 10″ Dobsonian from my front porch, but with the “small city” amount of light pollution to deal with, it made this impossible. I may have caught the ghost of a smudge of it, but I will never be sure.

Tonight, I decided to trek out to an area with very dark skies, to make my chances as good as possible to see this lonely, dying comet, since I knew that I probably wouldn’t get another opportunity. I had almost immediate success. It was easy to spot, using the star finder map from Comet Chasers, in addition to making sure I knew the star hops pretty well in Stellarium before I had gone out for the night.

It was definitely a faint comet to spot, but it was decently visible. I used my 25mm eyepiece, anything with higher magnification made it too dim. This provided a nice, full view of the comet. I was able to see the nebulous coma stretching out, and the nucleus of the comet. With my eyes and telescope, however, I was not able to resolve pieces or separate nuclei, only a single one at the front. Averted vision worked well to enhance the coma. I waited around a half an hour after my first look for the sky to darken a bit more, and around 8:00pm seemed to be a good time.

It’s always exciting to witness events like this, since usually things take eons to unfold in the cosmos. When we can see something happening within our own lifetime, let alone within a few months or weeks, it’s always a unique memory.

My condolences to Carl Hergenrother (who blogs over at The Transient Sky) for losing a friend. :) While I’m sure it was thrilling to discover it 14 years ago, I’m sure it’s equally exciting for it to be getting some attention again. This was only my second comet to view in a telescope, and my first to find on my own, so it was an interesting learning experience that happened to turn into a bit of excitement.

If you have a decent telescope and some fairly dark skies, grab a finder map and give this one a shot. You won’t get another chance.

The comet is currently just above the constellation Pegasus. A zoomed view of the square portion is in the next image.

A zoomed view of the previous image. The star hop was easy for me by making a triangle from the top two stars of Pegasus to the top star in this image, and then walking down to where the comet was, somewhere within the circle on the map.


Observing the Double Star Lambda Arietis

The constellation Aries, with Lambda Arietis circled.

I’m currently enrolled in an astronomical research seminar at the local community college supervised by Dr. Russ Genet. Tonight, I made my first observations toward published results in the Journal of Double Star Observations, which will ultimately end up in the Washington Double Star Catalog. I have five targets lined up, one of which is the double star Lambda Arietis (WDS 01579+2336, SAO 75051) . This is a fairly easy double to spot, visible with the naked eye. It consists of two stars, the primary is a magnitude 4.9, and the secondary is a 7.4, with around 37.6 arc-seconds of separation, making them an ideal candidate for my first attempt at measuring doubles.

I used my manual alt-az 10-inch Orion XT10 Dobsonian from my front porch. Dr. Genet let me borrow a Celestron Micro Guide illuminated eyepiece. Based on my theoretical calculations from my focal length, I had calculated the z value (distance between scale divisions on the linear scale) using the formula from the manual for the Micro Guide:

z = 20626 / f
z = 16.5008 arc-seconds

where f is the focal length of the telescope in millimeters. My focal length is 1250mm, so my z values was theoretically 16.5 arc-seconds per tick on the linear scale. Due to manufacturing imperfections, it’s recommended that you measure the time it takes a known bright star to travel parallel down the linear scale and calculate the z value yourself, since the focal lengths of the telescope and Micro Guide might be slightly different than what they are advertised as. To do this, I positioned my scope on the star Aldebaran and measured 68.32 seconds for it to travel from one end of the linear scale to the other. The declination of Aldebaran is +16° 30′ 33.49 which works out to 16.5055816667°. Also, there are 60 tick marks along the linear scale on the Micro Guide. With these three pieces of information, I could calculate the actual z value:

z = 15.0411 * (time) * cos(declination)  / ticks


z = 15.0411 * 68.32 * cos(16.5055816667°) / 60
z = 16.421039 arc-seconds

The Celestron Micro Guide reticle, for making astronomical measurements.

My measured value was close to the theoretical, but different enough that I’m glad I decided to measure it. This is the value that I used for the rest of my calculations.

Since I was using a manual scope, I quickly discovered that it was a bit tricky to measure the separation of my target double star. The technique I ended up using successfully was putting the double star somewhere above the linear scale and letting it slowly drift across the scale as the earth rotates. This way, I could focus on trying to read the number of tick marks between the primary and secondary stars. I measured the separation four times, after the first two measurements, I rotated the Micro Guide 180° so that if I had introduced any error in my alignment for these first two data points, my second two would reflect that. In the end, it appeared that my data was accurate to the best of my ability. I took my recorded “tick mark” distances and multiplied it by my z value from above to get a separation distance.

Observation Ticks Separation (arc-seconds)
1 2.26 37.11″
2 2.20 36.13″
3 2.30 37.77″
4 2.40 39.41″
Std. Dev. 0.08 1.38
Average 2.29 37.60″

The second measurement that I took was the position angle. I did this by positioning the stars above the linear axis and waiting for them to drift through it. If either the primary or the secondary drifted past and were aligned with the center mark on the linear scale, I would let the drift complete and record where on the protractor that same star passed. If they drifted past the linear scale and neither was aligned with the central mark, then I would reposition and try again. I repeated this measurement four times using the same method as above, rotating the Micro Guide 180° after the first two measurements to reduce error. The position angle is measured from the north position (90° on a normal protractor), so I subtracted my recorded value from 90° to obtain the PA.

Observation Position Angle (degrees)
1 47.50°
2 46.20°
3 47.50°
4 47.70°
Std. Dev. 0.69
Average 47.23°

To prepare for my measurements of my five target double stars throughout the duration of this research seminar, I contacted Dr. Brian Mason at the United States Naval Observatory. He maintains the Washington Double Star Catalog and provided me with all of the past observations and data for my five target double stars. Using this data for Lambda Arietis, I was able to confirm that my results we indeed accurate and in line with previous measurements.

My data Last WDS Last WDS Diff Last 10 WDS Avg Last 10 WDS Avg Diff
Position Angle 47.23° 48.00° -0.77° 46.88° +0.35
Separation 37.60″ 38.10″ -0.50″ 37.84″ -0.24

Past WDS data over time. My data points are in green at the right end of the graph.

The reason I included an average of the last 10 WDS observations was since I was unsure of their accuracy since quite a few recent observations seemed to jump around a bit. The above table also shows the difference between my numbers and the past WDS observations.

My numbers seemed right in line with what I would extrapolate from the past WDS observational data, so I feel pretty good about my accuracy, especially since this was my first observation session. I tried to be as meticulous as possible and was pleasantly surprised that I was able to obtain accurate results with a manual telescope. It takes a little bit of practice, but it’s accurate and fairly easy to accomplish. I look forward to tackling the rest of my target double stars in the near future!