Focused vs. Unfocused Differential Photometry: Observations of Exoplanets WASP-59b and WASP-33b

Defocusing is a technique used to capture photometry data by researchers to achieve a higher signal-to-noise ratio and therefore a smaller error in data. Since the light is spread over a much larger area of pixels on the CCD chip, the exposure time can also be increased, which helps to “smear out” the images and gather more photons. My research group wanted to try and quantify this, as well as do some observations of exoplanet transits. The theory is that when the light is only on a few pixels, variance and distortion on any of those pixels has a large effect on the data but with defocusing, the light is spread over a much larger pixel area so that if a few pixels are varying in any given image, the data set as a whole isn’t effected that much. Exoplanets are great for this type of observation since we are looking purely at the amount of light and no visual resolution is needed.

WASP-33b focused point spread function.

WASP-33b focused point spread function.

WASP-33b defocused point spread function. This created a very nice "donut".

WASP-33b defocused point spread function. This created a very nice “donut”.

Our research was done at the Cal Poly State University observatory, a 14 inch Meade LX-600 ACF telescope and an SBIG CCD camera. We were able to get three data sets: one for WASP-59b (13th magnitude host star) that was in focus, and two for WASP-33b (8th magnitude host star), one in focus and one out of focus. PyRAF was used for the data reduction and differential photometry process, but some custom tools were written to deal with the data sets. A tool called “daofind” creates lists of stars that meet certain criteria in each image of the data set so that you can use these stars to calculate photometric values such as magnitude and SNR. Unfortunately, it is not very good at keeping “found stars” consistent between images. Star #1 in image #1 is not necessarily star #1 in image #2, and so on. This is not a good problem when there are hundred or thousands of images. The tool I wrote compares all of the stars found in a given image against the stars from the previous images and if the x, y, and magnitude all change less than a certain threshold, it considers it to be the same star. This aligns the data that gets dumped and makes it easy to complete photometry.

These are some of our resulting light curves along with the binned curve that was used to estimate the transit start/stop times (marked with the yellow lines) and depth of the magnitude delta.

We obtained some great measurements of the transiting exoplanets. The defocused data was much easier to reduce and we did not throw out a single data point. With the other two focused runs, many data points were missing for comparison stars or the primary target and had to be omitted. Keep in mind, these graphs look differently because the data from WASP-33b focused contained about 2500 images since they had to be taken at 2.5 second exposure times. Focusing all of the star’s light onto a few pixels means that saturation is reached quickly, so this was the maximum exposure time that we could use. Defocusing allowed us to increase the exposure time to 15 seconds, so the data is naturally less noisy and there are far fewer data points in the graph. Defocusing also increased the SNR by a factor of 12.5 to 20.

While we could not draw a direct conclusion about the defocusing technique being the factor that made the data much better, the defocused data set was much less variant over the period of the transit than either of the two focused data sets and had a much larger SNR. We need to get more telescope time in order to draw a better and more quantitative result, but so far this looks very promising.

Using Python for Telescope/CCD Camera Automation and Data Processing

After having gone on my first real astronomy data run at the Pinto Valley Observatory, I saw quite a few things that could be done to improve the workflow for next time around. The two things that I am focusing on at the moment are:

  • Telescope and CCD camera integration for data acquisition
  • Data processing

The goal is to eventually be able to feed a list to a script, and go to sleep while it completes the data acquisition for the night.

Telescope and Camera Control

For the telescope and camera integration, I would like to write some code that can take a list of targets, integration times, and perhaps some other settings, and is able to figure out what order to aim at the targets and take data based on when it is closest to the zenith for the least amount of atmospheric noise. It was very cumbersome do take notes on the data set and target on one laptop, control the telescope from a second laptop, and take the data and control the camera from a third laptop. All of this could be simplified with some scripting that uses the provided lists of targets, moves the telescope (with ASCOM, INDI, SiTech, etc.), corrects for tracking errors and centers the target, and then starts the data acquisition and stores it away somewhere.

The main problem with all of this seems to be the complete lack of standardization in the telescope and camera world. Most every one operates with it’s own set of standards, which defeats the purpose. The camera we were using on PVO run was an Andor Luca-S which requires the proprietary SDK in order to control it. SBIG and a few others are able to be controlled using free and available libraries and software. On the telescope side, ASCOM was a well intentioned idea, but it’s limited to Windows based machines, which I do not have. I use Ubuntu linux, so I’m looking for alternatives that can work on any platform. INDI can do this, but it has limited device support. Fortunately, it works with the Celestron NexStar 6 that I have to play with.

So, in short, there seem to be three pieces to getting somewhere with this:

  • Figuring out how to handle the target list. Do I need to have a catalog, or am I going to rely on the user to provide RA/Dec for the targets?
  • Telescope control and tracking adjustment. Dan Gray wrote a simple spiral search script for the SiTech controller at PVO which worked well. It required manual intervention to hit “stop” once the target was in the FOV, but that could be automated easily if the camera is integrated by looking at the overall delta brightness of the images and once it changes by a certain threshold, you know that the target is in the FOV. Then you could center using this mechanism by moving in the x direction slowly until the delta changes, backup to get it in the FOV again, then move in the negative x direction until the delta drops again. Divide that by two and that target is centered for the x direction, then rinse and repeat to center in the y.
  • The camera automation seems easy enough with the Andor SDK, but, it’s proprietary which means that if I wanted to work with a different kind of camera, it would take more code. There probably isn’t a way around this other than writing functions to deal with detecting supported camera types and having code that can deal with it.

Data Processing

I’ve already started writing a bit of Python code for the data processing side of things. For the PVO run, we took data and recorded it in the proprietary Andor format called SIF. It is an undocumented format and there is nothing in the SDK for messing with these files, so you are stuck using their SOLIS software to convert it to different formats. I’m never taking data like that again. :) The Andor camera supports the FITS file format, which is used by NASA and lots of astronomy folks out there, and has quite a bit of support, including with the python module called PyFITS. I’ve been messing with that a bit today. Since all of our data from PVO is in SIF, we have to manually convert it to FITS using SOLIS before we can manipulate it by any other means. This is a tedious process which can be avoided in the future by recording data directly into the FITS format.

FITS files can be multidimentional “cubes” of data, so that our entire 2000 image set of a single target is in a single, convenient file. This is very easy to manipulate and run analysis on, and requires no extra conversion. This is what we will use for future runs.

Dr. Genet and I also use an wonderful piece of software called REDUC which is written by Florent Losse to compile our data and do various things like lucky imaging or speckle interferometry. The only issue is that REDUC only supports single dimensional FITS files, so we have to break up the FITS file from SOLIS into individual FITS files for a set of data. I’ve already been talking via email with Florent to get support in his software for data cubes, since this is widely used and accepted as a standard format for astronomical data. In the meantime, I’ve written a script that can slice up a multidimensional FITS into individual FITS files, and it also writes out a log file (in CVS format for Excel or LibreOffice, at the moment) with contains the header information for each data set, such as the exposure time, EM gain, number of photos, temperature of the camera, etc. If Florent can get the data cube support build in, I won’t need to slice up the FITS files anymore and no data conversions will need to take place which is ideal. I will still want to be able to write out a log file, since this comes in really handy once you’ve got a few nights worth of data.

The other piece which I just started playing with is actually analyzing the data. Python has some neat tools, such as the very powerful matplotlib, which might help in doing this and I’ve only begun to scratch the surface, doing some simple histograms, heat maps, and contours for some of the data that I have. This is a tantalizing possibility and I’m going to be very curious to see what I can come up with.

So there are plenty of pieces to these puzzles, and I’m going to attempt to figure something out. The data processing seems like the easiest part to tackle for the moment, and I will brainstorm on the other pieces as I go along. I’ll be sure to post something once I figure out anything substantial, since there seems to be bits and pieces all over the web, but no one with solid solutions. It looks like some of the big observatories have it figured out, but they have a lot of resources, and completely custom written software for their specific equipment. I would like to come up with something that helps out us amateurs so that we can make our own contributions to science without all of the headache.

 

 

Observing the Double Star Lambda Arietis

The constellation Aries, with Lambda Arietis circled.

I’m currently enrolled in an astronomical research seminar at the local community college supervised by Dr. Russ Genet. Tonight, I made my first observations toward published results in the Journal of Double Star Observations, which will ultimately end up in the Washington Double Star Catalog. I have five targets lined up, one of which is the double star Lambda Arietis (WDS 01579+2336, SAO 75051) . This is a fairly easy double to spot, visible with the naked eye. It consists of two stars, the primary is a magnitude 4.9, and the secondary is a 7.4, with around 37.6 arc-seconds of separation, making them an ideal candidate for my first attempt at measuring doubles.

I used my manual alt-az 10-inch Orion XT10 Dobsonian from my front porch. Dr. Genet let me borrow a Celestron Micro Guide illuminated eyepiece. Based on my theoretical calculations from my focal length, I had calculated the z value (distance between scale divisions on the linear scale) using the formula from the manual for the Micro Guide:

z = 20626 / f
z = 16.5008 arc-seconds

where f is the focal length of the telescope in millimeters. My focal length is 1250mm, so my z values was theoretically 16.5 arc-seconds per tick on the linear scale. Due to manufacturing imperfections, it’s recommended that you measure the time it takes a known bright star to travel parallel down the linear scale and calculate the z value yourself, since the focal lengths of the telescope and Micro Guide might be slightly different than what they are advertised as. To do this, I positioned my scope on the star Aldebaran and measured 68.32 seconds for it to travel from one end of the linear scale to the other. The declination of Aldebaran is +16° 30′ 33.49 which works out to 16.5055816667°. Also, there are 60 tick marks along the linear scale on the Micro Guide. With these three pieces of information, I could calculate the actual z value:

z = 15.0411 * (time) * cos(declination)  / ticks

so,

z = 15.0411 * 68.32 * cos(16.5055816667°) / 60
z = 16.421039 arc-seconds

The Celestron Micro Guide reticle, for making astronomical measurements.

My measured value was close to the theoretical, but different enough that I’m glad I decided to measure it. This is the value that I used for the rest of my calculations.

Since I was using a manual scope, I quickly discovered that it was a bit tricky to measure the separation of my target double star. The technique I ended up using successfully was putting the double star somewhere above the linear scale and letting it slowly drift across the scale as the earth rotates. This way, I could focus on trying to read the number of tick marks between the primary and secondary stars. I measured the separation four times, after the first two measurements, I rotated the Micro Guide 180° so that if I had introduced any error in my alignment for these first two data points, my second two would reflect that. In the end, it appeared that my data was accurate to the best of my ability. I took my recorded “tick mark” distances and multiplied it by my z value from above to get a separation distance.

Observation Ticks Separation (arc-seconds)
1 2.26 37.11″
2 2.20 36.13″
3 2.30 37.77″
4 2.40 39.41″
Std. Dev. 0.08 1.38
Average 2.29 37.60″

The second measurement that I took was the position angle. I did this by positioning the stars above the linear axis and waiting for them to drift through it. If either the primary or the secondary drifted past and were aligned with the center mark on the linear scale, I would let the drift complete and record where on the protractor that same star passed. If they drifted past the linear scale and neither was aligned with the central mark, then I would reposition and try again. I repeated this measurement four times using the same method as above, rotating the Micro Guide 180° after the first two measurements to reduce error. The position angle is measured from the north position (90° on a normal protractor), so I subtracted my recorded value from 90° to obtain the PA.

Observation Position Angle (degrees)
1 47.50°
2 46.20°
3 47.50°
4 47.70°
Std. Dev. 0.69
Average 47.23°

To prepare for my measurements of my five target double stars throughout the duration of this research seminar, I contacted Dr. Brian Mason at the United States Naval Observatory. He maintains the Washington Double Star Catalog and provided me with all of the past observations and data for my five target double stars. Using this data for Lambda Arietis, I was able to confirm that my results we indeed accurate and in line with previous measurements.

My data Last WDS Last WDS Diff Last 10 WDS Avg Last 10 WDS Avg Diff
Position Angle 47.23° 48.00° -0.77° 46.88° +0.35
Separation 37.60″ 38.10″ -0.50″ 37.84″ -0.24

Past WDS data over time. My data points are in green at the right end of the graph.

The reason I included an average of the last 10 WDS observations was since I was unsure of their accuracy since quite a few recent observations seemed to jump around a bit. The above table also shows the difference between my numbers and the past WDS observations.

My numbers seemed right in line with what I would extrapolate from the past WDS observational data, so I feel pretty good about my accuracy, especially since this was my first observation session. I tried to be as meticulous as possible and was pleasantly surprised that I was able to obtain accurate results with a manual telescope. It takes a little bit of practice, but it’s accurate and fairly easy to accomplish. I look forward to tackling the rest of my target double stars in the near future!