Back to Basics in Analyzing Radio Astronomy Data
Paul Barrett (The George Washington University)
As the volume of radio data increases, it is important to reevaluate our method of analyzing radio interferometry data to get the best results and highest performance possible from the data reduction. Several modules or applications currently exist for fitting visibility data. However, each application has its limitations, e.g., a limited number of shapes, a limited number of sources, a single Stokes parameter per calculation, or no spectral index calculation. Because of these limitations, a new package called Visfit using the Julia programming language is being developed. Some major benefits of fitting the visibility data directly are: (1) a significant reduction in the generation of light curves for photometry from days to minutes, (2) an ~20% greater sensitive than the FFT/CLEAN method, meaning a marginal 4 sigma detection becomes a solid 5 sigma detection; (3) reducing the number of pixels in an image by adaptive gridding; (4) using proper image projections such as conformal and equal-area projections instead of just a tangent plane projection for wide-field and mosaicked images; and (5) increasing the accuracy and efficiency of filtering unwanted radio signals in data calibration pipelines. Although, I have been working on this project for over a year, then benefits to the analysis of my radio data has been well worth the effort.