Matt Bolcar’s Research
As future science missions call for larger and larger aperture space telescopes, we see the technology begin to move toward one of two trends: segmented apertures and multilple aperture telescopes. In either case, the light from each segment or sub-aperture is combined to form an image or fringes in the detector plane. The resulting resolution is comparable to a monolithic aperture of size equal to the extent of the segments or sub-apertures.
In order for the combination to occur, the segments or sub-apertures must be “phased” – that is, the optical path length for rays striking each segment, or passing through each sub-aperture must be equal. The excessive size of these systems and the remoteness of their orbit prohibits the use of interferometers to perform the phasing. Therefore, image-based wavefront sensing techniques such as phase retrieval and phase diversity have been identified as enabling technologies for future space telescope missions.
My research is to test and expand the ability of phase retrieval and phase diversity techniques for such systems. For example, conventional phase diversity is performed by capturing multiple images with the system, each differing by a known defocus error. One technique I considered is to use the already existing optical hardware of segmented and multi-aperture systems to create the phase diversity by introducing known path-length errors in a subset of the segments or sub-apertures. We found that this technique works approximately as well as conventional focus diversity, however without the need for extra hardware or detector planes to create the phase diversity.
Another project has been to explore methods of multi-field wavefront sensing for large systems. Typically, telescopes such as the Hubble Space Telescope (HST) or the James Webb Space Telescope (JWST) have multiple science instruments in the image plane, each occupying a portion of the field of view. Traditional methods of wavefront sensing use only a single, on-axis point source as the beacon for wavefront sensing. These methods leave many field-dependent aberrations unsensed, and unaccounted for in wavefront correction and image processing.
My project compared two methods of peforming the multi-field wavefront sensing: one in which the field dependent aberrations were directly estimated during the phase retrieval process, and one in which they were indirectly calculated after processing phase retrieval results. We found that, in practice, the direct sensing method showed an advantage in estimating the aberrations of the system. This advantage became more apparent as the signal-to-noise ratio (SNR) of the system decreased.
One of the tools used to evaluate these different methods and techniques is that of Cramer-Rao bounds. Cramer-Rao bounds (CRBs) are an information theoretic quantity that put a lower bound on the ability to estimate a signal from a noisy system. An advantage of the CRBs is that they do not depend at all on the estimation process, but only on the statistics of the noise in the system and the system itself. The CRBs, therefore, provide a “lowest common denominator” with which various algorithms and techniques can be compared. The Cramer-Rao bounds are related to Fisher Information.
In addition to these projects, I support the Wavefront Sensing & Control group at NASA Goddard Space Flight Center under the direction of Bruce Dean as part of my Graduate Student Research Program (GSRP) Fellowship. The support consists of algorithm development and data processing for various projects related to the James Webb Space Telescope.