In the day of our science, I presented our results on how a short multimode fiber can be used as a focal-plane wavefront sensor. The idea is to place a multimode fiber at the focal plane of the telescope so the light from the celestial object is coupled into the multimode fiber. Then, we take an image of the output pattern that comes out of the multimode fiber. This output pattern is due to the interference between the different fiber modes, and it depends on both the amplitude and the phase of the input field. Then, we can train a convolutional neural network to learn the relationship between the input phase at the pupil plane and the speckle intensity patterns. Once trained, the convolutional neural network can extract the pupil plane from the images. We tried to train a network on simulated data and found that the pupil phases could be extracted, specially for the low-order aberrations.

In the day of our science, I presented our results on how a short multimode fiber can be used as a focal-plane wavefront sensor. The idea is to place a multimode fiber at the focal plane of the telescope so the light from the celestial object is coupled into the multimode fiber. Then, we take an image of the output pattern that comes out of the multimode fiber. This output pattern is due to the interference between the different fiber modes, and it depends on both the amplitude and the phase of the input field. Then, we can train a convolutional neural network to learn the relationship between the input phase at the pupil plane and the speckle intensity patterns. Once trained, the convolutional neural network can extract the pupil plane from the images. We tried to train a network on simulated data and found that the pupil phases could be extracted, specially for the low-order aberrations.

Context: This was shown for a general telescope- not for a distributed aperture like SELF. The idea is to use the multimode fiber wavefront sensor to be able to compensate for the atmospheric aberrations. To place the wavefront sensor in the focal plane is good because you prevent non-common-path aberrations. That is, you’re able to sense all the aberrations present in the science optical path. This is one of the current limitations of extreme adaptive optics. If we want to take direct images of exoplanets, we need to be able to compensate for the atmospheric turbulence with a high precision.

During the “Day of Our Science” event, Dr. Auxiliadora Padrón, researcher at the LIOM Project and PhD in Photonics, presented groundbreaking results on the use of a short multimode fiber as a focal-plane wavefront sensor — a technology with potential application in the SELF/ELF telescope system.

The approach involves placing a multimode fiber at the focal plane of the telescope, allowing the light from a celestial object to couple directly into the fiber. An image is then captured of the output pattern produced by the interference between the fiber’s internal modes — a pattern that varies depending on both the amplitude and phase of the incoming light field.

By training convolutional neural networks with simulated data, it is possible to establish a relationship between the wavefront phase at the pupil plane and the observed intensity patterns. Once trained, the network can accurately reconstruct the wavefront, particularly in the presence of low-order aberrations.

Although this research has initially been developed for conventional telescopes — and not yet specifically for distributed apertures like SELF — it represents a key advancement in the compensation of atmospheric aberrations. Placing the sensor at the focal plane avoids non-common-path errors, enabling the detection of all distortions present in the scientific optical path.

This development addresses one of the current limitations of extreme adaptive optics, a critical requirement for obtaining direct images of exoplanets with the precision needed to detect signs of life beyond Earth.