Developing cameras to stream live footage from the ISS
Rutherford Appleton Laboratory (RAL) is a department at the Science and Technology Facilities Council’s (STFC) focused on space research and technology development. They have been involved in more than 200 space missions and have a reputation for designing, building and testing satellite instrumentation.
By Mike Salter, SFTC RAL Space.
Learn how they are developing and testing two cameras that will stream unprecedented images and video footage of planet Earth from space. The cameras are bolted to a prepared rig on the ISS which orbits the Earth at an altitude of 400km. The cameras will capture video and imagery below the Station’s orbit, where approximately 90% of the world’s population lives. The objective is to give everyone the chance to see, in near-real time, an astronaut’s view of our planet by broadcasting the footage via a commercial website. This unique view of Earth may aid applications as diverse as disaster relief or monitoring deforestation.
Using LabVIEW and the NI PXI platform to develop and test the cameras. FlexRIO FPGA technology provided the means to retrieve and reconstruct image data in real time from sensors inside the cameras.
Technical requirements associated with camera technology
The HRC electronics output the raw video data over a bespoke parallel data interface, with a total data rate of more than 800Mb/s. The Electrical Ground Support Equipment (EGSE) system needed to be capable of capturing and recording this raw data stream so that engineers could later analyse the images.
The data from the camera is presented as a set of parallel differential pairs containing the pixel data and additional flags to indicate the beginning and end of lines within the image.
Figure 1 - high-resolution video camera
A PXIe-7966R FPGA module, combined with an NI 6585 adapter module, captures data. We used the LabVIEW FPGA Module to process some of the incoming data stream in real time. The FPGA code looked at flags within the data that delimit where image frames start and end, and was able to split the continuous stream into individual images and discard the meaningless data in between. All valid pixel values are placed into memory [a DMA First-In First-Out (FIFO) memory buffer] and transferred over the backplane of a PXIe-1082 chassis to the host Virtual Instrument (VI) running on a PXIe-8133 embedded controller. A second FIFO reports to the host VI the number of valid pixels contained in each image.
The host VI retrieves each set of image data from the DMA FIFO and saves it to an NI 8260 high-speed data storage module (solid-state drive). The host VI also displays a graphical user interface so that the user can specify file paths and select whether or not to save the complete raw data stream or have valid images separated into individual files. This proved beneficial during testing, as engineers could simply command the EGSE system to capture a single frame, as opposed to recording a section of the data stream and manually extracting the individual image later. The core functionality, such as file path selection, was very quickly implemented in LabVIEW so that we could focus our efforts on specific application challenges.
We used the LabVIEW Vision Development Module to implement features such as debayering and displaying the image data in real time, which gave a very quick visual confirmation that the hardware was operating correctly.
Having invested in other PXI modular instruments, we quickly found situations in which we could solve problems, for example, the camera included a PCB that crossed over signals between multiple connectors. We were required to qualify this design by putting the box through dozens of thermal cycles and monitoring signal connectivity on the PCB. Using LabVIEW with a PXI digital multimeter and PXI switch module, we were able to quickly automate this process so that we could test connectivity throughout the qualification process, instead of manually, at the end.
The combination of LabVIEW software with FlexRIO FPGA technology provided the throughput capability needed from test hardware to capture the image envelopes at the required rates and enabled us to process what to keep and discard in real time. By adopting the NI platform, we were able to undertake technology test development work more rapidly than if we were to build bespoke validation and verification test equipment. We were also able to easily adapt to changing requirements quickly and accommodate for more than what we initially set out to test.
Figure 2 - high resolution camera engineering team
It is important that we kept to the scheduled launch date for the space mission and ensured that the camera technology was available to our commercial partners on time. We were pleased to have completed camera development for this programme in only two years. The NI platform helped the engineering team reduce time and effort in delivering the results.
Figure 3 - installation of HRC on the IIS
Learn more about the NI PXI platform, our Vision software and NI space programmes.