scispace - formally typeset
Search or ask a question

Showing papers by "G. Maggio published in 2009"


Journal ArticleDOI
TL;DR: In this article, the authors quantified the level of distortion introduced by the on-board processing as a function of these parameters and proposed a method of tuning the onboard processing chain to cope with the limited bandwidth while keeping to a minimum the signal distortion.
Abstract: To asses stability against 1/f noise, the Low Frequency Instrument (LFI) on-board the Planck mission will acquire data at a rate much higher than the data rate allowed by the science telemetry bandwith of 35.5 Kbps. The data are processed by an on-board pipeline, followed on-ground by a decoding and reconstruction step, to reduce the volume of data to a level compatible with the bandwidth while minimizing the loss of information. This paper illustrates the on-board processing of the scientific data used by Planck/LFI to fit the allowed data-rate, an intrinsecally lossy process which distorts the signal in a manner which depends on a set of five free parameters (Naver, r1, r2, q, ) for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the on-board processing as a function of these parameters. It describes the method of tuning the on-board processing chain to cope with the limited bandwidth while keeping to a minimum the signal distortion. Tuning is sensitive to the statistics of the signal and has to be constantly adapted during flight. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, pre-launch tests or data taken in flight from LFI operating in a special diagnostic acquisition mode. All the needed optimization steps are performed by an automated tool, OCA2, which simulates the on-board processing, explores the space of possible combinations of parameters, and produces a set of statistical indicators, among them: the compression rate Cr and the processing noise Q. For Planck/LFI it is required that Cr = 2.4 while, as for other systematics, Q would have to be less than 10% of rms of the instrumental white noise. An analytical model is developed that is able to extract most of the relevant information on the processing errors and the compression rate as a function of the signal statistics and the processing parameters to be tuned. This model will be of interest for the instrument data analysis to asses the level of signal distortion introduced in the data by the on-board processing. The method was applied during ground tests when the instrument was operating in conditions representative of flight. Optimized parameters were obtained and inserted in the on-board processor and the performance has been verified against the requirements with the result that the required data rate of 35.5 Kbps has been achieved while keeping the processing error at a level of 3.8% of the instrumental white noise and well below the target 10% level.

19 citations


Journal ArticleDOI
TL;DR: The LIFE software suite has been successfully used during the RCA/RAA tests and the Planck Integrated System Tests, and passed the verification for its in-flight use during the System Operations Verification Tests, held in October 2008.
Abstract: The Planck Low Frequency Instrument (LFI) is an array of 22 pseudo-correlation radiometers on-board the Planck satellite to measure temperature and polarization anisotropies in the Cosmic Microwave Background (CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the performances of the LFI, a software suite named LIFE has been developed. Its aims are to provide a common platform to use for analyzing the results of the tests performed on the single components of the instrument (RCAs, Radiometric Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA). Moreover, its analysis tools are designed to be used during the flight as well to produce periodic reports on the status of the instrument. The LIFE suite has been developed using a multi-layered, cross-platform approach. It implements a number of analysis modules written in RSI IDL, each accessing the data through a portable and heavily optimized library of functions written in C and C++. One of the most important features of LIFE is its ability to run the same data analysis codes both using ground test data and real flight data as input. The LIFE software suite has been successfully used during the RCA/RAA tests and the Planck Integrated System Tests. Moreover, the software has also passed the verification for its in-flight use during the System Operations Verification Tests, held in October 2008.

18 citations


Journal ArticleDOI
TL;DR: In this article, a real-time assessment system for the Planck Low Frequency Instrument (LFI) is presented, based on the ESA SCOS 2000 generic mission control system, with the main purpose of monitoring the housekeeping parameters of LFI and detecting possible anomalies.
Abstract: The Planck Low Frequency Instrument (LFI) will observe the Cosmic Microwave Background (CMB) by covering the frequency range 30-70 GHz in three bands. The primary instrument data source are the temperature samples acquired by the 22 radiometers mounted on the Planck focal plane. Such samples represent the scientific data of LFI. In addition, the LFI instrument generates the so called housekeeping data by sampling regularly the on-board sensors and registers. The housekeeping data provides information on the overall health status of the instrument and on the scientific data quality. The scientific and housekeeping data are collected on-board into telemetry packets compliant with the ESA Packet Telemetry standards. They represent the primary input to the first processing level of the LFI Data Processing Centre. In this work we show the software systems which build the LFI Level 1. A real-time assessment system, based on the ESA SCOS 2000 generic mission control system, has the main purpose of monitoring the housekeeping parameters of LFI and detect possible anomalies. A telemetry handler system processes the housekeeping and scientific telemetry of LFI, generating timelines for each acquisition chain and each housekeeping parameter. Such timelines represent the main input to the subsequent processing levels of the LFI DPC. A telemetry quick-look system allows the real-time visualization of the LFI scientific and housekeeping data, by also calculating quick statistical functions and fast Fourier transforms. The LFI Level 1 has been designed to support all the mission phases, from the instrument ground tests and calibration to the flight operations, and developed according to the ESA engineering standards.

8 citations


Journal ArticleDOI
TL;DR: Rachel, a software application which has been purposely developed and used during the RCA test campaign to carry out both near-realtime on-line data analysis and data storage (in FITS format) of the raw output from the radiometric chains is described.
Abstract: Planck's Low Frequency Instrument is an array of 22 pseudo-correlation radiometers at 30, 44, and 70 GHz. Before integrating the overall array assembly, a first set of tests has been performed for each radiometer chain assembly (RCA), consisting of two radiometers. In this paper, we describe Rachel, a software application which has been purposely developed and used during the RCA test campaign to carry out both near-realtime on-line data analysis and data storage (in FITS format) of the raw output from the radiometric chains.

6 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission, where the on-board and ground processing are viewed as a single pipeline, and demonstrate that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.
Abstract: The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.

5 citations