Handling large data-sets: a JVLA study case

Main Author: D'Amato, Quirino
Format: info Proceeding
Terbitan: , 2019
Online Access: https://zenodo.org/record/3484648
Daftar Isi:
  • The z=6.31 SDSS J1030+0524 Quasar (QSO) field hosts the most promising candidate galaxy overdensity at z ∼ 6.3, likely associated with the QSO (Balmaverde+17). As such it has been targeted by numerous campaigns during the past years, that led to an unprecedented multi-band (X ray, optical, IR, sub-mm/mm, radio) coverage, making this field one of the best laboratories in which investigate the high-redshift Universe (z>2-3). As part of the observing program we obtained new VLA deep (rms 1.5 μJy/beam) continuum (1.4 GHz) observations (PI: I. Prandoni), aiming at studying the nature of the high-z Radio Quiet (RQ) AGN emission and investigating its relation to the X-ray emission. The large size of the dataset (~3.5TB) split in 11 observation carried out in a period of a month, represent a challenge in both the reduction and imaging phases, due to the large storage and high computational power required. Moreover, The VLA L-band (1-2 GHz) is the most affected by the Radio Frequency Interference (RFI), constituting an additional complication to the reduction of the dataset. In this talk I present the issues related to the current handling of big data, highlighting the need of an effective pipeline development and showing possible strategies to address the flagging, calibration and imaging in a case such as this.