Low-cost light-scattering particulate matter sensors are often advocated for dense monitoring networks. Recent literature has focused on evaluating their performance. Nonetheless, low-cost sensors are also considered unreliable and imprecise. Consequently, exploring techniques for anomaly detection, resilient calibration, and improvement of data quality should be more discussed. In this study, we analyze a year-long acquisition campaign by positioning 56 low-cost light-scattering sensors near the inlet of an official particulate matter monitoring station. We use the collected measurements to design and test a data processing pipeline composed of different stages, including fault detection, filtering, outlier removal, and calibration. These can be used in large-scale deployment scenarios where the quantity of sensors data can be too high to be analyzed manually. Our framework also exploits sensor redundancy to improve reliability and accuracy. Our results show that the proposed data processing framework produces more reliable measurements, reduces errors, and increases the correlation with the official reference.