This paper considers large-scale OneMax and RoyalRoad problems with up to 107 binary variables within a compact Estimation of Distribution Algorithms (EDA) framework. Building upon the compact Genetic Algorithm (cGA), the continuous domain Population-Based Incremental Learning algorithm (PBILc) and the arithmetic-coding EDA, we define a novel method that is able to compactly solve regular and noisy versions of these problems with minimal memory requirements, regardless of problem or population size. This feature allows the algorithm to be run in a conventional desktop machine. Issues regarding probability model sampling, arbitrary precision of the arithmetic-coding decompressing scheme, incremental fitness function evaluation and updating rules for compact learning, are presented and discussed.