Remote camera-based measurement of physiology has great potential for healthcare and affective computing. Recent advances in computer vision and signal processing have enabled photoplethysmography (PPG) measurement using commercially-available cameras. However, there remain challenges in recovering accurate non-contact PPG measurements in the presence of rigid head motion. When a subject is moving, their face may be turned away from one camera, be obscured by an object, or move out of the frame resulting in missing observations. As the calculation of pulse rate variability requires analysis over a time window of several minutes, the effect of missing observations on such features is deleterious. We present an approach for fusing partial color-channel signals from an array of cameras that enables physiology measurements to be made from moving subjects, even if they leave the frame of one or more cameras, that would not otherwise be possible with only a single camera. We systematically test our method on subjects (N=25) using a set of six, 5-minute tasks (each repeated twice) involving different levels of head motion. This results in validation across 25 hours of measurement. We evaluate pulse rate and pulse rate variability (PRV) parameter estimation including statistical, geometric and frequency-based measures. The median absolute error in pulse rate measurements was 0.57 beats-per-minute (BPM). In all but two tasks with the greatest motion the median error was within 0.4 BPM of that from a contact PPG device. PRV estimates were significantly improved using our proposed approach compared to an alternative not designed to handle missing values and multiple camera signals, the error was reduced by over 50%. Without our proposed method, errors in pulse rate would be very high, and estimation of PRV parameters would not be feasible due to significant data loss.