Brain–machine interface (BMI) researchers have traditionally focused on modeling endpoint reaching tasks to provide the control of neurally driven prosthetic arms. Most previous research has focused on achieving an endpoint control through a Cartesian-coordinate-centered approach. However, a joint-centered approach could potentially be used to intuitively control a wide range of limb movements. We systematically investigated the feasibility of discriminating between flexion and extension of different upper limb joints using electrocorticography(ECoG) recordings from sensorimotor cortex. Four subjects implanted with macro-ECoG (10-mm spacing), high-density ECoG (5-mm spacing), and/or micro-ECoG arrays (0.9-mm spacing and 4 mm × 4 mm coverage), performed randomly cued flexions or extensions of the fingers, wrist, or elbow contralateral to the implanted hemisphere. We trained a linear model to classify six movements using averaged high-gamma power (70–110 Hz) modulations at different latencies with respect to movement onset, and within a time interval restricted to flexion or extension at each joint. Offline decoding models for each subject classified these movements with accuracies of 62%–83%. Our results suggest that the widespread ECoG coverage of sensorimotor cortex could allow a whole limb BMI to sample native cortical representations in order to control flexion and extension at multiple joints.