The development of brain-computer interface (BCI) systems that are based on electroencephalography (EEG), and driven by spontaneous movement intentions, is useful for rehabilitation and external device control. In this study, we analyzed the decoding of five different hand executions and imageries from EEG signals, for a robot hand control. Five healthy subjects participated in this experiment. They executed and imagined five sustained hand motions. In this motor execution (ME) and motor imagery (MI) experiment, we proposed a subject-specific time interval selection method, and we used common spatial patterns (CSP) and the regularized linear discriminant analysis (RLDA) for the data analysis. As a result, we classified the five different hand motions offline and obtained average classification accuracies of 56.83% for ME, and 51.01% for MI, respectively. Both results were higher than the obtained accuracies from a comparison method that used a standard fixed time interval method. This result is encouraging, and the proposed method could potentially be used in future applications, such as a BCI-driven robot hand control.