Development of noninvasive brain-machine interface (BMI) systems based on electroencephalography (EEG), driven by spontaneous movement intentions, is a useful tool for controlling external devices or supporting a neuro-rehabilitation. In this study, we present the possibility of brain-controlled robot arm system using arm trajectory decoding. To do that, we first constructed the experimental system that can acquire the EEG data for not only movement execution (ME) task but also movement imagery (MI) tasks. Five subjects participated in our experiments and performed four directional reaching tasks (Left, right, forward, and backward) in the 3D plane. For robust arm trajectory decoding, we propose a subject-dependent deep neural network (DNN) architecture. The decoding model applies the principle of bi-directional long short-term memory (LSTM) network. As a result, we confirmed the decoding performance (r-value: >0.8) for all X-, Y-, and Z-axis across all subjects in the MI as well as ME tasks. These results show the feasibility of the EEG-based intuitive robot arm control system for high-level tasks (e.g., drink water or moving some objects). Also, we confirm that the proposed method has no much decoding performance variations between ME and MI tasks for the offline analysis. Hence, we will demonstrate that the decoding model is capable of robust trajectory decoding even in a real-time environment.