The integration of brain–computer interfaces (BCIs) with robotics offers a promising pathway toward intuitive and natural human–robot interaction. This thesis explores the use of electroencephalography (EEG) signals for decoding human motion intention and predicting corresponding movement trajectories in real-time. EEG provides a non-invasive method into cortical activity, reflecting motor planning and motor imagery processes before actual muscle activation. By extracting discriminative features from EEG recordings, the system aims to recognize different motion intentions—such as grasping, lifting, or reaching—and further predict the spatial trajectory of the intended motion. The predicted trajectory is then used to control a robotic arm for simple interaction tasks. The study combines neural decoding and motion estimation to bridge the human intention and robotic execution, showing the feasibility of EEG-based control for assistive robotics and rehabilitation applications. The results are expected to contribute to the development of intuitive BCI systems that enable users to interact with robotic devices through the “mind-driven” motion control.