Noninvasive brain-computer interface (BCI) is widely used to recognize users' intentions. Especially, BCI related to tactile and sensation decoding could provide various effects on many industrial fields such as manufacturing advanced touch displays, controlling robotic devices, and more immersive virtual reality or augmented reality. In this paper, we introduce haptic and sensory perception-based BCI systems called neurohaptics. It is a preliminary study for a variety of scenarios using actual touch and touch imagery paradigms. We designed a novel experimental environment and a device that could acquire brain signals under touching designated materials to generate natural touch and texture sensations. Through the experiment, we collected the electroencephalogram (EEG) signals with respect to four different texture objects. Seven subjects were recruited for the experiment and evaluated classification performances using machine learning and deep learning approaches. Hence, we could confirm the feasibility of decoding actual touch and touch imagery on EEG signals to develop practical neurohaptics.