The remaining useful life (RUL) prediction method based on deep learning (DL) shows exceptional degradation feature extraction capabilities. A number of DL methods rely on large-scale labeled datasets to achieve effective supervised training. However, due to the rigorous maintenance schedules of equipment, the collected data predominantly consists of a limited number of labeled samples and a substantial volume of unlabeled samples. During the training phase of supervised DL models, the limited availability of labeled samples frequently leads to network overfitting, whereas the abundant unlabeled samples remain underutilized. This paper proposes a novel RUL prediction method based on unlabeled sample enhancement and contrastive learning, significantly enhancing the utilization of unlabeled samples. Initially, a Siamese network model undergoes pretraining using labeled data, utilizing an integration of graph convolutional network (GCN) and self-attention convolutional LSTM (ConvLSTM) network. Additionally, the contrastive learing with unlabeled samples enhancement is further utilized to learn degradation information through the incorporation of unlabeled samples to boost the performance of RUL prediction. The proposed framework is validated on turbofan engine datasets. Experimental results demonstrate that the performance of the proposed RUL prediction framework is substantially improved with the inclusion of unlabeled samples.