The goal of semi-supervised learning is to utilize many unlabeled samples under a situation where a few labeled samples exist. Recently, researches of semi-supervised learning are evolving with deep learning technology development, because, in deep, models have powerful representation to make use of abundant unlabeled samples. In this paper, we propose a novel semi-supervised learning method with uncertainty. It naturally extends the consistency loss under the uncertainty and propose suitable regularizations for the uncertainty. Using two datasets CIFAR-10 and SVHN and with various experiments, we empirically demonstrate that the proposed method achieves competitive or higher performance in accuracy when compared to semi-supervised learning with the conventional consistency loss while our proposal can let a model generalize much faster.