For LT codes with robust Soliton distribution, the ripple size is relatively small in the beginning of BP decoding process. Therefore, most of decoding termination occurs due to lack of ripple at early stage. In this study, we aim at reducing early decoding termination for low symbol loss probability. First, given k input symbols, the degree-1 proportion is increased to enlarge the average ripple size within the range 0 ≤ n ≤ k=2, where n is the number of decoded input symbols. Second, we propose Non-Repetitive (NR) encoding scheme to avoid generating repeated degree-1 encoding symbols. An NR encoder forces the first k degree-1 encoding symbols to connect to different input symbols. Simulation results show that NR encoding outperforms LT encoding in terms of symbol loss probability. Besides, less encoding symbols is needed to achieve high successful decoding probability when our scheme is applied. With k = 2000, NR encoding reaches a successful decoding probability of 99.6% when overhead is 0.2, while LT encoding requires an overhead of 0.32 to reach the same probability.