Automated essay scoring (AES) systems have provided a computer-based writing assessment comparable to expert raters. However, the existing systems are inadequate to assess the writing fluency of non-English-speaking students, while they detect grammatical errors relatively well. The writing fluency is one of the important criteria in essay scoring, because most of non-English-speaking students have much difficulty in expressing their thoughts in English. In this paper, we propose an automated essay scoring system focusing on assessing the writing fluency by considering the quantitative factors such as vocabulary, perplexity in a sentence, diversity of sentence structures and grammatical relations. Experimental results show that the proposed method improves the performance in automated essay scoring.