Abstract For decades, dependency distance/length minimization (DDM) has been pursued as a universal underlying force shaping human languages. In the early edition of PNAS, Futrell, et al. suggest that dependency length minimization is a universal property of human languages and hence supports explanations of linguistic variation in terms of general properties of human information processing. This research may be the very first effort which surveys the largest scale as many as 37 natural languages, and immediately draws great attention worldwide. However, questions still remain in their research, since dependency distance can be sensitive to many factors. Also in this line, eight years before, Prof. Liu Haitao of Zhejiang University has compared dependency distance of 20 natural languages with that of two different random languages, and pointed out that dependency distance minimization is probably universal in human languages. Altogether, these researches into DDM in human languages reveal that it is valuable to cognitively investigate linguistic universals through statistical analysis of big-language-data, thus suggesting that, to obtain truly scientific discoveries, it may well be essential for linguistic study to integrate efforts from multiple disciplines——cross-language analysis, big-data mining, language universals, and cognitive science.
|