Main Article Content
Deep Learning approach using probability distribution to natural language processing achieves significant accomplishment. However, natural languages have inherent linguistic structures rather than probabilistic distribution. This paper presents a new graph-based representation of syntactic structures called syntactic knowledge graph based on dependency relations. This paper investigates the valency theory and the markedness principle of natural languages to derive an appropriate set of dependency relations for the syntactic knowledge graph. A new set of dependency relations derived from the markers is proposed. This paper also demonstrates the representation of various linguistic structures to validate the feasibility of syntactic knowledge graphs.