Burghard B. Rieger:

Tree-like Dispositional Dependency Structures for non-propositional Semantic Inferencing.

On a SCIP approach to natural language understanding by machine

In: Bouchon-Meunier, B./Yager, R. (eds.): Proceedings of the 7th Intern. Conf. on Information Processing and Management of Uncertainty in Knowledge-based Systems (IPMU-98), Paris (Editions EDK) 1988, pp. 351-358


Abstract

Arguing for the semiotic modeling of natural language understanding by machine is to follow a procedural stance of approach focusing on processes of meaning constitution. These can be typified in pragmatic situations of performative language games which may be analyzed empirically, described formally, and simulated computationally. In doing so, graph theoretical tools have been employed and new tree structures developed which allow both, to restrict the relational manifold in high-dimensional vector space structures computed as fuzzy word meaning representations, and to visualize semantically motivated relevancies emerging from such restrictions as reflexive, non-symmetric, and (weakly) transitive dependency relations among them. As a basal, context-sensitive form of reorganizing distributionally represented fuzzy entities, the tree like dispositional dependency structures (DDS) serve as a non-propositional format for conceptual associations and semantic inferencing by machine, as opposed to propositional reasoning based on truth-functional constraints. After a short introduction into semiotic cognitive information processing (SCIP) and the text analyzing and meaning representational formalisms employed, DDS tree generation will be discussed, and some examples be given to illustrate the algorithms' semantic inferencing potential as computed from and performed on a sample of German newspaper texts.


Full text

HTML Format

PDF Format (217 Kb)


zurück zu Aufsätze / back to Articles