Loading...
Loading...

Go to the content (press return)

Constant entropy rate and related hypotheses versus real language

Author
Ferrer-i-Cancho, R.; Debowski, L.
Type of activity
Presentation of work at congresses
Name of edition
35th Annual Meeting of the Cognitive Science Society
Date of publication
2013
Presentation's date
2013-08-02
Book of congress proceedings
Proceedings of the 35th Annual Meeting of the Cognitive Science Society
First page
3933
Last page
3933
URL
http://mindmodeling.org/cogsci2013/ Open in new window
Abstract
Constant entropy rate (CER) and uniform information density (UID) are two hypotheses that have been put forward to explain a wide rage of linguistic phenomena. However, the concrete definition of these hypotheses is unclear for statistical research and a direct and in-depth evaluation of these hypotheses from their definition is missing to our knowledge. Here we consider four operational definitions of UID: full UID (UID holding for any combination of elements making the utterances), strong UID ...
Group of research
LARCA - Laboratory of Relational Algorithmics, Complexity and Learnability

Participants