<mets:mets OBJID="eprint_3435" LABEL="Eprints Item" xsi:schemaLocation="http://www.loc.gov/METS/ http://www.loc.gov/standards/mets/mets.xsd http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-3.xsd" xmlns:mets="http://www.loc.gov/METS/" xmlns:mods="http://www.loc.gov/mods/v3" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><mets:metsHdr CREATEDATE="2018-01-17T15:03:07Z"><mets:agent ROLE="CUSTODIAN" TYPE="ORGANIZATION"><mets:name>Cogprints</mets:name></mets:agent></mets:metsHdr><mets:dmdSec ID="DMD_eprint_3435_mods"><mets:mdWrap MDTYPE="MODS"><mets:xmlData><mods:titleInfo><mods:title>From Analogue to Digital Vocalizations</mods:title></mods:titleInfo><mods:name type="personal"><mods:namePart type="given">Pierre-Yves</mods:namePart><mods:namePart type="family">Oudeyer</mods:namePart><mods:role><mods:roleTerm type="text">author</mods:roleTerm></mods:role></mods:name><mods:abstract>Sound is a medium used by humans to carry information. 
The existence of this kind of
medium is a pre-requisite for language. It is organized
into a code, called speech, which
provides a repertoire of forms that is shared in each
language community. This code is necessary to support the linguistic
interactions that allow humans to communicate. 
How then may a speech code be formed prior to the 
existence of linguistic interactions?
 
Moreover, the human speech code is characterized by several
properties: speech is digital and compositional (vocalizations
are made of units re-used systematically in other syllables); 
phoneme inventories have precise regularities as well as
great diversity in human languages; all the speakers of a
language community categorize sounds in the same manner,
but each language has its own system of categorization,
possibly very different from every other. 
How can a speech code with these properties form?
 
These are the questions we will approach in the paper. We will
study them using the method of the artificial. We will
build a society of artificial agents, and study what mechanisms
may provide answers. This will not prove directly what mechanisms
were used for humans, but rather give ideas about what kind
of mechanism may have been used. This allows us to shape the
search space of possible answers, in particular by showing
what is sufficient and what is not necessary. 

The mechanism we present is based on a low-level model of
sensory-motor interactions. We show that the integration of certain very 
simple and non language-specific neural devices 
allows a population of agents to build a speech code that
has the properties mentioned above. The originality is
that it pre-supposes neither a functional pressure for
communication, nor the ability to have coordinated
social interactions (they do not play language or imitation
games). It relies on the self-organizing properties of a generic
coupling between perception and production both
within agents, and on the interactions between agents.</mods:abstract><mods:classification authority="lcc">Neurolinguistics</mods:classification><mods:classification authority="lcc">Language</mods:classification><mods:classification authority="lcc">Dynamical Systems</mods:classification><mods:classification authority="lcc">Evolution</mods:classification><mods:classification authority="lcc">Phonology</mods:classification><mods:originInfo><mods:dateIssued encoding="iso8061">2003</mods:dateIssued></mods:originInfo><mods:originInfo><mods:publisher>Oxford University Press</mods:publisher></mods:originInfo><mods:genre>Book Chapter</mods:genre></mets:xmlData></mets:mdWrap></mets:dmdSec><mets:amdSec ID="TMD_eprint_3435"><mets:rightsMD ID="rights_eprint_3435_mods"><mets:mdWrap MDTYPE="MODS"><mets:xmlData><mods:useAndReproduction>
<p xmlns="http://www.w3.org/1999/xhtml"><strong>For work being deposited by its own author:</strong> 
In self-archiving this collection of files and associated bibliographic 
metadata, I grant Cogprints the right to store 
them and to make them permanently available publicly for free on-line. 
I declare that this material is my own intellectual property and I 
understand that Cogprints does not assume any 
responsibility if there is any breach of copyright in distributing these 
files or metadata. (All authors are urged to prominently assert their 
copyright on the title page of their work.)</p>

<p xmlns="http://www.w3.org/1999/xhtml"><strong>For work being deposited by someone other than its 
author:</strong> I hereby declare that the collection of files and 
associated bibliographic metadata that I am archiving at 
Cogprints) is in the public domain. If this is 
not the case, I accept full responsibility for any breach of copyright 
that distributing these files or metadata may entail.</p>

<p xmlns="http://www.w3.org/1999/xhtml">Clicking on the deposit button indicates your agreement to these 
terms.</p>
    </mods:useAndReproduction></mets:xmlData></mets:mdWrap></mets:rightsMD></mets:amdSec><mets:fileSec><mets:fileGrp USE="reference"><mets:file ID="eprint_3435_2233_1" SIZE="427010" OWNERID="http://cogprints.org/3435/1/journal.pdf" MIMETYPE="application/pdf"><mets:FLocat LOCTYPE="URL" xlink:type="simple" xlink:href="http://cogprints.org/3435/1/journal.pdf"></mets:FLocat></mets:file></mets:fileGrp></mets:fileSec><mets:structMap><mets:div DMDID="DMD_eprint_3435_mods" ADMID="TMD_eprint_3435"><mets:fptr FILEID="eprint_3435_document_2233_1"></mets:fptr></mets:div></mets:structMap></mets:mets>