?url_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Adc&rft.title=Dynamical+recurrent+neural+networks+towards+prediction+and+modeling+of+dynamical+systems&rft.creator=Aussem%2C+A.&rft.subject=Computational+Neuroscience&rft.subject=Artificial+Intelligence&rft.subject=Dynamical+Systems&rft.subject=Machine+Learning&rft.subject=Neural+Nets&rft.subject=Speech&rft.description=This+paper+addresses+the+use+of+Dynamical+Recurrent+Neural+Networks+(DRNN)+for+time+series+prediction+and+modeling+of+small+dynamical+systems.+Since+the+recurrent+synapses+are+represented+by+Finite+Impulse+Response+(FIR)+filters%2C+DRNN+are+state-based+connectionist+models+in+which+all+hidden+units+act+as+state+variables+of+a+dynamical+system.+The+model+is+trained+with+Temporal+Recurrent+Backprop+(TRBP)%2C+an+efficient+backward+recurrent+training+procedure+with+minimal+computational+burden+which+benefits+from+the+exponential+decay+of+gradient+reversely+in+time.+The+gradient+decay+is+first+illustrated+on+intensive+experiments+on+the+standard+sunspot+series.+The+ability+of+the+model+to+internally+encode+useful+information+on+the+underlying+process+is+then+illustrated+by+several+experiments+on+well+known+chaotic+processes.+Parsimonious+DRNN+models+are+able+to+find+an+appropriate+internal+representation+of+various+chaotic+processes+from+the+observation+of+a+subset+of+the+state+variables.&rft.date=1999&rft.type=Journal+(Paginated)&rft.type=PeerReviewed&rft.format=application%2Fpostscript&rft.identifier=http%3A%2F%2Fcogprints.org%2F547%2F2%2FNeurocomp98_Chaos.ps&rft.identifier=++Aussem%2C+A.++(1999)+Dynamical+recurrent+neural+networks+towards+prediction+and+modeling+of+dynamical+systems.++%5BJournal+(Paginated)%5D+++++&rft.relation=http%3A%2F%2Fcogprints.org%2F547%2F