--- abstract: "Understanding computation as “a process of the dynamic change of information” brings to look at the different types of computation and information. Computation of information does not exist alone by itself but is to be considered as part of a system that uses it for some given\r\npurpose. Information can be meaningless like a thunderstorm noise, it can be meaningful like an alert signal, or like the representation of a desired food. A thunderstorm noise participates to the generation of meaningful information about coming rain. An alert signal has a meaning as allowing a safety constraint to be satisfied. The representation of a desired food participates to the satisfaction of some metabolic constraints for the organism. Computations on information and representations will be different in nature and in complexity as the systems that link them have different constraints to satisfy. Animals have survival constraints to satisfy. Humans have many specific constraints coming in addition. And computers will compute what the designer and programmer ask for.\r\nWe propose to analyze the different relations between information, meaning and representation by taking an evolutionary approach on the systems that link them. Such a bottom-up approach allows starting with simple organisms and avoids an implicit focus on humans, which is the most complex and difficult case. To make available a common\r\nbackground usable for the many different cases, we use a systemic tool that defines the generation of meaningful information by and for a system submitted to a constraint [Menant, 2003]. This systemic tool allows to position information, meaning and representations for systems\r\nrelatively to environmental entities in an evolutionary perspective.\r\nWe begin by positioning the notions of information, meaning and representation and recall the characteristics of the Meaning Generator System (MGS) that link a system submitted to a constraint to its environment. We then use the MGS for animals and highlight the network nature of the interrelated meanings about an entity of the environment. This brings us to define the representation of an item for an agent as being the network of meanings relative to the item for the agent.\r\nSuch meaningful representations embed the agents in their environments and are far from the Good Old Fashion Artificial Intelligence type ones.\r\nThe MGS approach is then used for humans with a limitation resulting of the unknown nature of human consciousness.\r\nApplication of the MGS to artificial systems brings to look for compatibilities with different levels of Artificial Intelligence (AI) like embodied-situated AI, the Guidance Theory of Representations, and enactive AI. Concerns relative to different types of autonomy and organic or artificial constraints are highlighted. We finish by\r\nsummarizing the points addressed and by proposing some continuations.\r\n" altloc: [] chapter: 10 commentary: ~ commref: ~ confdates: ~ conference: ~ confloc: ~ contact_email: ~ creators_id: - christophe.menant@hotmail.fr creators_name: - family: Menant given: Christophe honourific: Mr lineage: '' date: 2011 date_type: published datestamp: 2011-10-27 01:33:16 department: ~ dir: disk0/00/00/76/76 edit_lock_since: ~ edit_lock_until: 0 edit_lock_user: ~ editors_id: [] editors_name: - family: Dodig-Crnkovic given: Gordana honourific: Pr. lineage: ~ - family: Burgin given: Mark honourific: Pr. lineage: ~ eprint_status: archive eprintid: 7676 fileinfo: text/html;http://cogprints.org/7676/1/7637.html full_text_status: public importid: ~ institution: ~ isbn: ~ ispublished: pub issn: ~ item_issues_comment: [] item_issues_count: ~ item_issues_description: [] item_issues_id: [] item_issues_reported_by: [] item_issues_resolved_by: [] item_issues_status: [] item_issues_timestamp: [] item_issues_type: [] keywords: 'Information, meaning, constraint, representation, evolution, Peirce, enaction' lastmod: 2011-10-27 01:33:16 latitude: ~ longitude: ~ metadata_visibility: show note: "Paragraphs\r\n\r\nA.1 Information and meaning. Meaning generation.\r\nA.1.1 Information. Meaning of information and quantity of information.\r\nA.1.2 Meaningful information and constraint satisfaction. A systemic approach.\r\nA.2 Information, meaning and representations. An evolutionary approach. \r\nA.2.1 Stay alive constraint and meaning generation for organisms. \r\nA.2.2 The Meaning Generator System (MGS). A systemic and evolutionary approach.\r\nA.2.3 Meaning transmission. \r\nA.2.4 Individual and species constraints. Group life constraints. Networks of meanings. \r\nA.2.5 From meaningful information to meaningful representations. \r\nA.3 Meaningful information and representations in humans. \r\nA.4 Meaningful information and representations in artificial systems. \r\nA.4.1 Meaningful information and representations from traditional AI to Nouvelle AI. Embodied-situated AI.\r\nA.4.2 Meaningful representations versus the Guidance Theory of Representation.\r\nA.4.3 Meaningful information and representations versus the enactive approach.\r\nA.5 Conclusion and continuation.\r\nA.5.1 Conclusion.\r\nA.5.2 Continuation\r\n" number: ~ pagerange: 255-286 pubdom: FALSE publication: INFORMATION AND COMPUTATION. Essays on Scientific and Philosophical Understanding of Foundations of Information and Computation publisher: World Scientific Publishing Co. Pte.Ltd. refereed: TRUE referencetext: "Anderson, M. (2005). Representation, evolution and embodiment; Institute for Advanced Computer Studies. University of Maryland. http://cogprints.org/3947/\r\nAnderson, M. and Rosenberg, G. (2008). Content and Action: The Guidance Theory of Representation. The Institute of Mind and Behavior, Inc. The Journal of Mind and Behavior Winter and Spring 2008, Volume 29, Numbers 1 and 2. Pages 55–86. ISSN 0271–0137\r\nBlock, N. (2002). Some Concepts of Consciousness. In Philosophy of Mind: Classical and Contemporary Readings, David Chalmers (ed.) Oxford University Press.\r\nBrooks, R. (1991,a). Intelligence without representation. Artificial Intelligence 47 (1991), 139–159 (received in 1987….)\r\nBrooks, R. (1991, b). New Approaches to Robotics. Science, 13 September 1991: Vol. 253. no. 5025, pp. 1227 – 1232 \r\nBrooks, R. (2001) The relationship between matter and life. Nature, Vol 409, 18 Jan 2001 \r\nDepraz, N. (2007). Phenomenology and Enaction. Summer school: Cognitive sciences and Enaction. Fréjus, 5-12 september 2007\r\nDi Paolo, E. (2003), Organismically-inspired robotics: homeostatic adaptation and teleology beyond the closed sensori-motor loop. In: K. Murase & T. Asakura (eds.), Dynamical Systems Approach to Embodiment and Sociality, Adelaide, Australia: Advanced Knowledge International, pp. 19-42\r\nDi Paolo, E. (2005), Autopoiesis, adaptivity, teleology, agency, Phenomenology and the Cognitive Sciences, 4(4), pp. 429-452\r\nDi Paolo, E. Rohde, M. De Jaegher, H. (2007), Horizons for the Enactive Mind: Values, Social Interaction, and Play CSRP 587 April 2007 ISSN 1350-3162. Cognitive Science Research Papers.\r\nDreyfus H. (2007). Why Heideggerian AI Failed and how Fixing it would Require making it more Heideggerian. Philosophical Psychology, 20(2), pp. 247-268.\r\nFloridi, L. (2003). From data to semantic information. Entropy, 2003, 5, 125-145 . http://www.mdpi.org/entropy/papers/e5020125.pdf\r\nFroese, T. (2007), “On the role of AI in the ongoing paradigm shift within the cognitive sciences”; In: M. Lungarella et al. (eds.), Proc. of the 50th Anniversary Summit of Artificial Intelligence, Berlin, Germany: Springer Verlag, in press\r\nFroese, T. Virgo, N. Izquierdo, E. (2007), Autonomy: a review and reappraisal. University of Sussex research paper. ISSN 1350-3162.\r\nFroese, T. and Ziemke, T. (2009). Enactive artificial intelligence: Investigating the systemic organization of life and mind. Artificial Intelligence, Volume 173 , Issue 3-4 (March 2009), 466-500. \r\nHarnad, S. (1990). The Symbol Grounding Problem Physica, D, 335-346.\r\nHaugeland, J. (1989) Artificial Intelligence, the very idea, 7th Ed. (MIT Press, USA)\r\nLieberman, P. (2006) Toward an evolutionary biology of language. Harvard University Press, 2006. ISBN 0674021843, 9780674021846\r\nMenant, C. (2003) Information and meaning. Entropy, 2003, 5, 193-204. http://www.mdpi.org/entropy/papers/e5020193.pdf\r\nMenant, C. (2005). Information and Meaning in Life, Humans and Robots.\r\n Foundations of Information Sciences. Presentation. Paris 2005 \r\n http://www.mdpi.org/fis2005/F.45.paper.pdf\r\nMenant, C. (2006, a). Evolution of Representations. From Basic Life to Self-representation and Self-consciousness. TSC 2006 Poster, http://cogprints.org/4843/\r\nMenant, C. (2006, b). Evolution of Representations and Intersubjectivity as sources of the Self. An Introduction to the Nature of Self-Consciousness. ASSC 10 poster\r\n http://cogprints.org/4957/ \r\nMenant, C. (2008). Evolution as connecting first-person and third-person perspectives of consciousness\" ASSC 12 Poster http://cogprints.org/6120/\r\nNewell, A. and Simon, H. (1976),Computer Science as Empirical Inquiry: Symbols and Search, Communications of the ACM, 19,\r\nQueiroz, J. and El-Hani C,. (2006). Semiosis as an Emergent Process. Transactions of the Charles S. Peirce Society Vol 42., N° 1\r\nSearle, J. (1980) Minds, brains, and programs. Behavioral and Brain Sciences, 3 (3): 417-457\r\nShannon, C. (1948) A mathematical theory of communication. Bell System Technical Journal, vol. 27. \r\nSharov, A. (1998) What is Biosemiotics ? http://home.comcast.net/~sharov/biosem/geninfo.html#summary\r\nTorrance, S. (2005). In search of the enactive: Introduction to special issue on Enactive Experience Phenomenology and the Cognitive Science., 4(4) December 2005, pp. 357-368.).\r\nVarela, F. Thompson, E. Rosch, E. (1991) The Embodied Mind: Cognitive Science and Human Experience. Cambridge, MA: MIT Press.\r\nVernon, D., Furlong, D. (2007) Philosophical Foundations of AI. In M. Lungarella et al. (Eds.): 50 Years of AI, Festschrift, LNAI 4850, pp. 53–62, 2007.c_Springer-Verlag Berlin Heidelberg 20007 http://www.robotcub.org/misc/papers/07_Vernon_Furlong_AI50.pdf\r\nZiemke, T. Sharkey, N. (2001). A stroll through the worlds of robots and animals: Applying Jakob von Uexküll’s theory of meaning to adaptive robots and artificial life. Published in: Semiotica, 134(1-4), 701-746 (2001). \r\n" relation_type: [] relation_uri: [] reportno: ~ rev_number: 12 series: ~ source: ~ status_changed: 2011-10-27 01:33:16 subjects: - bio-evo - comp-sci-art-intel - phil-mind succeeds: ~ suggestions: ~ sword_depositor: ~ sword_slug: ~ thesistype: ~ title: 'Computation on Information, Meaning and Representations. An Evolutionary Approach' type: bookchapter userid: 2546 volume: ~