<> "The repository administrator has not yet configured an RDF license."^^ . <> . . "Does the Mind Piggy-Back on Robotic and Symbolic Capacity?"^^ . " Cognitive science is a form of \"reverse engineering\" (as Dennett has dubbed it). We are\ntrying to explain the mind by building (or explaining the functional principles of) systems that have\nminds. A \"Turing\" hierarchy of empirical constraints can be applied to this task, from t1, toy models\nthat capture only an arbitrary fragment of our performance capacity, to T2, the standard \"pen-pal\"\nTuring Test (total symbolic capacity), to T3, the Total Turing Test (total symbolic plus robotic\ncapacity), to T4 (T3 plus internal [neuromolecular] indistinguishability). All scientific theories are\nunderdetermined by data. What is the right level of empirical constraint for cognitive theory? I will\nargue that T2 is underconstrained (because of the Symbol Grounding Problem and Searle's Chinese\nRoom Argument) and that T4 is overconstrained (because we don't know what neural data, if any, are\nrelevant). T3 is the level at which we solve the \"other minds\" problem in everyday life, the one at\nwhich evolution operates (the Blind Watchmaker is no mind-reader either) and the one at which\nsymbol systems can be grounded in the robotic capacity to name and manipulate the objects their\nsymbols are about. I will illustrate this with a toy model for an important component of T3 --\ncategorization -- using neural nets that learn category invariance by \"warping\" similarity space the way\nit is warped in human categorical perception: within-category similarities are amplified and\nbetween-category similarities are attenuated. This analog \"shape\" constraint is the grounding inherited\nby the arbitrarily shaped symbol that names the category and by all the symbol combinations it enters\ninto. No matter how tightly one constrains any such model, however, it will always be more\nunderdetermined than normal scientific and engineering theory. This will remain the ineliminable\nlegacy of the mind/body problem. "^^ . "1994" . . "XXII" . . "Addison Wesley"^^ . . . "The Mind, the Brain, and Complex Adaptive Systems"^^ . . . . . . . . . . . . . . "J."^^ . "Singer"^^ . "J. Singer"^^ . . "Stevan"^^ . "Harnad"^^ . "Stevan Harnad"^^ . . "H."^^ . "Morowitz"^^ . "H. Morowitz"^^ . . . . . . "Does the Mind Piggy-Back on Robotic and Symbolic Capacity? (HTML)"^^ . . . "harnad95.mind.robot.html"^^ . . . "Does the Mind Piggy-Back on Robotic and Symbolic Capacity? (Indexer Terms)"^^ . . . . . . "indexcodes.txt"^^ . . "HTML Summary of #1594 \n\nDoes the Mind Piggy-Back on Robotic and Symbolic Capacity?\n\n" . "text/html" . . . "Dynamical Systems" . . . "Perceptual Cognitive Psychology" . .