The Problem of Meaning in Embodied Cognition and Computational Theory of Cognition: Towards a Synthetic Model
Remya C (Sree Sankaracharya University of Sanskrit)

September 29, 2025, 11:00am - 12:00pm
Department of Philosophy, Sree Sankaracharya University of Sanskrit

Kāladi 683574
India

This will be an accessible event, including organized related activities

This event is available both online and in-person

Organisers:

Sree Sankaracharya University of Sanskrit

Topic areas

Details

Google Meet URL: https://meet.google.com/xqi-skby-zab

Synopsis

The Problem of Meaning in Embodied Cognition and Computational Theory of Cognition: Towards a Synthetic Model

Cognitive Science has undergone significant paradigm shifts since it emerged as an independent discipline, major among them being the transition from adopting a computational theory of cognition to an embodied framework. This transition is the result of a profound rethinking of how the mind is understood; whether as an independent entity that is solely responsible for our cognitive processes or as something that is related to the body and even the environment, thus overriding the mind-body dualism that pervaded Philosophy since Descartes. 

The computational theory of mind was committed to three principles: first, that the information conveyed by a mental representation is autonomous and is non- dependent on the sensory- motor system; second, knowledge is represented propositionally and meaning emerges from the relations among the constituent symbols; and third, the internal representations instruct the motor system, which is essentially separate and independent of cognition, all suggesting that cognitive processing is not significantly limited, constrained or shaped by bodily functions (Foglia. L & Wilson R. A, 2013). This model, with its major proponents Putnam, Chomsky, Fodor and others, conceptualized the mind as a computer where cognitive processes operated through the symbolic manipulation of internal representations. 

However, the dominance of this computational approach came under scrutiny during the last few decades. Even though Chomsky laid the foundation for the computational model of cognition by introducing the concept of generative grammar, which focused on the abstract syntactic rules underlying language, his distinction between linguistic competence (knowledge of language) and performance (actual use of language) aligned with the contemporary scientist, David Marr’s computational level. David Marr,  in his multi-layered approach to cognition, had emphasized that the conceptual analysis of problems should be given priority over mental representation or physical realization of sensations in the brain, a position that differed from what was traditionally understood with respect to Chomsky. Additionally, Chomsky's proposal of deep structures in syntax hinted at connections between syntax and semantics, despite his primary focus on the innate syntactic structures of language. Also, while computational theorists argued that symbolic processing best captured the essence of cognition, they struggled to explain how such capabilities emerged in the cognitive system or where these symbolic representations resided within the brain. The "symbol grounding problem" further highlighted these shortcomings, pointing to the difficulty in explaining how symbols acquired meaning. 

Against this backdrop, an alternative framework emerged, challenging the traditional dominance of the mind in explaining cognition and asserting the importance of the body. This perspective placed the body at the core of cognitive processes to address the limitations of classical approaches. Researchers and philosophers argued that cognition is fundamentally grounded in bodily interactions with the environment. This embodied cognition perspective contends that understanding is deeply rooted in sensory, motor, and emotional patterns, which shape how individuals engage with their surroundings. It emphasizes that cognition arises from the dynamic interactions between organisms and their physical and cultural environments, incorporating emotional responses and transformative actions as integral to our experience. For instance, research in sensorimotor systems demonstrates that perception and action are dynamically coupled, challenging the notion of cognition as merely abstract computation. Ground-breaking studies on gesture, for example, reveal how physical movements contribute to thought and language production, suggesting that cognition is enacted rather than abstractly represented. Similarly, bodily engagement in tasks such as finger counting or maintaining specific postures can simplify cognitive efforts and improve problem-solving or memory retention. The body also distributes cognitive processes, integrating neural and non-neural components. Perception, for instance, is shaped not solely by neural activity but by actions performed to perceive.  Concrete evidence for embodied cognition includes the phenomenon of phantom limbs, where individuals feel sensations in amputated limbs, demonstrating the brain’s reliance on bodily frameworks for sensory integration. Another example is Gibson’s theory of affordances, which posits that environmental properties dictate possible actions. These findings underscore how cognition is scaled to bodily dimensions and environmental interactions. Cognitive linguistics too provide substantial evidence supporting the embodied nature of meaning, by addressing the "symbol grounding problem" through conceptual metaphors and image schemas. 

While the embodied cognition paradigm offers promising avenues for understanding mind and behaviour, it suffers from two main weaknesses viz., insufficiency and incompleteness. The approaches appear to be insufficient because they do not provide a full explanation of the concepts to which they apply. And they also  appear incomplete because they do not seem to capture all that is encased in the abstract concepts. This has paved way for the emergence of synthetic models incorporating the  elements of both computational and embodied approaches to study cognition and meaning. Some such models include the neuro-philosophical approach of the Churchlands and  Conceptual semantics of Ray Jackendoff. 

Neuro-philosophy proposed by Patricia Churchland and upheld by Paul Churchland, attempts to study the questions about mind through the lens of Neuroscience and it assumes that meaning and mental content arise from the relational interplay of neuronal processes. It emphasizes the integration of Neuroscience and Philosophy to understand the mind, advocating for the reducibility of mental states to brain states. Patricia Churchland proposes a co-evolutionary model where neuroscience and cognitive psychology inform and shape each other to explain cognitive phenomena holistically. Though this theory was ground-breaking when it emerged, it faces significant criticisms today, particularly for its overemphasis on neural mechanisms at the expense of other cognitive dimensions. Critics argue that by reducing mental processes to brain processes, it risks oversimplifying the complex interplay between brain, body, and environment that is central to embodied cognition. Churchland's neglect of environmental scaffolding, a critical component of cognition, undermines her claim of offering a comprehensive framework. Furthermore, her strong alignment with eliminative materialism and reliance on Neuroscience raises concerns about the dismissal of computational and representational models, which remain pivotal in understanding abstract cognitive phenomena. Her assertion that neurobiological models might entirely replace computational ones appears premature, given the unresolved gaps between neural processes and higher-level cognitive functions. Neurophilosophy thus, struggles to provide a cohesive framework that integrates diverse approaches, leaving it vulnerable to critiques of reductionism and theoretical insufficiency.

Ray Jackendoff’s theory of Conceptual Semantics aligns with the move toward an integrated view of cognition, where language and thought are both computationally structured and embodied in interaction with the world. His Parallel Architecture model proposes that syntax, semantics, and phonology operate as distinct but interconnected components, linked by interface rules that map sound patterns to structures and meanings. Unlike generative grammar, which privileges syntax, Jackendoff treats these components as relatively autonomous, with words represented as lexical conceptual structures containing phonological, syntactic, and semantic information. However, this assumption has been criticized as circular, since structures are said to derive from lexical entries that already encode them. The model’s reliance on cognitive modularity is also questioned, as it fails to explain how modules evolved, specialized for language, or dynamically interact with cognition over time. While evidence shows language and cognition co-shaping neural pathways, Jackendoff’s framework leaves major gaps in accounting for these processes.

The thesis offers an alternative model, largely founded upon Jackendoff’s theory but with modifications to address the above mentioned criticisms. This model called the Conceptual Resonance Model proposes an innate, universal framework in humans called a Resonance Core that can facilitate the emergence of language. As per this model, this resonance core operates as a pre-structured cognitive space onto which linguistic and conceptual structures can be instantiated through interactions between the body and the environment. It is assumed that the resonance core consist of universal mental archetypes with a predisposition for linguistic organisation across, syntax, phonology and semantics. The syntactic component of these archetypes can organise and structure perception, thought and action. In the same way, the phonological aspect of it allows it to interpret sensory inputs through conceptual schemas like path, action, agent etc.,  and  the semantic aspect of this mental faculty enables the agent to have an indivisible essence of meaning. It is presumed that, in this model, when a stimulus (e.g., a spoken word) activates the phonological layer, it  triggers parallel activations in the syntactic and semantic layers enabling the  conceptual structures encoded in the resonance core  to resonate with the input, creating a meaningful interpretation by matching the input to universal archetypes, all of which process parallelly just like in Jackendoff’s model. To explain this novel model, the researcher has taken the liberty to leave the analytic tools behind as the thesis now tries to adopt pure philosophical methods which can allow her to take Cognitive Science to greater heights by borrowing insights from other philosophical traditions. Considering that it is accepted to change tools so long as it is inevitable in making advancements to one’s theory, the researcher has undertaken such a shift in methodology.

The thesis also incorporates certain nativist accounts that can conceptually support the proposed models of cognition, which due to their metaphysical backing can, to an extent, withstand the  criticisms levelled against their respective Western models. The exposition of sentence as a partless whole and meaning as a shared linguistic capacity between the speaker and the hearer has been a major proposal put forward by the Grammarian School. This notion that the essence of meaning is contained as an undifferentiated whole in language can be considered a framework  supporting  the Conceptual resonance model where the inbuilt potential for meaning is what resonates with other features to produce the meaning of an expression. This intrinsic link between words and meaning offers an explanation for language acquisition, supported by experiments in Cognitive Science that show infants' innate ability to discriminate phonetic contrasts, irrespective of their native language. However, we see that, as children grow, their ability to discriminate non-native phonetic units diminishes, influenced by language experience, which leads to language-specific perceptual mappings. This is the reason why the  transition from sphota to pratibha (meaning generation) can be understood differently in infants and adults, with experience potentially overpowering the innate linguistic capacity.

The other Indian School of thought which the thesis has taken up is the Buddhist school, particularly the Svatantra vijnanavada, to support the conceptual claims of embodied cognition to explain the phenomenal world, which is transcended to reach the state of nirvana. The proponents of this school, particularly Dinnaga and Dharmakirti, emphasises on immediate, experiential perception free from linguistic constructs, rejecting generality and class concepts—an orientation that resonates with embodied theories  The subsequent mental perception or svasamvedana again highlights the embodied nature of cognition, where the mind recognizes its own processes, grounding experience in the lived body. This embodied perspective fosters mindfulness, encouraging a person to stay present with momentary experiences rather than abstract conceptualizations. At the next stage when it becomes a perception, universal features get attributed to it taking it far from reality, as it is the case for the embodied theorists as well. What is to be noted here is that while the embodied cognitivists consider the body as responding to the cues from the environment, Dinnaga asserts  a phenomenal existence wherein the world becomes meaningful in the experience of the Being. The body here becomes the knowing subject thus, dissolving the duality between the mind and body. Further, the knowledge so attained is validated by means of its conduciveness in a pragmatic world (arthakriyakaritva) aligning with the embodiment theory that cognition is enactive. To ground meaning, Dharmakirti introduces imprints (vasana) to explain the regularity of cognition. While past imprints aid recognition, they alone cannot account for cognition; the phenomenal form produced in perception excludes other imprints that differ from the perceiver’s goal-oriented expectation. These imprints resemble the subliminal alayavijnana of early Yogacarins, where seeds accumulate in a storehouse and later shape experience. Dharmakirti distinguishes between experiential and innate imprints. Though ontologically distinct from phenomenal forms, they are identified even before language acquisition, as in infants, due to innate disposition. Since no two objects are identical, this disposition cannot be acquired but belongs to sentient beings by nature. Such an innate disposition may create a distortion by identifying two distinct entities as same, leading to a dysfunctional engagement with the world, which Dharmakirti calls as ignorance, whereas it is actually the causal capacity of the object and our cognitive function of expectation that causes such an identification. Thus, the svatantravijnanavada school seems to be offering synthetic model of cognition, though it is primarily embodied. But considering the soteriological bent of the school, there cannot be an ‘other’, ultimate state of a being  should be the  state of nothingness, which the school portrays through its semiotic process to explain the three natures of the signified. The first being parikalpita where there is the first entry to semiosis, with subject – object duality comes into play. Before this stage, when there is no such duality and there is only immediate perception, there is no sign involved. At the level of paratantra, there is the second semiosis when the trichotomous division of  subject -object- perception gets reduced to consciousness and representation. And at the third level or parinishpanna, there is a termination of signification, leading to signlessness as there are no objects but only the enduring subject, a state of nirvana. Thus, though the ultimate aim of the school is not to explain empirical cognition, when it engages in this exercise, it represents an embodied model with slight leanings towards a synthetic model especially when it explains language acquisition and meaning.

Thus, it appears that almost all the so far developed synthetic models of cognition, though they succeeded in redefining knowledge in significant manners, they seem to possess certain metaphysical pointers especially when read from nativist accounts, especially the grammarian and Svatantravijnanavada  schools. Whether a student of Cognitive science should go to the extent of accepting the soteriological dimensions in these schools  or simply accept only those insights that are conducive for his/her  purpose is questionable. But, at the same time, the grammarian and Buddhist schools, devoid of their metaphysical principles cannot be considered complete as these philosophical systems coherently incorporate ontological, epistemic and ethical concerns in a single string making them unique when compared to Western schools of thought.

Supporting material

Add supporting material (slides, programs, etc.)

Reminders

Registration

No

Who is attending?

1 person is attending:

See all

Will you attend this event?


Let us know so we can notify you of any change of plan.

Custom tags

#Cognition, #Meaning, #Computation, #Model, #Philosophy, #Indian Philosophy