Saturday, March 15, 2008

satisfaction and correctness

I think there is a useful distinction to be drawn between these two types of ‘intentionality’ – the first is a type of ‘satisfaction’ condition, the lattera type of ‘correctness’ condition.

Type A.

Desire - satisfied/dissatisfied
Goal - attained/unattained
Problem - solved/unsolved
Competition - won/lost
Question - answered/unanswered
Task - completed/incomplete
Difficulty - resolved/unresolved
Fact - established/not established
Theory - confirmed/disconfirmed
Understanding - attained/unattained

And so on.

Type B

Belief/statement - true/false
Reasoning - valid/invalid
Behaviour - appropriate/inappropriate
Action - rational/irrational
Game playing - correct/incorrect
Object - valuable/worthless
Performance - good/bad
Action - moral/immoral
Institution - just/unjust
Situation - fair/unfair

Question: Do these regulative criteria/judgements develop together: is there a common developmental root?

Thursday, March 13, 2008

Cognitive competences related to theory of mind.

On the theory I've been develeping over the past year, there is the following progression of abilities: association > hidden cause tracking (may involve basic simulation) > normative standard 'content'/ intentionality > aspectual perspective taking > normative standard (deep) perspective taking.

It is only when we get to the aspectual perspective taking at around 6 years old that we get real 'theory of mind' on this view.

Normative standard intentionality is something that is not domain-specific to 'minds': it is externalist with respect to multiple domains - behavioural, mental, action, social practice, objects, tools, etc.

Sunday, March 9, 2008

mental simulation and particulars

Mental simulation/mental modeling - an aspect of causal reasoning - requires the representation of particulars, located in space and time as unique individuals.

Emergent consciousness

Gell-Mann in this talk suggests that the human mind, like life itself, could be emergent - a product of laws and accidents. In the case of the human mind it could be an emergent property of the complexity of neo-cortex. The cerebellum - which is not involved in conscious awareness - has the same neuron density, but is not as interconnected as cortex: it is not as complex. I think the idea that consciousness is an emergent 'global' property is a more plausible than the ideathat it is a module, an exaption, an assembly and so on.

Gell-Mann's TED talk: http://www.ted.com/index.php/talks/view/id/194