Cody and Tony (on “Relations”)

   (The dialogue between Cody, a Codicalist, and Tony, a Platonist, continues.)

CODY: Are all relations physical, or are some relations nonphysical?

TONY: I’d say yes. Being uncountably infinite is a property (one-place relation), but it’s not physical. Being the sidekick of Sherlock Homes is a relation, but it’s not physical. Do you count “linguistic” relations, e.g. as physical.

CODY: There is only an uncountably infinity if physical hypercomputing (of a sufficient Turing level) can be found to occur in nature or in a machine we make.
    – Hyperarithmetical sets and iterated Turing jumps: the hyperarithmetical hierarchy
    – Reverse mathematics, countable and uncountable: a computational approach
    – Effective Mathematics of the Uncountable

“The sidekick of Sherlock Homes” (or all “linguistic” things for that matter) is physical: It’s print in books, magnetic states on disks, chemicals on films, neuronal states in brains, … .

TONY: Relations, per se, are neither physical nor nonphysical. Relations can be between physical things; or nonphysical things; or, even between physical and nonphysical things.

CODY: An ontology of physical + nonphysical is Platonism or Dualist Mentalism (which physicalism rejects).

The confusion between “linguistics” and “physics” — between (mathematical/formal) models/languages and (physical) reality/substrate — is what Victor Stenger defined as “platonism”.
    – scientificamerican.com/article/physicists-are-philosophers-too

 

Philip Thrift

Advertisements

2 thoughts on “Cody and Tony (on “Relations”)”

  1. Hi Philip,
    May I suggest that when I say “tree” you think of a tree, but it is not the same tree. I had communicated a symbol or code for a tree. There was no tree-ness in the symbol. The tree in my mind is different than the tree in yours. Symbolic information is not physical; only the text or sound is physical. Therefore, physicalism is a grossly incomplete description of our existence?

    Like

    1. See Note Is biocomputing > computing?: A purely “informational” computing model of consciousness would be insufficient. Biocomputing, incorporating a phenomenological semantics in addition to informational semantics, is required. Also, BCPT defeats Searle’s Chinese Room Argument: Searle may be right in arguing purely informational (“symbol-manipulation”) or linguistic computing is insufficient for full-blown AI (consciousness), but he does not consider substrative (biological) computing. (cf. substrative vs. linguistic compiler).

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s