This is John Roth's Typepad Profile.
Join Typepad and start following John Roth's activity
Join Now!
Already a member? Sign In
John Roth
Recent Activity
What you're describing seems very similar to Charles Fillmore's Frames theory, which is one of the standard theories of semantics. The easiest way to see how it works is to use Framenet, which is a data base of frames, including the one that you use as an example. By the way, would you please fix the comment app? I get real itchy when it says it wants permission to maintain my Google+ profile just to post a comment here. ----- BLOGGER'S REPLY-- Sorry about the Google+ business. I did not know it said that and will see what I can do.
Toggle Commented Mar 23, 2017 on Following the Experience at Babel's Dawn
I've always understood locative prepositions have either a space or a time meaning, so I would treat "the game is over" as locative. It shifts orientation from the game to the game's location in time with respect to the speaker. Otherwise a very interesting take on it; I especially like your hypothesis that space is what exists between foreground percepts.
Toggle Commented Apr 17, 2015 on Space as a Matter of Attention at Babel's Dawn
Hi. I just noticed your new series. One item you mentioned earlier was Chomsky's reliance on some form of inherent meaning. I agree that, in the general sense, this is nonsense, but there is a more restricted sense in which it appears to be correct. This is Natural Semantic Metalanguage, a system created by several Australian linguists. While they don't deal with evolutionary issues, it has the advantage that the 65 basic concepts (primes in their terminology) not only don't have to all appear in one shot, some of them could well appear before the emergence of compositional language. Natural Semantic Metalanguage is the Google search term. If you find Wierzbicka (Polish, immigrated to Australia) or Cliff Goddard, you're in the right place. Sincerely, John Roth ------------------------ BLOGGER: Thanks for the Google term.
Toggle Commented Jul 1, 2014 on I’m Tired of Chomsky (Part III) at Babel's Dawn
Frankly, I think this is a tempest in a teapot. I agree with the previous commenter that the brain represents objects, actions, attributes and so forth in some kind of a linked network representation. Linearizing this kind of network is a standard computer science task, as is recreating such a network from a linearized version. Thus it's no surprise that human language has the ability to embed; it's also no surprise that human beings tend to have difficulty if there's too much embedding, or if it's the wrong type. It's also possible to dump such a network in a form that doesn't include recursion or embedding; that is, recursion is not a necessary feature of externalizing such a network; it's simply highly convenient. To see this, consider the difference between a = b + c*d and x = c * d, a = b + x. One includes recursion, the other doesn't. The fact that Piraha uses little to no embedding while all other known languages use a lot of embedding is interesting, and cries out for some kind of an explanation. I don't, unfortunately, expect that explanation until appropriate fMRI and similar studies are done, and I don't expect those studies any time soon.
Toggle Commented Mar 28, 2012 on I, meaning me, say there. at Babel's Dawn
I've got a couple of comments. First, there is now a much better and much more believable explanation for the "great leap forward" that doesn't require the sudden invention of language: the "grandparent hypothesis." See for links. Second, Chomsky is getting really tiresome. What he's saying only makes sense by ignoring the rest of the system of which language is only a part. The end result of understanding something winds up in short term memory, the same as if the information had come in via direct sensory perception. Grammar, while it's important enough that the system wouldn't work without it, isn't central. Short term memory and the brain's "executive function" are central. To the best of my knowledge, short term memory doesn't have stopping points. Stopping points are strictly a function of grammar. They're important to allow refocusing of attention. As far as I'm concerned, the more important issue is disconnecting a vocabulary of sounds or gestures from a one-to-one (or at least a one-to-small-number) of meanings. You can see the rudiments of this in various animals that can learn other animals' vocabulary and interpret them appropriately as kinds of danger, food or whatnot.
Toggle Commented Aug 9, 2011 on How Old is Language? at Babel's Dawn
John Roth is now following The Typepad Team
Aug 9, 2011