I linked one such rejoinder in the video description (A re-appraisal of Chomsky's review, 50 years later (by a Skinner sympathizer): www.ncbi.nlm.nih.gov/pmc/articles/PMC2223153/) But there's also a very recent response that invokes Large Language Models: lingbuzz.net/lingbuzz/007180
AI is an emulation of language; it's not language (a biological instinct that came about through evolution). A computer can emulate gravity, but it won't tell you what gravity is (a change in the fabric of space-time). It's the same with “language models.”
This is one definition you could have. By this definition, it would be impossible for any artificial intelligence to use language, no matter how well it emulated brains. The Star Trek computer doesn't use language, and neither does Lieutenant Commander Data of the Starship Enterprise. It would also mean that God and angels, if such entities exist, by definition do not use language. They could all do things that are a lot like language, but it wouldn't be language. That is one definition, but I would prefer to have a more functional definition, that includes anything that functions exactly like language. I think with such a functional definition, there are still ways you could naturally define things so that what ChatGPT is doing isn't language - it doesn't initiate its own conversations, it doesn't have interests of its own that it tries to further by means of words and so on. I think it's more useful to make this sort of argument that whatever ChatGPT is doing, it isn't the same thing as human language, rather than just defining language so that even Lt. Cmdr. Data by definition isn't using language, but just doing something that sounds and acts exactly like language as far as any human is concerned.
@@keaswaran0 Functional definitions are pseudo-definitions at worst, teleology at best. Nature does not exist to serve any anthropocentric "function". This much was known to Plato, and Newton too seems to agree. This is also the distinction between "Science" and "Engineering".
@@keaswaran0 The topic of a definition that describes the essence of something vs a more functional definition is interesting. I would say that this problem can be solved in a more parsimonious way by assuming multiple forms of language. Perhaps we don’t have to settle for just one. In any case, my point is that if what an AI does is a form of language, it doesn’t logically follow that it is the same language generated by humans and therefore explains it.
@@psicologiajoseh All of this presumes that one has a coherent definition of "Language". If you have context-specific definitions, then you don't have a definition at all. You merely have context-specific re-descriptions of the phenomena. Description is not explanation.
@@chomskyan4life Nature in general doesn't serve an anthropocentric function. But biological features do serve evolved functions. To the extent that a system is evolved (like human language) or designed (like artificial intelligence) it can be very useful to think of that system in terms of its functions. Biologists are very interested both in shared features across organisms that reflect preservation of shared ancestry with different functions (like human hands and the front legs of horses) and shared features across organisms that reflect similar function shaping ancestrally distinct body parts (like all the different families of desert plants that have evolved spines out of different ancestral plant parts). To the extent that language is a biological feature that was the product of evolution, it is useful to think of it in functional terms.
I love this concept! Reading “together“ influential papers and academic texts. Fantastic!
I love this!
Thank you!!!!
Thank you so much for making this
This helped me a lot for my philosophy class, thank you
Has there ever been a rejoinder to Chomsky's review published by Skinner's friends
I linked one such rejoinder in the video description (A re-appraisal of Chomsky's review, 50 years later (by a Skinner sympathizer): www.ncbi.nlm.nih.gov/pmc/articles/PMC2223153/)
But there's also a very recent response that invokes Large Language Models: lingbuzz.net/lingbuzz/007180
AI is an emulation of language; it's not language (a biological instinct that came about through evolution). A computer can emulate gravity, but it won't tell you what gravity is (a change in the fabric of space-time). It's the same with “language models.”
This is one definition you could have. By this definition, it would be impossible for any artificial intelligence to use language, no matter how well it emulated brains. The Star Trek computer doesn't use language, and neither does Lieutenant Commander Data of the Starship Enterprise. It would also mean that God and angels, if such entities exist, by definition do not use language. They could all do things that are a lot like language, but it wouldn't be language.
That is one definition, but I would prefer to have a more functional definition, that includes anything that functions exactly like language. I think with such a functional definition, there are still ways you could naturally define things so that what ChatGPT is doing isn't language - it doesn't initiate its own conversations, it doesn't have interests of its own that it tries to further by means of words and so on. I think it's more useful to make this sort of argument that whatever ChatGPT is doing, it isn't the same thing as human language, rather than just defining language so that even Lt. Cmdr. Data by definition isn't using language, but just doing something that sounds and acts exactly like language as far as any human is concerned.
@@keaswaran0 Functional definitions are pseudo-definitions at worst, teleology at best. Nature does not exist to serve any anthropocentric "function". This much was known to Plato, and Newton too seems to agree. This is also the distinction between "Science" and "Engineering".
@@keaswaran0 The topic of a definition that describes the essence of something vs a more functional definition is interesting. I would say that this problem can be solved in a more parsimonious way by assuming multiple forms of language. Perhaps we don’t have to settle for just one.
In any case, my point is that if what an AI does is a form of language, it doesn’t logically follow that it is the same language generated by humans and therefore explains it.
@@psicologiajoseh All of this presumes that one has a coherent definition of "Language". If you have context-specific definitions, then you don't have a definition at all. You merely have context-specific re-descriptions of the phenomena. Description is not explanation.
@@chomskyan4life Nature in general doesn't serve an anthropocentric function. But biological features do serve evolved functions. To the extent that a system is evolved (like human language) or designed (like artificial intelligence) it can be very useful to think of that system in terms of its functions. Biologists are very interested both in shared features across organisms that reflect preservation of shared ancestry with different functions (like human hands and the front legs of horses) and shared features across organisms that reflect similar function shaping ancestrally distinct body parts (like all the different families of desert plants that have evolved spines out of different ancestral plant parts).
To the extent that language is a biological feature that was the product of evolution, it is useful to think of it in functional terms.
Not now baby, i'm watching the cognitive revolution happens