Tuesday, October 22, 2019
Artificial Intelligence Essays (5026 words) - Cognitive Science
Artificial Intelligence Essays (5026 words) - Cognitive Science Artificial Intelligence Artificial Intelligence is based in the view that the only way to prove you know the mind's causal properties is to build it. In its purest form, AI research seeks to create an automaton possessing human intellectual capabilities and eventually, consciousness. There is no current theory of human consciousness which is widely accepted, yet AI pioneers like Hans Moravec enthusiastically postulate that in the next century, machines will either surpass human intelligence, or human beings will become machines themselves (through a process of scanning the brain into a computer). Those such as Moravec, who see the eventual result as the universe extending to a single thinking entity as the post-biological human race expands to the stars, base their views in the idea that the key to human consciousness is contained entirely in the physical entity of the brain. While Moravec (who is head of Robotics at Carnegie Mellon University) often sounds like a New Age psychedelic guru professing the nex t stage of evolution, most AI (that which will concern this paper) is expressed by Roger Schank, in that the question is not 'can machines think?' but rather, can people think well enough about how people think to be able to explain that process to machines? This paper will explore the relation of linguistics, specifically the views of Noam Chomsky, to the study of Artificial Intelligence. It will begin by showing the general implications of Chomsky's linguistic breakthrough as they relate to machine understanding of natural language. Secondly, we will see that the theory of syntax based on Chomsky's own minimalist program, which takes semantics as a form of syntax, has potential implications on the field of AI. Therefore, the goal is to show the interconnectedness of language with any attempt to model the mind, and in the process explain Chomsky's influence on the beginnings of the field, and lastly his potential influence on current or future research. Chomsky essentially founded modern linguistics in seeking out a systematic, testable theory of natural language. He hypothesized the existence of a language organ within the brain, wired with a deep structured universal grammar that is transmitted genetically and underlies the superficial structures of all human languages. Chomsky asserted that underlying meaning was carried in the universal grammar of deep structures and transformed by a series of operations that he termed transformational rules into the less abstract surface structures that was the spoken form of the various natural languages. He showed also that mental activities in general can and should be investigated independently of behavior and cognitive underpinnings. This idealization of the linguistic capability of a native speaker brought Chomsky to his nativist, internalist, and constructivist philosophical views of language and mind. This concept of generative grammar could be seen as a 'machine', in the abstract Turing sense, that can be used to generate all the grammatical sentences in a given language. Chomsky was searching for a formal method of describing the possible grammatical sentences of a language, as the Turing machine (more below) was used to specify what was possible in the language of mathematics. Chomsky's transformational generative grammar (TGG) possessed the most influence on AI in that it was a specification for a machine that went beyond the syntax of a language, to their semantics, or the ways that meanings are generated. An ambiguous sentence like I like her cooking or flying planes can be dangerous could have a single surface structure from multiple deep structures, just as semantically equivalent sentences involving a transformation from active to passive voice or the like, could have different surface structures emerging from the same deep structure. Computational linguists and AI researchers saw that these rules, once understood, could be applied, or mechanized, with a formal mathematical system. Here, natural languages were strings of symbols constructed to different conventions, which needed to be converted to a universal human 'machine code.' From a computational viewpoint, language is an abstract system for manipulating symbols; the universal grammar could be purified in the sense of mathematics, in other words, being independent of physical reality. Semantics in this view would just be an application of the abstract syntax onto the real world. Chomskyan linguistics, as we shall see further on, does
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.