NLP (Natural Language Processing) started during the 1950s as the crossing point of man-made brain power and the semantics. This intersection of the artificial intelligence and linguistics resulted in many successful natural language processing operations in artificial intelligence. however some theoretical and practical issues were still a matter of great concern. Due to industrial growth of artificial intelligence and smart systems, the theoretical problems were being disregarded while the products were generated and sold. The TINLAP (Theoretical issues in natural language processing) proceedings of 1975, 1978 and 1987 played key roles in latching up solutions to the theoretical issues in natural language processing. Many students and researchers from psychology, computational linguistics and artificial intelligence were encouraged to develop models on the struggling issues by people working on other aspects of understanding and interpreting natural language. The role of these workshops wasn’t to set out and talk about specific applications and usage of common language handling frameworks, however to focus on the basic issues, and to think about arrangements and limitations crosswise over disciplinary outskirts by drawing into a workshop theoretician in artificial intelligence, logic, brain research, theory and semantics.
SOME MAJOR THEORETICAL ISSUES IN NATURAL LANGUAGE PROCESSING INCLUDE
- Words and world representations
- Unification and new grammatism
- Memory part 1: Natural Language Input
- Metaphor
WORDS AND WORLD REPRESENTATIONS
Words play a major role in the artificial human intelligence in terms of understanding how to view the world. However, many factors and other related facts disregard the present interest in words and consequently the word resources. This section of TINLAP, 1987 focused on how the unused wealth of information relevant to many natural language processing functions can be traced out from the definition of words. Few factors and phenomena affected the current interest in words and their resources out of which the generation of many theories related to grammar (e.g., phase structure, unification of grammar, grammar of words and lexical functional grammar) was one the major factors.
Save your time!
We can take care of your essay
- Proper editing and formatting
- Free revision, title page, and bibliography
- Flexible prices and money-back guarantee
Place an order
The argument was focused on keeping a wide distinction between the word knowledge and world knowledge. It was realised that even though one knows about any word related to an object or any event in the world is not enough to allow one to use a word properly. World information must be enhanced and obliged by semantic learning to yield a fitting record of word. Accession of the work by the Clarks (E. Clark 1973, H. Clark 1973) displayed a few ties between perceptions and language and builds up some intriguing issues concerning the interface among perceptual and linguistic information, and the process by which a kid may utilize perceptual learning as a bootstrapping gadget to connect words and world information into an as of now existent intrinsic semantic learning. Understanding the importance of neutral lexicon lead to a vast discussion on need of neutral lexicon because the boundary wall between words and world could only be reduced if we have a more common and generalised vocabulary, language or branch of knowledge for all the users.
The concluding words to this topic deeply confronted the required efforts to develop a theory neutral lexicon which can be used by a variety of parsing and generation programs. Putting efforts towards neutral lexicon will make the understanding and successful handling of the data frame works, providing more appropriate characterization of those points on we agree or disagree.
UNIFICATION AND NEW GRAMMATISM
In unification we deal with denotational semantics and classify the objects on the basis of analysis, utterances and their parts which holds vast amount of linguistic data (syntactic, semantic, lexical, discourse) in a more sober way. Rules give the description of a collective object on the basis of description of its parts. Denotational semantics give exact specification of what a grammar does. However, from computational point of view we need to focus on how does a grammar work and what the grammar does. The prototypical unification grammar consists of a context-free skeleton, enriched with a set of specifications on the grammatical symbols in the rules and associated lexicon.
The unification formalism gives an opportunity to provide a well-defined semantics because of its algebraic characterization (Pereira and Schieber, 1984). However, how to formulate this concept of unification formalism into efficient algorithms for processing is still in progress. The unification activity assembles new structures together with some string consolidating task (link being the essential one) setting the element structures with strings (Schieber, 1986).
METAPHOR
Metaphor is an inescapable and significant phenomenon, both in writing and in conventional language. It is additionally an enormously variable phenomenon. The term 'metaphor' is regularly used to refer nonliteral correlations that are novel and clear and that pass on thoughts that may somehow or another be hard to express (Ortony, 1975).
Theoretically metaphors might serve three basic functions. First, metaphors may enable one to express what is trouble or impossible to express in the event that one is confined to strict user of language. Second, metaphors might include the specific compatible means of communication. Third, At last, metaphors may help catch the distinctiveness of wonderful experience.
CONCLUSION
Natural language processing has shown a tremendous growth in the industrial sector for manufacturing products for artificial intelligence and expert systems but due to this many theoretical issues were strictly ignored. Therefore, solutions to all of the theoretical issues were traced and implemented successfully by the theoreticians in all the areas of the artificial intelligence, logic, psychology, philosophy and linguistics. Solutions to some major theoretical issues like unification and grammatism, words and world and metaphor were traced out with full interest to maintain the progress of NLP and making sure that the NLP never goes through a downfall.
The world was very successfully separated from the linguistic knowledge. The word and world laid key focus on how the words fit into language understanding and generating new ideas. Many grammatical formalisms were motivated and their extent of success was calculated. In unification we deal with denotational semantics to give exact specification of what grammar can do.