注册 | 登录读书好,好读书,读好书!
读书网-DuShu.com
当前位置: 首页出版图书科学技术计算机/网络人工智能基于人工神经网络的机器翻译

基于人工神经网络的机器翻译

基于人工神经网络的机器翻译

定 价:¥25.00

作 者: 许罗迈
出版社: 科学出版社
丛编项:
标 签: 自动化基础理论

购买这本书可以去


ISBN: 9787030189813 出版时间: 2007-06-01 包装: 平装
开本: 0开 页数: 216 字数:  

内容简介

  《基于人工神经网络的机器翻译》基于语料库统计的机器翻译模式把机器翻译分为翻译模式和语言模式两种处理过程,作者尝试把人工神经网络技术应用于两种模式的处理,使之应用于机器翻译的全过程,是一项创造性工作,作者采用神经元自学习的方法,从少量实例开始,系统通过自学习建立机器词库和对应的译文,本研究实验证明对于确定的领域,该系统可以输出相当通顺的目的语,这种用分布式神经网络体系解决翻译模式的训练,较好地解决了单一网络学习能力有限的问题,对神经网络语言处理技术开发了新思路,有相当意义。作者在应用神经网络处理语言模式方面,也提出了新的解决方案,改变了以往神经网络以复杂句法、语义特征为训练对象的普遍做法,采用词性标注为训练对象,以自创的一套词语移动符号基为训练目标的神经网络处理方法,是一种独特的处理方法,虽然作者指出这种方法未能得到预期的结果,但是如果能够如作者提出的把分布式神经网络体系也用于语言模式的训练,这种独特的方法成败与否还未可知。

作者简介

暂缺《基于人工神经网络的机器翻译》作者简介

图书目录

Preface
Acknowledgements
Chapter One Prologue
Chapter Two MT state of the art
 2.1 MT as symbolic systems
 2.2 Practical MT
 2.3 Alternative technique of MT
  2.3.1 Theoretical foundation
  2.3.2 Translation model
  2.3.3 Language model
 2.4 Discussion
Chapter Three Connectionist solutions
 3.1 NLP models
 3.2 Representation
 3.3 Phonological processing
 3.4 Learning verb past tense
 3.5 Part of speech tagging
 3.6 Chinese collocation learning
 3.7 Syntactic parsing
  3.7.1 Learning active/passive transformation
  3.7.2 Confluent preorder parsing
  3.7.3 Parsing with fiat structures
  3.7.4 Parsing embedded clauses
  3.7.5 Parsing with deeper structures
 3.8 Discourse analysis
  3.8.1 Story gestalt and text understanding
  3.8.2 Processing stories with scriptural knowledge
 3.9 Machine translation
 3.10 Conclusion
Chapter Four NeuroTrans design considerations
 4.1 Scalability and extensibility
 4.2 Transfer or inter lingual
 4.3 Hybrid or fully connectionist
 4.4 The use of linguistic knowledge
 4.5 Translation as a two stage process
 4.6 Selection of network models
 4.7 Connectionist implementation
 4.8 Connectionist representation issues
 4.9 Conclusion
Chapter Five A neural lexicon model
 5.1 Language data
 5.2 Knowledge representation
  5.2.1 Symbolic approach
  5.2.2 The statistical approach
  5.2.3 Connectionist approach
  5.2.4 NeuroTrans' input/output representation
  5.2.5 NeuroTrans' lexicon representation
 5.3 Implementing the neural lexicon
  5.3.1 Words in context
  5.3.2 Context with weights
  5.3.3 Details of algorithm
  5.3.4 The Neural Lexicon Builder
 5.4 Training
  5.4.1 Sample preparation
  5.4.2 Training results
  5.4.3 Generalization test
 5.5 Discussion
  5.5.1 Adequacy
  5.5.2 Scalability and Extensibility
  5.5.3 Efficiency
  5.5.4 Weaknesses
Chapter Six Implementing the language model
 6.1 Overview
 6.2 Design
  6.2.1 Redefining the generation problem
  6.2.2 Defining jumble activity
  6.2.3 Language model structure
 6.3 Implementation
  6.3.1 Network structure Sampling Training and results
  6.3.2 Generalization test
 6.4 Discussion
  6.4.1 Insufficient data
  6.4.2 Information richness
  6.4.3 Insufficient contextual information
  6.4.4 Distributed language model
Chapter Seven Conclusion
Chapter Eight References
Index

本目录推荐