在线咨询
eetop公众号 创芯大讲堂 创芯人才网
切换到宽版

EETOP 创芯网论坛 (原名:电子顶级开发网)

手机号码,快捷登录

手机号码,快捷登录

找回密码

  登录   注册  

快捷导航
搜帖子
查看: 6229|回复: 33

求书!Mathematical Foundations of Information Theory

[复制链接]
发表于 2009-5-28 20:44:42 | 显示全部楼层 |阅读模式

马上注册,结交更多好友,享用更多功能,让你轻松玩转社区。

您需要 登录 才可以下载或查看,没有账号?注册

x
求书
Mathematical Foundations of Information Theory
【成文时间】: 1957
【语言】:en
【页数】: 142
【作者】:A. Ya. Khinchin  
【文件格式】: pdf
【文件原名】:Mathematical Foundations of Information Theory

On the Fundamental Theorems of Information Theory
by
A I Khinchin

INTRODUCTION
Information theory is one of the youngest branches of applied probability theory; it is not yet ten years old. The date of its birth can, with certainty, be considered to be the appearance in 1947-1948 of the by now classical work of Claude Shannon. Rarely does it happen in mathematics that A new discipline achieves the character of a mature and developed scientific theory in the first investigation devoted to it. Such in its time was the case with the theory of integral equations, after the fundamental work of Fredholm; so it was with-information theory after the work of Shannon.

From the very beginning, information theory presents mathematics with a whole new set of problems, including some very difficult ones. It is quite natural that Shannon and his first disciples, whose basic goal was to obtain practical results, were not able to pay enough attention to these mathematical difficulties at the beginning. Consequently, at many points of their investigations. they were compelled either to be satisfied with reasoning of an inconclusive nature or to limit artificially the set of objects studied, (sources, channels, codes, etc.) in order to simplify the proofs. Thus, the whole mass of literature of the first years of information theory, of necessity, bears the imprint of mathematical incompleteness which, in particular, makes it extremely difficult for mathematicians to become acquainted with this new subject. The recently published general textbook on information theory by S Goldman can serve as a typical example of the style prevalent in this literature.

Investigations, with the aim of setting information theory on a solid mathematical basis have begun to appear only in recent years and, at the present time, are few in number. First of all, we must mention the work of McMillan in which the fundamental concepts of the theory of discrete sources (source, channel, code, etc.) were first given precise mathematical definitions. The most important result of this work must be considered to be the proof of the remarkable theorem that any discrete ergodic source has the property which Shannon attributed to sources of Markov type and which underlies almost all the asymptotic calculations of information theory. This circumstance permits the whole theory of discrete information to be constructed without being limited, as was Shannon, to Markov type sources. In the rest of his paper McMillan tries to put Shannon's fundamental theorem on channels with noise on a rigorous basis. In doing so, it becomes apparent that the sketchy proof given by Shannon contains gaps which remain even in the case of Markov sources. The elimination of these gaps is begun in McMillan's paper, but is not completed.

Next, it is necessary to mention the work of Feinstein. Like McMillan, Feinstein considers the Shannon theorem on channels with noise to be the pinnacle of the general theory of discrete information and he undertakes to give a mathematically rigorous proof of this theorem. Accepting completely McMillan's mathematical apparatus, he avoids following Shannon's original path and constructs a proof, using the completely new and apparently very fruitful idea of a "distinguishable set of sequences", the principal features of which will be explained below. However, Feinstein carries out the proof in all details only for the simplest and least practical case, where the successive signals of the source are mutually independent and the channel memory is zero. In the more general case, he indicates only sketchily how the reader is to carry out the necessary reasoning independently. Unfortunately, there remains a whole series of significant difficulties.

As is well known, Shannon formulated his theorem on channels with noise in two different ways. One was in terms of a quantity called equivocation, and the other was in terms of the probability of error. McMillan's analysis leads to the conclusion that these two formulations are not equivalent, and that the second gives a more exact result than the first. Feinstein's more detailed investigation showed that although the first formulation is implied by the second, a rigorous derivation of this implication is not only non-trivial but fraught with considerable additional difficulties. Since both formulations are equally important in actual content, it is preferable to speak about two Shannon theorems rather than combine them under the same heading.

In this paper I attempt to give a complete, detailed proof of both of these Shannon theorems, assuming any ergodic source and any stationary channel with a finite memory. At the present time, apparently, these are the broadest hypotheses under which the Shannon theorems can be regarded as valid. On the whole, I follow the path indicated in the works of McMillan and Feinstein, deviating from them only in the comparatively few cases when I see a gap in their explanation, or when another explanation seems to me more complete and convincing (and sometimes, more simple).

The first chapter of the paper, which is of purely auxiliary character, requires special explanation. It is devoted to the derivation of a whole set of unrelated inequalities, each of which is a theorem of elementary probability theory (i.e., pertains only to finite spaces). The reader acquainted with my paper The entropy concept in probability theory (Russian) (1953) will be able to begin this paper with the second chapter, returning to the first chapter only, when references to its results appear in the text. All the following chapters are constructed according to a specific plan, and can not be skipped or read in different order.

The reader will see that the path to the Shannon theorems is long and thorny, but apparently science, at this time, knows no shorter path if we do not want artificial restrictions on the material studied and if we are to avoid making statements which we can not prove.

很需要这本书,不知道哪位仁兄能够共享一下呢?
万分感激~~~~~
发表于 2009-5-30 09:08:34 | 显示全部楼层

                               
登录/注册后可看大图


[size=120%]Mathematical Foundations of Information Theory
By A. Ya. Khinchin

  • Publisher:   Dover Publications
  • Number Of Pages:   120
  • Publication Date:   1957-06-01
  • ISBN-10 / ASIN:   0486604349
  • ISBN-13 / EAN:   9780486604343
  • Binding:   Paperback


Product Description:

Comprehensive, rigorous introduction to work of Shannon, McMillan, Feinstein and Khinchin. Translated by R. A. Silverman and M. D. Friedman.



Summary: Mathematical foundation of information theory
Rating: 4
The main advantage of the book is that it covers the info theory corresponding to different fields of science and art. This is rather interesting and surprizing for EE students and scholars. The mathematics therein however is not rigorous.


Summary: A clear exposition of Shannon's results by a great mathemati
Rating: 5
A Y Khinchin was one of the great mathematicians of the first half of the twentieth century. His name is is already well-known to students of probability theory along with A N Kolmogorov and others from the host of important theorems, inequalites, constants named after them. He was also famous as a teacher and communicator. The books he wrote on Mathematical Foundations of Information Theory, Statistical Mechanics and Quantum Statistics are still in print in English translations, published by Dover. Like William Feller and Richard Feynman he combines a complete mastery of his subject with an ability to explain clearly without sacrificing mathematical rigour.
In his "Mathematical Foundations" books Khinchin develops a sound mathematical structure for the subject under discussion based on the modern theory of probability. His primary reason for doing this is the lack of mathematically rigorous presentation in many textbooks on these subjects.
This book contains two papers written by Khinchin on the concept of entropy in probability theory and Shannon's first and second theorems in information theory - with detailed modern proofs. Like all Khinchin's books, this one is very readable. And unlike many recent books on this subject the price is very cheap. Two minor complaints are: lack of an index, and typesetting could be improved.


Summary: More rigorous version of Shannon 1948 paper
Rating: 5

Shannon's paper is great. Easy to read (though many people misunderstand many concepts - I may too) but lacks mathematical rigor. This book has redone several points that Shannon made but more accurately. It requires ergodic theory and measure theory to follow every detail, but some parts may be usable even without much background. I don't think the book is perfectly edited, but I know I paid too little for the knowledge I gained from this book.



Credit:hephep
pdf 300dpi 120p 16.5 MB
http://ifile.it/kpzijw5/khinchin_a.i_-_mathematical_foundations_of_information_theory.pdf
http://rapidshare.com/files/169691625/khinchin_a.i_-_mathematical_foundations_of_information_theory.pdf
发表于 2009-5-30 13:43:14 | 显示全部楼层
Its valurable.
 楼主| 发表于 2009-6-7 21:31:51 | 显示全部楼层
太好了,感谢好心人的共享~~~~
发表于 2009-6-8 14:59:38 | 显示全部楼层
至于吗?
头像被屏蔽
发表于 2009-6-8 16:31:30 | 显示全部楼层
提示: 作者被禁止或删除 内容自动屏蔽
发表于 2009-6-17 07:56:12 | 显示全部楼层

怎样才能下载?

怎样才能下载?
发表于 2009-6-18 04:00:09 | 显示全部楼层
look look!
发表于 2009-6-23 17:12:03 | 显示全部楼层
很好的东西啊,看一看
发表于 2009-6-24 19:45:44 | 显示全部楼层

very good

:lol :lol :lol
您需要登录后才可以回帖 登录 | 注册

本版积分规则

关闭

站长推荐 上一条 /2 下一条


小黑屋| 手机版| 关于我们| 联系我们| 在线咨询| 隐私声明| EETOP 创芯网
( 京ICP备:10050787号 京公网安备:11010502037710 )

GMT+8, 2024-11-22 06:06 , Processed in 0.029202 second(s), 10 queries , Gzip On, Redis On.

eetop公众号 创芯大讲堂 创芯人才网
快速回复 返回顶部 返回列表