Kod: 02074387
The Entropy Concept In Probability Theory 1. Entropy of Finite Schemes 2. The Uniqueness Theorem 3. Entropy of Markov chains 4. Fundamental Theorems 5. Application to Coding Theory On the Fundamental Theorems of Information Theory ... więcej
9.76 €
Zwykle: 12.09 €
Oszczędzasz 2.34 €
Za ten zakup dostaniesz 25 punkty
The Entropy Concept In Probability Theory 1. Entropy of Finite Schemes 2. The Uniqueness Theorem 3. Entropy of Markov chains 4. Fundamental Theorems 5. Application to Coding Theory On the Fundamental Theorems of Information Theory INTRODUCTION CHAPTER I. Elementary Inequalities 1. Two generalizations of Shannon's inequality 2. Three inequalities of Feinstein CHAPTER II. Ergodic Sources 3. Concept of a source. Stationarity. Entropy 4. Ergodic Sources 5. The E property. McMillan's theorem. 6. The martingale concept. Doob's theorem. 7. Auxillary propositions 8. Proof of McMillan's theorem. CHAPTER III. Channels and the sources driving them 9. Concept of channel. Noise. Stationarity. Anticipation and memory 10. Connection of the channel to the source 11. The ergodic case CHAPTER IV. Feinstein's Fundamental Lemma 12. Formulation of the problem 13. Proof of the lemma CHAPTER V. Shannon's Theorems 14. Coding 15. The first Shannon theorem 16. The second Shannon theorem CONCLUSION REFERENCES
Kategoria Książki po angielsku Reference, information & interdisciplinary subjects Research & information: general Information theory
9.76 €
Osobní odběr Bratislava a 2642 dalších
Copyright ©2008-24 najlacnejsie-knihy.sk Wszelkie prawa zastrzeżonePrywatnieCookies
Nákupní košík ( prázdný )