图书介绍

Elements of Information Theory Second EditionPDF|Epub|txt|kindle电子书版本网盘下载

Elements of Information Theory Second Edition
  • Thomas M.Cover 著
  • 出版社: Wiley-Interscience
  • ISBN:0471241954
  • 出版时间:2006
  • 标注页数:748页
  • 文件大小:54MB
  • 文件页数:770页
  • 主题词:

PDF下载


点此进入-本书在线PDF格式电子书下载【推荐-云解压-方便快捷】直接下载PDF格式图书。移动端-PC端通用
种子下载[BT下载速度快]温馨提示:(请使用BT下载软件FDM进行下载)软件下载地址页直链下载[便捷但速度慢]  [在线试读本书]   [在线获取解压码]

下载说明

Elements of Information Theory Second EditionPDF格式电子书版下载

下载的文件为RAR压缩包。需要使用解压软件进行解压得到PDF格式图书。

建议使用BT下载工具Free Download Manager进行下载,简称FDM(免费,没有广告,支持多平台)。本站资源全部打包为BT种子。所以需要使用专业的BT下载软件进行下载。如BitComet qBittorrent uTorrent等BT下载工具。迅雷目前由于本站不是热门资源。不推荐使用!后期资源热门了。安装了迅雷也可以迅雷进行下载!

(文件页数 要大于 标注页数,上中下等多册电子书除外)

注意:本站所有压缩包均有解压码: 点击下载压缩包解压工具

图书目录

1 Introduction and Preview1

1.1 Preview of the Book5

2 Entropy,Relative Entropy,and Mutual Information13

2.1 Entropy13

2.2 Joint Entropy and Conditional Entropy16

2.3 Relative Entropy and Mutual Information19

2.4 Relationship Between Entropy and Mutual Information20

2.5 Chain Rules for Entropy,Relative Entropy,and Mutual Information22

2.6 Jensen’s Inequality and Its Consequences25

2.7 Log Sum Inequality and Its Applications30

2.8 Data-Processing Inequality34

2.9 Sufficient Statistics35

2.10 Fano’s Inequality37

Summary41

Problems43

Historical Notes54

3 Asymptotic Equipartition Property57

3.1 Asymptotic Equipartition Property Theorem58

3.2 Consequences of the AEP:Data Compression60

3.3 High-Probability Sets and the Typical Set62

Summary64

Problems64

Historical Notes69

4 Entropy Rates of a Stochastic Process71

4.1 Markov Chains71

4.2 Entropy Rate74

4.3 Example:Entropy Rate of a Random Walk on a Weighted Graph78

4.4 Second Law of Thermodynamics81

4.5 Functions of Markov Chains84

Summary87

Problems88

Historical Notes100

5 Data Compression103

5.1 Examples of Codes103

5.2 Kraft Inequality107

5.3 Optimal Codes l110

5.4 Bounds on the Optimal Code Length112

5.5 Kraft Inequality for Uniquely Decodable Codes115

5.6 Huffman Codes118

5.7 Some Comments on Huffman Codes120

5.8 Optimality of Huffman Codes123

5.9 Shannon -Fano-Elias Coding127

5.10 Competitive Optimality of the Shannon Code130

5.11 Generation of Discrete Distributions from Fair Coins134

Summary141

Problems142

Historical Notes157

6 Gambling and Data Compression159

6.1 The Horse Race159

6.2 Gambling and Side Information164

6.3 Dependent Horse Races and Entropy Rate166

6.4 The Entropy of English168

6.5 Data Compression and Gambling171

6.6 Gambling Estimate of the Entropy of English173

Summary175

Problems176

Historical Notes182

7 Channel Capacity183

7.1 Examples of Channel Capacity184

7.1.1 Noiseless Binary Channel184

7.1.2 Noisy Channel with Nonoverlapping Outputs185

7.1.3 Noisy Typewriter186

7.1.4 Binary Symmetric Channel187

7.1.5 Binary Erasure Channel188

7.2 Symmetric Channels189

7.3 Properties of Channel Capacity191

7.4 Preview of the Channel Coding Theorem191

7.5 Definitions192

7.6 Jointly Typical Sequences195

7.7 Channel Coding Theorem199

7.8 Zero-Error Codes205

7.9 Fano’s Inequality and the Converse to the Coding Theorem206

7.10 Equality in the Converse to the Channel Coding Theorem208

7.11 Hamming Codes210

7.12 Feedback Capacity216

7.13 Source-Channel Separation Theorem218

Summary222

Problems223

Historical Notes240

8 Differential Entropy243

8.1 Definitions243

8.2 AEP for Continuous Random Variables245

8.3 Relation of Differential Entropy to Discrete Entropy247

8.4 Joint and Conditional Differential Entropy249

8.5 Relative Entropy and Mutual Information250

8.6 Properties of Differential Entropy,Relative Entropy,and Mutual Information252

Summary256

Problems256

Historical Notes259

9 Gaussian Channel261

9.1 Gaussian Channel:Definitions263

9.2 Converse to the Coding Theorem for Gaussian Channels268

9.3 Bandlimited Channels270

9.4 Parallel Gaussian Channels274

9.5 Channels with Colored Gaussian Noise277

9.6 Gaussian Channels with Feedback280

Summary289

Problems290

Historical Notes299

10 Rate Distortion Theory301

10.1 Quantization301

10.2 Definitions303

10.3 Calculation of the Rate Distortion Function307

10.3.1 Binary Source307

10.3.2 Gaussian Source310

10.3.3 Simultaneous Description of Independent Gaussian Random Variables312

10.4 Converse to the Rate Distortion Theorem315

10.5 Achievability of the Rate Distortion Function318

10.6 Strongly Typical Sequences and Rate Distortion325

10.7 Characterization of the Rate Distortion Function329

10.8 Computation of Channel Capacity and the Rate Distortion Function332

Summary335

Problems336

Historical Notes345

11 Information Theory and Statistics347

11.1 Method of Types347

11.2 Law of Large Numbers355

11.3 Universal Source Coding357

11.4 Large Deviation Theory360

11.5 Examples of Sanov’s Theorem364

11.6 Conditional Limit Theorem366

11.7 Hypothesis Testing375

11.8 Chernoff-Stein Lemma380

11.9 Chernoff Information384

11.10 Fisher Information and the Cramer-Rao Inequality392

Summary397

Problems399

Historical Notes408

12 Maximum Entropy409

12.1 Maximum Entropy Distributions409

12.2 Examples411

12.3 Anomalous Maximum Entropy Problem413

12.4 Spectrum Estimation415

12.5 Entropy Rates of a Gaussian Process416

12.6 Burg’s Maximum Entropy Theorem417

Summary420

Problems421

Historical Notes425

13 Universal Source Coding427

13.1 Universal Codes and Channel Capacity428

13.2 Universal Coding for Binary Sequences433

13.3 Arithmetic Coding436

13.4 Lempel-Ziv Coding440

13.4.1 Sliding Window Lempel-Ziv Algorithm441

13.4.2 Tree-Structured Lempel-Ziv Algorithms442

13.5 Optimality of Lempel-Ziv Algorithms443

13.5.1 Sliding Window Lempel-Ziv Algorithms443

13.5.2 Optimality of Tree-Structured Lempel-Ziv Compression448

Summary456

Problems457

Historical Notes461

14 Kolmogorov Complexity463

14.1 Models of Computation464

14.2 Kolmogorov Complexity:Definitions and Examples466

14.3 Kolmogorov Complexity and Entropy473

14.4 Kolmogorov Complexity of Integers475

14.5 Algorithmically Random and Incompressible Sequences476

14.6 Universal Probability480

14.7 Kolmogorov complexity482

14.8 Ω484

14.9 Universal Gambling487

14.10 Occam’s Razor488

14.11 Kolmogorov Complexity and Universal Probability490

14.12 Kolmogorov Sufficient Statistic496

14.13 Minimum Description Length Principle500

Summary501

Problems503

Historical Notes507

15 Network Information Theory509

15.1 Gaussian Multiple-User Channels513

15.1.1 Single-User Gaussian Channel513

15.1.2 Gaussian Multiple-Access Channel with m Users514

15.1.3 Gaussian Broadcast Channel515

15.1.4 Gaussian Relay Channel516

15.1.5 Gaussian Interference Channel518

15.1.6 Gaussian Two-Way Channel519

15.2 Jointly Typical Sequences520

15.3 Multiple-Access Channel524

15.3.1 Achievability of the Capacity Region for the Multiple-Access Channel530

15.3.2 Comments on the Capacity Region for the Multiple-Access Channel532

15.3.3 Convexity of the Capacity Region of the Multiple-Access Channel534

15.3.4 Converse for the Multiple-Access Channel538

15.3.5 m-User Multiple-Access Channels543

15.3.6 Gaussian Multiple-Access Channels544

15.4 Encoding of Correlated Sources549

15.4.1 Achievability of the Slepian-Wolf Theorem551

15.4.2 Converse for the Slepian-Wolf Theorem555

15.4.3 Slepian-Wolf Theorem for Many Sources556

15.4.4 Interpretation of Slepian-Wolf Coding557

15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels558

15.6 Broadcast Channel560

15.6.1 Definitions for a Broadcast Channel563

15.6.2 Degraded Broadcast Channels564

15.6.3 Capacity Region for the Degraded Broadcast Channel565

15.7 Relay Channel571

15.8 Source Coding with Side Information575

15.9 Rate Distortion with Side Information580

15.10 General Multiterminal Networks587

Summary594

Problems596

Historical Notes609

16 Information Theory and Portfolio Theory613

16.1 The Stock Market:Some Definitions613

16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio617

16.3 Asymptotic Optimality of the Log-Optimal Portfolio619

16.4 Side Information and the Growth Rate621

16.5 Investment in Stationary Markets623

16.6 Competitive Optimality of the Log-Optimal Portfolio627

16.7 Universal Portfolios629

16.7.1 Finite-Horizon Universal Portfolios631

16.7.2 Horizon-Free Universal Portfolios638

16.8 Shannon-McMillan-Breiman Theorem(General AEP)644

Summary650

Problems652

Historical Notes655

17 Inequalities in Information Theory657

17.1 Basic Inequalities of Information Theory657

17.2 Differential Entropy660

17.3 Bounds on Entropy and Relative Entropy663

17.4 Inequalities for Types665

17.5 Combinatorial Bounds on Entropy666

17.6 Entropy Rates of Subsets667

17.7 Entropy and Fisher Information671

17.8 Entropy Power Inequality and Brunn-Minkowski Inequality674

17.9 Inequalities for Determinants679

17.10 Inequalities for Ratios of Determinants683

Summary686

Problems686

Historical Notes687

Bibliography689

List of Symbols723

Index727

热门推荐