Network Information Theory

Network Information Theory

Network Information Theory

Lessons

  1. Network Information Theory (Lecture- 01)

  2. How to quantify information?

  3. Hartleys measure and shannons measure

  4. What is entropy and what is IT Inequality?

  5. Chain rules of entropy and mutual information

  6. Properties of entropy conditional entropy and relative entropy

  7. Network Information Theory (Lecture- 02)

  8. Jensens Inequality

  9. Fanos Lemma

  10. Shannon mcmillan breiman theorem

  11. Coding a single random variable

  12. Prefix free code

  13. Krafts inequality

  14. Log sum inequality

  15. Data Processing Lemma

  16. Network Information Theory (Lecture- 03)

  17. Weak law of large numbers strong law of large numbers

  18. Network Information Theory (Lecture- 04)

  19. Rooted tree with probabilities

  20. Shannon fano coding

  21. Network Information Theory (Lecture- 05)

  22. Huffman code

  23. Network Information Theory (Lecture- 06)

  24. Coding an information source

  25. Lempel ziv 1978

  26. Discussion on equipartition property

  27. Lempel ziv 1977

  28. Network Information Theory (Lecture- 08)

  29. Coding of positive integers

  30. Discrete stationary source dss

  31. Block to variable length coding of dss

  32. Network Information Theory (Lecture- 07)

  33. Joint typical sequences

  34. Network Information Theory (Lecture- 09)

  35. Noisy channel coding theorem

  36. Elias willems source coding

  37. Fanos inequality

  38. Network Information Theory (Lecture- 10)

  39. Weak converse

  40. Discrete memoryless channel

  41. Strongly symmetric channel

  42. Network Information Theory (Lecture- 12)

  43. Uniformly dispersive channel and uniformly focusing channel

  44. Questionn

  45. Network Information Theory (Lecture- 11)

  46. Recap and outline

  47. Modeling the medium

  48. Mutual information differential entropy capacity

  49. Rate distortion function

  50. Network Information Theory (Lecture- 14)

  51. Achievability of the rate distortion function

  52. Converse of rate distortion theorem

  53. Network Information Theory (Lecture- 15)

  54. Introduction

  55. Network Information Theory (Lecture- 13)

  56. Introduction to Network IT

  57. Classic 1 Hop Example

  58. Distributed Source Coding Slepian Wolf Problem

  59. Wyner-Ziv problem (Lecture- 17)

  60. Slepian-Wolf problem (Lecture- 16)

  61. Network Information Theory (Lecture- 18)

  62. Wyner ziv problem continued

  63. Generalizations

  64. Recap Of Distributed Source Coding

  65. Network Information Theory (Lecture- 19)

  66. Broadcast Channels

  67. 2nd Moment Method

  68. Some Special Cases (Lecture- 20)

  69. Network Information Theory (Lecture- 21)

  70. Network Information Theory (Lecture- 24)

  71. Network Information Theory (Lecture- 25)

  72. Network Information Theory (Lecture- 23)

  73. Network Information Theory (Lecture- 27)

  74. Network Information Theory (Lecture- 26)

  75. Network Information Theory (Lecture- 22)

  76. Network Information Theory (Lecture- 28)

  77. Superposition Coding

  78. Network Information Theory (Lecture- 30)

  79. Interference Channels (Lecture- 31)

  80. Relay Channels (continued) (Lecture- 29)

  81. Degraded Broadcast Channels

  82. AWGN Broadcast Channel

  83. 1. Binary Symmetric Broadcast Channel

  84. Multiple-Access Channels (MACs)

  85. Binary Adder Channel

  86. 1. Broadcast Channel (Recap)

  87. 3. (Scalar) AWGN Channel

  88. Multiple-Access Channels (MACs) (continued)

  89. Successive Interference Cancellation (SIC)

  90. Multiple-Access Channels (MACs) (continued)

  91. Rate Splitting

  92. Duality - Scalar Case

  93. Vector AWGN Channel

  94. Interference Channels

  95. Multiple-Access Channels (MACs) (continued)

  96. Coding Strategies

  97. Han Kobayashi Region

  98. AWGN ICs

  99. Han Kobayashi Region (continued)

  100. Relay Channels

  101. Physically Degraded RCs

  102. Relay Channels (continued)

  103. Symbol Relaying (Amplify-Forward)

  104. Partial Decode Forward (PDF)

  105. 1. Superposition Coding

  106. 2. Degraded Broadcast Channels

No Comments

Give a comment