Department of Computer Science
CSU Stanislaus
California State University

CS4450-001: Coding and Information Theory

Spring 2024


 

Instructor: Dr. Xuejun Liang

My Office: DBH 282

 Office Hours: M: 2:00 p.m. – 3:00 p.m. & WF 10:00 a.m.-11:00 a.m. (ZOOM Meeting ID: 4438930033).

 Phone : (209) 667-3169. Email : xliang@csustan.edu

 

Class Information

Classroom: Bizzini 117 / Online

Class Date & Time: MWF 11:00 a.m.-11:50 a.m.

Class Website: https://www.cs.csustan.edu/~xliang/Courses2/CS4450-24S

 

Course Materials

Textbook:

·         Information and Coding Theory, by Gareth A. Jones and J. Mary Jones, Springer, 2000, ISBN 978-1-85233-622-6

 

Reference Book

·         Information Theory, Coding and Cryptography, by Arijit Saha, Nilotpal Manna, Mandal, Pearson India, 2013, Print ISBN-13: 978-81-317-9749-5, Web ISBN-13: 978-93-325-1785-1

Lecture Slides

Chapter 1 Source Coding (A, B, C)

Chapter 2 Optimal Codes (A, B)

Chapter 3 Entropy (A, B, C)

Chapter 4 Information Channels (A, B, C)

Chapter 5 Using an Unreliable Channel (A, B)

Chapter 6 Error-correcting Codes (A, B, C, D)

Chapter 7 Linear Codes (A, B, C, D, E)

 

Lecture Slides on Probability and Mathematical Fundamentals

Overview of Probability: Probability-A and Probability-B

Mathematical Fundamentals (Part-A, Part-B, Part-C)

 

Course Syllabus and Major Topics

 

Course Description

Topics to be selected from error detecting and correcting codes, encryption and decryption techniques, RSA and knapsack codes, algebraic coding theory, Hamming distance, sphere packing and its relation to optimal codes, Hamming, Huffman and Gray codes, entropy, channel capacity and Shannon's theorem, bandwidth and the sampling theorem.

 

Course Outcomes:

Students who successfully complete the course should be able to

1.      Determine whether a given code can be decoded uniquely or is instantaneous, construct instantaneous codes, including Huffman code for a source or an extension of a source, and compute the average word length.

2.      Describe the concept of Entropy and the meaning of Shannon's First Theorem, compute Entropy for a source, extension, and products, and compute word lengths for Shannon-Fane codes.

3.      Describe the concepts and definitions of information channel, system entropies (input entropy, output entropy, equivocation, and join entropy), mutual information, and channel capacity, and apply them to BSC and BEC.

4.      Describe Shannon’s fundamental theorem, apply the ideal observer rule, the maximum likelihood rule, and the nearest neighbor decoding to BSC and BEC, and compute PrE and PrC

5.      Compute the (extended) Hamming code, Hamming's sphere-packing bound, and the Gilbert-Varshamov Bound, and construct a Hadamard matrix and its corresponding codes.

6.      Compute the generator matrix and parity-check matrix (in systematic form) of a linear code and the minimum distance of a linear code. Calculate with the Hamming Codes, the Golay Codes and the Standard Array.

 

Homework Assignments

Homework assignments will be given and submitted in Canvas.