This book is intended to introduce coding theory and information theory to undergraduate students of mathematics and computer science. It begins with a review of probablity theory as applied to finite sample spaces and a general introduction to the nature and types of codes. The two subsequent chapters discuss information theory: efficiency of codes, the entropy of information sources, and Shannon's Noiseless Coding Theorem. The remaining three chapters...
Related Subjects
Artificial Intelligence Coding Theory Computer Science Computers Computers & Technology Education & Reference Math Mathematics Science Science & Math Science & Scientists Science & Technology Software Design, Testing & Engineering Systems Analysis & Design Textbooks Theory of Computing