coding of data

Storing or transmitting information can use coded symbols {coding, computer} {encoding, computer} {decoding}.

types

Codes can use binary numbers, decimal numbers, words, images, and switches. Code size depends on possible-symbol number and probability.

probability

Symbols can have equal or unequal probability. Binary code use symbols 0 and 1, with equal probability. Probability is base-2 logarithm: P = - log(2^1) = 1/2 for one position. Information {average unexpectedness} is greatest if all symbols are equally probable.

information

Information measurement uses binary units. For binary code, with two equal-probability symbols, information in one position is 1 bit, 2^1 = 2. Information in three positions is 3 bits, 2^3 = 8, with eight possible series, such as 010.

Related Topics in Table of Contents

Mathematical Sciences>Computer Science>Software>Coding

Whole Section in One File

3-Computer Science-Software-Coding

Drawings

Drawings

Contents and Indexes of Topics, Names, and Works

Outline of Knowledge Database Home Page

Contents

Glossary

Topic Index

Name Index

Works Index

Searching

Search Form

Database Information, Disclaimer, Privacy Statement, and Rights

Description of Outline of Knowledge Database

Notation

Disclaimer

Copyright Not Claimed

Privacy Statement

References and Bibliography

Consciousness Bibliography

Technical Information

Date Modified: 2022.0224