Storing or transmitting information can use coded symbols {coding, computer} {encoding, computer} {decoding}.
types
Codes can use binary numbers, decimal numbers, words, images, and switches. Code size depends on possible-symbol number and probability.
probability
Symbols can have equal or unequal probability. Binary code use symbols 0 and 1, with equal probability. Probability is base-2 logarithm: P = - log(2^1) = 1/2 for one position. Information {average unexpectedness} is greatest if all symbols are equally probable.
information
Information measurement uses binary units. For binary code, with two equal-probability symbols, information in one position is 1 bit, 2^1 = 2. Information in three positions is 3 bits, 2^3 = 8, with eight possible series, such as 010.
Mathematical Sciences>Computer Science>Software>Coding
3-Computer Science-Software-Coding
Outline of Knowledge Database Home Page
Description of Outline of Knowledge Database
Date Modified: 2022.0224