Página InicialGruposDiscussãoMaisZeitgeist
Pesquisar O Sítio Web
Este sítio web usa «cookies» para fornecer os seus serviços, para melhorar o desempenho, para analítica e (se não estiver autenticado) para publicidade. Ao usar o LibraryThing está a reconhecer que leu e compreende os nossos Termos de Serviço e Política de Privacidade. A sua utilização deste sítio e serviços está sujeita a essas políticas e termos.

Resultados dos Livros Google

Carregue numa fotografia para ir para os Livros Google.

An Introduction to Information Theory:…
A carregar...

An Introduction to Information Theory: Symbols, Signals, and Noise (original 1961; edição 1980)

por John R. Pierce

MembrosCríticasPopularidadeAvaliação médiaDiscussões
632236,894 (3.92)Nenhum(a)
Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. Beginning with the origins of this burgeoning field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy, language and meaning, efficient encoding, and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. An Introduction to Information Theory continues to be the most impressive nontechnical account available and a fascinating introduction to the subject for lay listeners.… (mais)
Membro:AndrewMcBurney
Título:An Introduction to Information Theory: Symbols, Signals, and Noise
Autores:John R. Pierce
Informação:Dover Publications (1980), paperback
Coleções:A sua biblioteca
Avaliação:
Etiquetas:Mathematics, Information Theory

Informação Sobre a Obra

An Introduction to Information Theory: Symbols, Signals, and Noise por John R. Pierce (1961)

Nenhum(a)
A carregar...

Adira ao LibraryThing para descobrir se irá gostar deste livro.

Ainda não há conversas na Discussão sobre este livro.

Mostrando 2 de 2
(Original Review, 1980-12-05)

Final answer to question, "How many joules to send a bit?"

The unit of information is determined by the choice of the arbitrary scale factor K in Shannon's entropy formula:

{ s(Q|X) = -K SUM(p*ln(p)) }

If K is made equal to 1/ln(2), then S is said to be measured in "bits" of information. A common thermodynamic choice for K is kN, where N is the number of molecules in the system considered and k is 1.38e-23 joule per degree Kelvin, Boltzmann's constant. With that choice, the entropy of statistical mechanics is expressed in joules per degree. The simplest thermodynamic system to which we can apply Shannon's equation is a single molecule that has an equal probability of being in either of two states, for example, an elementary magnet. In this case, p=.5 for both states and thus S=+k ln(2). The removal of that much uncertainty corresponds to one bit of information. Therefore, a bit is equal to k ln(2), or approximately 1e-23 joule per degree K. This is an important figure, the smallest thermodynamic entropy change that can be associated with a measurement yielding one bit of information.

The amount of energy needed to transmit a bit of information when limited by thermal noise of temperature T is:

E = kT ln 2 (Joules/bit)

This is derived from Shannon's initial work (1) on the capacity of a communications channel in a lucid fashion by Pierce (2), although it is not obvious that he was the first to derive it. This limit is the same as the amount of energy needed to store or read a bit of information in a computer, which Landauer derived (3) from entropy considerations without the use of Shannon's theorems. Pierce's book is reasonably readable. On page 192 he derives the energy per bit formula (Eq. 10.6), and on page 200 he describes a Maxwell Demon engine generating kT ln 2 of energy from a single molecule and showing that the Demon had to use that amount of energy to "read" the position of the molecule. Then on page 177 Pierce points out that one way of approaching this ideal signalling rate is to concentrate the signal power in a single, short, powerful pulse, and send this pulse in one of many possible time positions, each of which represents a different symbol. This is essentially the concept behind the patent (4) which led me to ask the original question. My thanks to those who helped with their replies.

REFERENCES

1. C. E. Shannon, "A Mathematical Theory of Communication", Bell
System Tech. J., Vol. 27, No. 3, 379-423 and No. 4, 623-656
(1948); re-printed in: C. E. Shannon and W. Weaver, "The
Mathematical Theory of Communication", University of Illinois
Press, Urbana, Illinois (1949).
2. J. R. Pierce, "Symbols, Signals and Noise", Harper, NY (1961)
3. R. Landauer, "Irreversibility and Heat Generation in the
Computing Process," IBM J. Res. & Dev., Vol. 5, 183 (1961).
4. R. L. Forward, "High Power Pulse Time Modulation
Communication System with Explosive Power Amplifier Means",
U. S. Patent 3,390,334 (25 June 1968).

[2018 EDIT: This review was written at the time as I was running my own personal BBS server. Much of the language of this and other reviews written in 1980 reflect a very particular kind of language: what I call now in retrospect a “BBS language”.] ( )
  antao | Nov 6, 2018 |
Brilliant and inspiring book. Enjoyed it immensely. Much use of highlighter. ( )
  jaygheiser | Jul 23, 2008 |
Mostrando 2 de 2
sem críticas | adicionar uma crítica

» Adicionar outros autores

Nome do autorPapelTipo de autorObra?Estado
John R. Pierceautor principaltodas as ediçõescalculado
Dorland, Cees vanTradutorautor secundárioalgumas ediçõesconfirmado
Newman, James R.Editorautor secundárioalgumas ediçõesconfirmado
Tem de autenticar-se para poder editar dados do Conhecimento Comum.
Para mais ajuda veja a página de ajuda do Conhecimento Comum.
Título canónico
Informação do Conhecimento Comum em inglês. Edite para a localizar na sua língua.
Título original
Títulos alternativos
Informação do Conhecimento Comum em inglês. Edite para a localizar na sua língua.
Data da publicação original
Pessoas/Personagens
Locais importantes
Acontecimentos importantes
Filmes relacionados
Epígrafe
Dedicatória
Informação do Conhecimento Comum em inglês. Edite para a localizar na sua língua.
To Claude and Betty Shannon
Primeiras palavras
Informação do Conhecimento Comum em inglês. Edite para a localizar na sua língua.
In 1948 Claude E. Shannon published a paper called "A Mathematical Theory of Communication";[sic] it appeared in book form in 1949.
Citações
Últimas palavras
Informação do Conhecimento Comum em inglês. Edite para a localizar na sua língua.
Nota de desambiguação
Editores da Editora
Autores de citações elogiosas (normalmente na contracapa do livro)
Língua original
DDC/MDS canónico
LCC Canónico

Referências a esta obra em recursos externos.

Wikipédia em inglês (1)

Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. Beginning with the origins of this burgeoning field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy, language and meaning, efficient encoding, and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. An Introduction to Information Theory continues to be the most impressive nontechnical account available and a fascinating introduction to the subject for lay listeners.

Não foram encontradas descrições de bibliotecas.

Descrição do livro
Resumo Haiku

Current Discussions

Nenhum(a)

Capas populares

Ligações Rápidas

Avaliação

Média: (3.92)
0.5
1
1.5
2 1
2.5
3 9
3.5 3
4 9
4.5
5 10

É você?

Torne-se num Autor LibraryThing.

 

Acerca | Contacto | LibraryThing.com | Privacidade/Termos | Ajuda/Perguntas Frequentes | Blogue | Loja | APIs | TinyCat | Bibliotecas Legadas | Primeiros Críticos | Conhecimento Comum | 204,240,811 livros! | Barra de topo: Sempre visível