I'm pretty sure the vast majority of you have never heard of Claude Shannon, and, frankly, there's no real reason why the vast majority of you should have.
Shannon was born in Petoskey, Michigan, on April 30, 1916, to Claude Shannon, Sr. and his wife, Mabel Wolf Shannon, and attended Gaylord High School in Gaylord, Michigan, where his mother was the principal. As a child, he gravitated towards mathematics and science, and would spend hours building radio-controlled model boats, and even constructed a wireless telegraph system that enabled him to talk to a friend who lived half a mile away.
As one would perhaps expect, Shannon's childhood idol was Thomas Edison, but, unlike most childhood idols, in an ironic twist, he was to find out later in life that Edison was, in fact, a distant cousin of his. This fills me with hope because George Clooney DEFINITELY has similar ears to me... just sayin'.
Anyway, after graduating Gaylord HS, Shannon entered the University of Michigan and took two simultaneous bachelor's degrees in—you guessed it— mathematics and electrical engineering. It was these studies that introduced him to the work of George Boole, the inventor of Boolean Logic, which underpins the operating systems of modernday computers. Boole's work would fascinate and inspire Shannon, putting him on a course to become one of the most important people you've never heard of.
After graduation, Shannon entered the famed Massachusetts Institute of Technology (MIT), and it was here that his work led to him publishing what has been called "the most important master's thesis of all time." High praise indeed.
The thesis in question was Shannon's "A Symbolic Analysis of Relay and Switching Circuits," and whilst I have been meaning to getting around to reading it for, oh, about thirty years now, I am prepared to rely on the opinion of professor Howard Gardner of Harvard University, who believed that Shannon's work was "possibly the most important, and also the most famous, master's thesis of the century."
Personally, I was delighted when the July 20, 2011 edition of Things That Make You Go Hmmm... was dubbed "One of Several Things I Read This Week" by none other than my father, but I realize that Shannon and I operate at differing intellectual altitudes.
During WWII Shannon, then working for Bell Labs, became one of America's finest cryptographers, and his code-breaking skills became the stuff of legend, while in his spare time, he spent hours indulging his love of, amongst other eclectic hobbies, juggling, unicycling, and chess, and invented scores of devices including rocket-powered flying discs, a motorized pogo stick, and something Shannon deliciously called "the Ultimate Machine":
(Wikipedia): Otherwise featureless, the box possessed a single switch on its side. When the switch was flipped, the lid of the box opened and a mechanical hand reached out, flipped off the switch, then retracted back inside the box.
Shannon's wicked brilliance led him to the development of "Information Theory" (he was dubbed the "Father of Information Theory"), which, in partnership with the renowned physicist John L. Kelly, Jr., he advanced into a series of methods that have broad application in both the gambling and investment worlds to this day and have been followed by such luminaries as Warren Buffett, Bill Gross, and Jim Simons, as well as the MIT Blackjack Team who famously brought down the house in Las Vegas (as chronicled in Ben Mezrich's book of the same name).
(Wikipedia): Shannon and his wife Betty also used to go on weekends to Las Vegas with MIT mathematician Ed Thorp, and made very successful forays in blackjack using game-theory- type methods co-developed with fellow Bell Labs associate, physicist John L. Kelly Jr. based on principles of information theory. They made a fortune, as detailed in the book Fortune's Formula by William Poundstone and corroborated by the writings of Elwyn Berlekamp, Kelly's research assistant in 1960 and 1962. Shannon and Thorp also applied the same theory, later known as the Kelly criterion, to the stock market with even better results. Over the decades, Kelly's scientific formula has become a part of mainstream investment theory and the most prominent users, well-known and successful billionaire investors Warren Buffett, Bill Gross, and Jim Simons use Kelly's methods.
The theory was also exploited by the famous MIT Blackjack Team, which was a group of students and ex-students from the Massachusetts Institute of Technology, Harvard Business School, Harvard University, and other leading colleges who used card-counting techniques and other sophisticated strategies to beat casinos at blackjack worldwide. The team and its successors operated successfully from 1979 through the beginning of the 21st century. Many other blackjack teams have been formed around the world with the goal of beating the casinos.
Claude Shannon's card-counting techniques were explained in Bringing Down the House, the best-selling book published in 2003 about the MIT Blackjack Team by Ben Mezrich.
Part of Information Theory is something called the Shannon-Hartley Theorem, which Shannon developed with the help of Ralph Hartley, a colleague at Bell Labs. This theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise.
It has become known popularly as the Signal-to-Noise Ratio.
In layman's terms, it's a way of determining how much of what you are hearing is actually important and how much is just interference that could possibly deter you from understanding the true message.
Take, for example, the economic "recovery" in the United States.
Every "signal" that gets released by the BLS (to pick one government bureau at random) is accompanied by a wealth of "noise" from both sides of the data. It begins before the figures are released with a thousand predictions of what the number will be. Those arbitrary estimates collectively become the benchmark for expectations, and then, once the world gets a look at the official statistic, the post-mortem begins and the noise level really ratchets up.
Case in point: the non-farm payrolls number.
This is one of the most-watched of all economic statistics and can be a genuine "marketmover." The most recent number, released on Friday—four days before what will no doubt be an incredibly closely fought US election—provides the perfect example of just how much "noise" can surround a given "signal," and, just to add a little more static crackle to the mix, we had to deal with the massive variable that was Hurricane Sandy.
A search on Google News for "BLS Jobs" demonstrates just how much this noise has ratcheted up since 2008 and how increasingly important it has become as we near the 2012 presidential election in the US:
It also demonstrates how regional the interest really is amongst the world's population to the number that "everybody" cares about so much.
The simple fact is, anyone involved in the financial sector has their own "Signal-to-Noise" issues that we have to deal with in that we focus obsessively on every number, every release, every tiny hint of what may make a difference—and we assume everybody else does the same thing.