Single molecules perform a variety of tasks in cells, from replicating, controlling and translating the genetic material to sensing the outside environment. These operations all require that specific actions take place. In a sense, each molecule must make tiny decisions. To make a decision, each "molecular machine" must dissipate an energy Py in the presence of thermal noise Ny. The number of binary decisions that can be made by a machine which has dspace independently moving parts is the "machine capacity" Cy = dspace log2 [(Py + Ny)/Ny]. This formula is closely related to Shannon's channel capacity for communications systems, C = W log2 [(P + N)/N]. This paper shows that the minimum amount of energy that a molecular machine must dissipate in order to gain one bit of information is epsilon min = kB T ln (2) joules/bit. This equation is derived in two distinct ways. The first derivation begins with the Second Law of Thermodynamics, which shows that the statement that there is a minimum energy dissipation is a restatement of the Second Law of Thermodynamics. The second derivation begins with the machine capacity formula, which shows that the machine capacity is also related to the Second Law of Thermodynamics. One of Shannon's theorems for communications channels is that as long as the channel capacity is not exceeded, the error rate may be made as small as desired by a sufficiently involved coding. This result also applies to the dissipation formula for molecular machines. So there is a precise upper bound on the number of choices a molecular machine can make for a given amount of energy loss. This result will be important for the design and construction of molecular computers.