We want to investigate how many steps a computer can execute in a reasonable amount of time. This will give us a rough, but interesting, upper bound on how much we can expect out of an algorithm.
The basic measure of a computational step for a CPU is derived from the frequency of the crystal which provides the clock for the CPU, e.g., a 100 MHz frequency allows 100 million basic steps per second. A simple CPU instruction can be executed in such a step. Commercial workstations at present operate around 200 MHz, and super computers at less than than 400 MHz.
With these numbers we can give an upper bound for the number of basic steps a computer can carry out on a particular project. (The particular numbers are not critical to our discussion - if you want to be more certain, take the thousandfold of each one.)
Now, from the simple fact that
we have the following approximate conversion of years to nanoseconds:
and thus a project requiring
(as ). Actually one could probably claim that even basic steps is hopeless for todays computers. In terms of powers of 2 we have
as . In particular this says that trying to compute a truth table with 85 variables is just impossible.