The success of theoretical computer science is measured both by its impact on practice and by its contribution to the understanding of computation and the limits thereof.
Theoretical computer science has made contributions to many areas. In the 1930's Church, Godel, Kleene, Turing and others developed fundamental models of computation that continue to be relevant. In the 1940's Shannon, von Neumann and Weiner and advanced computer science through the study of circuits and other models of computation. The 1950's and 60's saw development of formal languages as well as models of computation such as the finite-state machine and the pushdown automaton. This work continues to be used in programming and the design and translation of languages today.
Algorithms and data structures is an area that has seen an enormous amount of effort by theoreticians whose impact on practice has been immense. Activity began in this area as early as the 1950's and has grown in importance to the present time. Among its notable achievements are fast and efficient algorithms for sorting, searching and data retrieval, computations on strings and graphs, computational geometry, scientific computation, and constraint and optimization problems.
During the 1960's the foundations of computational complexity were developed. Hierarchies of problems were identified and speed-up and gap theorems established. This area flourished in the 1970's and beyond, in particular, through the identification of the NP-complete, P-complete and PSPACE-complete languages. In the 1970's connections were established between Turing and circuit complexity, thereby spawning a new examination of this topic. Space-time tradeoff studies were also initiated. Research in all of these areas continues today.
The late 1970's saw the emergence of the VLSI model as well as methods of analysis for it. This led to a burst of activity on the part of theoreticians in the 1980's in this area and to the rebirth of computer architecture under the label of VLSI.
The early 1980's saw the introduction of the first models for memory hierarchies and to the development of I/O complexity as an important area of research. Several different models capturing various aspects of memory hierarchies emerged and research in this area is now leading to a redesign of I/O systems.
Public key cryptography emerged in the mid to late 1970's and has spawned a flood of interesting ideas having to do with secure communication and interactive and zero-knowledge proof systems. Today systems based on some of these ideas are used to provide security for purchases over the Internet.
Beginning in the late 1970's, theoretical research on parallelism has resulted in new parallel computer architectures and new paradigms for programming parallel machines that have been offered in products. We have learned that even the ``simple'' PRAM parallel machine model is very difficult to program efficiently. New algorithms have had to be invented for old problems because old ones could not exploit parallelism. Through theoretical studies we have come to better understand why some problems are much more difficult to program in parallel than others. Since computers will become increasingly parallel as we approach the limits of performance of serial VLSI-based computers, the impact of research in this area will continue to grow.
The formal study of the semantics of programming languages, an area that became active in the 1960's, has led to a much better understanding of programming constructs, thereby guiding the design of successors to pioneering languages, such as LISP, Prolog, and Simula, that were invented to simplify the design and maintenance of complex applications. The study of programming language semantics also provides a fundamental tool for the development of static analysis and program transformations, two key steps in compiler optimization.
The formal study program verification techniques also began in the late 1960's. It has also had an important impact on programming language design. Recent results in model checking show high promise for a significant industrial impact.
Relational database theory emerged in the early 1970's and, together with many advances in data structures, optimization methods and other contributions, has had a major impact on practice. Today relational database are commonplace commercial products that are widely used throughout the world.
The often subtle behavior of concurrent and distributed systems, which became and active area of research on the late 1960's and early 1970's, has benefited immensely from formal modeling and analysis. Models of concurrent systems have been introduced and tools to understand concurrency applied. Transactions processing, which involves issues of serializability, locking, logging, timestamping, replication and orphan handling, have also been subjected to modeling and analysis, thereby insuring correct handling of these issues. The same is true of I/O which depends on the correct and efficient implementation of data structures used in this context. All of this work has a direct impact on practice.
Learning theory, which saw a burst of activity on the part of theoretical computer scientists in the 1980's, has led to new and deeper understanding of the problems for which active and passive learning is possible, thereby serving as a guide for the intelligent use of computers.
Computational biology and the human genome project have benefited from the algorithm design and complexity analysis that theoretical computer scientists have brought to them. This work will continue to be important in the future.
Approximation algorithms for hard computational problems have long been sought. The development of a theory of polynomially checkable proofs provides a major improvement in our understanding of the degree to which NP-complete problems are approximable. This knowledge will serve as an important guide in the search for good approximation algorithms.Fonte: http://www.cs.brown.edu/~jes/future.theory/node4.html#SECTION00040000000000000000