Dataflow programming studies began in the 1970s, as the limits of von Neuman (normal) computers were found: such systems use an inherent control-driven programming model. Dataflow models are alternative to, and different from this, and are much studied and researched in many areas of basic computer science: computation models, programming languages, machine architecture, compilers, parallelism. Many early dataflow researchers thought dataflow would become a new and general model of computing, able to exploit all parallel aspects extant in general purpose programs, and that it would enable developing high-level languages where programmers need not manage all details of efficient program and data mapping on parallel machines. Most dataflow research and papers are on architecture and language aspects of dataflow models in fine-grained parallelism and program execution. Dataflow concepts exist in many aspects of normal computing, such as pipelining and multiple instruction issue techniques used in many RISC processors. Dataflow programming was once a promising approach to a new generation of high-performance computing, but still may be too immature to be a mainstream technology in general parallel computing, and is not much accepted and used in the high-performance community and industry, even though it has many benefits and unique traits that can be used in today's parallel software environment. Likely, von Neuman-based processors, on which current parallel technology is built, will dominate high-performance computing and applications for years to come.
Related categories 3
Computation Structures Group: CSG
MIT research group which does dataflow programming research.
Growing article, with links to many related topics. [Wikipedia]
Declarative Ada: parallel dataflow programming in a familiar context
From Proceedings of the 1995 ACM 23rd annual conference on Computer science conference.
Pure Data (PD)
Real-time graphical dataflow programming environment for audio, video, and graphical processing.
ResearchIndex: Coarse-Grain Dataflow Programming of Conventional Parallel Computers
Granular Lucid, GLU: coarse-grain dataflow language for programming conventional parallel computers; based on Lucid (circa 1994) an implicitly parallel, multidimensional dataflow language. A GLU program is a Lucid program with imperatively defined data functions and data types.
Last update:August 7, 2011 at 8:24:04 UTC