Jane C. Blake,
The integration of distinct parts to form a useful and effective
whole is the underlying theme for two sets of topics in this
issue. The opening papers describe the integration of programming
tools to create a graphical software development environment. The
second set of papers addresses the integration of large, complex
systems --- systems that encompass all the software and hardware
components needed to serve the user's purpose.
The DEC FUSE software development product is designed to take
advantage of UNIX workstations' graphical capabilities,
supporting such programming languages as C, C++, and Fortran.
Rich Hart and Glenn Lupton review the origins of DEC FUSE in the
FIELD environment developed at Brown University and compare FUSE
with similar environments based on a tool integration model. The
authors present two key aspects of the product design: graphical
user interfaces built on top of UNIX commands and a multicast
messaging mechanism that allows the tools to work together.
A tool recently integrated into the DEC FUSE suite is the Data
Visualizer, which allows software developers to display thousands
of lines of code with associated statistics. Don Zaremba
describes the process of taking the tool from advanced
development through implementation, and relates what the
engineers learned as they adapted current visualization research
to their goals and built prototypes of the technology. He
concludes with a description of the resulting product and plans
for future work.
Our next three papers explore experiences with different aspects
of systems-level engineering and integration. Eric Newcomer's
overview of the Multivendor Integration Architecture (MIA)
effort, initiated by Nippon Telegraph and Telephone (NTT),
highlights many factors that in general make systems integration
challenging. NTT sought, through standardization, to resolve the
costly problem of incompatible application environments. Eric
discusses the MIA's chosen direction based on the need for
portability, interoperability, and a common user interface. He
then describes Digital's contribution in the area of distributed
transaction processing and summarizes the MIA consortium's
successes and continuing work.
A specific object-oriented product developed to integrate systems
applications is the subject of Jim Kirkley's and Wick Nichols'
paper. Comprising Jacobson's and Rumbaugh's methodologies,
third-party software, and Digital's CORBA-compliant ObjectBroker,
the Framework-based Environment (FBE) product addresses the need
for new and legacy applications to interoperate in a distributed
manufacturing system. The authors step through a typical
integration project and expand on trade-offs that must be
addressed in an integration project that takes an object view of
the system environment.
A major systems engineering project to solve the problem of
ongoing introductions of software into a large computer network
is described in the concluding paper by Owen Tallman. The
project, commissioned by a large French bank, extended over a
network of data center clustered servers, branch servers, and
thousands of workstations and personal computers. Owen outlines
the customer's requirements and Digital's role as developer of
the automated software deployment facility. He reviews the
configuration management model (CMM) and other models that were
the basis for the project team's work. His discussion of the
implementation encompasses examples that illustrate the
intricacies of a rigorously managed software deployment process.
The editors thank Mikael Rolfhamre of Digital's UNIX Business
Segment, Ed Balkovich of Digital's Corporate Research Group, and
Hank Jakiela of the Systems Business Unit for their help in
developing this issue. At the end of the issue, we also
acknowledge and thank the referees for their very valuable
reviews of manuscripts submitted during this past year.
Upcoming topics in the Journal are Digital's high-performance
Fortran compiler and parallel software environment, and the
Sequoia 2000 global change research project.