Path: utzoo!yunexus!geac!daveb
From: daveb@geac.UUCP (David Collier-Brown)
Newsgroups: comp.software-eng
Subject: Re: Writing code/Basics of Program Design)
Summary: Two other paradigms.
Message-ID: <2992@geac.UUCP>
Date: 12 Jul 88 13:33:15 GMT
Article-I.D.: geac.2992
References: <900@td2cad.intel.com> <580001@hpiacla.HP.COM>
Reply-To: daveb@geac.UUCP (David Collier-Brown)
Organization: GEAC Computers, Toronto, CANADA
Lines: 74

In article <580001@hpiacla.HP.COM> scottg@hpiacla.HP.COM (Scott Gulland) writes:
>Most of the projects I have worked on have had between five and forty
>engineers working on them and take one to two years to complete.  The
>first step in all of these has been to develop the functional requirements
>and produce a psuedo user manual.  This precisely defines the desired
>results of the product and helps to drive the latter test phases.
>At the completion of this step, the team begins to consider the design
>of the product.
[followed by a good discussion of the waterfall model]

  There are two other basic approaches to a largish project, one of
which is best described as "predictor-corrector", and another best
described as a "modelling" approach.

  I'll leave discussion of predictor-corrector for later (I love it
some much I'm tempted to bias my comments), and talk a little bit
about a modelling approach.

  On a large project I was involved in recently, the design phase
consisted of building a very complete model of the processing that
would be done, and then transforming that model into self-consistent
objects which would provide the behaviors that the model (and the
real-world system) required.
  For lack of another technique, we modelled the system as a whole in
terms of data flows, thus producing an input/output processing
(procedural) model of the system, which we then broke up by
locality-of-reference into a suite of readily identifiable
"objects". We then defined their behaviors to provide all the methods
that would be needed to provide
	1) all the required services to the user
	2) all the required facilities described by the model.

  I can recommend this method for projects of moderate size (20-odd
person-years), subject to the following caveats:

	1) The requirements are complete, and can be shown to be
	achievable by prototyping or some more formal technique.

	2) The system is not "complex".  By this I mean that it does
	one well-defined thing, not a cross-product of several
	things. It can be as large ("compound") as you like, so long
	as it has a natural structure to guide the model-builders
	and is not just a collection of random kludges.

	3) The model is verifiable and validatable.  Dataflow
	diagrams are an elderly idea, but still valuable and are
	well-supported by tools and manual techniques. We therfore
	could validate the model against both requirements and the
	actual experience of experts in the field, and verify that
	it was self-consistent.  

	4) The transformation into a design is well-understood by at
	least one person on the team.  We were fortunate in having a
	manager who had a good grasp of what an "object" was, and
	staff members who could visualize a (storage) object in
	terms which were easily definable in terms of a database's
	functions.

  Use of a model, even a dataflow model, directly as a structure for
a program carries with it some nasty risks, the most notable of
which is tunnel vision... Transforming a conceptual model into an
executable model (or design) gives on the chance to look at it from
a different direction, with different assumptions.

  --dave c-b

ps: there are several other "modelling" approaches, including
    "Michael Jackson" Design and some varieties of object-oriented
    programming
-- 
 David Collier-Brown.  {mnetor yunexus utgpu}!geac!daveb
 Geac Computers Ltd.,  | "His Majesty made you a major 
 350 Steelcase Road,   |  because he believed you would 
 Markham, Ontario.     |  know when not to obey his orders"