CIS 375 SOFTWARE ENGINEERING

UNIVERSITY OF MICHIGAN-DEARBORN

DR. BRUCE MAXIM, INSTRUCTOR

Date: 10/13/97

Week 6

  1. Structured analysis and design technique(SADT)

Structured analysis activity diagram:

(Activity Network)

Control

Inputs Output

Mechanism

Structured analysis data diagram:

(Data Network)

Control

Generating Activity Resulting Activity

Storage Devices

Design techniques:

Explains how to interpret results of SA (control issues).

  1. Structured system analysis:
  2. (Database way of attacking things)
  3. Requirements dictionary:
  4. (a.k.a. data dictionary)
  5. 1) Name.
  6. 2) Aliases.
  7. 3) Where something's used & how it is used (producer & consumer).
  8. 4) Content description (motivation for representation).
  9. 5) Supplementary information - restrictions, limitations.

OBJECT ORIENTED ANALYSIS

OBJECT ORIENTED DESIGN

  • Object.
  • (Class = type vs. Instance = var)
  • Encapsulation
  • (Data & methods)
  • Message passing.
  • (Method call & return)
  • Inheritance.
  • (Polymorphism & reuse)
  • Dynamic bindings.
  • (Types & methods)
  • IDENTIFY THE PROBLEM OBJECTS

  • (classes) = nouns (not procedure names)
  • Example:
    1. 1) External entities (people & devices).
    2. 2) Things in problem domain (reports, displays, signals).
    3. 3) Occurrences or events (completion of some task).
    4. 4) Roles (manager, engineer, sales person).
    5. 5) Organizational units (division, groups, department).
      1. 6) Structures (sensors, vehicles, computers).

    CRITERIA:

  • (Object or not)
    1. 1) Does object inf. need to be retained?
    2. 2) Does object have a set of needed services?
      1. (Can change it's attributes)
    3. 3) Does the object have major attributes?
      1. (Trivial objects should not be built)
    4. 4) Identify common attributes for all object instances.
    5. 5) Identify common operations for all object instances.
      1. (If nothing to share, why make it an object?)
      2. External entities which produce or consume must have defined classes.

    SPECIFYING ATTRIBUTES:

  • Similar to building data dictionary.
  • (define in terms of atomic objects)
  • SPECIFYING OPERATIONS:

    1. Include anything needed to manipulate data elements.
    2. Communication among objects.

    OBJECT SPECIFICATION:

    1. Object name.
    2. Attribute description:
      1. Attribute name.
      2. Attribute content.
      3. Attribute data type/structure.
    3. External input to object.
    4. External output from object.
    5. Operation description:
      1. Operation name.
      2. Operation interface description.
      3. Operation processing description.
      4. Performance issues.
      5. Restriction and limitations.
    6. Instance connections:
  • (0:1, 1:1, 0:many, 1:many)
    1. 6.1. Message connections.
  • OBJECT ORIENTED ANALYSIS STEPS:

    1. Class modeling.
    1. (Build an object model similar to an ER diagram)
      1. Dynamic modeling.
    2. (Build a finite state machine type model)
      1. Functional modeling.
  • (Similar to data flow diagram)
  • CASE STUDY:

    ELEVATOR MODEL.

  • n - elevators.
  • m - floors in building.
  • each floor has two buttons (except ground & top).
  • CLASS MODELING

    DYNAMIC MODELING:

  • "Normal" schemas (and 1 or 2 abnormal).
  • -> production rules (describe state transitions).
  • FUNCTIONAL MODELING:

  • (Identify source & destination node)
  • OBJECT-ORIENTED LIFE CYCLE MODEL:

    FOUNTAIN MODEL:

  • Bottom up design.
  • Date: 10/15/97

    Week 6

    CLASS RESPONSIBILITY COLLABORATOR MODEL (CRC):

  • Responsibilities:
    1. Distributed system intelligence.
    2. State responsibility in general terms.
    3. Information and related behavior in same class.
    4. Information attributes should be localized.
    5. Share responsibilities among classes when appropriate.
    6. Collaborators build a CRC card
  • (Build a paper model, see if it works on paper)
  • On card:
    • Class name.
    • Class type.
    • Class characteristics.
    • Responsibility/collaborators.
  • (System is basically acted out)
  • MANAGEMENT & OBJECT ORIENTED PROJECTS:

    1. Establish a common process framework (CPF).
    2. Use CPF & historic data to eliminate time & effort.
    3. Specify products & milestone.
    4. Define Q.A. checkpoints.
    5. Manage changes.
    6. Monitor.
  • Project Metrics:
    1. 1) Number of scenario scripts.
    2. 2) Number of key classes.
    3. 3) Number of support classes.
      1. 4) (# of key classes)/(# of support classes).

    OBJECT ORIENTED ESTIMATING & SCHEDULING:

    1. Develop estimates using effort decomposition, FP, etc.
    2. Use O.O.A. to develop scenario script and count them.
    3. Use O.O.A. to get the number of key classes.
    4. Categorize types of interfaces:
    5. No U.I. = 2.0 G.U.I. = 3.0
    6. Text U.I. = 2.25 Complex G.U.I. = 3.0
    7. Use to derive support classes:
    8. (# of key classes) * (G.U.I. #)
    9. Total classes = (key + support) * (average # of work units per class).
    10. ( 15-20 person days per class)
    11. Cross check class based estimate (5) by multiplying.
    12. (avg. # of work units) * (# of scenario scripts)

    PROJECT SCHEDULING METRICS:

    1. Number of major iterations (around spiral model).
    2. Number if completed contracts.
  • (Goal: at least 1 per iteration)
  • OBJECT ORIENTED MILESTONES:

    1. Contracts completed.
    2. O.O.A. completed.
    3. O.O.D. completed.
    4. O.O.P. completed.
    5. O.O. testing completed.

    SYSTEM DESIGN:

    1. Conceptual Design (function) {for the customer}.
    2. Technical design (form) {for the hw & sw experts}.

    CHARACTERISTICS OF CONCEPTUAL DESIGN:

    1. Written in customer's language.
    2. No techie jargon.
    3. Describes system function.
    4. Should be implement independent.
    5. Must be derived from requirements document.
    6. Must be cross referenced to requirements document.
    7. Must incorporate all the requirements in adequate detail.

    TECHNICAL DESIGN CONTENTS:

    1. System architecture (sw & hw)
    2. System software structure.
    3. data

    DESIGN APPROACHES:

    1. Decomposition (function based, well defined).
  • OR
    1. Composition (O.O.D., working from data types not values).
  • PROGRAM DESIGN:

    PROGRAM DESIGN MODULES:

    1. Must contain detailed algorithms.
    2. Data relationships & structures.
    3. Relationships among functions.

    PROGRAM DESIGN GUIDELINES:

    1. Top - down approach.
    1. vs.
      1. Bottom - up approach.
    2. (There are risks and benefits for both)
      1. Modularity & independence (abstraction).
      2. Algorithms:
        1. Correctness.
        2. Efficiency.
        3. Implementation.
      3. Data types:
        1. Abstraction.
        2. Encapsulation.
        3. Reusability.

    WHAT IS A MODULE?

  • A set of contiguous program statements with:
    1. 1) A name.
    2. 2) The ability to be called.
    3. 3) Some type of local environment (variables).
    4. (Self contained, sometimes can be compiled separately)
    5. Modules either contain executable code or creates data.

    DETERMINING PROGRAM COMPLEXITY:

  • {More modules - less errors in each, but more errors between them}
  • PROGRAM COMPLEXITY FACTORS:

    1. 1. Span of the data item.
      1. (too many lines of code between uses of a variable)
        1. 2. Span of control.
      2. (too many lines of code in one module)
        1. 3. Number of decision statements.
      3. (too many "if - then - else", etc.)
        1. 4. Amount of information which must be understood.
      4. (too much information to have to know)
        1. 5. Accessibility of information and standard presentation.
      5. (x,y -> x,y; not y1,y2)
        1. Structured information.
      6. (good presentation)