CIS 375 SOFTWARE ENGINEERING

UNIVERSITY OF MICHIGAN-DEARBORN

DR. BRUCE MAXIM, INSTRUCTOR

Date: 11/24/97

Week 12

SOFTWARE RELIABILITY:

  1. Probability estimation.
  2. Reliability.
  3. R = MTBF / (1 + MTBF)
  4. 0 < R < 1
  5. Availability.
  6. A = MTBF / (MTBF + MTTR)
  7. Maintainability.
  8. M = 1 / (1 + MTTR)
  9. Software Maturity Index (SMI).

Mt = # of modules in current release.

Fc = # of changed modules.

Fa = # of modules added to previous release.

Fd = # of modules deleted from previous release.

SMI = [MT - (Fa + Fc + Fd)] / Mt

{As number of errors goes to 1 - software becomes more stable}

ESTIMATING # OF ERRORS:

  1. Error seeding.

(s / S) = (n / N)

  1. Two groups of independent testers.

E(1) = (x / n) = (# of read errors found / total # of real errors)= q/y =

(# overlap/# found by 2)

E(2) = (y / n) = q/y = (# overlap / # found by 1)

n = q/(E(1) * E(2))

x = 25, y = 30, q = 15

E(1) = (15 / 30) = .5

E(2) = (15 / 25) = .6

n = [15 / (.5)(.6)] = 50 errors

CONFIDENCE IN SOFTWARE:

  1. S = # of seeded errors.

N = # of actual errors.

C (confidence level) = 1 if n > N

C (confidence level) = [S / (S - N + 1)] if n N

  1. Seeded errors found so far.

C = 1 if n > N

C = S / (S + N + 1)

S = # of seeded errors

s = # seeded errors found

N = # of actual errors

n = # of actual errors found so far

INTENSITY OF FAILURE:

  1. Suppose that intensity is proportional to # of faults or errors at start of testing.

duty time

function A 90%

function B 10%

Suppose 100 total errors

50 in A, 50 in B

(.9)50K + (.1)50K = 50K

(.1)50K = 5K

or

(.9)50k = 45K

  1. Zero failure testing.

failures to time t = ae-b(t)

[(ln (failures / (0.5 + failures)) / (ln (0.5 + failures) / (test + failures))] * (test hours to last failure)

Example:

33,000 line program.

15 = errors found.

500 = # of hours total testing.

50 = # of hours since last failure.

if we need failure rate 0.03 / 1000 LOC

failure = (.03)(33) = 1

[(ln (1 / (0.5 + 1))) / (ln ((0.5 + 1) / 15) + 1)] * 450 = 77

CASE TOOLS:

  1. Analytic tools.

(used during software development)

  1. Which assist in development and maintenance of software.

UPPER CASE (FRONT-END TOOLS):

LOWER CASE:

TAXONOMY OF CASE TOOLS:

  1. Information engineering tools.
  2. Process modeling and management tools.
  3. Project planning tools.
    1. Cost / effect estimation.
    2. Project scheduling tools.
  4. Risk analysis tools.
  5. Project management tools.

(Monitor project & maintain management plan)

  1. Requirements tracing tools.
  2. Metrics & management tools.
  3. Documentation tools.
  4. System software tools.
  5. Q.A. tools.
  6. Database management tools.
  7. Software configuration management tools.
  8. Analysis & design tools.
  9. PRO / SIM tools.
  10. Interface design tools.
  11. Prototyping tools.
  12. Programming tools.

(Compilers, editors, etc.)

  1. Integration & testing tools.
  2. Static analysis tools.
  3. Dynamic analysis tools.
  4. Test management tools.
  5. Client/server testing tools
  6. Reengineering tools

Date: 11/26/97

Week 12

"LIFE CYCLE" CASE:

  • ("integrated" CASE = I-CASE)
  • I-CASE ENVIRONMENT:

    1. Mechanism for sharing SE information among tools.
    2. Tracking change.
    3. Version control & configuration management.
    4. Direct access to any tool.
    5. Automating the work breakdown that matches tools.
    6. Support communication among software engineers.
    7. Collect metrics & management.

    INTEGRATION ARCHITECTURE:

    1. User interface layer.
    2. Tools management services (TSM).
    3. Object management layer.
    4. Shared responsibility layer (CASE database)

    CASE REPOSITORY (DB) IN I-CASE:

    Role:

    1. Data integrity.
    2. Information sharing.
    3. Data /tool integration.
    4. Data /data integration.
    5. Methodology enforcement.
    6. Document standardization.

    Content:

    1. Problem to be solved.
    2. Problem domain.
    3. Emerging solution.
    4. Rules pertaining to software process methodology.
    5. Project plan.
    6. Organizational content.

    DBMS:

    1. Non-redundant data storage.
    2. Data access at high level
    3. Data independent.
    4. Transaction control.
    5. Security.
    6. Ad hoc queries & reports.
    7. Openness (import/export).
    8. Multi-user support.

    SOFTWARE QUALITY:

    McCall's software quality functions (regression)


    SOFTWARE QUALITY MEASUREMENT PRINCIPLES:

    1. Formulation.
    2. Collect information.
    3. Analysis information.
    4. Interpretation.
    5. Feedback.

    ATTRIBUTES OF EFFECTIVE SOFTWARE METRICS:

    1. Simple and computable.
    2. Empirically & intuitively persuasive.
    3. Consistent & objective.
    4. Consistent in use of units & dimension.
    5. Programming language implementation.
    6. Provide mechanism for quality feedback.

    SAMPLE METRICS:

    1. Function-based (function points).
    2. Bang metric (DeMarco).
    1. RE / FuP
    2. (< 0.7) function strong applications.
    3. (> 1.5) data strong applications.

    SPECIFICATION QUALITY METRICS:

    nn = nf + nnf

    where:

    1. nn = requirements & specification .
    2. nf = functional .
    3. nnf = non-functional .

    Specificity, Q1 = nai/qr

    1. nai = # of requirements with reviewer agreement.

    Completeness, Q2 = nu / (ni * ns)

    1. nu = unique functions.
    2. ni = # of inputs.
    3. ns = # of states.

    Overall completeness, Q3 = nc / (nc + nnv)

    1. nc = # validated & correct.
    2. nnv = # not validated.

    SOFTWARE QUALITY INDICES:

    IEEE standard - SMI (Software Maturity Index)

    1. SMI = [Mt = (Fa + Fc + Fd)]/Mt
    2. Mt = number of modules in current release.
    3. Fa = modules added.
    4. Fc = modules changed.
    5. Fd = modules deleted.

    COMPONENT LEVEL METRICS:

    1. Cohesion metric:
    1. Data slice.
    2. Data tokens.
    3. Glue tokens.
    4. Super glue tokens.
    5. Stickiness.
      1. Coupling metrics:
    6. For data & control flow coupling.
    7. For global coupling.
    8. For environmental coupling.
      1. Complexity metrics.
      2. Interface design metrics.