Workshop on Software Assessment
November 27, 2001
Final Program

General Chair:
Norman F. Schneidewind, Naval Postgraduate School

Program Co-Chairs:
Noel Samaan, Motorola Labs, Motorola Inc.
Allen Nikora, Jet Propulsion Laboratory, California Institute of Technology


Time
Presentation
Speakers
08:30 -
08:45
Registration
08:45 -
09:00
Welcome and Opening Remarks
Norman F. Schneidewind
General Chair
09:00 -
09:45
Quantifying Architectural Attributes: An Analytical Approach

This presentation identifies a set of external attributes of software architectures, then investigates techniques to quantify them by means of corresponding internal attributes. The work is illustrated by an industrial strength example

 W. AbdelMoez, Hany H. Ammar, N. Gradetsky, D. Nassar, M. Shereshevsky

Lane Department of Computer Science and Electrical Engineering
West Virginia University
Morgantown WV

09:45 -
10:30
Can SRE Add Value in Assessing Security-Based Products?

This presentation describes "work in progress" directed to answering this question. The two COTS systems to which SRE is being applied are FireWall products that are (or were) under assessment for particular EAL certification (both were being assessed at EAL 4, a level corresponding to use in a high-security environment.

W. W. Everett
SPRE, Inc.
Albuquerque, NM
10:30 -
11:00
Break
11:00 -
11:45
The Process-Based Early Prediction Software Reliability Model

There is a need to predict reliability before the code is tested and even prior to completion of its development.  This reliability model predicts the software reliability of its new  code based upon the development’s organization metrics of its previously released code along with its present development capability.

Samuel Keene
Longmont, Colorado
11:45 -
12:30
Using Excel to Implement Software Reliability Models

Many practitioners don’t have access to software reliability tools, or don’t have have the knowledge to use them effectively.  This presentation shows how to help practitioners by using Excel in combination with reliability tools to do reliability predictions, fault correction predictions, model validations, and various plots.

Norman F. Schneidewind
Naval Postgraduate School
Monterey, CA
 
12:30 -
13:30
Lunch
13:30 -
14:15
Scalable Generation of Complete Interaction Sequences
for Testing Graphical User Interfaces

This presentation describes work in generating test sequences for Graphical user Interfaces.  The presentation summarizes research that has been applied to real development efforts.  The test generation methods focus on state diagrams and regular events for systematically generating scalable test sequences.

Fevzi Belli
University of Paderborn
Germany
14:15 -
15:00
Controlling the Effects of Complexity in Software Testing (Testing of GUI Systems)

This presentation focuses primarily on the problem of GUI systems testing. Clearly for large GUI systems, there is insufficient time to exhaustively test all potential combinations.  The proposed approach is to select those tests of greatest interest to the user:  to specifically test those behaviors the user is most interested in.  In addition, the total testing time is always limited, and may even be unknown. Therefore, more time must actually be spent running tests and less time for analysis to determine the tests to be run.

Lee White
Department of Electrical Engineering and Computer Science
Case Western Reserve University
Cleveland, Ohio
15:00 -
15:30
Break
15:30 -
16:15
Reduction of Cycle Time for System Design Validation: Technology, Process and Training

The presentation elaborates on the author’s novel modularized approach, given in his ISSRE’01 presentation, to correlate knowledge extracted from the application of reliability engineering models, and characterization of dynamic simulation of executable objects together with that related to the characterization of user-profile and behaviors of past test cases that have been executed at least once during test past releases. The presentation also highlights difficulties encountered in deploying such technology within a large organization. Finally, the author intends to stimulate discussions with regard to extending the applicability of the methodology presented to system validation, making the technology transition smoother and finally, discuss collaboration between educational institutions (i.e.,  academia) and professional institutions (e.g.,  IEEE) to improve SW Eng practices.

Noel Samaan
Motorola Labs
Motorola Inc
Schaumburg, Illinois
1615 -
17:00
A Practical Software Measurement Mechanism

Any software measurement system system must have the following characteristics: 

  • Measurements must be meaningful and repeatable.
  • Measurements must be consistent.
This presentation, providing detail in addition to the authors’ ISSRE’01 presentation, describes a measurement capability being implemented at the Jet Propulsion Laboratory, consisting of three components:
  • Structural measurement
  • Fault burden computation
  • Fault measurement and identification
This information is used to develop a model from which absolute fault burdens can be estimated.
Allen Nikora
Jet Propulsion Laboratory,
California Institute of Technology

John Munson
University of Idaho

17:00 -
17:30
Concluding Remarks and Open Discussion
Norman F. Schneidewind,
Attendees