sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr

         +===================================================+
         +======= TeSting TechniqueS NewSletter (TTN) =======+
         +=======           ON-LINE EDITION           =======+
         +=======              April 1996             =======+
         +===================================================+

TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-Mailed
monthly to support the Software Research, Inc. (SR) user community and
provide information of general use to the worldwide software testing
community.

(c) Copyright 1996 by Software Research, Inc.  Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition pro-
vided that the entire document/file is kept intact and this copyright
notice appears with it.

TRADEMARKS:  STW, Software TestWorks, CAPBAK/X, SMARTS, EXDIFF,
CAPBAK/UNIX, Xdemo, Xvirtual, Xflight, STW/Regression, STW/Coverage,
STW/Advisor and the SR logo are trademarks or registered trademarks of
Software Research, Inc. All other systems are either trademarks or
registered trademarks of their respective companies.

========================================================================

INSIDE THIS ISSUE:

   o  Quality Week 1996: Program Summary and Conference Description
      (Electronic registration: http://www.soft.com/QualWeek/"

   o  Software Negligence and Testing Coverage (Part 4 of 4), by Cem
      Kaner

   o  New Journal: Automated Software Engineering

   o  Frequently Asked Questions about the Space Shuttle Computers (Part
      3 of 3) by Ed Taft (taft@adobe.com)

   o  ISO 9001 Explained:  A New Book for the Software Quality Commun-
      ity.

   o  Special issue of IBM's System Journal devoted to Software Quality
      (IBM SJ January 1994)

   o  TTN SUBMITTAL POLICY

   o  TTN SUBSCRIPTION INFORMATION

========================================================================

           NINTH INTERNATIONAL SOFTWARE QUALITY WEEK (QW'96)

                             21-24 May 1996
            Sheraton Palace Hotel, San Francisco, California

             CONFERENCE THEME: QUALITY PROCESS CONVERGENCE
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Advances in technology have swept the computing industry to new heights
of innovation.  The astonishing growth of the InterNet and the WWW, the
maturation of client-server technology, and the emerging developments
with C++ and Sun's Java Language (tm) are illustrations of the rapid
deployment we are seeing the 1990s.  For software quality to keep track
of existing methods, approaches and tools have to be thought of in
well-structured ``process models'' that apply quality control and test
methods in a reasoned, practical way.  Quality Process Convergence -
making sure that applied quality techniques produce real results at
acceptable costs - is the key to success.  The Ninth International Soft-
ware Quality Week focuses on software testing, analysis, evaluation and
review methods that support and enable process thinking.  Quality Week
'96 brings the best quality industry thinkers and practitioners together
to help you keep the competitive edge.

                          CONFERENCE SPONSORS
                          ^^^^^^^^^^^^^^^^^^^
The QW'96 Conference is sponsored by SR Institute, in cooperation the
IEEE Computer Society (Technical Council on Software Engineering) and
the ACM.  Members of the IEEE and ACM receive a 10% discount off all
registration fees.

                     TECHNCIAL PROGRAM DESCRIPTION
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The Pre-Conference Tutorial Day offers expert insights on ten key topic
areas.  The Keynote presentations give unique perspectives on trends in
the field and recent technical developments in the community, and offer
conclusions and recommendations to attendees.

The General Conference offers four track presentations, mini-tutorials
and a debate:

      Technical Track Topics include:
              OO Testing
              Specifications
              Ada
              Statistical Methods
              Rule-Based Testing
              Class Testing
              Testability

      Applications Track Topics include:
              Decision Support
              Mission-Critical
              Innovative Process
              Internal Risk
              GUI Testing
              New Approaches

      Management Track Topics include:
              QA Delivery
              Testing Topics
              Process Improvement - I
              Process Improvement - II
              Metrics to Reduce Risk
              Process Improvement III
              Success Stories

      Quick-Start Mini-Tutorial Track includes:
              An Overview of Model Checking
              Software Reliability Engineered Testing Overview
              Teaching Testers: Obstacles and Ideas
              Testing Object-Oriented Software: A Hierarchical Approach
              Best Current Practices in Software Quality
              A History of Software Testing and Verification
              Software Testing: Can We Ship It Yet?


                   Q U A L I T Y   W E E K   ' 9 6
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                  C O N F E R E N C E   P R O G R A M
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

                  TUESDAY, 21 MAY 1996 (TUTORIAL DAY)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Tutorial Day offers ten lectures in two time slots on current issues and
technologies.  You can choose one tutorial from each of the two time
slots.

      Tuesday, 21 May 1996, 8:30 - 12:00 -- AM Half-Day Tutorials
      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Mr. Robert V. Binder (System Consulting Inc.)  "Object-Oriented System
Testing: The FREE Approach"

Dr. Boris Beizer (ANALYSIS) "An Overview Of Testing Unit, Integration,
System"

Dr. Walt Scacchi (University of Southern California) "Understanding
Software Productivity"

Mr. Lech Krzanik (CCC Software Professionals Oy) "BOOTSTRAP: A European
Software Process Assessment and Improvement Method"

Mr. John D. Musa (AT&T Bell Labs) "Software Reliability Engineered Test-
ing"

       Tuesday, 21 May 1996, 1:30 - 5:00 -- PM Half-Day Tutorials
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Mr. Hans-Ludwig Hausen (GMD Gesellschaft fur Mathematik und Datenverar-
beitung mbH) "Software Quality Evaluation and Certification"

Dr. Norman F. Schneidewind (Naval Postgraduate School) "Software Relia-
bility Engineering for Client-Server Systems"

Mr. William J. Deibler, Mr. Bob Bamford (Software Systems Quality Con-
sulting) "Models for Software Quality -- Comparing the SEI Capability
Maturity Model (CMM) to ISO 9001"

Mr. Dan Craigen, Mr. Ted Ralston (ORA Canada) "An Overview of Formal
Methods"

Mr. Tom Gilb (Independent Consultant) "Software Inspection"

             22-24 MAY 1996 -- QUALITY WEEK '96 CONFERENCE
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         Wednesday, 22 May 1996, 8:30 - 12:00 -- OPENING KEYNOTES
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

   Mr. Walter Ellis (Or Equivalent) (Software Process and Metrics) "NSC:
   A Prospectus And Status Report (Keynote)"

   Mr. Tom Gilb (Independent Consultant) "The `Result Method' for Qual-
   ity Process Convergence (Keynote)"

   Prof. Leon Osterweil (University of Massachusetts Amherst) "Perpetu-
   ally Testing Software (Keynote)"

   Dr. Watts Humphrey (Carnegie Mellon University) "What if Your Life
   Depended on Software?" (Keynote)"

         Wednesday, 22 May 1996, 1:30 - 5:00 -- PM Parallel Tracks
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Three regular parallel tracks with four papers per track: TECHNOLOGY,
   APPLICATIONS, MANAGEMENT.  (See the Conference Brochure for complete
   details.)

QUICK START TRACK MINI-TUTORIALS

   Mr. Daniel Jackson (Carnegie Mellon University) "An Overview of Model
   Checking"

   Mr. John D. Musa (AT&T Bell Labs) "Software Reliability Engineered
   Testing Overview"

S P E C I A L   E V E N T

   Dr. Boris Beizer, Mr. Tom Gilb (Independent Consultants) "Testing Vs.
   Inspection -- THE GREAT DEBATE"

         Thursday, 23 May 1996, 8:30 - 12:00 -- AM Parallel Tracks
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Three regular parallel tracks with four papers per track: TECHNOLOGY,
   APPLICATIONS, MANAGEMENT.  (See the Conference Brochure for complete
   details.)

QUICK START TRACK MINI-TUTORIALS

   Mr. James Bach (STL) "Teaching Testers: Obstacles and Ideas"

   Mr. Shel Siegel (Objective Quality Inc.)  "Testing Object Oriented
   SW: A Hierarchical Approach"

         Thursday, 23 May 1996, 8:30 - 12:00 -- PM Parallel Tracks
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Three regular parallel tracks with four papers per track: TECHNOLOGY,
   APPLICATIONS, MANAGEMENT.  (See the Conference Brochure for complete
   details.)

QUICK START TRACK MINI-TUTORIALS

   Mr. Tom Drake (NSA Software Engineering Center) "Best Current Prac-
   tices In Software Quality Engineering"

   Prof. Leon Osterweil, Dan Craigen (University of Massachusetts
   Amherst) "A History of Software Testing and Verification"

          Friday, 24 May 1996, 8:30 - 10:00 -- AM Parallel Tracks
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Three regular parallel tracks with four papers per track: TECHNOLOGY,
   APPLICATIONS, MANAGEMENT.  (See the Conference Brochure for complete
   details.)

QUICK START TRACK MINI-TUTORIALS

   Mr. Roger W. Sherman, Mr. Stuart Jenine (Microsoft Corporation)
   "Software Testing: Can We Ship It Yet?"

           Friday, 24 May 1996, 10:30 - 1:00 -- CLOSING KEYNOTES
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

   Mr. Guenther R. Koch (European Software Institute) "The European
   Software Institute As A Change Agent (KEYNOTE)"

   Mr. Clark Savage Turner (Software Engineering Testing) "Legal Suffi-
   ciency of Safety-Critical Testing Process (Keynote)"

   Dr. Boris Beizer (ANALYSIS) "Software *is* Different KEYNOTE"

   Dr. Edward Miller (Software Research) "Conference Conclusion"

       R E G I S T R A T I O N   F O R   Q U A L I T Y   W E E K
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Contact SR/Institute at qw@soft.com or FAX +1 (415) 550-3030 or register
electronically at http://www.soft.com/QualWeek/.

========================================================================

           SOFTWARE NEGLIGENCE AND TESTING COVERAGE (Part 4)

                                   by

                  CEM KANER, J.D., PH.D., ASQC-C.Q.E.

Editors Note:  This article was first published in the Software QA Quar-
terly, Volume 2, No.  2, 1995, pp. 18-26. Copyright (c) Cem Kaner, 1995.
All rights reserved.  It is reprinted in TTN-Online, in four parts, by
permission of the author.

                             (Part 4 of 4)

23. Every time-slice setting. In some systems, you can control the grain
of switching between tasks or processes. The size of the time quantum
that you choose can make race bugs, time-outs, interrupt-related prob-
lems, and other time-related problems more or less likely. Of course,
coverage is a difficult problem here because you aren't just varying
time-slice settings through every possible value. You also have to
decide which tests to run under each setting. Given a planned set of
test cases per setting, the coverage measure looks at the number of set-
tings you've covered.

24. Varied levels of background activity. In a multiprocessing system,
tie up the processor with competing, irrelevant background tasks. Look
for effects on races and interrupt handling. Similar to time-slices,
your coverage analysis must specify

*    categories of levels of background activity (figure out something
that makes sense) and

*    all timing-sensitive testing opportunities (races, interrupts,
etc.).

25. Each processor type and speed. Which processor chips do you test
under?  What tests do you run under each processor? You are looking for:

*    speed effects, like the ones you look for with background activity
testing

*    consequences of processors' different memory management rules

*    floating point operations

*    any processor-version-dependent problems that you can learn about.

26. Every opportunity for file / record / field locking.

27. Every dependency on the  locked (or unlocked) state of a file,
record or field.

28. Every opportunity for contention for devices or resources.

29. Performance of every module / task / object. Test the performance of
a module then retest it during the next cycle of testing. If the perfor-
mance has changed significantly, you are either looking at the effect of
a performance-significant redesign or at a symptom of a new bug.

30. Free memory / available resources / available stack space at every
line or on entry into and exit out of every module or object.

31. Execute every line (branch, etc.) under the debug version of the
operating system. This shows illegal or problematic calls to the operat-
ing system.

32. Vary the location of every file. What happens if you install or move
one of the program's component, control, initialization or data files to
a different directory or drive or to another computer on the network?

33. Check the release disks for the presence of every file. It's amazing
how often a file vanishes. If you ship the product on different media,
check for all files on all media.

34. Every embedded string in the program. Use a utility to locate embed-
ded strings. Then find a way to make the program display each string.

Operation of every function / feature / data handling operation under:

35.     Every program preference setting.

36.     Every character set, code page setting, or country code setting.

37.     The presence of every memory resident utility (inits, TSRs).

38.     Each operating system version.

39.     Each distinct level of multi-user operation.

40.     Each network type and version.

41.     Each level of available RAM.

42.     Each type / setting of virtual memory management.

43.     Compatibility with every previous version of the program.

44.     Ability to read every type of data available in every readable
input file format. If a file format is subject to subtle variations
(e.g. CGM) or has several sub-types (e.g. TIFF) or versions (e.g.
dBASE), test each one.

45. Write every type of data to every available output file format.
Again, beware of subtle variations in file formats -- if you're writing
a CGM file, full coverage would require you to test your program's
output's readability by every one of the main programs that read CGM
files.

46. Every typeface supplied with the product. Check all characters in
all sizes and styles. If your program adds typefaces to a collection of
fonts that are available to several other programs, check compatibility
with the other programs (nonstandard typefaces will crash some pro-
grams).

47. Every type of typeface compatible with the program. For example, you
might test the program with (many different) TrueType and Postscript
typefaces, and fixed-sized bitmap fonts.

48. Every piece of clip art in the product. Test each with this program.
Test each with other programs that should be able to read this type of
art.

49. Every sound / animation provided with the product. Play them all
under different device (e.g. sound) drivers / devices. Check compatibil-
ity with other programs that should be able to play this clip-content.

50. Every supplied (or constructible) script to drive other machines /
software (e.g. macros) / BBS's and information services (communications
scripts).

51. All commands available in a supplied communications protocol.

52. Recognized characteristics. For example, every speaker's voice
characteristics (for voice recognition software) or writer's handwriting
characteristics (handwriting recognition software) or every typeface
(OCR software).

53. Every type of keyboard and keyboard driver.

54. Every type of pointing device and driver at every resolution level
and ballistic setting.

55. Every output feature with every sound card and associated drivers.

56. Every output feature with every type of printer and associated
drivers at every resolution level.

57. Every output feature with every type of video card and associated
drivers at every resolution level.


58. Every output feature with every type of terminal and associated pro-
tocols.

59. Every output feature with every type of video monitor and monitor-
specific drivers at every resolution level.

60. Every color shade displayed or printed to every color output device
(video card / monitor / printer / etc.) and associated drivers at every
resolution level. And check the conversion to grey scale or black and
white.

61. Every color shade readable or scannable from each type of color
input device at every resolution level.

62. Every possible feature interaction between video card type and reso-
lution, pointing device type and resolution, printer type and resolu-
tion, and memory level. This may seem excessively complex, but I've seen
crash bugs that occur only under the pairing of specific printer and
video drivers at a high resolution setting. Other crashes required pair-
ing of a specific mouse and printer driver, pairing of mouse and video
driver, and a combination of mouse driver plus video driver plus ballis-
tic setting.

63. Every type of CD-ROM drive, connected to every type of port (serial
/ parallel / SCSI)  and associated drivers.

64. Every type of writable disk drive / port / associated driver. Don't
forget the fun you can have with removable drives or disks.

65. Compatibility with every type of disk compression software. Check
error handling for every type of disk error, such as full disk.

66. Every voltage level from analog input devices.

67. Every voltage level to analog output devices.

68. Every type of modem and associated drivers.

69. Every FAX command (send and receive operations) for every type of
FAX card under every protocol and driver.

70. Every type of connection of the computer to the telephone line
(direct, via PBX, etc.; digital vs. analog connection and signaling);
test every phone control command under every telephone control driver.

71. Tolerance of every type of telephone line noise and regional varia-
tion (including variations that are out of spec) in telephone signaling
(intensity, frequency, timing, other characteristics of ring / busy /
etc.  tones).

72. Every variation in telephone dialing plans.

73. Every possible keyboard combination. Sometimes you'll find trap
doors that the programmer used as hotkeys to call up debugging tools;
these hotkeys may crash a debuggerless program. Other times, you'll dis-
cover an Easter Egg (an undocumented, probably unauthorized, and possi-
bly embarrassing feature). The broader coverage measure is every possi-
ble keyboard combination at every error message and every data entry
point.  You'll often find different bugs when checking different keys in
response to different error messages.

74. Recovery from every potential type of equipment failure. Full cover-
age includes each type of equipment, each driver, and each error state.
For example, test the program's ability to recover from full disk errors
on writable disks. Include floppies, hard drives, cartridge drives, opt-
ical drives, etc. Include the various connections to the drive, such as
IDE, SCSI, MFM, parallel port, and serial connections, because these
will probably involve different drivers.

75. Function equivalence. For each mathematical function, check the out-
put against a known good implementation of the function in a different
program.  Complete coverage involves equivalence testing of all testable
functions across all possible input values.

76. Zero handling. For each mathematical function, test when every input
value, intermediate variable, or output variable is zero or near-zero.
Look for severe rounding errors or divide-by-zero errors.

77. Accuracy of every graph, across the full range of graphable values.
Include values that force shifts in the scale.

78. Accuracy of every report. Look at the correctness of every value,
the formatting of every page, and the correctness of the selection of
records used in each report.

79. Accuracy of every message.

80. Accuracy of every screen.

81. Accuracy of every word and illustration in the manual.

82. Accuracy of every fact or statement in every data file provided with
the product.

83. Accuracy of every word and illustration in the on-line help.

84. Every jump, search term, or other means of navigation through the
on-line help.

85. Check for every type of virus / worm that could ship with the pro-
gram.

86. Every possible kind of security violation of the program, or of the
system while using the program.

87. Check for copyright permissions for every statement, picture, sound
clip, or other creation provided with the program.

Usability tests of:

88. Every feature / function of the program.

89. Every part of the manual.

90. Every error message.

91. Every on-line help topic.

92. Every graph or report provided by the program.

Localizability / localization tests:

93. Every string. Check program's ability to display and use this string
if it is modified by changing the length, using high or low ASCII char-
acters, different capitalization rules, etc.

94. Compatibility with text handling algorithms under other languages
(sorting, spell checking, hyphenating, etc.)

95. Every date, number and measure in the program.

96. Hardware and drivers, operating system versions, and memory-resident
programs that are popular in other countries.

97. Every input format, import format, output format, or export format
that would be commonly used in programs that are popular in other coun-
tries.

98. Cross-cultural appraisal of the meaning and propriety of every
string and graphic shipped with the program.

99. Verification of the program against every program requirement and
published specification.

100. Verification of the program against user scenarios. Use the program
to do real tasks that are challenging and well-specified. For example,
create key reports, pictures, page layouts, or other documents events to
match ones that have been featured by competitive programs as interest-
ing output or applications.

101. Verification against every regulation (IRS, SEC, FDA, etc.) that
applies to the data or procedures of the program.

-----------------------------------------------------------------
CEM KANER JD, PhD.     Attorney / ASQC-Certified Quality Engineer
Read Kaner, Falk & Nguyen, TESTING COMPUTER SOFTWARE (2d Ed. VNR)
1060 Highland Court #4, Santa Clara 95050            408-244-7000
-----------------------------------------------------------------

========================================================================

              NEW JOURNAL: Automated Software Engineering

                                Editors:
         W. Lewis Johnson, (University of Southern California)
            Bashar Nuseibeh, (Imperial College, London, UK)

                ISSN 0928-8910 1996, Volume 3 (4 issues)

           Institutional Price $377.00  Private Price $115.00

     Special rate for members of ACM-SIGART available upon request.

Automated Software Engineering is an archival, peer reviewed journal
publishing research, tutorial papers, surveys, and accounts of signifi-
cant industrial experience in the foundations, techniques, tools and
applications of automated software engineering technology.  This
includes the study of techniques for constructing, understanding, adapt-
ing, and modeling software artifacts and processes.  Both automatic sys-
tems and collaborative systems are covered, as are computational models
of human software engineering activities.

Automated Software Engineering is abstracted/indexed in Computer
Abstracts, Engineering Index, COMPENDEX PLUS, Ei Page One, COMPUSCIENCE,
Computer Literature Index, and INSPEC.

SELECTED CONTENTS, 1995:

Inductive Specification Recovery: Understanding Software by Learning
from Example Behaviors (Cohen) * Systematic Incremental Validation of
Reactive Systems via Sound Scenario Generalization (Hall) * Contextual
Local Analysis in the Design of Distributed Systems (Cheung/Kramer) * On
the Reuse of Software: A Case-Based Approach Employing a Repository
(Katalagarianos / Vassiliou) * The Two-Day Workshop on Research Issues
in the Intersection Between Software Engineering and Artificial Intelli-
gence (Held in Conjunction with ICSE-16) (Kontogiannis/Selfridge)

For complete information on this new journal please contact:

                       Kluwer Academic Publishers
                    Order Department * P.O. Box 358
                  Accord Station * Hingham, MA  02018
                Tel (617) 871-6600 * Fax (617) 871-6528
                         E-Mail kluwer@wkap.com

========================================================================

      FREQUENTLY ASKED QUESTIONS ABOUT THE SPACE SHUTTLE COMPUTERS
                              Part 3 of 3

This is the FAQ list just for the Space Shuttle computer systems.  The
information here was collected by Brad Mears during several years of
working in the shuttle flight software arena, then expanded by Ken Jenks
with major assistance from Kaylene Kindt of the NASA/Johnson Space
Center's Engineering Directorate.  If you believe any part of this docu-
ment is in error, contact me and I'll try to resolve the issue thru
further research.  My email address is kjenks@gothamcity.jsc.nasa.gov.

The latest version of this document is available via anonymous FTP at
ftp://explorer.arc.nasa.gov/pub/SPACE/FAQ/shuttle-GPC-FAQ.txt

                   o       o       o       o       o

11) What Operating System do the GPCs use?  A non-standard one of
course. :-)  Seriously, the PASS and BFS each have their very own custom
designed OS.  It ain't like nothin' else in the world.

12) How is the software tested?  Testing flight software is a multi-
stage process.  NASA has several simulators (of varying fidelities) that
are built around one or more of the flight computers.  This allows the
tester to run the actual flight software in real-time on a real com-
puter.

After a programmer changes a module, he tests that module on the first
level (lowest fidelity) simulator.  If it behaves as expected, it is
given to the people who do integration testing.

Eventually, an entire release of the flight software is delivered to the
Shuttle Avionics Integration Laboratory (SAIL).  SAIL is the best avion-
ics simulator in the world.  It uses real flight-ready hardware tied
into environment simulators.  That is, it has 5 real GPCs, 3 CRTs, real
multiplexers, etc.  A full-up simulation in SAIL has an  astronaut in
the simulator cockpit and about 30 engineers at various test stations
supporting the effort.  The SAIL is capable of simulating all phases of
Shuttle flight.  It is the final arbiter of whether or not the software
is ready to fly.


13) Why doesn't NASA use more modern computers?!?!?!?!  This issue
really bugs a lot of people.  A $2000 486 PC is quite a bit faster than
the AP-101 and can hold a whole lot more memory.  So why doesn't NASA
yank out the AP-101 and stuff in a PC?

There are several reasons - reliability, certification, and performance.

Reliability - The AP-101 has been built to withstand extremes in tem-
perature, vibration, radiation, etc.  PCs aren't.

Certification - Before ANY piece of hardware is allowed on the shuttle,
it has to undergo extensive testing.  Does the case release toxic fumes
if it catches on fire?  Does it operate the same in zero-G as it does on
Earth?  How about at 3-G?  Are there sharp edges that can cut anything?
Is it compatible with the orbiter's power system?  There are thousands
of questions like this.  For a complex piece of flight-critical
hardware, the certification process can be very expensive.  For example,
the upgrade from the AP-101B to the AP-101S took several years and
untold millions of dollars.  And this was for an UPGRADE!  Can you ima-
gine how much longer it would take for a completely new design?

Performance - It is not clear to me that even a fast 486 could meet the
real-time requirements for the shuttle flight software.  The GPCs have
to process a LOT of input data.  If it fails to provide an output on
time, very bad things could happen.  The use of multiple processors in
the GPCs is something a PC just can't compare with.

Besides the issues of reliability, certification, and performance, there
is one more very good reason to stick with the current computers.  They
work!  Yes, they're old and slow and new computers are "better".  So
what?  The current setup does the job in an acceptable manner.  If NASA
had infinite money, then replacing the GPCs could be justified.  But
right now, there are other things the money could be used for.


References:  Newsgroups: sci.space.shuttle From: taft@mv.us.adobe.com
(Ed Taft) Subject: Re: I'd like a copy of the GPC FAQ...  Organization:
Adobe Systems Incorporated, Mountain View, CA Date: Fri, 29 Apr 1994
17:47:19 GMT

Just as a footnote to this excellent FAQ, I'd like to mention that there
were several interesting in-depth articles on the development of the GPC
software in the September 1984 issue of Communications of the ACM. This
is a publication of the Association for Computing Machinery which should
be available in any technical library. Perhaps this bibliographic refer-
ence should be added to the FAQ.  -- Ed Taft      taft@adobe.com
---------------------- "Space Shuttle: The History of Developing the
National Space Transportation System."  Dennis R. Jenkins
(djenkins@iu.net), (c) 1992, ISBN 0-9633974-1-9, pp. 158-163.
---------------------- Also check out the WWW page about the GPC's:
 http://www.ksc.nasa.gov/shuttle/technology/sts-newsref/sts-
av.html#sts-dps ---------------------- P. Newbold, et al.; HAL/S
Language Specification; Intermetrics Inc., Report No. IR-61-5 (Nov.
1974)

========================================================================

            ISO 9001: Interpreted for Software Organizations
                                   by
                            Ronald A. Radice

                      A NEW BOOK BASED ON THE NEW
               JULY 1994 VERSION OF THE ISO 9001 STANDARD

             Contains the Full Text of the ISO 9001 Clauses

FOREWORD by Watts Humphrey

Editors Notes:  We received this in incoming Email and have reprinted it
with credit to Mr. Ron Radice, Software Technology Transition

ABOUT THE BOOK:

You'll want to read this book

     - If you are interested in software process improvement (SPI)
     - Are pursuing ISO 9001 registration
     - Even if you already have ISO 9001 registration
     - You are not a software organization, but are interested in ISO 9001

The book has completed a worldwide technical review in six countries
with over 20 ISO 9001 experts.

As one reviewer said, "This book was needed five years ago when the
standard was in its beginning use.  It is still very much needed.  It
makes a dense subject easy to understand, especially because it takes a
business perspective."

One user recently said, "I have been using your book along with a tem-
plate for a Quality System Manual provided in another book on ISO 9000,
to help me create a Quality System Manual.  I have found your applica-
tion of the standard to software development extremely helpful.  I think
I would have been feeling my way in the dark without it."

"In short, this book is designed to be used.  If you are interested in
ISO 9001 you should use it." -Watts Humphrey

Topics covered include:

Quality Management Systems (QMS):
     -Contents
     -Objectives
     -Best approach
     -What to be sensitive to when developing a QMS
     -How to establish a QMS that meets your business needs

Audits:
     -What are the different types
     -Auditee's responsibilities
     -Auditor's responsibilities
     -How do auditors work

Getting Registered:
     -The steps that you should follow
     -Business reasons to consider
     -Avoiding overkill
     -Taking a pragmatic approach

For each of the 20 Clauses in ISO 9001:
     -Fully stated clause from the standard
     -Encapsulated, easy to read, requirements restatement
     -Explanation of each requirement
     -Interpretation for software
     -Risks associated with each requirement
     -What auditors will look for

ABOUT THE AUTHOR:

Ronald A. Radice is a principal partner in Software Technology Transi-
tion, a company that provides training, consulting services, diagnostic
services, and software engineering methods and tools.  He is a past
Director of the Software Process Program at the Software Engineering
Institute (SEI) at Carnegie Mellon University. Previously he was the
Director of Software Resources at Groupe Bull.  He worked at IBM for 23
years in technical and managerial positions across all aspects of the
software life cycle.  Ron has co-authored Software Engineering: An
Industrial Approach.  From 1984 to 1988 Ron taught software engineering
on the graduate level at Rensselear Polytechnic Institute.

Contact: PARADOXICON PUBLISHING, PO Box 1095, Andover, MA 01810.

========================================================================

         IBM Systems Journal: Special Issue on Software Quality

Even though it appeared nearly two years ago -- January 1994 -- the con-
tent of this special number of the world-famous IBM Systems Journal
clearly bears note within the Software Testing community.  (Contact your
local IBM office to obtain a copy, asking for IBM Systems Journal, Vol.
33, No. 1, 1994, IBM Issue Order No. G321-0115.)

The special issue was edited by Gene F. Hoffnagle who also wrote the
Preface.

"Software quality: an overview from the perspective of total quality
management", by S. H. Kan, V. R. Basili, and L. N. Shapiro.

"Forging a silver bullet from the essence of software" by R. G. Mays

"Journey to a mature software process," by C. Billings, J. Clifton, B.
Kolkhorst, E. Lee, and W. B. Wingert

"AS/400 software quality management," by S. H. Kan, S. D. Dull, D. N.
Amundson, R. J. Lindner, and R. J. Hedger

"Adopting Cleanroom software engineering with a phased approach," by P.
A.  Hausler, R. C. Linger, and C. J. Trammell

"RE-Analyzer: From source code to structured analysis," by A. B. O'Hare
and E. W.  Troan

"The impact of object-oriented technology on software quality: Three
case histories," by N. P. Capper, R. J. Colgate, J. C. Hunter, and M. F.
James

"Deriving programs using generic algorithms," by V. R. Yakhnis, J. A.
Farrell, and S. S. Schultz

"In-process improvement through defect data interpretation," by I. Bhan-
dari, M.  J. Halliday, J. Chaar, R. Chillarege, K. Jones, J. S. Atkin-
son, C.  Lepori-Costello, P. Y. Jasper, E. D. Tarver, C. C. Lewis, and
M. Yonezawa

Technical forum:  "Programming quality improvement in IBM," by D. L.
Bencher

Technical note: "On reliability modeling and software quality," by A. J.
Watkins,
========================================================================
------------>>>          TTN SUBMITTAL POLICY            <<<------------
========================================================================

The TTN On-Line Edition is forwarded on the 15th of each month to Email
subscribers via InterNet.  To have your event listed in an upcoming
issue, please Email a description of your event or Call for Papers or
Participation to "ttn@soft.com".  The TTN On-Line submittal policy is as
follows:

o  Submission deadlines indicated in "Calls for Papers" should provide
   at least a 1-month lead time from the TTN On-Line issue date.  For
   example, submission deadlines for "Calls for Papers" in the January
   issue of TTN On-Line would be for February and beyond.
o  Length of submitted non-calendar items should not exceed 350 lines
   (about four pages).
o  Length of submitted calendar items should not exceed 68 lines (one
   page).
o  Publication of submitted items is determined by Software Research,
   Inc., and may be edited for style and content as necessary.

========================================================================
----------------->>>  TTN SUBSCRIPTION INFORMATION  <<<-----------------
========================================================================

To request your FREE subscription, to CANCEL your subscription, or to
submit or propose a calendar item or an article send E-mail to
"ttn@soft.com".

TO SUBSCRIBE: Use the keyword "subscribe" in front of your Email address
in the body of your Email message.

TO UNSUBSCRIBE: please use the keyword "unsubscribe" in front of your
Email address in the body of your message.


                     TESTING TECHNIQUES NEWSLETTER
                        Software Research, Inc.
                            901 Minnesota Street
                   San Francisco, CA  94107 USA  

                        Phone: +1 (415) 550-3020
                Toll Free: +1 (800) 942-SOFT (USA Only)
                         FAX: + (415) 550-3030
                          Email: ttn@soft.com
                      WWW URL: http://www.soft.com

                               ## End ##