<<

. 9
( 9)



186“187 Contract/Subcontract, outline of, 231“233
expanding, 156“164, 201 corporation
implementing, 188“190 corporate data and, 196
interrelationships of, 176 Policies, Plans, and Processes and,
ordering techniques for, 165“176, 201 24“25
Programmatic Performance Checklist Cost of Quality Position, 31, 92
(PPC), 10“37 Critical Design Review (CDR), 118
Programmatic Recovery Checklist Critical Success Factors (CSFs), 41, 43,
(PRC), 58“108 112“113, 144, 269“270
selecting, 187 culture, of ˜˜generation X,™™ 7
INDEX 277


customer Technical Performance Measures
(TPMs) and, 45, 124“125
acceptance of ¬nal delivery by, 36“37,
traceable to processes, 42“43, 117“119
106“107
Design Reviews, 16, 22, 29, 39, 42“43, 45,
approval of Design Review, 45, 127
66, 88, 90, 96, 122, 150
approval of System Test Plan/Proce-
completion according to required proc-
dure, 52
esses, 45, 125“127
customer data and, 196
customer approval of, 45, 127
determining needs of, 59“67, 70“71
Design Review Approval Form, 127,
Customer Meetings, 15“16, 29“30, 65, 90,
128, 263“274
96
recovery issues, 74, 110, 125“127
Customer Processes
Design To Cost (DTC) approach, 44, 124
Causes for Action and, 167, 201
documentation
Design and, 42“43, 117“119
data trail in, 23“24, 195“196
Production/Manufacturing and, 49“50,
in Design Review, 127
138“139
importance of, 193“195
Project/Program Plan and Technical
interrelationships in, 5
Plan linkage, 23“24, 75“76, 166
library in, 194“195
researching, 162 of prototype changes, 47“48, 133
of Speci¬cations, 68
Data Delivery Matrix, 247“248 standard, 196
Data Item Descriptions (DIDs), 35“36, Statement of Work (SOW), 10“12, 59“
132, 147, 152“154, 162 60, 63“64
Data Management, 12 DOORS, 27“28
drop shipping, 37, 107“108
amount of data on time, 35“36,
˜˜Dumping™™ the Fix, 190
103“104
recovery issues, 58, 103“104
Early, John F., 170
Data Plan, 35“36, 103“104, 146“147
Earned Value Measurement System
Data Sheets, 170
(EVMS), 90
data trail, importance of, 23“24, 195“196
EIA-649, 151“152, 153
de Bono, Edward, 160
80/20 Rule, 192
DECISION Systems, Inc., 171
85:15 Rule, 166“167, 192, 193, 201
Deming, W. Edwards, 170
Employee Assistance Programs (EAPs),
Department of Defense (DOD) standards,
100
162 enterprise data, 118, 196
Design, 39, 42“45 Enterprise Processes
architecture-level issues in, 43“44, 122 Causes for Action and, 167, 201
correctness of, 43, 119“120 Design and, 42“43, 117“119
ef¬ciency of, 43, 120“122 Production/Manufacturing and, 49“50,
Life Cycle Cost (LCC) and, 44“45, 138“139
123“124 Project/Program Plan and Technical
recovery issues, 74, 110, 117“125 Plan linkage, 24“25, 76“77, 162
segmentation of, 44, 122“123 researching, 162
INDEX
278


Entity-Relationship Diagrams, 175 IBM, User Access Guide, 42, 54n
ethics, 43, 63, 86 IEEE, 132
Expansion Methodologies, 156“164, 201 implementing Cause Descriptions,
Experience Window, 253“254 188“190
˜˜Dumping™™ the Fix, 190
˜˜On-Ramps,™™ 189“190
FAA Standards, 162
˜˜Slipping in the Fix,™™ 189, 190
Failure Mode and Criticality Analysis
incremental construction, 47, 132
(FMECA), 121, 166, 170, 182“184,
Independent Research and Development
201
(IR&D) programs, 19, 61“62, 69“70
Failure Mode Effect Analysis (FMEA), 121,
In¬nite Innovations Ltd., 159
166, 170, 182“184, 201
In-Process Reviews (IPRs), 16“17, 29, 33,
development of, 182“183
39, 45“46, 66, 89, 90, 95, 96, 118
software for, 183
approval by appropriate authority, 46,
Family of Causes, 157
129“130
fast-tracking, 80
completion according to required proc-
Final Delivery, 12
esses, 45, 127“129
acceptance by customer without delay,
In-Process Review Approval Form, 130,
36“37, 106“107
265“266
avoidance of third-party or drop ship-
recovery issues, 74, 110, 127“130
ping, 37, 107“108
Interfaces, 68, 72
recovery issues, 58, 106“108
Interface Control Document (ICD),
First Articles, 47, 133
115, 123
Fishbone Diagram, 167“171
user, 42, 116“117
Fishman, George S., 186
interpersonal con¬‚ict, 34“35, 82, 99“101
Flow Charting, 178
Ishikawa, Kaoru, 167“168
FMECA”Failure Mode and Criticality
Ishikawa Diagram, 167“171
Analysis, 121, 182“184
ISO-9000, 132
Force Field Analysis, 180“182, 201
ISO-9001, 23“25, 75“77
development of, 180“182
ISO-10007, 151“152, 153
software for, 182
ISO/IEC 12207, 151“152, 153
Functional Manager, 81

Janus process, 53“54, 151
G&A (General and Administrative) ex-
Jiro, Kwakita, 172
penses, 80
Gallery Walking, 178
Guffey, Mary Ellen, 156“157, 164n key functions, de¬ning, 42, 114“115
KJ Method, 171“176
hardware, Test Plans, 142“143
hazardous conditions, 100“101 leave time, 97“98
Histograms, 170, 178 legal responsibility, 1
holes Lewin, Kurt, 180“181
creation of, 133“136 Life Cycle Cost (LCC), 44“45, 123“124
eliminating, 186“187 Light Voting, 178
INDEX 279


line design, 50, 139 Mission Statement
liquidated damages, 141 Organization and, 26
load, for System Tests, 53, 55n, 149 Policies, Plans, and Processes and, 24
Lotus Flower Diagrams, 178 modifying methods, 196
Lowest Replaceable Unit (LRU), 183 modules/subsystems
Luttman, Robert, 176n de¬ning, 41, 113“114
design and, 51, 143
monitoring
major elements, description of, 42,
of Purchase Orders, 31“33, 50, 92“95
115“116
of Speci¬cation (Spec), 21“22, 72“73
Manhattan Project, 2, 184
of Statement of Work (SOW), 15“16,
Materials, 11“12, 30“33
65“66
Production/Manufacturing and, 50,
in Teaming, Alliances, and Subcon-
140“141
tracts, 29, 88“89
Purchase Order monitoring, 31“33, 50,
Monte Carlo Simulation, 184“186, 192,
92“95
201
Purchase Order preparation, 30“31, 50,
development of, 184“185
91
software for, 186
recovery issues, 57“58, 91“95
MTBF (Mean Time Between Failure), 41,
vender performance and, 33, 91“92, 95
112“113
vendor competence and, 31, 91“92
MTTR (Mean Time To Repair), 41,
Materials Manager, 31
112“113
matrix management, 81, 97, 98
Municipal Government Standards, 162
MBWA (Management by Walking
Around), 33
NASA
McDermott, Robin E., 184
SpecsIntact, 68
Mean Time Between Failure (MTBF), 41,
standards of, 162
112“113
negotiation
Mean Time To Repair (MTTR), 41,
Negotiation Checklist, 267“268
112“113
of Speci¬cation (Spec), 20“21, 71“72
metrics, 6, 17
of Statement of Work (SOW), 14“15,
Microsoft, Interface Guidelines, 42, 54n
64“65
Milestone Reviews, 22
in Teaming, Alliances, and Subcon-
MIL-STD-61, 151“152
tracts, 28“29, 87“88
MIL-STD-100, 23“25, 75“77
MIL-STD-245, 12, 60, 64, 71
MIL-STD-483, 151“152, 153 ˜˜On-Ramps,™™ 189“190
MIL-STD-490, 18, 69 On the Job Training (OJT), 97, 102
MIL-STD-498, 151“152, 153 ordering Causes for Action
MIL-STD-973, 132, 151“154 Af¬nity Diagrams, 166, 171“176, 201
MIL-STD-1423, 132 Cause and Effect Diagram, 166, 167“
MIL-STD-1521, 126“127, 131, 132 171, 201
MIL-STD-1629, 121 85:15 Rule, 166“167, 192, 193, 201
minutes, of meetings, 68, 127 Relationship Diagrams, 173“176, 201
INDEX
280


Organization, 11, 25“26 Project/Program Processes, 162“164,
mix of personnel, 26, 78“82, 98 201
number of personnel, 25, 77“78 recovery issues, 57, 74“77
recovery issues, 57, 77“83 Standard Processes, 23, 74“75, 161“162
teamwork of personnel, 26, 82“83 Policy-to-Plan Trail, 251“252
out-of-tolerance, 66“67, 94 Posttest Reviews, 29, 89
overlaps Preliminary Design Review (PDR), 22“23,
creation of, 133“136 118
eliminating, 186“187 Pre Planned Product Improvement (P3I),
overtime costs, 80 44, 124
Pretest Meetings, 29, 89
Probability Density Functions (PDFs), 185
Pacestar Software, 173, 175
problem-solving process, 72, 156“157
Pareto, Vilfredo, 178
Problem Test Reports (PTRs), 52, 144“146
Pareto Analysis, 170, 178“180, 192, 193,
Production/Manufacturing, 40, 49“50
201
line design for, 50, 139
development of, 178“180
Materials in, 50, 140“141
software for, 180
recovery issues, 74, 111, 138“141
Pareto Principle, 178“180, 192, 193
shop orders in, 50, 139“140
Performance Characteristics, 68, 71“72
traceability of, 49, 138“139
Personnel, 12, 33“35
Pro¬t and Loss (P&L) responsibility, 1
availability when needed, 34, 97“98
program
competence for tasks, 33“34, 95“97,
de¬ned, 1
102
project versus, 1“2
interpersonal con¬‚ict and, 34“35, 82,
requirements control matrix, 4
99“101
Programmatic Performance Checklist
mix of, 26, 78“82, 98
(PPC), 9“37
number of, 25, 77“78
Data Management assertions, 12, 35“36
recovery issues, 58, 95“101
Final Delivery assertions, 12, 36“37
salaries/wages equal to or less than bid,
Materials assertions, 11“12, 30“33
34, 98
Organization assertions, 11, 25“26
for System Tests, 53, 149“150
Personnel assertions, 12, 33“35
teamwork of, 26, 82“83
Policies, Plans, and Processes assertions,
see also Training
11, 23“25
PERT Charts, 174
Quality assertions, 12, 36
Phoenix Award, xiv, 199
Speci¬cation assertions, 11, 17“23
Physical Characteristics, 68, 72
Statement Of Work (SOW) assertions,
Plans, Progress, and Problems Meetings,
10“17
30, 89“90
Teaming, Alliances, and Subcontracts
Plsek, P. E., 170
assertions, 11, 26“30
Policies, Plans, and Processes, 11, 23“25
Training assertions, 12, 35
Customer Processes, 23“24, 75“76, 162
Programmatic Recovery Checklist (PRC),
Enterprise Processes, 24“25, 76“77, 162
Policy-to-Plan Trail, 251“252 9, 56“108
INDEX 281


Data Management assertions, 58, monitoring of, 31“33, 50, 92“95
103“104 preparation of, 30“31, 50, 91
Final Delivery assertions, 58, 106“108 recovery issues, 74, 111, 135“138
Materials assertions, 57“58, 91“95 sum of all purchases in, 48“49, 135“136
Organization assertions, 57, 77“83
Personnel assertions, 58, 95“101 Quali¬cation Requirements, 68, 72
Policies, Plans, and Processes assertions, Quality, 12, 36
57, 74“77 characteristics of, 36, 105
Quality assertions, 58, 104“106 measurement of, 36, 105“106
Speci¬cation assertions, 57, 67“74 Quality Plan, 36, 104“105
Statement Of Work (SOW) assertions, recovery issues, 58, 104“106
57, 59“67 Quality America, Inc., 171
Teams, Alliances, and Subcontracts as- Quality Assurance Plan, 31, 68, 92, 104,
sertions, 57, 83“90 237“240
Training assertions, 58, 101“103 Quality Control Plan, 104
Program Of¬ce (PO), 6
Quality Standards, 36
Program Test Plan (PTP), 51
Quantum Improvement (QI), 192“193
Progress Reviews, 29, 30, 88, 89
Quantum Process Improvement (QPI),
project
192
de¬ned, 1
program versus, 1“2
Radar Diagrams, 178
project data and, 196
rapid prototyping, 7
requirements control matrix, 4
Reengineering, 192
Project Advisory Council, 24“25
Relationship Diagrams, 173“176, 201
Project Management Benchmarking Net-
development of, 173“175
work (PMBN), 160“161
software to support, 175“176
Project/Program Plan, 43
Relex Corporation, 183
outline of, 219“222
requirements, 3
overview, 4
Requirements de¬nition team, 15
Technical Plan and, 23“25, 74“77
Requirements Flow-Down Matrix (RFM)
Project/Program Processes, researching,
Architecture and, 42, 114“115
162“164, 201
described, 3“5, 245“246
Project Reviews, frequency of, 15“16
Purchase Orders and, 91, 136, 137
Prototypes, 39, 46“48
Subcontracts and, 26“28, 83, 84, 134,
Change Control Process and, 47“48,
135
133
Requirements Traceability Checklist
changes accepted by originator of re-
(RTC)
quirements, 47“48, 133
Architecture and, 42, 114“115
incremental construction of, 47, 132
Design and, 43, 119
recovery issues, 74, 110, 130“133
Purchase Orders and, 49, 91, 136, 137
re¬‚ection of requirements, 46, 130“132
Subcontracts and, 48, 83, 84
Purchase Orders (POs), 5“6, 40, 48“49
Unit Tests and, 51
completeness of, 49, 136“138
INDEX
282


Requirements Traceability Matrix (RTM) for Pareto Analysis, 180
for Relationship Diagrams, 175“176
Causes for Action and, 74, 187
requirements for, 7
described, 3“6, 241“244
Test Plans, 142“143
Design and, 119“122
Software Con¬guration Management
Speci¬cation and, 22, 73, 74
(SCM), 152
Subcontracts and, 26“28, 48, 134
Software Engineering Institute (SEI), 152
System Tests and, 52, 147“148
Specialty Discipline Studies, 126“127
Unit Tests and, 143, 144
Speci¬cation (Spec), 11, 17“23
Research and Development (R&D), 114,
capabilities for completing, 18“20,
122
69“70
Review Board, change requests and, 54,
de¬nition of, 17“18, 67“69, 83
152“153
described, 3“5
Risk Mitigation Plan, 6, 28, 62, 69“70, 85,
interpretation of, 20, 70“71
227“230
monitoring of, 21“22, 72“73
Root Cause Analyst, 171
negotiation of, 20“21, 71“72
Rubenstein, Reuven Y., 185
performance of, 22“23, 73“74
Run Charts, 178
Policies, Plans, and Processes and, 23
recovery issues, 57, 67“74
salaries/wages, relative to bid for project or
Requirements Traceability Matrix
program, 34, 98
(RTM) for, 22
Scattergrams, 178
subcontracts and, 26, 85“86
schedules, 6
topics covered in, 21
mix of personnel and, 79“81
types of, 18
Schedule Reviews, 16, 30, 33, 65, 66, 89,
see also Quality
90, 96
SpecsIntact, 68
Search Tables, 7“8, 156, 157, 188“189 Stamatis, Dean H., 184
Senge, Peter, 170 Standard Processes
Senior Advisory Council, 195, 196 Causes for Action and, 167, 201
shop orders, 50, 139“140 Design and, 42“43, 117“119
Show Cause letter, 87 Production/Manufacturing and, 49“50,
Six Sigma, 192 138“139
SkyMark, 160, 171, 180, 182 Project/Program Plan and Technical
˜˜Slipping in the Fix,™™ 189, 190 Plan linkage, 23, 74“75, 161“162
SmartDraw.com, 173, 176 researching, 161“162
Sobol, Ilya M., 186 standards
software documentation of, 196
for Af¬nity Diagrams, 173, 175“176 Quality, 36
for brainstorming, 159“160 Standards Traceability Index (STI), 43
for Cause and Effect Process, 170“171 Standards Traceability Matrix (STM), 23“
for Failure Mode Effect Analysis 25, 138, 255“258
(FMEA), 183 Statement Of Work (SOW), 10“17
for Force Field Analysis, 182 capabilities for completing, 12“13,
for Monte Carlo Simulation, 186 60“62
INDEX 283


de¬nition of, 10“12, 14, 59“60, 83, 118 System Tests, 40, 52“53
described, 3“5 concurrent tests of all elements, 52,
interpretation of, 13“14, 62“64 148“149
monitoring of, 15“16, 65“66 loads in, 53, 55n, 149
negotiation of, 14“15, 64“65 personnel in, 53, 149“150
performance of, 16“17, 66“67 procedures approved by customer, 52,
Policies, Plans, and Processes and, 23 146“147
recovery issues, 57, 59“67 recovery issues, 74, 111, 146“150
Speci¬cations and, 22 results of prior-level tests and, 53, 144,
subcontracts and, 26, 83, 85 150
Sterling, John C., 45 traceable to requirements, 52, 147“148
Strategic Plan, Policies, Plans, and Proc- Unit Tests and, 51, 53, 144
esses and, 24
Subcontractor Meetings, 29, 30, 66, 88, 90, tasking, 6
96 Task Quali¬cation Matrix, 69“70
Subcontract Requirements Flow-Down
Teaming, Alliances, and Subcontracts, 11,
Matrix (SRFM), 5, 83
26“30
Subcontract Requirements Traceability
monitoring in, 29, 88“89
Matrix (SRTM)
negotiation in, 28“29, 87“88
teaming and, 26“28, 83
Organization and, 26, 82“83
Unit Tests and, 51
performance of, 29“30, 89“90
Subcontracts, 5“6, 11, 26“30, 40, 48
recovery issues, 57, 83“90, 111, 133“135
capabilities for completing, 28, 84“87
subcontract de¬nition in, 26“28, 83“84
Contract/Subcontract outline, 231“233
tasks within capabilities, 28, 84“87
de¬nition of, 26“28, 83“84
see also Subcontracts
monitoring of, 29, 88“89
Teaming Agreements, 28
negotiation of, 28“29, 87“88
Team Meetings, frequency of, 15“16
performance of, 29“30, 89“90
Technical Interchange Meetings (TIMs),
recovery issues, 57, 74, 83“90, 111,
16, 29, 30, 33, 66, 88, 89, 90, 95, 96
133“135
Technical Performance Checklist (TPC),
Speci¬cations in, 26, 85“86
38“55
tasks allocated in, 28, 48, 84“87,
Architecture assertions, 39, 41“42
133“135
Con¬guration Management assertions,
Subcontracts/Purchase Order Status List,
40, 53“54
138
Design assertions, 39, 42“45
Sub Program Of¬ces (SPOs), 2
Design Review assertions, 39, 45
Subsystem Tests, 144, 150
In-Process Review assertions, 39, 45“46
Unit Tests and, 51, 53, 144
Production/Manufacturing assertions,
synergy, 158
40, 49“50
System Effectiveness Factors, 40, 54, 55n
Prototype assertions, 39, 46“48
consideration of all appropriate, 54,
Purchase Order assertions, 40, 48“49
154“155
recovery issues, 111, 154“155 Subcontract assertions, 40, 48
INDEX
284


Technical Performance Checklist (TPC) recovery issues, 58, 101“103
teamwork and, 82“83
(continued)
Tree Diagram, 168“171
System Effectiveness Factors assertions,
40, 54, 55n
Ulam, Stan, 184
System Test assertions, 40, 52“53
U.S. Army, 84
Unit Test assertions, 40, 51“52
Unit Test Plan (UTP), 142“143, 144
Technical Performance Measures (TPMs),
Unit Tests, 40, 51“52
45, 124“125
forwarded to Subsystem and System
Technical Plan, 43
Tests, 51, 144
outline of, 223“226
of individual design elements, 51, 143
Project/Program Plan and, 23“25,
Problem Test Reports (PTRs) and, 52,
74“77
144“146
Technical Recovery Checklist (TRC),
recovery issues, 74, 111, 141“146
109“155
requirements and, 51, 141“143
Architecture assertions, 110, 112“117 user interfaces, de¬nition of, 42, 116“117
Con¬guration Management assertions,
111, 150“154 vacation time, 97“98
Design assertions, 110, 117“125 Vendor Evaluation Forms, 32, 85, 93, 260,
Design Review assertions, 110, 125“127 261, 262
In-Process Review assertions, 110, vendors
127“130 competence of, 31, 91“92
evaluation of, 32, 85, 93, 260, 261, 262
Production/Manufacturing assertions,
performance of, 33, 91“92, 95
111, 138“141
Vendor/Subcontractor Database, 28
Prototype assertions, 110, 130“133
Version Description Document (VDD),
Purchase Order assertions, 111,
54, 153“154
135“138
Vision
Subcontract assertions, 111, 133“135
Organization and, 26
System Effectiveness Factor assertions,
Policies, Plans, and Processes and, 24
111, 154“155
System Test assertions, 111, 146“150
wages/salaries, relative to bid for project or
Unit Test assertions, 111, 141“146
program, 34, 98
Technology Associates, 186
Work Breakdown Structure (WBS)
third-party shipping, 37, 107“108
Architecture and, 42, 113, 114“115
Tiger Team, 67 described, 3“5
Total Quality Leadership (TQL), 192 Design and, 44, 122“123
Total Quality Management (TQM), 170, Prototypes and, 132
192 Purchase Orders and, 49
Training, 12, 35 Subcontracts and, 48, 83
adequacy of, 35, 101“102 Work Orders, 141
economical, 35, 102“103 Work Package Leaders, 6
On the Job (OJT), 97, 102 Work Package (WP), described, 3“6
mix of personnel and, 81 Wormer, James Van, 166“167, 176n, 180

<<

. 9
( 9)