Introduction to Testing |
|
|
1 |
Testing Definition |
• The process of operating a system or |
component under specification conditions, |
observing and recording the results, and |
making an evaluation of some aspect of the |
system or component (IEEE). |
2 |
1 |
Testing vs. Debugging |
• Debugging: the act of attempting to determine the cause of |
the symptoms of malfunctions detected by testing or by |
[frenzied] user complaints [Boris 1990] |
• Testing: the process of establishing confidence that a |
program or system does what it is supposed to [Hetzwl, |
William 1973] |
• Testing: the process of executing a program or system with |
the intent of finding errors [Myers, 1979] |
• Testing: any [authors opinion] activity aimed at evaluating |
an attribute or capability of a program or system and |
determining that it meets its required results[Hetzel, Bill |
1988] |
3 |
Testing vs. Debugging |
• The last definition “includes” people reviews |
• The purpose of testing is to show that a program |
has bugs [Hetzel, Bill 1988] |
• The purpose of debugging is to find the error or |
misconception that leads to the program’s failure |
and to design and implement the program changes |
that correct the error |
4 |
2 |
Types of Testing |
• Formal vs. Informal |
– Formal:planned tests, cases & harnesses retained, |
results tracked;”Process of conducting testing activities |
and reporting test results in accordance with an |
approved test plan”[Hetzel, 1988] |
• Levels of testing |
– Unit testing |
– Integration testing |
– System testing |
– Acceptance testing |
5 |
Functional Testing |
• Testing that ignores the internal |
mechanisms of a system or component and |
focuses solely on the outputs generated in |
response to selected inputs and external |
conditions |
6 |
3 |
Structural Testing |
• Testing that takes into account the internal |
mechanism of a system or component |
• Types of structural testing: |
– Branch testing |
– Path testing |
– Statement testing |
7 |
Brach Testing |
• Testing designed to check the outcome of |
execution of each decision point in a |
computer program |
8 |
4 |
Path Testing |
• Testing designed to execute all or selected |
paths through a computer program |
9 |
Statement Testing |
• Testing designed to execute each statement |
in a computer program |
1 0 |
5 |
Alpha, Bata Testing |
• Testing performed by actual customers at |
the developer’s site. |
• Testing performed by actual customers at |
their site |
1 1 |
Regression Testing |
• Selective testing of a system or component |
to verify that modifications have not caused |
unintended effects |
1 2 |
6 |
Levels of Testing |
• Unite testing |
• Interface testing |
• Integration testing |
• System testing |
• Acceptance testing |
1 3 |
Unit Testing |
• Testing of individual software components |
or groups of related components |
• Interface testing: testing conducted to |
evaluate whether systems or components |
pass data and control correctly to one |
another |
1 4 |
7 |
Integration Testing, System |
Testing |
• Testing in which software components or |
hardware components or both are combined |
and tested to evaluate the interaction |
between them |
• System testing: testing conducted on a |
complete, integrated system to evaluate |
compliance with specified requirements |
1 5 |
Acceptance Testing |
• Formal tseting conducted to enable a user, |
customer, or other authorized entity to |
determine whether to accept a system or |
component |
1 6 |
8 |
Testing Documentation (IEEE |
Std 829) |
• Test Plan |
• Test Design Description |
• Test Case Specification |
• Test Procedures Specification |
• Test Item Transmittal Report |
• Test Log |
• Test Incident Report |
• Test Summary |
1 7 |
Test Plan |
• A document describing the scope, approach, |
resources, and schedule of intended testing |
activities. |
1 8 |
9 |
Test Design Description |
• A document specifying the details of the |
testing approach for a software feature by |
identifying the associated tests |
1 9 |
Test Case |
• A set of test inputs, execution conditions, |
and expected results developed to verify |
compliance with a specific requirement or |
requirements |
2 0 |
10 |
Test Case Specification |
• A document specifying inputs, predicted |
results, and a set of execution conditions for |
a test item. |
2 1 |
Test Procedure Specification |
• A document specifying a sequence of |
actions for the execution of a test |
2 2 |
11 |
Test Item Transmittal Report |
• A document identifying test items by their |
current status and location information |
2 3 |
Test Log |
• A chronological record of relevant details |
about the execution of tests |
2 4 |
12 |
Test Incident Report |
• A document reporting on any event that |
occurs during the testing process which |
requires investigation |
2 5 |
Test Summary |
• A document summarizing testing activities |
and results, containing an evaluation of the |
corresponding test items |
2 6 |
13 |
Configuration Management |
• Applying technical and administrative direction |
and surveillance to: |
– Identify and document the functional and physical |
characteristics of a configuration items, |
– Control changes to those characteristics, |
– Record and report change processing and |
implementation status, and |
– Verify compliance with specified requirements (IEEE) |
2 7 |
Measure |
• A way to ascertain or appraise value by |
comparing to a standard (IEEE) |
2 8 |
14 |
Metric |
• A quantitative measure of the degree to |
which a system, component or process |
possesses a given attribute (IEEE) |
2 9 |
Definition |
• Mistake: A human action that produces an |
incorrect result (IEEE) |
• Fault: An incorrect step, process, or data |
definition in a computer program (IEEE) |
• Failure: The inability of a system or |
component to perform its required functions |
within specified performance requirements |
(IEEE) |
3 0 |
15 |
Definition () |
• Error: The difference between a computed, |
observed, or measured value or condition |
and the true, specified, or theoretically |
correct values or condition (IEC) |
3 1 |
Verification and Validation |
• The process of determining whether the |
requirements for a system or component are |
complete and correct, the products of each |
development phase fulfill the requirements |
or condition imposed by the previous phase, |
and the final system or component complies |
with specified requirements. |
3 2 |
16 |
Verification Definition |
• The process of evaluating a system or |
component to determine whether the |
products of a given development phase |
satisfy the conditions imposed at the start of |
that phase (IEEE) |
3 3 |
Validation Definition |
• The process of evaluating a system or |
component during or at the end of the |
development process to determine whether |
it satisfies specified requirements (IEEE) |
3 4 |
17 |
Formal Method |
• A mathematically sound approach to system |
specification or design that uses logical |
inference (proof system) for verification |
purposes |
3 5 |
Formal Methods vs. Testing |
• Testing can only be applied after the program has been |
finished. |
• Formal method is applied before the program has been |
conceived |
• Testing can only check the program for a finite number of |
input values. |
• Formal method checks it for ALL values |
• Testing an embedded real-time system may never discover |
certain execution path, which leads to errors. |
• Formal method provides full coverage of such faults in |
advance |
3 6 |
18 |
Engineering Perspective on |
Formal Methods |
• From the engineering perspective formal methods |
have to fit into the software life cycle and be |
compared to S/W development methodologies. |
• A formal method involves: |
– A notation, such as a specification language, to describe |
system’s behavior |
– A calculus to analyze and predict system’s behavior, |
usually by proof. |
– (Hopefully) software tools to assist automating the |
proof process. |
3 7 |
Formal Methods’ Main |
Categories |
• Formal methods can be roughly categorized |
into two groups: |
– Model-based (operational approaches) |
– Property-based (descriptive approaches) |
3 8 |
19 |
Model-based vs. Property-based |
• Model-based techniques define the system |
in terms of states and transitions. |
• Property-based techniques define the |
system by means of algebraic and/or logic |
equations. |
3 9 |
Model-based Examples |
• Model-based (operational) methods: |
– Petri nets |
– Timed automata |
– Synchronous languages (Esterel) |
– Statecharts |
– ASTRAL |
4 0 |
20 |
Property-based Techniques: |
Example |
• Property-based (descriptive) methods based on logic: |
– Temporal logic |
– CTL (Computation Tree Logic) |
– RTTL (Real-time Temp. Logic) |
• Property-based (descriptive) methods based on algebras: |
– VDM |
– Z |
– LOTOS |
– CCS and CSP |
– Process algebras |
4 1 |
Formal Methods in Practice |
• How formal methods work most efficiently |
in practice: |
– Describe a system formally |
– Define its desirable properties (usually in a |
different language) |
– Verify properties (hopefully, with an automatic |
tool) |
4 2 |
21 |
Formal Methods and Real-time |
Systems |
• A real-time system is usually described via |
an operational approach, but its properties |
are defined via a descriptive approach |
4 3 |
Formal Approaches to |
Verification |
• Formal approaches to verification: |
– Model checking |
– Theorem proving |
4 4 |
22 |
Model Checking |
• A technique that relies on building a finite |
model of a system, in a certain language, |
and checking that a desired property holds |
in that model |
4 5 |
Theorem Proving |
• A verification technique, in which both the |
system and its desired properties are |
expressed as formulas in logic, and |
verification relies on derivation of property |
from axioms of the logic |
4 6 |
23 |
Fault Tolerance |
• Software must detect as many software |
faults as possible |
• Recover from as many faults as possible |
4 7 |
Automatic tools |
• PVS |
• SMV |
• Murfi |
• Cabernet |
• Hytech |
• Uppaal |
• TVS |
4 8 |
24 |
No comments:
Post a Comment