Software testing: principles and practices / (Record no. 2671)

MARC details
000 -LEADER
fixed length control field 11313nam a2200157 4500
020 ## - INTERNATIONAL STANDARD BOOK NUMBER
International Standard Book Number 9788177541218 (pb)
040 ## - CATALOGING SOURCE
Transcribing agency CUS
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 005.14
Item number DES/S
100 ## - MAIN ENTRY--PERSONAL NAME
Personal name Desikan, Srinivasan
245 ## - TITLE STATEMENT
Title Software testing: principles and practices /
Statement of responsibility, etc. Srinivasan Desikan and Gopalaswamy Ramesh
260 ## - PUBLICATION, DISTRIBUTION, ETC. (IMPRINT)
Place of publication, distribution, etc. Noida :
Name of publisher, distributor, etc. Pearson ,
Date of publication, distribution, etc. 2006.
300 ## - PHYSICAL DESCRIPTION
Extent xviii, 486 p.
505 ## - FORMATTED CONTENTS NOTE
Formatted contents note 1.1 Context of Testing in Producing Software<br/>1.2 About this Chapter<br/>1.3 The Incomplete Car<br/>1.4 Dijkstra's Doctrine<br/>1.5 A Test in Time!<br/>1.6 The Cat and the Saint<br/>1.7 Test the Tests First!<br/>1.8 The Pesticide Paradox<br/>1.9 The Convoy and the Rags<br/>1.10 The Policemen on the Bridge<br/>1.11 The Ends of the Pendulum<br/>1.12 Men in Black<br/>1.13 • Automation Syndrome<br/>1.14 Putting it All Together<br/>References<br/>Problems and Exercises<br/>2.1 Phases of Software Project<br/>2.1.1 Requirements Gathering and Analysis<br/>2.1.2 Planning<br/>2.1.3 Design<br/>2.1.4 Development or Coding<br/>2.1.5 Testing<br/>2.1.6 Deployment and Maintenance<br/>2.2 Quality, Quality Assurance, and Quality Control<br/>2.3 Testing, Verification, and Validation<br/>2.4 Process Model to Represent Different Phases<br/>2.5 Life Cycle Models<br/>2.5.1 Waterfall Model<br/>2.5.2 Prototjqjing and Rapid Application Development Models<br/>2.5.3 Spiral or Iterative Model<br/>2.5.4 The V Model<br/>2.5.5 Modified V Model<br/>2.5.6 Comparison of Various Life Cycle Models<br/>References<br/>Problems and Exercises<br/>3.1 What is White Box Testing?<br/>3.2 Static Testing<br/>3.2.1 Static Testing by Humans<br/>3.2.2 Static Analysis Tools<br/>3.3 Structural Testing<br/>3.3.1 Unit/Code Functional Testing<br/>3.3.2 Code Coverage Testing<br/>3.3.3 Code Complexity Testing<br/>3.4 Challenges in White Box Testing<br/>References<br/>Problems and Exercises<br/>4.1 What is Black Box Testing?<br/>42 Why Black Box Testing?<br/>4.3 When to do Black Box Testing?<br/>4.4 How to do Black Box Testing?<br/>4.4.1 Requirements Based Testing<br/>4.4.2 Positive and Negative Testing<br/>4.4.3 Boundary Value Analysis<br/>4.4.4 Decision Tables<br/>4.4.5 Equivalence Partitioning<br/>4.4.6 State Based or Graph Based Testing<br/>4.4.7 Compatibility Testing<br/>4.4.8 User Documentation Testing<br/>4.4.9 Domain Testing<br/>4.5 Conclusion<br/>References<br/>Problems and Exercises<br/>5.1 What is Integration Testing?<br/>5.2 Integration Testing as a Type of Testing<br/>5.2.1 Top-Down Integration<br/>5.2.2 Bottom-Up Integration<br/>5.2.3 Bi-Directional Integration<br/>5.2.4 System Integration<br/>5.2.5 Choosing Integration Method<br/>5.3 Integration Testing as a Phase of Testing<br/>5.4 Scenario Testing<br/>5.4.1 System Scenarios<br/>5.4.2 Use Case Scenarios<br/>5.5 Defect Bash<br/>5.5.1 Choosing the Frequency and Duration of Defect Bash<br/>5.5.2 Selecting the Right Product Build<br/>5.5.3 Communicating the Objective of Defect Bash<br/>5.5.4 Setting up and Monitoring the Lab<br/>5.5.5 Taking Actions and Fixing Issues<br/>5.5.6 Optimizing the Effort Involved in Defect Bash<br/>5.6 Conclusion<br/>References<br/>Problems and Exercises<br/>6.1 System Testing Overview<br/>6.2 Why is System Testing Done?<br/>6.3 Functional Versus Non-Functional Testing<br/>6.4 Functional System Testing<br/>6.4.1 Design/Architecture Verification<br/>6.4.2 Business Vertical Testing<br/>6.4.3 Deplo5Tnent Testing<br/>6.4.4 Beta Testing<br/>6.4.5 Certification, Standards and Testing for Compliance<br/>6.5 Non-Functional Testing<br/>6.5.1 Setting up the Configuration<br/>6.5.2 Coming up with Entry/Exit Criteria<br/>6.5.3 Balancing Key Resources<br/>6.5.4 Scalability Testing<br/>6.5.5 Reliability Testing<br/>6.5.6 Stress Testing<br/>6.5.7 Interoperability Testing<br/>6.6 Acceptance Testing<br/>6.6.1 Acceptance Criteria<br/>6.6.2 Selecting Test Cases for Acceptance Testing<br/>6.6.3 Executing Acceptance Tests<br/>6.7 Summary of Testing Phases<br/>6.7.1 Multiphase Testing Model<br/>6.7.2 Working Across Multiple Releases<br/>6.7.3 Who Does What and When<br/>References<br/>Problems and Exercises<br/>7.1 Introduction<br/>7.2 Factors Governing Performance Testing<br/>7.3 Methodology for Performance Testing<br/>7.3.1 Collecting Requirements<br/>7.3.2 Writing Test Cases<br/>7.3.3 Automating Performance Test Cases<br/>7.3.4 Executing Performance Test Cases<br/>7.3.5 Analyzing the Performance Test Results<br/>7.3.6 Performance Tuning<br/>7.3.7 Performance Benchmarking<br/>7.3.8 Capacity Planning<br/>7.4 Tools for Performance Testing<br/>7.5 Process for Performance Testing<br/>7.6 Challenges<br/>References<br/>Problems and Exercises<br/>8.1 What is Regression Testing?<br/>8.2 lypes of Regression Testing<br/>8.3 When to do Regression Testing?<br/>8.4 How to do Regression Testing?<br/>8.4.1 Performing an Initial "Smoke" or "Saiuty" Test<br/>8.4.2 Understanding the Criteria for Selecting the Test Cases<br/>8.4.3 Cleissif3dng Test Cases<br/>8.4.4 Methodology for Selecting Test Cases<br/>8.4.5 Resetting the Test Cases for Regression Testing<br/>8.4.6 Concluding the Results of Regression Testing<br/>8.5 Best Practices in Regression Testing<br/>References<br/>Problems and Exercises<br/>9.1 Introduction<br/>9.2 Primer on Internationalization<br/>9.2.1 Definition of Language<br/>9.2.2 Qiaracter Set<br/>9.2.3 Locale<br/>9.2.4 Terms Used in This Chapter<br/>9.3 Test Phases for Internationalization Testing<br/>9.4 Enabling Testing<br/>9.5 Locale Testing<br/>9.6 Internationalization Validation<br/>9.7 Fsike Language Testing<br/>9.8 Language Testing<br/>9.9 Localization Testing<br/>9.10 Tools Used for Internationalization<br/>9.11 Challenges and Issues<br/>References<br/>Problems and Exercises<br/>10.1 Overview of Ad Hoc Testing<br/>10.2 Buddy Testing<br/>10.3 Pair Testing<br/>10.3.1 Situations When Pedr Testing Becomes Ineffective<br/>10.4 Exploratory Testing<br/>10.4.1 Exploratory Testing Techniques<br/>10.5 Iterative Testing<br/>10.6 Agile and Extreme Testing<br/>10.6.1 XP Work Flow<br/>10.6.2 Summary with an Example<br/>10.7 Defect Seeding<br/>10.8 Conclusion<br/>References<br/>Problems and Exercises<br/>11.1 Introduction<br/>11.2 Primer on Object-Oriented Software<br/>11.3 Differences in OO Testing<br/>11.3.1 Unit Testing a set of Classes<br/>11.3.2 Putting Classes to Work Together—Integration Testing<br/>11.3.3 System Testing and Interoperability of OO Systems<br/>11.3.4 Regression Testing of OO Systems<br/>11.3.5 Tools for Testing of OO Systems<br/>11.3.6 Summary<br/>References<br/>Problems and Exercises<br/>12.1 What is Usability Testing?<br/>12.2 Approach to Usability<br/>12.3 When to do Usability Testing?<br/>12.4 How to Achieve Usability?<br/>12.5 Quality Factors for Usability<br/>12.6 Aesthetics Testing<br/>12.7 Accessibility Testing<br/>12.7.1 Basic Accessibility<br/>12.7.2 Product Accessibility<br/>12.8 Tools for Usability<br/>12.9 Usability Lab Setup<br/>12.10 Test Roles for Usability<br/>12.11 Summary<br/>References<br/>Problems and Exercises<br/>13.1 Perceptions and Misconceptions About Testing<br/>13.1.1 Testing is not Technically Challenging"<br/>13.1.2 Testing Does Not Provide me a Career Path or Growth"<br/>13.1.3 "I Am Put in Testing—What is Wrong With Me?!"<br/>13.1.4 "These Folks Are My Adversaries"<br/>13.1.5 "Testing is What I Can Do in the End if I Get Time"<br/>13.1.6 "There is no Sense of Ownership in Testing"<br/>13.1.7 "Testing is only Destructive"<br/>13.2 Comparison between Testing and Development Fimctions<br/>13.3 Providing Career Paths for Testing Professionals<br/>13.4 The Role of the Ecosystem and a Call for Action<br/>13.4.1 Role of Education System<br/>13.4.2 Role of Senior Management<br/>13.4.3 Role of the Community<br/>References<br/>Problems and Exercises<br/>14.1 Dimensions of Organization Structures<br/>14.2 Structures in Single-Product Companies<br/>14.2.1 Testing Team Structures for Single-Product Companies<br/>14.2.2 Component-Wise Testing Teams<br/>14.3 Structures for Multi-Product Companies<br/>14.3.1 Testing Teams as Part of "CTO's Office"<br/>14.3.2 Single Test Team for All Products<br/>14.3.3 Testing Teams Organized by Product<br/>14.3.4 Separate Testing Teams for Different Phases of Testing<br/>14.3.5 Hybrid Models<br/>14.4 Effects of Globalization and Geographically Distributed Teams<br/>on Product Testing<br/>14.4.1 Business Impact of Globalization<br/>14.4.2 Round the Clock Development/Testing Model<br/>14.4.3 Testing Competency Center Model<br/>14.4.4 Challenges in Global Teams<br/>14.5 Testing Services Organizations<br/>14.5.1 Business Need for Testing Services<br/>14.5.2 Differences between Testing as a Service and Product-<br/>Testing Organizations<br/>14.5.3 Typical Roles and Responsibilities of Testing Services Organization<br/>14.5.4 Challenges and Issues in Testing Services Organizations<br/>14.6 Success Factors for Testing Organizations<br/>References<br/>Problems and Exercises<br/>15.1 introduction<br/>15.2 Test Planning<br/>15.2.1 Preparing a Test Plan<br/>15.2.2 Scope Management: Deciding Features to be Tested/Not Tested<br/>15.2.3 Deciding Test Approach/Strategy<br/>15.2.4 Setting up Criteria for Testing<br/>15.2.5 Identifjdng Responsibilities, Staffing, and Training Needs<br/>15.2.6 Identifying Resource Requirements<br/>15.2.7 Identifying Test Deliverables<br/>15.2.8 Testing Tasks: Size and Effort Estimation<br/>15.2.9 Activity Breakdown and Scheduling<br/>15.2.10 Communications Management<br/>15.2.11 Risk Management<br/>15.3 Test Management<br/>15.3.1 Choice of Standards<br/>15.3.2 Test Infrastructure Management<br/>15.3.3 Test People Management<br/>15.3.4 Integrating with Product Release<br/>15.4 Test Process<br/>15.4.1 Putting Together and Baselining a Test Plan<br/>15.4.2 Test Case Specification<br/>15.4.3 Update of Traceability Matrix<br/>15.4.4 Identifying Possible Candidates for Automation<br/>15.4.5 Developing and Baselining Test Cases<br/>15.4.6 Executing Test Cases and Keeping TraceabiUty Matrix Current<br/>15.4.7 Collecting and Analyzing Metrics<br/>15.4.8 Preparing Test Summary Report<br/>15.4.9 Recommending Product Release Criteria<br/>15.5 Test Reporting<br/>15.5.1 Recommending Product Release<br/>15.6 Best Practices<br/>15.6.1 Process Related Best Practices<br/>15.6.2 People Related Best Practices<br/>15.6.3 Technology Related Best Practices<br/>Appendix A: Test Planning Checklist<br/>Appendix B: Test Plan Template<br/>References<br/>Problems and Exercises<br/>16.1 What is Test Automation?<br/>16.2 Terms Used in Automation<br/>16.3 Skills Needed for Automation<br/>16.4 What to Automate, Scope of Automation<br/>16.4.1 Identifjdng the Types of Testing Amenable to Automation<br/>16.4.2 Automating Areas Less Prone to Chcmge<br/>16.4.3 Automate Tests that Pertain to Standards<br/>16.4.4 Management Aspects in Automation<br/>16.5 Design and Architecture for Automation<br/>16.5.1 External Modules<br/>16.5.2 Scenario and Configuration File Modules<br/>16.5.3 Test Cases and Test Freimework Modules<br/>16.5.4 Tools and Results Modules<br/>16.5.5 Report Generator and Reports/Metrics Modules<br/>16.6 Generic Requirements for Test Tool/Framework<br/>16.7 Process Model for Automation<br/>16.8 Selecting a Test Tool<br/>16.8.1 Criteria for Selecting Test Tools<br/>16.8.2 Steps for Tool Selection and Deplojmient<br/>16.9 Automation for Extreme Programming Model<br/>16.10 Challenges in Automation<br/>16.11 Summary<br/>References<br/>Problems and Exercises<br/>17.1 What are Metrics and Measurements?<br/>17.2 Why Metrics in Testing?<br/>17.3 Types of Metrics<br/>17.4 Project Metrics<br/>17.4.1 Effort Variance (Planned vs Actual)<br/>17.4.2 Schedule Variance (Plaimed vs Actual)<br/>17.4.3 Effort Distribution Across Phases<br/>17.5 Progress Metrics<br/>17.5.1 Test Defect Metrics<br/>17.5.2 Development Defect Metrics<br/>17.6 Productivity Metrics<br/>17.6.1 Defects per ICQ Hours of Testing<br/>17.6.2 Test Cases Executed per ICQ Hours of Testing<br/>17.6.3 Test Cases Developed per 100 Hours of Testing<br/>17.6.4 Defects per 100 Test Cases<br/>17.6.5 Defects per 100 Failed Test Cases<br/>17.6.6 Test Phase Effectiveness<br/>17.6.7 Closed Defect Distribution<br/>17.7 Release metrics<br/>17.8 Summary<br/>References<br/>Problems and Exercises
650 ## - SUBJECT
Keyword Computer Programming
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Koha item type General Books
Holdings
Withdrawn status Lost status Damaged status Not for loan Home library Current library Shelving location Date acquired Full call number Accession number Date last seen Koha item type
        Central Library, Sikkim University Central Library, Sikkim University General Book Section 13/06/2016 005.14 DES/S P19915 13/06/2016 General Books
SIKKIM UNIVERSITY
University Portal | Contact Librarian | Library Portal

Powered by Koha