• Skip to main content
  • Skip to primary sidebar
  • Skip to secondary sidebar
  • Skip to footer

Computer Notes

Library
    • Computer Fundamental
    • Computer Memory
    • DBMS Tutorial
    • Operating System
    • Computer Networking
    • C Programming
    • C++ Programming
    • Java Programming
    • C# Programming
    • SQL Tutorial
    • Management Tutorial
    • Computer Graphics
    • Compiler Design
    • Style Sheet
    • JavaScript Tutorial
    • Html Tutorial
    • Wordpress Tutorial
    • Python Tutorial
    • PHP Tutorial
    • JSP Tutorial
    • AngularJS Tutorial
    • Data Structures
    • E Commerce Tutorial
    • Visual Basic
    • Structs2 Tutorial
    • Digital Electronics
    • Internet Terms
    • Servlet Tutorial
    • Software Engineering
    • Interviews Questions
    • Basic Terms
    • Troubleshooting
Menu

Header Right

Home » Software Engineering » Measuring Software Quality in Software Engineering
Next →
← Prev

Measuring Software Quality in Software Engineering

By Dinesh Thakur

The aim of the software developer is to develop high-quality software within a specified time and budget. To achieve this, software should be developed according to the functional and performance requirements, document development standards, and characteristics expected from professionally developed software. Note that private metrics are collected by software engineers and then assimilated to achieve project-level measures. The main aim at the project level is to measure both the errors and defects. These measures are used to derive metrics, which provide an insight into the efficacy of both individual and group software quality assurance and software control activities.

Many measures have been proposed for assessing software quality such as interoperability, functionality, and so on. However, it has been observed that reliability, correctness, maintainability, integrity, and usability are most useful as they provide valuable indicators to the project team.

  • Reliability: The system or software should be able to maintain its performance level under given conditions. Reliability can be defined as the ability of the software product to perform its required functions under stated conditions for a specified period of time or for a specified number of operations. Reliability can be measured using Mean Time Between Failure (MTBF), which is the average of time between successive failures. A similar measure to MTBF is Mean Time To Repair (MTTR) which is the average time taken to repair the machine after a failure occurs. MTBF can be combined with Mean Time To Failure (MTTF), which describes how long the software can be used to calculate MTBF, that is,
  • MTBF = MTIF + MTIR.
  • Correctness: A system or software must function correctly. Correctness can be defined as the degree to which software performs its specified function. It can be measured in terms of defects per KDLOC. For quality assessment, defects are counted over a specified period of time.
  • Maintainability: In software engineering, software maintenance is one of the most expensive and time-consuming activities. Maintainability can be defined as the ease with which a software product can be modified to correct errors, to meet new requirements, to make future maintenance easier, or adapt to the changed environment. Note that software maintainability is assessed by using indirect measures like Mean Time to Change (MTTC), which can be defined as the time taken to analyze change request, design modifications, implement changes, testing, and distribute changes to all users. Generally, it has been observed that programs having lower MITC are easier to maintain.
  • Integrity: In the age of cyber-terrorism and hacking, software integrity has become an important factor in the software development. Software integrity can be defined as the degree to which unauthorized access to the components of software (program, data, and documents) can be controlled.

For measuring integrity of software, attributes such as threat and security are used. Threat can be defined as the probability of a particular attack at a given point of time. Security is the probability of repelling an attack, if it occurs. Using these two attributes, integrity can be calculated by using the following equation.

Integrity = ∑/1-(threat*(1-security))] .

  • Usability: Software, which is easy to understand and easy to use is always preferred by the user. Usability can be defined as the capability of the software to be understood, learned, and used under specified conditions. Note that software, which accomplishes all the user requirements but is not easy’ to use, is often destined to fail.

In addition to the afore-mentioned measures, lack of conformance to software requirements should be avoided as these form the basis of measuring software quality. Also, in order to achieve high quality both explicit and implicit requirements should be considered.

Defect Removal Efficiency (DRE)

Defect removal efficiency (DRE) can be defined as the quality metrics, which is beneficial at both the project level and process level. Quality assurance and control activities that are applied throughout software development are responsible for detecting errors introduced at various phases of SDLC. The ability to detect errors (filtering abilities) is measured with the help of DRE, which can be calculated by using the following equation.

DRE = E/(E + D)

Where

E = number of errors found before software is delivered to the user

D = number of defects found after software is delivered to the user.

The value of DRE approaches 1, if there are no defects in the software. As the value of E increases for a given value of D, the overall value of DRE starts to approach 1. With an increase in the value of E, the value of D decreases as more errors are discovered before the software is delivered to the user. DRE improves the quality of software by establishing methods which detect maximum number of errors before the software is delivered to the user.

DRE can also be used at different phases of software development. It is used to assess the software team’s ability to find errors at each phase before they are passed on to the next development phase. When DRE is defined in the context of SDLC phases, it can be calculated by the following equation.

DREi = K/ (Ei + Ei+1)

Where

Ei = number of errors found in phase i

Ei + 1 = number of errors that were ignored in phase i, but found in phase i + 1.

The objective of the software team is to achieve the value of DREi as 1. In other words, errors should be removed before they are passed on to the next phase.

You’ll also like:

  1. Software Engineering – What is Software Engineering? Write Basic Objective and Need for Software Engineering
  2. Definition of Software Engineering and Software Engineering Layers
  3. Software Myths : What is software myth in software engineering.
  4. Write Different Software Quality Factors
  5. Principles of Software Design & Concepts in Software Engineering
Next →
← Prev
Like/Subscribe us for latest updates     

About Dinesh Thakur
Dinesh ThakurDinesh Thakur holds an B.C.A, MCDBA, MCSD certifications. Dinesh authors the hugely popular Computer Notes blog. Where he writes how-to guides around Computer fundamental , computer software, Computer programming, and web apps.

Dinesh Thakur is a Freelance Writer who helps different clients from all over the globe. Dinesh has written over 500+ blogs, 30+ eBooks, and 10000+ Posts for all types of clients.


For any type of query or something that you think is missing, please feel free to Contact us.


Primary Sidebar

Software Engineering

Software Engineering

  • SE - Home
  • SE - Feasibility Study
  • SE - Software
  • SE - Software Maintenance Types
  • SE - Software Design Principles
  • SE - Prototyping Model
  • SE - SRS Characteristics
  • SE - Project Planning
  • SE - SRS Structure
  • SE - Software Myths
  • SE - Software Requirement
  • SE - Architectural Design
  • SE - Software Metrics
  • SE - Object-Oriented Testing
  • SE - Software Crisis
  • SE - SRS Components
  • SE - Layers
  • SE - Problems
  • SE - Requirements Analysis
  • SE - Software Process
  • SE - Software Metrics
  • SE - Debugging
  • SE - Formal Methods Model
  • SE - Management Process
  • SE - Data Design
  • SE - Testing Strategies
  • SE - Coupling and Cohesion
  • SE - hoc Model
  • SE - Challenges
  • SE - Process Vs Project
  • SE - Requirements Validation
  • SE - Component-Level Design
  • SE - Spiral Model
  • SE - RAD Model
  • SE - Coding Guidelines
  • SE - Techniques
  • SE - Software Testing
  • SE - Incremental Model
  • SE - Programming Practices
  • SE - Software Measurement
  • SE - Software Process Models
  • SE - Software Design Documentation
  • SE - Software Process Assessment
  • SE - Process Model
  • SE - Requirements Management Process
  • SE - Time Boxing Model
  • SE - Measuring Software Quality
  • SE - Top Down Vs Bottom UP Approaches
  • SE - Components Applications
  • SE - Error Vs Fault
  • SE - Monitoring a Project
  • SE - Software Quality Factors
  • SE - Phases
  • SE - Structural Testing
  • SE - COCOMO Model
  • SE - Code Verification Techniques
  • SE - Classical Life Cycle Model
  • SE - Design Techniques
  • SE - Software Maintenance Life Cycle
  • SE - Function Points
  • SE - Design Phase Objectives
  • SE - Software Maintenance
  • SE - V-Model
  • SE - Software Maintenance Models
  • SE - Object Oriented Metrics
  • SE - Software Design Reviews
  • SE - Structured Analysis
  • SE - Top-Down & Bottom up Techniques
  • SE - Software Development Phases
  • SE - Coding Methodology
  • SE - Emergence
  • SE - Test Case Design
  • SE - Coding Documentation
  • SE - Test Oracles
  • SE - Testing Levels
  • SE - Test Plan
  • SE - Staffing
  • SE - Functional Testing
  • SE - Bottom-Up Design
  • SE - Software Maintenance
  • SE - Software Design Phases
  • SE - Risk Management
  • SE - SRS Validation
  • SE - Test Case Specifications
  • SE - Software Testing Levels
  • SE - Maintenance Techniques
  • SE - Software Testing Tools
  • SE - Requirement Reviews
  • SE - Test Criteria
  • SE - Major Problems
  • SE - Quality Assurance Plans
  • SE - Different Verification Methods
  • SE - Exhaustive Testing
  • SE - Project Management Process
  • SE - Designing Software Metrics
  • SE - Static Analysis
  • SE - Software Project Manager
  • SE - Black Box Testing
  • SE - Errors Types
  • SE - Object Oriented Analysis

Other Links

  • Software Engineering - PDF Version

Footer

Basic Course

  • Computer Fundamental
  • Computer Networking
  • Operating System
  • Database System
  • Computer Graphics
  • Management System
  • Software Engineering
  • Digital Electronics
  • Electronic Commerce
  • Compiler Design
  • Troubleshooting

Programming

  • Java Programming
  • Structured Query (SQL)
  • C Programming
  • C++ Programming
  • Visual Basic
  • Data Structures
  • Struts 2
  • Java Servlet
  • C# Programming
  • Basic Terms
  • Interviews

World Wide Web

  • Internet
  • Java Script
  • HTML Language
  • Cascading Style Sheet
  • Java Server Pages
  • Wordpress
  • PHP
  • Python Tutorial
  • AngularJS
  • Troubleshooting

 About Us |  Contact Us |  FAQ

Dinesh Thakur is a Technology Columinist and founder of Computer Notes.

Copyright © 2025. All Rights Reserved.

APPLY FOR ONLINE JOB IN BIGGEST CRYPTO COMPANIES
APPLY NOW