Author: saqibkhan

  • Function Oriented Design

    Function Oriented design is a method to software design where the model is decomposed into a set of interacting units or modules where each unit or module has a clearly defined function. Thus, the system is designed from a functional viewpoint.

    Design Notations

    Design Notations are primarily meant to be used during the process of design and are used to represent design or design decisions. For a function-oriented design, the design can be represented graphically or mathematically by the following:

    Function Oriented Design

    Data Flow Diagram

    Data-flow design is concerned with designing a series of functional transformations that convert system inputs into the required outputs. The design is described as data-flow diagrams. These diagrams show how data flows through a system and how the output is derived from the input through a series of functional transformations.

    Data-flow diagrams are a useful and intuitive way of describing a system. They are generally understandable without specialized training, notably if control information is excluded. They show end-to-end processing. That is the flow of processing from when data enters the system to where it leaves the system can be traced.

    Data-flow design is an integral part of several design methods, and most CASE tools support data-flow diagram creation. Different ways may use different icons to represent data-flow diagram entities, but their meanings are similar.

    The notation which is used is based on the following symbols:

    Function Oriented Design
    Function Oriented Design

    The report generator produces a report which describes all of the named entities in a data-flow diagram. The user inputs the name of the design represented by the diagram. The report generator then finds all the names used in the data-flow diagram. It looks up a data dictionary and retrieves information about each name. This is then collated into a report which is output by the system.

    Data Dictionaries

    A data dictionary lists all data elements appearing in the DFD model of a system. The data items listed contain all data flows and the contents of all data stores looking on the DFDs in the DFD model of a system.

    A data dictionary lists the objective of all data items and the definition of all composite data elements in terms of their component data items. For example, a data dictionary entry may contain that the data grossPay consists of the parts regularPay and overtimePay.

                      grossPay = regularPay + overtimePay

    For the smallest units of data elements, the data dictionary lists their name and their type.

    A data dictionary plays a significant role in any software development process because of the following reasons:

    • A Data dictionary provides a standard language for all relevant information for use by engineers working in a project. A consistent vocabulary for data items is essential since, in large projects, different engineers of the project tend to use different terms to refer to the same data, which unnecessarily causes confusion.
    • The data dictionary provides the analyst with a means to determine the definition of various data structures in terms of their component elements.

    Structured Charts

    It partitions a system into block boxes. A Black box system that functionality is known to the user without the knowledge of internal design.

    Function Oriented Design

    Structured Chart is a graphical representation which shows:

    • System partitions into modules
    • Hierarchy of component modules
    • The relation between processing modules
    • Interaction between modules
    • Information passed between modules

    The following notations are used in structured chart:

    Function Oriented Design

    Pseudo-code

    Pseudo-code notations can be used in both the preliminary and detailed design phases. Using pseudo-code, the designer describes system characteristics using short, concise, English Language phases that are structured by keywords such as If-Then-Else, While-Do, and End.

  • Cohesion and Coupling in Software Engineering

    Introduction:

    The design and evaluation of software systems are heavily reliant on the concepts of coherence and coupling. They describe the arrangement and communication between the modules or constituents of a software system. Building software applications that are resilient, scalable, and maintainable requires an understanding of cohesion and coupling.

    Introduction to Cohesion and Coupling:

    The art of designing manageable and effective software components known as modularization is shaped by coupling and cohesion in software engineering.

    Module interdependence is defined by coupling whereas component unity is measured by cohesion. Achieving high cohesion and low coupling encourages modular structures that are understandable and maintainable. By navigating complexity, developers can enhance testing, scalability, and teamwork through this mutually beneficial relationship. These guidelines affect customer satisfaction and project management throughout the whole software lifecycle.

    Module Coupling

    In software engineering, the coupling is the degree of interdependence between software modules. Two modules that are tightly coupled are strongly dependent on each other. However, two modules that are loosely coupled are not dependent on each other. Uncoupled modules have no interdependence at all within them.

    The various types of coupling techniques are shown in fig:

    Coupling and Cohesion

    A good design is the one that has low coupling. Coupling is measured by the number of relations between the modules. That is, the coupling increases as the number of calls between modules increase or the amount of shared data is large. Thus, it can be said that a design with high coupling will have more errors.

    Types of Module Coupling

    Coupling and Cohesion

    1. No Direct Coupling: There is no direct coupling between M1 and M2.

    Coupling and Cohesion

    In this case, modules are subordinates to different modules. Therefore, no direct coupling.

    2. Data Coupling: When data of one module is passed to another module, this is called data coupling.

    Coupling and Cohesion

    3. Stamp Coupling: Two modules are stamp coupled if they communicate using composite data items such as structure, objects, etc. When the module passes non-global data structure or entire structure to another module, they are said to be stamp coupled. For example, passing structure variable in C or object in C++ language to a module.

    4. Control Coupling: Control Coupling exists among two modules if data from one module is used to direct the structure of instruction execution in another.

    5. External Coupling: External Coupling arises when two modules share an externally imposed data format, communication protocols, or device interface. This is related to communication to external tools and devices.

    6. Common Coupling: Two modules are common coupled if they share information through some global data items.

    Coupling and Cohesion

    7. Content Coupling: Content Coupling exists among two modules if they share code, e.g., a branch from one module into another module.

    Module Cohesion

    In computer programming, cohesion defines to the degree to which the elements of a module belong together. Thus, cohesion measures the strength of relationships between pieces of functionality within a given module. For example, in highly cohesive systems, functionality is strongly related.

    Cohesion is an ordinal type of measurement and is generally described as “high cohesion” or “low cohesion.”

    Coupling and Cohesion

    Types of Modules Cohesion

    Coupling and Cohesion
    1. Functional Cohesion: Functional Cohesion is said to exist if the different elements of a module, cooperate to achieve a single function.
    2. Sequential Cohesion: A module is said to possess sequential cohesion if the element of a module form the components of the sequence, where the output from one component of the sequence is input to the next.
    3. Communicational Cohesion: A module is said to have communicational cohesion, if all tasks of the module refer to or update the same data structure, e.g., the set of functions defined on an array or a stack.
    4. Procedural Cohesion: A module is said to be procedural cohesion if the set of purpose of the module are all parts of a procedure in which particular sequence of steps has to be carried out for achieving a goal, e.g., the algorithm for decoding a message.
    5. Temporal Cohesion: When a module includes functions that are associated by the fact that all the methods must be executed in the same time, the module is said to exhibit temporal cohesion.
    6. Logical Cohesion: A module is said to be logically cohesive if all the elements of the module perform a similar operation. For example Error handling, data input and data output, etc.
    7. Coincidental Cohesion: A module is said to have coincidental cohesion if it performs a set of tasks that are associated with each other very loosely, if at all.

    Difference between Cohesion & Coupling:

    Coupling and Cohesion
    AspectCouplingCohesion
    DefinitionLevel of interdependence among a systems modules or constituent parts.The degree of focus and relatedness among a module or component.
    FocusInteraction amongst modules.Elements that make up a module.
    Impact on ChangeOne module change may have an effect on others.Modifications are contained within a module.
    FlexibilitySince changes are likely to spread a high coupling, decreases system flexibility.As changes are localized high cohesion increases system flexibility.
    MaintenanceDue to the frequent changes high coupling makes maintenance more difficult.AS changes are limited maintenance is made easier by high cohesion.
    TestingIsolating and testing coupled modules is more difficult.Cohesive modules functionality is well-contained testing them is simpler.
    ReuseBecause of dependencies coupled modules are less reusable.Cohesive modules clear and targeted functionality makes them more reusable.
    DependencyModule dependency is represented by coupling.Cohesion stands for the purpose and unity of a module.
    Design GoalTo reduce interdependencies, aim for low coupling.For modules to be clear and focused strive for high cohesion.
    ObjectiveFor system stability minimize dependencies and interactions.elemental groupings to accomplish a clear goal.
    System ImpactCascading failures and rigid architectures can result from high coupling.Adaptable architectures and maintainability are encouraged by high cohesion.

    Conclusion:

    The quality and maintainability of software systems are greatly impacted by the fundamental software engineering concepts of cohesion and coupling. Strong module cohesion guarantees focused, unambiguous functionality which facilitates the understanding testing and maintenance of code.

    Aiming for low coupling and high cohesion together results in systems that are more adaptable, resilient, and changeable. A well-designed software system achieves maintainability, reusability, and long-term success by striking a harmonious balance between coupling and cohesion. Software engineers can create systems that are not only functional but also flexible enough to adapt to changing user demands and technological breakthroughs by comprehending and putting these principles into practice.

  • Software Design Principles

    Software design principles are concerned with providing means to handle the complexity of the design process effectively. Effectively managing the complexity will not only reduce the effort needed for design but can also reduce the scope of introducing errors during design.

    Following are the principles of Software Design

    Software Design Principles

    Problem Partitioning

    For small problem, we can handle the entire problem at once but for the significant problem, divide the problems and conquer the problem it means to divide the problem into smaller pieces so that each piece can be captured separately.

    For software design, the goal is to divide the problem into manageable pieces.

    Benefits of Problem Partitioning

    1. Software is easy to understand
    2. Software becomes simple
    3. Software is easy to test
    4. Software is easy to modify
    5. Software is easy to maintain
    6. Software is easy to expand

    These pieces cannot be entirely independent of each other as they together form the system. They have to cooperate and communicate to solve the problem. This communication adds complexity.

    Note: As the number of partition increases = Cost of partition and complexity increases


    Abstraction

    An abstraction is a tool that enables a designer to consider a component at an abstract level without bothering about the internal details of the implementation. Abstraction can be used for existing element as well as the component being designed.

    Here, there are two common abstraction mechanisms

    1. Functional Abstraction
    2. Data Abstraction

    Functional Abstraction

    1. A module is specified by the method it performs.
    2. The details of the algorithm to accomplish the functions are not visible to the user of the function.

    Functional abstraction forms the basis for Function oriented design approaches.

    Data Abstraction

    Details of the data elements are not visible to the users of data. Data Abstraction forms the basis for Object Oriented design approaches.


    Modularity

    Modularity specifies to the division of software into separate modules which are differently named and addressed and are integrated later on in to obtain the completely functional software. It is the only property that allows a program to be intellectually manageable. Single large programs are difficult to understand and read due to a large number of reference variables, control paths, global variables, etc.

    The desirable properties of a modular system are:

    • Each module is a well-defined system that can be used with other applications.
    • Each module has single specified objectives.
    • Modules can be separately compiled and saved in the library.
    • Modules should be easier to use than to build.
    • Modules are simpler from outside than inside.

    Advantages and Disadvantages of Modularity

    In this topic, we will discuss various advantage and disadvantage of Modularity.

    Software Design Principles

    Advantages of Modularity

    There are several advantages of Modularity

    • It allows large programs to be written by several or different people
    • It encourages the creation of commonly used routines to be placed in the library and used by other programs.
    • It simplifies the overlay procedure of loading a large program into main storage.
    • It provides more checkpoints to measure progress.
    • It provides a framework for complete testing, more accessible to test
    • It produced the well designed and more readable program.

    Disadvantages of Modularity

    There are several disadvantages of Modularity

    • Execution time maybe, but not certainly, longer
    • Storage size perhaps, but is not certainly, increased
    • Compilation and loading time may be longer
    • Inter-module communication problems may be increased
    • More linkage required, run-time may be longer, more source lines must be written, and more documentation has to be done

    Modular Design

    Modular design reduces the design complexity and results in easier and faster implementation by allowing parallel development of various parts of a system. We discuss a different section of modular design in detail in this section:

    1. Functional Independence: Functional independence is achieved by developing functions that perform only one kind of task and do not excessively interact with other modules. Independence is important because it makes implementation more accessible and faster. The independent modules are easier to maintain, test, and reduce error propagation and can be reused in other programs as well. Thus, functional independence is a good design feature which ensures software quality.

    It is measured using two criteria:

    • Cohesion: It measures the relative function strength of a module.
    • Coupling: It measures the relative interdependence among modules.

    2. Information hiding: The fundamental of Information hiding suggests that modules can be characterized by the design decisions that protect from the others, i.e., In other words, modules should be specified that data include within a module is inaccessible to other modules that do not need for such information.

    The use of information hiding as design criteria for modular system provides the most significant benefits when modifications are required during testing’s and later during software maintenance. This is because as most data and procedures are hidden from other parts of the software, inadvertent errors introduced during modifications are less likely to propagate to different locations within the software.


    Strategy of Design

    A good system design strategy is to organize the program modules in such a method that are easy to develop and latter too, change. Structured design methods help developers to deal with the size and complexity of programs. Analysts generate instructions for the developers about how code should be composed and how pieces of code should fit together to form a program.

    To design a system, there are two possible approaches:

    1. Top-down Approach
    2. Bottom-up Approach

    1. Top-down Approach: This approach starts with the identification of the main components and then decomposing them into their more detailed sub-components.

    Software Design Principles

    2. Bottom-up Approach: A bottom-up approach begins with the lower details and moves towards up the hierarchy, as shown in fig. This approach is suitable in case of an existing system.

    Software Design Principles
  • Software Design in Software Engineering

    Introduction:

    Software design is the process of constructing software methods, functions, objects, the overall structure and interaction of your code, so that the functionality produced satisfies the needs of your users. There are many different methods for designing software. Different developers have different preferences for different design levels either up front or during the execution phase. In general, the overall design should be carefully considered and reviewed before starting to code. Early in the development cycle it is simpler to test various designs and identify issues than to make a big design change after the majority of the code has been written. To begin let’s review the definition of software design in software engineering.

    What is Software Design:

    Software design is a mechanism to transform user requirements into some suitable form, which helps the programmer in software coding and implementation. It deals with representing the client’s requirement, as described in SRS (Software Requirement Specification) document, into a form, i.e., easily implementable using programming language.

    The software design phase is the first step in SDLC (Software Design Life Cycle), which moves the concentration from the problem domain to the solution domain. In software design, we consider the system to be a set of components or modules with clearly defined behaviors & boundaries.

    Software Design

    Objectives of Software Design

    Following are the purposes of Software design:

    Software Design
    1. Correctness:Software design should be correct as per requirement.
    2. Completeness:The design should have all components like data structures, modules, and external interfaces, etc.
    3. Efficiency:Resources should be used efficiently by the program.
    4. Flexibility:Able to modify on changing needs.
    5. Consistency:There should not be any inconsistency in the design.
    6. Maintainability: The design should be so simple so that it can be easily maintainable by other designers.

    Software Design Levels:

    In software design there are three levels.

    Software Design

    1. Architecture Design:

    An architecture is the overall structure of a system and how that structure provides conceptual integrity to the system. According to the architectural design the software is a system made up of many interrelated parts. At this point the designers obtain a broad understanding of the domain of the suggested solution.

    2. High-level design:

    By breaking down the architectural design’s “single entity-multiple component” concept the high-level design presents a less abstract view of subsystems and modules and illustrates how they interact with one another. Implementing the system and its components as modules is the focus of high-level design. It acknowledges each subsystem modular design in addition to their connections and interactions with one another.

    3. Detailed Design:

    Following the completion of the high-level design the detailed design process starts. At this stage of software design every module is thoroughly examined to determine the data structures and algorithms that will be employed. Ultimately the stages results are recorded in a module specification document. It outlines each modules interface with other modules as well as its logical structure.

    Principles of Software Design:

    Let’s examine a few software design ideas that help a software engineer build a model of the system or software product that needs to be developed. Before creating a software system, the following concepts should be understood.

    Abstraction:

    Object-oriented programming (OOP) languages include abstraction as one of their core ideas. Its main goal is to handle complexity by keeping the user unaware of internal details. This enables the user to construct increasingly complex reasoning on top of the provided abstraction without needing to comprehend or even consider all of the hidden complexity.

    Flexibility:

    To reduce the complexity of a system or project modularity means dividing it into smaller parts. In design, modularity also means breaking a system up into smaller components that can be constructed separately and then utilized in different systems for various functions. It is sometimes necessary to divide software into modules in order to deal with monolithic software which is challenging for software engineers to understand. Modularity in design has consequently emerged as a crucial and popular trend.

    Architecture:

    The design choices pertaining to the overall structure and functionality of a system are represented by its software architecture. Architecture helps stakeholders understand and assess how the system will achieve important features like security, availability, and modifiability. It outlines the relationships and communication between the various parts that make up a software system. For the development team it serves as a foundation and a blueprint for the software application.

    Enhancement:

    Refinement is the process of eliminating impurities and raising something caliber. Building or presenting the software or system in a detailed way which entails expanding on a system or software is the idea behind software design refinement. Furthermore, refinement is crucial for locating and fixing any potential mistakes.

    Design Patterns:

    Within a specific software design context, a software design pattern is a general reusable solution to a problem that frequently arises. Representing some of the best techniques used by seasoned object-oriented software engineers they serve as templates to address typical software engineering issues. In object-oriented systems a design pattern is a general design that addresses a recurrent design problem in a methodical manner. It covers the issue, the solution, when to apply it, and the consequences. Also, it offers examples and implementation advice.

    Data (or) Information Hiding:

    Software Design

    In its most basic form, information hiding is the act of keeping information hidden from unauthorized parties. When designing software information hiding is achieved by structuring modules so that data obtained or stored in one module is hidden and inaccessible by other modules.

    Refactoring:

    Refactoring is the process of rearranging code so that it works as intended. By making small adjustments that don’t impact the codes external behavior refactoring attempts to enhance internal code. Code is refactored by computer programmers and software developers to enhance the software implementation, structure, and design. Refactoring thus reduces complexity while improving code readability. Hence, refactoring can help software engineers find errors or weaknesses in their code.

    Conclusion:

    In software design user requirements are transformed into a format that programmers can use for coding and implementation. Using a programming language, it aims to transform the client’s requirements as stated in the Software Requirement Specification (SRS) document into a format that is simple to implement. During the software design phase of the Software Design Life Cycle (SDLC) the emphasis moves from the problem domain to the solution domain. It tries to provide guidance on how to fulfill the SRSs requirements. Accuracy, completeness, efficiency, flexibility, consistency, and maintainability are the goals of the software design process.

  • Six Sigma

    Six Sigma is the process of improving the quality of the output by identifying and eliminating the cause of defects and reduce variability in manufacturing and business processes. The maturity of a manufacturing process can be defined by a sigma rating indicating its percentage of defect-free products it creates. A six sigma method is one in which 99.99966% of all the opportunities to produce some features of a component are statistically expected to be free of defects (3.4 defective features per million opportunities).

    Six Sigma

    History of Six Sigma

    Six-Sigma is a set of methods and tools for process improvement. It was introduced by Engineer Sir Bill Smith while working at Motorola in 1986. In the 1980s, Motorola was developing Quasar televisions which were famous, but the time there was lots of defects which came up on that due to picture quality and sound variations.

    By using the same raw material, machinery and workforce a Japanese form took over Quasar television production, and within a few months, they produce Quasar TV’s sets which have fewer errors. This was obtained by improving management techniques.

    Six Sigma was adopted by Bob Galvin, the CEO of Motorola in 1986 and registered as a Motorola Trademark on December 28, 1993, then it became a quality leader.

    Characteristics of Six Sigma

    The Characteristics of Six Sigma are as follows:

    Six Sigma
    1. Statistical Quality Control: Six Sigma is derived from the Greek Letter σ (Sigma) from the Greek alphabet, which is used to denote Standard Deviation in statistics. Standard Deviation is used to measure variance, which is an essential tool for measuring non-conformance as far as the quality of output is concerned.
    2. Methodical Approach: The Six Sigma is not a merely quality improvement strategy in theory, as it features a well defined systematic approach of application in DMAIC and DMADV which can be used to improve the quality of production. DMAIC is an acronym for Design-Measure- Analyze-Improve-Control. The alternative method DMADV stands for Design-Measure- Analyze-Design-Verify.
    3. Fact and Data-Based Approach: The statistical and methodical aspect of Six Sigma shows the scientific basis of the technique. This accentuates essential elements of the Six Sigma that is a fact and data-based.
    4. Project and Objective-Based Focus: The Six Sigma process is implemented for an organization’s project tailored to its specification and requirements. The process is flexed to suits the requirements and conditions in which the projects are operating to get the best results.
    5. Customer Focus: The customer focus is fundamental to the Six Sigma approach. The quality improvement and control standards are based on specific customer requirements.
    6. Teamwork Approach to Quality Management: The Six Sigma process requires organizations to get organized when it comes to controlling and improving quality. Six Sigma involving a lot of training depending on the role of an individual in the Quality Management team.

    Six Sigma Methodologies

    Six Sigma projects follow two project methodologies:

    1. DMAIC
    2. DMADV
    Six Sigma

    DMAIC

    It specifies a data-driven quality strategy for improving processes. This methodology is used to enhance an existing business process.

    The DMAIC project methodology has five phases:

    Six Sigma
    1. Define: It covers the process mapping and flow-charting, project charter development, problem-solving tools, and so-called 7-M tools.
    2. Measure: It includes the principles of measurement, continuous and discrete data, and scales of measurement, an overview of the principle of variations and repeatability and reproducibility (RR) studies for continuous and discrete data.
    3. Analyze: It covers establishing a process baseline, how to determine process improvement goals, knowledge discovery, including descriptive and exploratory data analysis and data mining tools, the basic principle of Statistical Process Control (SPC), specialized control charts, process capability analysis, correlation and regression analysis, analysis of categorical data, and non-parametric statistical methods.
    4. Improve: It covers project management, risk assessment, process simulation, and design of experiments (DOE), robust design concepts, and process optimization.
    5. Control: It covers process control planning, using SPC for operational control and PRE-Control.

    DMADV

    It specifies a data-driven quality strategy for designing products and processes. This method is used to create new product designs or process designs in such a way that it results in a more predictable, mature, and detect free performance.

    The DMADV project methodology has five phases:

    Six Sigma
    1. Define: It defines the problem or project goal that needs to be addressed.
    2. Measure: It measures and determines the customer’s needs and specifications.
    3. Analyze: It analyzes the process to meet customer needs.
    4. Design: It can design a process that will meet customer needs.
    5. Verify: It can verify the design performance and ability to meet customer needs.
  • People Capability Maturity Model

    PCMM is a maturity structure that focuses on continuously improving the management and development of the human assets of an organization.

    It defines an evolutionary improvement path from Adhoc, inconsistently performed practices, to a mature, disciplined, and continuously improving the development of the knowledge, skills, and motivation of the workforce that enhances strategic business performance.

    The People Capability Maturity Model (PCMM) is a framework that helps the organization successfully address their critical people issues. Based on the best current study in fields such as human resources, knowledge management, and organizational development, the PCMM guides organizations in improving their steps for managing and developing their workforces.

    The People CMM defines an evolutionary improvement path from Adhoc, inconsistently performed workforce practices, to a mature infrastructure of practices for continuously elevating workforce capability.

    The PCMM subsists of five maturity levels that lay successive foundations for continuously improving talent, developing effective methods, and successfully directing the people assets of the organization. Each maturity level is a well-defined evolutionary plateau that institutionalizes a level of capability for developing the talent within the organization

    The five steps of the People CMM framework are:

    People Capability Maturity Model (PCMM)

    Initial Level: Maturity Level 1

    The Initial Level of maturity includes no process areas. Although workforce practices implement in Maturity Level, 1 organization tend to be inconsistent or ritualistic, virtually all of these organizations perform processes that are defined in the Maturity Level 2 process areas.

    Managed Level: Maturity Level 2

    To achieve the Managed Level, Maturity Level 2, managers starts to perform necessary people management practices such as staffing, operating performance, and adjusting compensation as a repeatable management discipline. The organization establishes a culture focused at the unit level for ensuring that person can meet their work commitments. In achieving Maturity Level 2, the organization develops the capability to handle skills and performance at the unit level. The process areas at Maturity Level 2 are Staffing, Communication and Coordination, Work Environment, Performance Management, Training and Development, and Compensation.

    Defined Level: Maturity Level 3

    The fundamental objective of the defined level is to help an organization gain a competitive benefit from developing the different competencies that must be combined in its workforce to accomplish its business activities. These workforce competencies represent critical pillars supporting the strategic workforce competencies to current and future business objectives; the improved workforce practices for implemented at Maturity Level 3 become crucial enablers of business strategy.

    Predictable Level: Maturity Level 4

    At the Predictable Level, the organization handles and exploits the capability developed by its framework of workforce competencies. The organization is now able to handle its capacity and performance quantitatively. The organization can predict its capability for performing work because it can quantify the ability of its workforce and of the competency-based methods they use performing in their assignments.

    Optimizing Level: Maturity Level 5

    At the Optimizing Level, the integrated organization is focused on continual improvement. These improvements are made to the efficiency of individuals and workgroups, to the act of competency-based processes, and workforce practices and activities.

  • Software Engineering Institute Capability Maturity Model (SEICMM)

    The Capability Maturity Model (CMM) is a procedure used to develop and refine an organization’s software development process.

    The model defines a five-level evolutionary stage of increasingly organized and consistently more mature processes.

    CMM was developed and is promoted by the Software Engineering Institute (SEI), a research and development center promote by the U.S. Department of Defense (DOD).

    Capability Maturity Model is used as a benchmark to measure the maturity of an organization’s software process.

    Methods of SEICMM

    There are two methods of SEICMM:

    Software Engineering Institute Capability Maturity Model (SEICMM)

    Capability Evaluation: Capability evaluation provides a way to assess the software process capability of an organization. The results of capability evaluation indicate the likely contractor performance if the contractor is awarded a work. Therefore, the results of the software process capability assessment can be used to select a contractor.

    Software Process Assessment: Software process assessment is used by an organization to improve its process capability. Thus, this type of evaluation is for purely internal use.

    SEI CMM categorized software development industries into the following five maturity levels. The various levels of SEI CMM have been designed so that it is easy for an organization to build its quality system starting from scratch slowly.

    Software Engineering Institute Capability Maturity Model (SEICMM)

    Level 1: Initial

    Ad hoc activities characterize a software development organization at this level. Very few or no processes are described and followed. Since software production processes are not limited, different engineers follow their process and as a result, development efforts become chaotic. Therefore, it is also called a chaotic level.

    Level 2: Repeatable

    At this level, the fundamental project management practices like tracking cost and schedule are established. Size and cost estimation methods, like function point analysis, COCOMO, etc. are used.

    Level 3: Defined

    At this level, the methods for both management and development activities are defined and documented. There is a common organization-wide understanding of operations, roles, and responsibilities. The ways through defined, the process and product qualities are not measured. ISO 9000 goals at achieving this level.

    Level 4: Managed

    At this level, the focus is on software metrics. Two kinds of metrics are composed.

    Product metrics measure the features of the product being developed, such as its size, reliability, time complexity, understandability, etc.

    Process metrics follow the effectiveness of the process being used, such as average defect correction time, productivity, the average number of defects found per hour inspection, the average number of failures detected during testing per LOC, etc. The software process and product quality are measured, and quantitative quality requirements for the product are met. Various tools like Pareto charts, fishbone diagrams, etc. are used to measure the product and process quality. The process metrics are used to analyze if a project performed satisfactorily. Thus, the outcome of process measurements is used to calculate project performance rather than improve the process.

    Level 5: Optimizing

    At this phase, process and product metrics are collected. Process and product measurement data are evaluated for continuous process improvement.

    Key Process Areas (KPA) of a software organization

    Except for SEI CMM level 1, each maturity level is featured by several Key Process Areas (KPAs) that contains the areas an organization should focus on improving its software process to the next level. The focus of each level and the corresponding key process areas are shown in the fig.

    Software Engineering Institute Capability Maturity Model (SEICMM)

    SEI CMM provides a series of key areas on which to focus to take an organization from one level of maturity to the next. Thus, it provides a method for gradual quality improvement over various stages. Each step has been carefully designed such that one step enhances the capability already built up.

  • ISO 9000 Certification

    ISO (International Standards Organization) is a group or consortium of 63 countries established to plan and fosters standardization. ISO declared its 9000 series of standards in 1987. It serves as a reference for the contract between independent parties. The ISO 9000 standard determines the guidelines for maintaining a quality system. The ISO standard mainly addresses operational methods and organizational methods such as responsibilities, reporting, etc. ISO 9000 defines a set of guidelines for the production process and is not directly concerned about the product itself.

    Types of ISO 9000 Quality Standards

    ISO 9000 Certification

    The ISO 9000 series of standards is based on the assumption that if a proper stage is followed for production, then good quality products are bound to follow automatically. The types of industries to which the various ISO standards apply are as follows.

    1. ISO 9001: This standard applies to the organizations engaged in design, development, production, and servicing of goods. This is the standard that applies to most software development organizations.
    2. ISO 9002: This standard applies to those organizations which do not design products but are only involved in the production. Examples of these category industries contain steel and car manufacturing industries that buy the product and plants designs from external sources and are engaged in only manufacturing those products. Therefore, ISO 9002 does not apply to software development organizations.
    3. ISO 9003: This standard applies to organizations that are involved only in the installation and testing of the products. For example, Gas companies.

    How to get ISO 9000 Certification?

    An organization determines to obtain ISO 9000 certification applies to ISO registrar office for registration. The process consists of the following stages:

    ISO 9000 Certification
    1. Application: Once an organization decided to go for ISO certification, it applies to the registrar for registration.
    2. Pre-Assessment: During this stage, the registrar makes a rough assessment of the organization.
    3. Document review and Adequacy of Audit: During this stage, the registrar reviews the document submitted by the organization and suggest an improvement.
    4. Compliance Audit: During this stage, the registrar checks whether the organization has compiled the suggestion made by it during the review or not.
    5. Registration: The Registrar awards the ISO certification after the successful completion of all the phases.
    6. Continued Inspection: The registrar continued to monitor the organization time by time.
  • Software Quality

    Software quality product is defined in term of its fitness of purpose. That is, a quality product does precisely what the users want it to do. For software products, the fitness of use is generally explained in terms of satisfaction of the requirements laid down in the SRS document. Although “fitness of purpose” is a satisfactory interpretation of quality for many devices such as a car, a table fan, a grinding machine, etc.for software products, “fitness of purpose” is not a wholly satisfactory definition of quality.

    Example: Consider a functionally correct software product. That is, it performs all tasks as specified in the SRS document. But, has an almost unusable user interface. Even though it may be functionally right, we cannot consider it to be a quality product.

    The modern view of a quality associated with a software product several quality methods such as the following:

    Portability: A software device is said to be portable, if it can be freely made to work in various operating system environments, in multiple machines, with other software products, etc.

    Usability: A software product has better usability if various categories of users can easily invoke the functions of the product.

    Reusability: A software product has excellent reusability if different modules of the product can quickly be reused to develop new products.

    Correctness: A software product is correct if various requirements as specified in the SRS document have been correctly implemented.

    Maintainability: A software product is maintainable if bugs can be easily corrected as and when they show up, new tasks can be easily added to the product, and the functionalities of the product can be easily modified, etc.

    Software Quality Management System

    A quality management system is the principal methods used by organizations to provide that the products they develop have the desired quality.

    A quality system subsists of the following:

    Managerial Structure and Individual Responsibilities: A quality system is the responsibility of the organization as a whole. However, every organization has a sever quality department to perform various quality system activities. The quality system of an arrangement should have the support of the top management. Without help for the quality system at a high level in a company, some members of staff will take the quality system seriously.

    Quality System Activities: The quality system activities encompass the following:

    Auditing of projects

    Review of the quality system

    Development of standards, methods, and guidelines, etc.

    Production of documents for the top management summarizing the effectiveness of the quality system in the organization.

    Evolution of Quality Management System

    Quality systems have increasingly evolved over the last five decades. Before World War II, the usual function to produce quality products was to inspect the finished products to remove defective devices. Since that time, quality systems of organizations have undergone through four steps of evolution, as shown in the fig. The first product inspection task gave method to quality control (QC).

    Quality control target not only on detecting the defective devices and removes them but also on determining the causes behind the defects. Thus, quality control aims at correcting the reasons for bugs and not just rejecting the products. The next breakthrough in quality methods was the development of quality assurance methods.

    The primary premise of modern quality assurance is that if an organization’s processes are proper and are followed rigorously, then the products are obligated to be of good quality. The new quality functions include guidance for recognizing, defining, analyzing, and improving the production process.

    Total quality management (TQM) advocates that the procedure followed by an organization must be continuously improved through process measurements. TQM goes stages further than quality assurance and aims at frequently process improvement. TQM goes beyond documenting steps to optimizing them through a redesign. A term linked to TQM is Business Process Reengineering (BPR).

    BPR aims at reengineering the method business is carried out in an organization. From the above conversation, it can be stated that over the years, the quality paradigm has changed from product assurance to process assurance, as shown in fig.

    Software Quality
  • Project Monitoring and Control

    Monitoring and Controlling are processes needed to track, review, and regulate the progress and performance of the project. It also identifies any areas where changes to the project management method are required and initiates the required changes.

    The Monitoring & Controlling process group includes eleven processes, which are:

    Project Monitoring and Control
    1. Monitor and control project work: The generic step under which all other monitoring and controlling activities fall under.
    2. Perform integrated change control: The functions involved in making changes to the project plan. When changes to the schedule, cost, or any other area of the project management plan are necessary, the program is changed and re-approved by the project sponsor.
    3. Validate scope: The activities involved with gaining approval of the project’s deliverables.
    4. Control scope: Ensuring that the scope of the project does not change and that unauthorized activities are not performed as part of the plan (scope creep).
    5. Control schedule: The functions involved with ensuring the project work is performed according to the schedule, and that project deadlines are met.
    6. Control costs: The tasks involved with ensuring the project costs stay within the approved budget.
    7. Control quality: Ensuring that the quality of the project?s deliverables is to the standard defined in the project management plan.
    8. Control communications: Providing for the communication needs of each project stakeholder.
    9. Control Risks: Safeguarding the project from unexpected events that negatively impact the project’s budget, schedule, stakeholder needs, or any other project success criteria.
    10. Control procurements: Ensuring the project’s subcontractors and vendors meet the project goals.
    11. Control stakeholder engagement: The tasks involved with ensuring that all of the project’s stakeholders are left satisfied with the project work.