Flying Santa
Kimkorng

Software Engineering

02 JAN, 2024

Chapter 1: Software Engineering

Definition

  • Software Engineering (SE): A discipline that applies engineering principles to software development, ensuring reliable and efficient software products.
  • Components of SE: Programming languages, software design techniques, testing, maintenance, and development.

Software Quality Attributes

  • Portability: Compatibility across different environments and systems.
  • Usability: Ease of use for various users.
  • Reusability: Ability to use components in different applications.
  • Correctness: Accurate implementation of specified requirements.
  • Maintainability: Ease of fixing errors, adding features, and modifying functionalities.

Importance of SE

  • Ensures reliable, economical, and high-quality software systems.
  • Addresses large-scale software needs with proper processes and scientific concepts.
  • Reduces long-term costs and adapts to changing user requirements.

Differences Between SE and CS

  • Computer Science (CS): Focuses on theoretical and fundamental aspects of computing.
  • Software Engineering (SE): Emphasizes practicalities and methodologies for developing and delivering software.

Software Process Activities

  1. Specification: Defining the software and its constraints.
  2. Development: Designing and programming the software.
  3. Validation: Ensuring the software meets customer requirements.
  4. Evolution: Modifying the software to meet evolving needs.

SE vs. System Engineering

  • System Engineering: Involves physical products (e.g., buildings, cars) with tangible and replaceable parts.
  • Software Engineering: Deals with intangible software products that do not wear out and cannot have spare parts replaced.

Costs and Products

  • Software often incurs higher costs than hardware, especially in maintenance.
  • Generic Products: Marketed to any customer (e.g., PC software).
  • Customized Products: Tailored to specific customer needs (e.g., embedded control systems).

General Issues in SE

  • Heterogeneity: Need for software to operate across diverse systems and devices.
  • Business and Social Change: Rapid adaptation to evolving markets and technologies.
  • Security and Trust: Ensuring software reliability and security.

Software Engineering Diversity

  • Different application types (e.g., standalone applications, web-based systems) require varied engineering methods and tools.

Ethics and Professional Responsibility

  • Software engineers must adhere to ethical principles, ensuring confidentiality, competence, respect for intellectual property, and avoiding computer misuse.
  • The ACM/IEEE Code of Ethics provides guidelines for professional conduct.

Case Studies

  • Examples include systems like personal insulin pumps, mental health care management systems, and wilderness weather stations, illustrating real-world applications and challenges in software engineering.

Chapter 2: Software Engineering

Software Process Concepts

  • Software Process: A set of activities required to develop a software system.
  • Software Process Model: An abstract representation of a process, describing the process from a particular perspective.

Generic Software Process Models

  1. Waterfall Model: Plan-driven with distinct phases (requirements, design, implementation, testing, maintenance).
  2. Incremental Development: Combines specification, development, and validation interleaved; can be plan-driven or agile.
  3. Reuse-oriented Software Engineering: Builds systems from existing components, involving component analysis, requirements modification, system design with reuse, and development and integration.

Process Framework

  • Framework Activities: Key tasks that guide the software development process.
  • Umbrella Activities: Support tasks like project management, risk management, quality assurance, configuration management, and reuse management.

Process Activities

  1. Specification: Define what the system should do (requirements engineering).
  2. Design and Implementation: Define system organization and translate design into executable code.
  3. Validation: Ensure the system meets customer requirements.
  4. Evolution: Modify the system to meet changing needs.

Requirements Engineering

  • Feasibility Study: Assess technical and financial viability.
  • Requirements Elicitation and Analysis: Identify stakeholder needs.
  • Requirements Specification: Detail the requirements.
  • Requirements Validation: Confirm the requirements' correctness.

Software Design and Implementation

  • Design Activities: Architectural design, component design, interface design, and database design.
  • Implementation: Transform the design into executable software.

Testing

  • Ensures software meets requirements.
  • Identifies and fixes defects.

Rational Unified Process (RUP)

  • Phases: Inception, elaboration, construction, and transition.
  • Disciplines: Business modeling, requirements, analysis and design, implementation, test, deployment, configuration and change management, project management, and environment.

Capability Maturity Model Integration (CMMI)

  • Levels of Maturity: Initial, repeatable, defined, managed, and optimizing.
  • Key Process Areas (KPAs): Goals, commitments, abilities, activities, monitoring methods, and verification methods.

Importance of Software Processes

  • Order and Discipline: Ensures all necessary activities are completed.
  • Knowledge Transfer: Captures and shares development knowledge.
  • Process Improvement: Incorporates best practices and enhances predictability and efficiency. Software Process Models:
  1. Spiral Model:
    • Strengths: Emphasizes risk management, integrates quality assurance, no distinction between development and maintenance.
    • Weaknesses: Best for large projects, requires skilled risk analysts, suited for internal projects.
  2. WINWIN Spiral Model:
    • Focuses on stakeholder communication and agreement on "win-win" conditions.
  3. Component-Based Development (CBD):
    • Reuse Approach: Assumes reusability of experience and requires integration into development processes. Involves identifying, selecting, and modifying existing components to meet new requirements.
    • Development Processes: Differentiates between system development (using components) and component development. Component Development Considerations:
  • Guidelines for Components: Must be well-specified, easy to understand, general, adaptable, and easy to deploy and replace.
  • Consumer Perspective: Considers lifecycle compatibility, updates, and effects of changes.
  • Producer Perspective: Focuses on business goals, functionality, maintenance, and compatibility with system requirements. Component Lifecycle:
  1. Development with Components:
    • Emphasis on finding reusable units, evaluating effort for using them, and integrating them into the system.
  2. System Design: Evolutionary approach considering feasibility and compatibility of component combinations.
  3. System Implementation: Reduced to creating "glue-code" and adapting components.
  4. System Integration: Involves component adaptation, reconfiguration of assemblies, and managing emerging properties.
  5. Verification and Validation: Ensures components and the overall system meet specifications. Operation and Maintenance:
  • Maintenance can be complex due to unclear support responsibilities and version incompatibilities.
  • System improvements can be made by updating or adding new components. Disposal of Components:
  • Dissatisfaction: When a component no longer supports the system or is no longer needed.
  • Obsolescence: When maintenance and support cease due to the availability of better components or lack of producer support. Component-Based Development Summary:
  • Component Development: Focuses on building reusable units.
  • System Development: Concentrates on evaluating and integrating components. V-Shaped Software Development Life Cycle (SDLC) Model:
  • Features:
    • Emphasizes verification and validation.
    • Testing is planned in parallel with development phases.
  • Strengths: Tracks progress by milestones, ensures each deliverable is testable.
  • Weaknesses: Struggles with concurrent events, iterations, dynamic requirements, and lacks risk analysis. Use Cases for V-Shaped Model:
  • Suitable for high-reliability systems (e.g., hospital patient control).
  • Best when all requirements are known upfront and solutions are established.

Chapter 3: Software Metrics

Terminology:

  • Measure: Quantitative indication of some attribute (e.g., number of errors in a review).
  • Measurement: The act of obtaining a measure.
  • Attribute: A property of software products or processes, either qualitative or quantitative.
  • Metric: A quantitative measure of the degree to which a system possesses a given attribute (e.g., number of errors per review).
  • Indicator: A metric or combination of metrics that provides insight into software processes, projects, or products【10:1†source】【10:2†source】【10:3†source】.

Measurement Principles:

  1. Formulation: Identifying appropriate software measures and metrics.
  2. Collection: Accumulating data needed to derive metrics.
  3. Analysis: Computing metrics and applying mathematical tools.
  4. Interpretation: Evaluating metrics to gain insights into software quality.
  5. Feedback: Making recommendations based on metrics evaluation to improve software development【10:1†source】.

Importance of Software Metrics:

  • Metrics provide a way to determine if a process is improving.
  • Establishing meaningful goals for improvement.
  • Providing a baseline for measuring improvements.
  • Identifying the causes of defects with the greatest impact on software development【10:2†source】.

Characteristics of Effective Software Metrics:

  • Simple and computable
  • Empirically and intuitively persuasive
  • Consistent and objective
  • Consistent in the use of units and dimension
  • Independent of programming language
  • Provide effective mechanisms for high-quality feedback【10:5†source】.

Steps to Create Metrics:

  1. Define the goal of the metric.
  2. Identify the requirements for the metric.
  3. Collect necessary data.
  4. Determine the organizational baseline value for the metric.
  5. Review the metric for usability【10:5†source】【10:6†source】.

Types of Metrics:

  • Direct Metrics: Metrics that do not depend on other attributes. Examples include:
    • Understanding: Time for a new user to understand software features.
    • Ease of Learning: Time for a new user to learn how to perform basic tasks【10:5†source】【10:6†source】.

Chapter 4: Software Estimation Techniques

Software estimation is a critical aspect of project management, allowing teams to predict the resources required to complete a project. This chapter covers various estimation techniques, including algorithmic models and expert judgment methods. Here’s a concise overview of the key concepts:

Estimation Techniques

  1. Algorithmic Models:
    • COCOMO (COnstructive COst MOdel):
      • Basic COCOMO: Estimates effort and duration based on the size of the software project.
        • Project Types: Organic (small teams, familiar environment), Semidetached (medium-sized projects, mix of experienced and inexperienced staff), Embedded (complex projects, large teams, high reliability).
      • Intermediate COCOMO: Adds factors for cost drivers like product attributes, hardware constraints, personnel quality, and project attributes.
      • Advanced COCOMO: Uses detailed cost driver attributes for each phase of the software development lifecycle.
    • COCOMO II: An updated version to handle modern software development practices including rapid development and reuse models. It includes modules for application composition, early design, and post-architecture.
  2. Source Lines of Code (SLOC):
    • Estimates effort based on the number of lines of code.
    • Divided into physical SLOC (actual lines of code written) and logical SLOC (number of statements).
  3. Function Points (FP):
    • Language-independent estimation method.
    • Measures functionality delivered to the user based on the complexity of inputs, outputs, user interactions, files, and interfaces.
    • Conversion ratios can be used to convert function points to SLOC for different programming languages.
  4. Expert Judgment:
    • Relies on the experience and intuition of experts.
    • Methods include Delphi Technique where experts estimate anonymously and iteratively until a consensus is reached.

Comparison of Methods

  • COCOMO:
    • Advantages: Provides clear, detailed estimates; widely recognized and validated.
    • Disadvantages: Requires accurate historical data; complex to apply.
  • Function Points:
    • Advantages: Language-independent, focuses on user perspective.
    • Disadvantages: Can be subjective; hard to automate.
  • Expert Judgment:
    • Advantages: Quick and adaptable to unique projects.
    • Disadvantages: Highly dependent on the expertise and biases of the individuals.

Estimation Steps

  1. Determine the size of the project: Using SLOC or FP.
  2. Select the appropriate model and parameters: Based on project type and historical data.
  3. Calculate the effort: Using the chosen model’s formulae.
  4. Adjust for cost drivers and scale factors: For more refined estimates.
  5. Review and validate estimates: Through expert judgment or comparison with past projects.

Chapter 5: Analysis Concepts & Principles

Objectives:

  • Understand user and system requirements and their distinct representations.
  • Differentiate between functional and non-functional software requirements.
  • Organize requirements within a software requirements document.
  • Grasp the activities of requirements engineering: elicitation, analysis, validation.
  • Recognize the necessity of requirements management. Analysis:
  • Identifies system usage, users, and operational contexts.
  • Steps:
    1. Develop an analysis strategy.
    2. Gather requirements through various methods.
    3. Formulate a system proposal. Principles of Analysis:
  1. Represent the information domain.
  2. Define software functions.
  3. Depict software behavior.
  4. Use models for information, functions, and behavior.
  5. Progress from essential information to detail. Design:
  • Involves decisions on hardware, software, and infrastructure.
  • Steps:
    1. Design strategy: in-house development vs. outsourcing.
    2. Architecture design: user/system interfaces, forms, reports.
    3. Database & file specifications.
    4. Program paradigms. Requirements Engineering:
  • Process of defining services and constraints.
  • Activities: understanding the problem, prototyping, recording requirements, using multiple views, prioritizing, and eliminating ambiguity. Types of Requirements:
  • User Requirements: Natural language statements plus diagrams for customers.
  • System Requirements: Detailed descriptions for implementation, forming part of contracts. Functional vs. Non-functional Requirements:
  • Functional Requirements: System services, reactions to inputs, and specific behaviors.
  • Non-functional Requirements: System properties like reliability, response time, constraints on development processes, standards.
  • Domain Requirements: Constraints from the operational domain. Requirements Documentation:
  • Describes what is required from system developers, not how to achieve it.
  • Agile methods often forgo extensive documentation, using incremental engineering and 'user stories.' Structure of a Requirements Document:
  • Preface, Introduction, Glossary, User requirements, System architecture, System requirements specification, System models, System evolution, Appendices, Indexes. Ways of Writing Requirements:
  • Natural Language: Simple, numbered sentences.
  • Structured Natural Language: Standard forms/templates.
  • Graphical Notations: UML diagrams.
  • Mathematical Specifications: Formal methods for unambiguous specifications.

Chapter 6: Analysis Modeling

Analysis Modeling Methods:

  1. Structured Analysis:
    • Classic modeling method.
    • Focuses on data flow and functional views. ECHO is off.
  2. Object-Oriented Analysis:
    • Focuses on system's object model rather than traditional data/functional views. Model & Modeling:
  • Model: A simplified representation of a system to predict effects of changes.
  • Modeling: The process of creating a model, ensuring it closely approximates the real system without being overly complex.
  • Validation Techniques: Simulate the model with known inputs and compare outputs. Basic Elements of Analysis Model:
  1. Data Dictionary:
    • Defines system data using text or symbols.
    • Lists all data items in data flow diagrams (DFDs) and their purposes.
  2. Data Model (ERD):
    • Identifies data objects and their relationships.
    • Illustrates data structure through entity-relationship diagrams.
  3. Functional Model (DFD):
    • Shows data transformation through a system.
    • Includes process specifications detailing each function.
  4. Behavioral Model (State Transition Diagram):
    • Depicts system behavior in response to external events.
    • Includes control specifications for software control aspects. Data Modeling:
  • Objects: Described by attributes and manipulated within the software (e.g., user, report, event, role, organization unit).
  • Data Model Components:
    • Data Object: Represents composite information essential to software.
    • Data Attributes: Define properties of data objects (name, description, reference).
    • Relationships: Connections between data objects (one-to-one, one-to-many, many-to-many). Entity Relationship Diagram (ERD):
  • Logical data representation.
  • Depicts relationships among system entities.
  • Entities: Main data objects (e.g., STUDENT, EMPLOYEE).
  • Attributes: Key (identify entities) and non-key (describe entities). Functional Modeling (DFD):
  • DFD Elements:
    • External Entity: Producer/consumer of data outside the system.
    • Process/Activity: Transforms input data to output.
    • Data Flow: Indicates movement of data within the system.
    • Data Store: Storage for permanent data, representing databases/files. DFD Rules:
  • Inputs differ from outputs.
  • Processes cannot have only inputs or only outputs.
  • Data must flow through the system meaningfully, adhering to defined rules. Behavioral Models:
  • Model system's dynamic behavior during execution.
  • Stimuli Types: Data (processed by the system) and events (trigger system processing). State Transition Diagram (STD):
  • Elements:
    • State: Period where system behavior is stable.
    • Initial State: Starting point for new objects.
    • Final State: End point where the object goes out of existence.
    • Event: Significant moments affecting object behavior.
    • Transition: Valid state progressions in response to events.
    • Self-Transition: Source and target states are the same.
SHARE