Software Development SDLC
Software Development SDLC
Objectives
After studying this chapter, the learner will be able to:
Introduction
In today’s rapid changing technological world, software development teams must be agile to
adapt to changing requirements and goals. System development life cycle offers a flexible
framework for software development that can be adapted to the specific needs of different
projects and organisations. Software development teams can ensure that their software is of high
quality, meets user requirements and can be developed and maintained in a cost effective
manner.
System
A system is a collection of interconnected components or elements that work together to achieve
a common goal or purpose. It refers to a combination of hardware, software, data, and processes
that work together to perform specific functions or tasks. A system can be designed to automate
business processes, manage data, provide services or facilitate communication and collaboration.
Characteristics of a system
Task
Identify a system and outline its entities.
Systems Analysis
It focuses on understanding and defining the requirements and objectives of a software system. It
involves studying the existing system and identifying the needs and problems that the new
system should address and this is done by a system analyst.
A system analyst plays a critical role in the development and implementation of information
systems within an organization. Some of the key duties of a system analyst include:
Requirement Gathering: The system analyst works closely with stakeholders, such as
users, managers, and IT staff, to understand their needs and gather requirements for the
system. This involves conducting interviews, workshops, and analyzing existing
processes and systems.
System Design: Based on the gathered requirements, the system analyst designs the
structure and functionality of the information system. This includes creating system
models, flowcharts, and diagrams to illustrate the system's architecture, data flow, and
user interfaces.
Feasibility Analysis: The system analyst assesses the feasibility of implementing the
proposed system. This involves evaluating technical, economic, and operational factors to
determine if the system is viable and aligns with the organization's goals and resources.
System Development: The system analyst collaborates with developers and programmers
to oversee the development and implementation of the system. They provide guidance,
clarify requirements, and ensure that the system is being built according to specifications.
Testing and Quality Assurance: The system analyst is involved in testing the system to
ensure that it functions correctly and meets the requirements. They develop test plans,
conduct testing, and work with users to gather feedback and address any issues or bugs.
User Training and Support: The system analyst assists in training users on how to use the
system effectively. They may develop user manuals, conduct training sessions, and
provide ongoing support to address user questions or concerns.
System Documentation: The system analyst is responsible for documenting the system's
design, requirements, processes, and user manuals. This documentation serves as a
reference for future maintenance, upgrades, and troubleshooting.
Project Management: In some cases, the system analyst may also be involved in project
management activities, such as defining project scope, creating project plans,
coordinating resources, and monitoring progress to ensure timely and successful system
implementation.
Overall, the duties of a system analyst involve understanding user needs, designing system
solutions, overseeing development, ensuring quality, providing user support, and documenting
the system for future reference. They serve as a bridge between business stakeholders and
technical teams to ensure the successful implementation of information systems.
A good analyst possesses several key qualities:
Strong Analytical Skills: They have the ability to break down complex problems, gather
and interpret data, and draw meaningful insights.
Attention to Detail: They pay close attention to details and ensure accuracy in their
analysis.
Critical Thinking: They can think objectively, evaluate information from multiple
perspectives, and make informed decisions.
Domain Knowledge: They have a deep understanding of the industry or subject matter
they are analyzing, enabling them to provide valuable insights.
Communication Skills: They can effectively communicate complex concepts and
findings to both technical and non-technical stakeholders.
Problem-Solving Abilities: They can identify problems, propose solutions, and
implement strategies to address them.
Curiosity and Learning Mindset: They are curious, constantly seeking new knowledge
and staying updated with the latest trends and technologies in their field.
The System Development Life Cycle (SDLC) is a structured approach used in software
development to guide the process of building and maintaining information systems. It consists of
a series of phases that help ensure the successful completion of a software project. The typical
phases of the SDLC include requirements gathering, system analysis, system design, coding,
testing, deployment, and maintenance. Each phase has specific goals and deliverables, and the
process is usually iterative, meaning that feedback and improvements are incorporated
throughout the development cycle. The SDLC provides a framework for managing and
controlling the development process, ensuring that the final system meets the needs of the users
and stakeholders.
SDLC is phased approach to analysis and design that holds that systems are best developed
through the use of a specific cycle of analyst and user activities.
Analyst vary on the number of phases of the SDLC stage but here we have divided the
cycle into seven phases. Although each phase is presented discretely, it is never
accomplished as a separate step. Instead, several activities can occur simultaneously and
activities may be repeated. It is more useful to think of the SDLC as accomplished in
phases (with activities in full swing overlapping with others and then tapering off) and no
in separate steps.
Developing a system throughout the SDLC involves continual and clear communication
among the users and system personnel- the professionals responsible for designing and
implementing the information system.
The system analyst is mainly concerned at what is occurring in the organisation and
identifying the problems.
This stage is critical to the rest of the project because no one wants to waste subsequent
time addressing the wrong problem.
The people involved in the preliminary investigation stage include the users, system
analyst and system managers coordinating the project
Feasibility study
A feasibility study is a systematic analysis and evaluation of the potential success and viability of
a proposed project or business venture. It aims to determine whether the project is technically,
financially, and operationally feasible.
Project Description: Clearly define the objectives, scope, and purpose of the project.
Market Analysis: Assess the target market, competition, and demand for the product or
service.
Technical Feasibility: Evaluate the technical requirements, resources, and capabilities
needed to implement the project.
Financial Feasibility: Conduct a comprehensive financial analysis, including cost
estimation, revenue projections, and return on investment (ROI) calculations.
Operational Feasibility: Determine the operational requirements, including staffing,
infrastructure, and processes, to ensure the project can be effectively executed.
Legal and Regulatory Considerations: Identify any legal or regulatory constraints that
may impact the project.
Risk Assessment: Identify potential risks and uncertainties associated with the project
and develop risk mitigation strategies.
Environmental Impact: Assess the potential environmental impact of the project and any
necessary measures to mitigate adverse effects.
Conclusion and Recommendations: Summarize the findings of the feasibility study and
provide recommendations on whether to proceed with the project.
A well-conducted feasibility study provides valuable insights and helps stakeholders make
informed decisions about the viability and potential success of a project.
Social feasibility
Social feasibility refers to the assessment of whether a proposed project or system is acceptable
and compatible with the social and cultural context in which it will be implemented. It involves
evaluating the potential social impacts, benefits, and risks associated with the project.
Some key considerations in assessing social feasibility include:
By considering social feasibility, organizations can ensure that their projects are socially
responsible, culturally appropriate, and accepted by the communities they impact. It helps
minimize potential conflicts, enhance project success, and promote positive social outcomes.
Schedule feasibility
Schedule feasibility refers to the assessment of whether a proposed project or system can be
completed within the desired timeframe. It involves evaluating the project's timeline, milestones,
and deadlines to determine if they are realistic and achievable.
When assessing schedule feasibility, several factors need to be considered:
By evaluating these factors, a project manager or system analyst can determine if the proposed
schedule is feasible or if adjustments need to be made to ensure a realistic and achievable
timeline for project completion.
Analysis stage
During this phase of the SDLC, the analyst strives to understand what information users need to
perform their jobs. Several methods for determining information requirements involves
interacting directly with users. This phase serves to fill in the picture that the analyst has of the
organisation and its objectives.
The analysis stage is a crucial phase in the System Development Life Cycle (SDLC) where the
existing system is examined and analyzed to identify the requirements for a new or improved
system. The main objective of system analysis is to understand the current system, its strengths,
weaknesses, and areas for improvement.
Data collection
Data collection refers to the process of gathering and acquiring information or data from various
sources.
There are various types of data collection methods, including:
Sensor data collection: Collecting data from sensors or IoT devices.
Social media monitoring: Gathering data from social media platforms.
Web scraping: Extracting data from websites using automated tools.
Questionnaires:
It is a research instrument or tool used to collect information from individuals or groups of
people. It consists of a set of structured questions that are designed to elicit specific responses
from participants.
Advantages:
Disadvantages:
Limited depth: Questionnaires may not allow for in-depth exploration of complex
issues or individual perspectives.
Response bias: Respondents may provide inaccurate or incomplete answers due to
misunderstanding or social desirability bias.
Lack of clarification: Questionnaires do not provide an opportunity for respondents to
seek clarification on unclear questions.
Low response rate: It can be challenging to motivate respondents to complete and
return questionnaires, resulting in a low response rate
Interviews:
An interview is a formal or informal conversation between two or more individuals, typically
conducted with a specific purpose or objective. They can be conducted in various formats
including face-t-face interviews, phone interview, or online interviews.
Advantages:
Rich data: Interviews allow for in-depth exploration of topics, capturing detailed
information and personal experiences.
Flexibility: Interviewers can adapt the questions and probe further based on the
respondent's answers, allowing for a more nuanced understanding.
Clarification: Respondents can seek clarification on questions, leading to more
accurate responses.
Non-verbal cues: Interviews enable the observation of non-verbal cues and emotions,
providing additional insights.
Disadvantages:
Both questionnaires and interviews have their strengths and limitations, and the choice
between them depends on the research objectives, resources, and the nature of the data being
collected.
Observation
It refers to the act of carefully and attentively watching, perceiving, and noting details or
events in order to gather information or gain understanding.
Advantages of Observation:
Observer bias: The presence of an observer may alter the behavior of participants, leading
to observer bias. Participants may modify their behavior consciously or unconsciously,
leading to distorted or inaccurate data.
Limited generalizability: Observational research often focuses on specific individuals or
groups in specific contexts. Therefore, the findings may have limited generalizability to
other populations or settings. It is important to carefully consider the context and
limitations of the observations.
Time and resource-intensive: Observation can be a time-consuming research method,
requiring trained observers, extensive data collection, and analysis. It may also require
access to specific settings or individuals, which can be resource-intensive.
Ethical considerations: Observational research raises ethical concerns, particularly
regarding privacy and informed consent. Researchers must ensure that participants' rights
and confidentiality are protected, and ethical guidelines are followed.
Limited access to internal thoughts and motivations: Although observation can provide
valuable insights into behaviors and actions, it does not directly access participants'
internal thoughts, motivations, or emotions. Supplementing observation with other
research methods may be necessary to gain a more comprehensive understanding.
Record inspection
Record inspection, also known as document inspection or document review, refers to the process
of examining records or documents for various purposes.
Time and Resource Intensive: Inspecting records can be a time-consuming and resource-
intensive process, especially when dealing with a large volume of documents. It requires
careful review and analysis, which may require significant manpower and resources.
Limited Scope: Record inspection is limited to the information contained within the
documents being reviewed. It may not provide a comprehensive understanding of the
entire context or circumstances surrounding the recorded information.
Potential Bias: The interpretation of records during inspection can be influenced by
personal biases or subjective judgments. This may introduce potential inaccuracies or
misinterpretations of the information.
Data Privacy Concerns: Record inspection may involve accessing sensitive or
confidential information. It is important to handle and protect such data in accordance
with applicable privacy laws and regulations.
Incomplete or Missing Information: Records may not always contain all the necessary or
desired information. Some information may be incomplete, outdated, or missing, which
can limit the effectiveness of record inspection.
It is important to consider these advantages and disadvantages when conducting record
inspection and to ensure proper procedures and safeguards are in place to address any
potential limitations or risks.
These methods can be used individually or in combination to collect different types of data
for research, analysis, or decision-making purposes.
Dataflow diagrams
Data Flow Diagrams (DFDs) are graphical representations that depict the flow of data within a
system or process. They are used to visualize how data moves between different components of a
system, including inputs, outputs, processes, and data stores. Here are the key elements of a data
flow diagram:
2. Data Flows: Represented by arrows, data flows depict the movement of data between
processes, external entities, and data stores. They show the direction and flow of data within the
system.
4. Data Stores: Represented by rectangles with parallel lines, data stores are repositories where
data is stored within the system. They can be databases, files, or any other storage medium.
DFDs help in understanding the overall data flow and relationships within a system. They
provide a visual representation of how data is processed, stored, and exchanged, making it easier
to identify potential bottlenecks, redundancies, or areas for improvement. DFDs can be used
during system analysis, design, and documentation phases of software development projects.
Design Stage
The design stage in the Software Development Life Cycle (SDLC) is a crucial phase where the
system's architecture, components, and interfaces are planned and designed. It involves
translating the requirements gathered during the analysis phase into a detailed design that can be
implemented.
Here are the key activities involved in the design stage:
System Architecture Design: In this activity, the overall structure and organization of the
system are defined. It includes identifying the main components, their relationships, and
the system's high-level behavior.
Database Design: If the system involves a database, the database design activity focuses
on designing the database schema, tables, relationships, and constraints. It ensures that
the data is organized efficiently and can be retrieved and manipulated effectively.
User Interface Design: This activity involves designing the user interface of the system,
considering factors such as usability, accessibility, and user experience. It includes
creating wireframes, mockups, or prototypes to visualize the user interface design.
Component Design: The system's components, such as modules, classes, or functions, are
designed in detail during this activity. It involves defining the internal structure,
interfaces, and behavior of each component.
Data Structure Design: This activity focuses on designing the data structures that will be
used within the system. It includes defining the data types, data formats, and data
organization to support the system's functionality and requirements.
Interface Design: If the system needs to interact with external systems or services,
interface design involves specifying the protocols, formats, and methods of
communication between the system and external entities.
Security Design: In this activity, security measures and controls are designed to protect
the system and its data from unauthorized access, breaches, or vulnerabilities. It includes
defining authentication, authorization, encryption, and other security mechanisms.
Testing and Quality Assurance Design: Designing the testing strategy, test cases, and
quality assurance processes are essential during the design stage. It ensures that the
system will be thoroughly tested for functionality, performance, reliability, and other
quality aspects.
The design stage sets the foundation for the implementation phase, providing detailed
specifications for developers to build the system. It aims to ensure that the system will
meet the requirements, be scalable, maintainable, and robust.
PROTOTYPING
It’s a working model that is built economically and quickly, with the intention that it will
be modified.
Prototyping is used to help users who aren’t certain determine what they really want in an
information system. For example, to fine tune system specifications, users are given
copies of system out put so that they can make any desired changes before any further
effort is put into programming output.
A prototype involves users so that they have a sense of system ownership hopefully
generating tolerance of minor faults after the new system is operational.
A prototype demonstrates the system to management so that key personnel can see how a
long expensive the project is progressing.
Prototyping is a valuable approach in the software development process that involves creating a
preliminary version of a system or product.
Here are some reasons and benefits of prototyping:
Disadvantages of prototyping
System development
During this stage some of the following are developed:
Coding
Component integration
Unit testing
Version control
Documentation
This is also known as the coding phase.it is the stage where the software system is built based on
the requirements and design specifications defined in the earlier stages.
Testing
Testing is one of the most important activities in the development of a system. Any system
whether a bridge, a piece of equipment, or an information system can cause disastrous results for
its users if it is not thoroughly tested and found to be satisfactory.System testing is a critical
phase in the software development life cycle (SDLC) that focuses on evaluating the entire
system's functionality, performance, and reliability.
Here are some commonly used system testing strategies:
Black box testing: is a software testing technique where the internal workings of the system
being tested are not known to the tester. It focuses on testing the functionality of the system by
proving inputs and observing the outputs without considering the internal code or
implementation details. The goal of black box testing is to uncover defects or issues in the
system’s behavior or functionality. It can be performed at various levels, such as unit testing,
integration testing, system testing and acceptance testing
During alpha testing the testers simulate real world scenarios and use the software as intended.
They provide feedback on any issues they encounter, such as bugs, usability problems or missing
features, this feedback is valuable for developers to refine the software and address any issues
before wider testing or release.
Top-Down approach: This is when systems are tested starting with the more complex modules
going down to the simplest one. The system is tested with limited functionality. Most functions
are replaced with stubs that contain a code. Functions are gradually added to the program until
the complete program is tested.
Bottom-up approach: is a problem-solving or design methodology that starts with the individual
components or details and gradually builds up to create a larger system or solution. It is the
opposite of the top-down approach, which begins with an overall system and then breaks it down
into smaller components. In a bottom-up approach, the focus is on understanding and solving
smaller sub-problems or building blocks, and then integrating them to form a larger system
White-box testing: is a software testing technique that focuses on examining the internal
structure, code, and logic of a software application. It is also known as clear box testing, glass
box testing, or structural testing. Unlike black-box testing, which tests the functionality of the
software without considering its internal implementation, white-box testing is concerned with the
internal workings of the system.
The main goal of white-box testing is to ensure that the system functions correctly based on its
internal design and implementation. It involves analyzing the internal code, data flow, control
flow, and structure of the software to design test cases that exercise different paths and
conditions within the code.
Beta testing: is a type of software testing performed by a select group of end-users or customers
in a real-world environment. It is conducted after the completion of alpha testing, which is
typically performed by the development team in a controlled environment. Beta testing allows
for a broader and more diverse user base to test the software and provide feedback before its
official release.
Functional Testing: This strategy verifies that the system functions correctly according to the
defined requirements. It involves testing individual features, user interactions, data processing,
and system behavior under different scenarios.
Integration Testing: Integration testing focuses on testing the interaction and compatibility
between different modules or components of the system. It ensures that the integrated system
functions as expected and that data flows correctly between components.
Performance Testing: Performance testing assesses the system's performance under various load
conditions. It measures response times, throughput, scalability, and resource usage to identify
any performance bottlenecks or issues.
Security Testing: Security testing aims to identify vulnerabilities and weaknesses in the system's
security measures. It includes testing for authentication, authorization, data integrity, encryption,
and protection against common security threats.
Usability Testing: Usability testing evaluates the system's user-friendliness and ease of use. It
involves testing the system's interface, navigation, and user interactions to ensure that it meets
the users' needs and expectations.
Compatibility Testing: Compatibility testing verifies that the system functions correctly across
different platforms, operating systems, browsers, or devices. It ensures that the system is
compatible with the intended environments and configurations.
Regression Testing: Regression testing is performed to ensure that changes or fixes in the system
do not introduce new defects or break existing functionality. It involves retesting previously
tested features to ensure their continued proper functioning.
Stress Testing: Stress testing evaluates the system's behavior and performance under extreme or
peak load conditions. It tests the system's stability, robustness, and recovery capabilities when
subjected to high loads or resource constraints.
Acceptance Testing: Acceptance testing is conducted to determine whether the system meets the
specified acceptance criteria and is ready for deployment. It involves testing the system with
real-world scenarios and user acceptance to gain confidence in its readiness.
Exploratory Testing: Exploratory testing involves ad-hoc and unscripted testing, where testers
explore the system to uncover defects, usability issues, or unexpected behavior. It relies on
testers' knowledge, experience, and creativity to identify potential problems.
The selection of testing strategies depends on the system's requirements, complexity, and
intended use.
Implementation Stage
This refers to the phase where software system is developed, tested, and deployed for use by the
end users. It involves translating the design specifications and requirements into functional
software product. There are several conversion methods used to change from old system to the
new system.
Parallel conversion
Both new and old systems are used for a certain period of time after which the old system is no
longer used. Parallel conversion is a method used in software or system implementation where
both the old and new systems run simultaneously for a period of time. During parallel
conversion, both systems are operational, and data and processes are duplicated in both systems.
This allows for a gradual transition and comparison between the old and new systems to ensure
accuracy and minimize disruption.
Parallel conversion offers several advantages, such as reduced risk of data loss or system failure
since the old system is still available as a backup. It also allows for thorough testing and
validation of the new system before fully committing to it. However, parallel conversion can be
resource-intensive and time-consuming since it requires running two systems simultaneously.
Direct conversion
This method is a software implementation strategy where the old system is replaced completely
by the new system at a specific point in time. This means that the new system is implemented
and the old system is immediately discontinued. It is a high-risk approach as there is no fallback
option once the new system is in place. It requires thorough planning, testing, and training to
ensure a smooth transition and minimize disruptions to the business operations.
Pilot conversion
This conversion may prove safe and economical. A system is installed in one location and tried
there before it is installed in other locations
Pilot conversion
It refers to the stage where a small-scale implementation or trial run of the new system is
conducted before full deployment. It involves selecting a subset of users or a specific department
within an organisation to test the new software or system in a controlled environment. The
purpose of pilot conversion is to identify any potential issues, gate feedback and make necessary
adjustments before rolling out the system to a larger user base.
Phased conversion
Phased conversion is a software implementation strategy where the new system is introduced
gradually in stages, while the old system is gradually phased out. This approach allows for a
more controlled and gradual transition, reducing the risk of disruptions to business operations. It
involves implementing the new system in specific modules or departments, testing and validating
each phase before moving on to the next. This approach allows for incremental learning,
training, and adjustment, ensuring a smoother transition for users and minimizing the impact on
the organization.
Maintenance stage
After a system has been tested and installed and users trained, it enters the maintenance stage.
The maintenance of a system may span several years, during which time the changing
requirements of users lead to minor modifications to the system. Eventually the system reaches a
point where routine maintenance is no longer sufficient and the SDLC begins again.
There are several types of maintenance commonly used in various industries. Here are some of
the main types:
It's important to note that different industries and organizations may use different terminology or
variations of these maintenance types based on their specific needs and requirements.
Maintenance is needed for several reasons:
Equipment Reliability: Regular maintenance helps ensure the reliability and availability
of equipment and systems. By performing inspections, lubrication, adjustments, and
replacements, potential issues can be identified and addressed before they lead to
breakdowns or failures. This helps minimize unplanned downtime and disruptions to
operations.
Safety: Maintenance plays a crucial role in ensuring the safety of equipment, systems,
and the people who operate or interact with them. Regular inspections and maintenance
help identify and address potential safety hazards, such as faulty wiring, worn-out
components, or malfunctioning safety devices.
Asset Lifespan: Regular maintenance can extend the lifespan of equipment and assets. By
addressing wear and tear, performing timely repairs, and implementing preventive
measures, the lifespan of equipment can be prolonged, reducing the need for premature
replacements.
Overall, maintenance is essential for maximizing the reliability, safety, performance, and
lifespan of equipment and systems, while also minimizing costs and ensuring compliance with
regulations.
Cost Savings: Proper maintenance can lead to cost savings in the long run. By addressing
issues early on, maintenance can prevent major breakdowns or failures that may require
costly repairs or replacements. Additionally, well-maintained equipment tends to operate
more efficiently, resulting in energy savings and reduced operational costs.
Regulatory Compliance: Many industries have specific regulations and standards that
require regular maintenance and inspections to ensure compliance. By adhering to these
requirements, organizations can avoid penalties, legal issues, and reputational damage.
Equipment Performance: Maintenance helps optimize the performance of equipment and
systems. By performing tasks like cleaning, calibration, and adjustments, equipment can
operate at its intended efficiency and effectiveness, resulting in improved productivity
and quality of output.
Evaluation
Evaluation refers to the process of assessing the software system or projecting various stages to
determine its effectiveness, quality, and adherence to requirements, evaluation helps to identify
any gaps, issues, or areas for improvement in the development process
Additionally, the evaluation stage may include other activities such as code reviews,
performance testing, security audits, and compliance checks, depending on the specific
requirements of the project and the organization.
The main goal of the evaluation stage is to ensure that the software is of high quality,
meets the intended purpose, and is ready for deployment to users or customers. It helps to
identify and rectify any issues or deficiencies before the software is released, minimizing
the risk of potential failures or negative user experiences.
In the Software Development Life Cycle (SDLC), the evaluation stage typically refers to
the testing and quality assurance phase. This stage occurs after the development of the
software and before its deployment to end-users.
During the evaluation stage, the software is thoroughly tested to ensure that it meets the
specified requirements, functions correctly, and is free from defects or bugs. Different
types of testing, such as unit testing, integration testing, system testing, and user
acceptance testing, are performed to validate the software's functionality, performance,
security, and usability.
The evaluation stage also involves reviewing and analyzing the test results, identifying
and documenting any issues or bugs found, and working on their resolution. The software
may go through multiple iterations of testing and bug fixing until it meets the desired
quality standards.
DOCUMENTATION
It is a written or a graphic record of the steps carried out during the development of a system.
Accurate and complete documentation throughout the cycle is important for several reasons. For
example:
Many persons may be involved in the development of a system and many tasks rely on
work already completed by other members of a developing team. Documentation allows
system personnel and management to review what was previously done and understand
why it was done
Another example is that many system development projects extend over a long period of
time. During that time, there are generally personnel changes or additions to the team.
New personnel can review documentation to understand what is going on and be able to
contribute valuable ideas even in the middle of a project.
User Documentation
A user is a person who will utilize a system once it’s been installed. Users include operators who
run computers and individuals who require information from the system.
User documentation aims to provide users with the necessary information and guidance to
effective efficient use of a product or service.
It typically consists of instructions, guides, or manuals that provide information on how to use a
product or service. It may include
Getting started guide: this guide helps users set up and start using the product or service.
User manual: a comprehensive document that provides detailed instructions on how to
use all the features and functionalities of the product.
FAQs: Frequently Asked Questions that address common queries and provide quick
answers to users.
Troubleshooting Guide: instructions for resolving common issues or errors that users may
encounter.
How to Guides: step by step instructions on performing specific tasks or actions.
Glossary: A list of key terms and their definitions to help users understand the
terminology used in the product or service.
Tips and Tricks: Additional information or short cuts that can enhance the user
experience or provide useful insights.
Updates and Release notes: Information about new features, bugfixes, and improvements
introduced in software updates.
Technical Documentation
These components may vary depending on the specific product or system being documented and
the target audience of the documentation.
It typically consists of the following:
Identify software development projects that has been completed in the past few years. Research
the project and gather information on the SDLC model that was used. Analyse the project and
identify the different phases of the SDLC. Evaluate the successes and failures of the project
based on the SDLC model that was used. Identify any challenges or issues that arose during the
project and evaluate how they were addressed and mitigated. Determine if the SDLC model used
was appropriate for the project and suggest any improvements or changes that could have been
made. Present the findings of the case study in a report or presentation (PowerPoint
presentation).
SUMMARY
The Software Development Life Cycle (SDLC) is a systematic approach to software
development that consists of several phases. Here is a summary of the typical SDLC phases:
Requirements gathering: This phase involves gathering and documenting the software
requirements by understanding the needs and expectations of the stakeholders.
Analysis: In this phase, the requirements are analyzed to identify any inconsistencies,
ambiguities, or gaps. The feasibility of the project is also assessed.
Design: The system architecture and detailed design are created in this phase. It includes
designing the software components, database structure, user interface, and other system
specifications.
Implementation: The actual coding of the software is done in this phase. Developers
write the code according to the design specifications and coding standards.
Testing: The software is tested to ensure that it meets the specified requirements and
functions correctly. Different testing techniques like unit testing, integration testing,
system testing, and user acceptance testing are performed.
Deployment: The software is deployed in the production environment or made available
to end-users. This phase involves installation, configuration, and data migration if
necessary.
Maintenance: After deployment, the software requires ongoing maintenance and support.
Bug fixes, updates, and enhancements are made to ensure its optimal performance and
usability.
END OF CHAPTER QUESTIONS
1. What is the purpose of SDLC?
2. What are the different phases of SDLC?
3. Can you describe the Waterfall model of SDLC?
4. What is the role of system analyst in Software development?
5. What are some common challenges that can occur during SDLC?
6. Briefly explain the following gathering techniques
a) Observation
b) Interview
c) questionnaire
7. How can SDLC be used to ensure the quality and success of software development
projects?
8. Explain the following testing strategies
a) Bottom-up
b) White-box
c) User acceptance
d) Alpha
9. Explain the following types of documentation
a) User documentation
b) Technical documentation.