ONVIF Client Library Testing (Performance
and security)
Document for Test Plan Enhancement
Document Version: 1.0
Date: April 2, 2025
Project: JCI ONVIF Client Library (Profile S/G/T Support)
Table of Contents
ONVIF Client Library Testing (Performance and security)...................................................... 1
Document for Test Plan Enhancement.................................................................................... 1
Table of Contents.....................................................................................................................1
1. Introduction.......................................................................................................................... 4
1.1 Purpose........................................................................................................................4
1.2 Scope...........................................................................................................................4
2. Performance Testing (Library level)..................................................................................... 4
2.1 Concurrent Request Handling......................................................................................4
2.1.1 Test Approach..................................................................................................... 4
2.1.2 Test Scenarios.................................................................................................... 5
2.1.3 Metrics and Acceptance Criteria......................................................................... 5
2.2 Resource Utilization Monitoring................................................................................... 5
2.2.1 Test Approach..................................................................................................... 5
2.2.2 Test Scenarios.................................................................................................... 5
2.2.3 Metrics and Acceptance Criteria......................................................................... 6
2.5 Scalability Testing........................................................................................................ 6
2.5.1 Test Approach..................................................................................................... 6
2.5.2 Test Scenarios.................................................................................................... 6
2.5.3 Metrics and Acceptance Criteria......................................................................... 7
3. Security Testing (To be tested on linux client)......................................................................7
3.1 API Security Validation.................................................................................................7
3.1.1 Test Approach..................................................................................................... 7
3.1.2 Test Scenarios.................................................................................................... 7
3.1.3 Metrics and Acceptance Criteria......................................................................... 7
3.2 Input Validation............................................................................................................ 8
3.2.1 Test Approach..................................................................................................... 8
3.2.2 Test Scenarios.................................................................................................... 8
3.2.3 Metrics and Acceptance Criteria......................................................................... 8
3.3 Buffer Overflow Prevention.......................................................................................... 9
3.3.1 Test Approach..................................................................................................... 9
3.3.2 Test Scenarios (Black box testing)......................................................................9
1. Buffer Boundary Testing (Black Box Approach).............................................................9
2. Memory Corruption Testing (Black Box Approach)........................................................ 9
3. String Handling Testing (Black Box Approach).............................................................. 9
3.3.3 Metrics and Acceptance Criteria......................................................................... 9
4. Module-Specific Testing Methodologies............................................................................ 10
4.1 Authentication Module............................................................................................... 10
4.1.1 Functional Testing............................................................................................. 10
4.1.2 Performance Testing......................................................................................... 10
4.1.3 Security Testing.................................................................................................10
4.1.4 Test Cases (Examples)..................................................................................... 10
4.2 Device Discovery Module...........................................................................................11
4.2.1 Functional Testing............................................................................................. 11
4.2.2 Performance Testing......................................................................................... 11
4.2.3 Security Testing................................................................................................. 11
4.2.4 Test Cases (Examples)..................................................................................... 11
4.3 Media Profile Module................................................................................................. 11
4.3.1 Functional Testing............................................................................................. 11
4.3.2 Performance Testing......................................................................................... 11
4.3.3 Security Testing.................................................................................................12
4.3.4 Test Cases (Examples)..................................................................................... 12
4.4 Media Streaming Module........................................................................................... 12
4.4.1 Functional Testing............................................................................................. 12
4.4.2 Performance Testing (On Panel).......................................................................12
4.4.3 Security Testing.................................................................................................13
4.4.4 Test Cases (Examples)..................................................................................... 13
4.5 PTZ Module (Need appropriate camera from JCI).....................................................13
4.5.1 Functional Testing............................................................................................. 13
4.5.2 Performance Testing......................................................................................... 13
4.5.3 Security Testing.................................................................................................14
4.5.4 Test Cases (Examples)..................................................................................... 14
4.6 Recording Module......................................................................................................14
4.6.1 Functional Testing............................................................................................. 14
4.6.2 Performance Testing......................................................................................... 14
4.6.3 Security Testing.................................................................................................14
4.6.4 Test Cases (Examples) (SD cards are required).............................................. 15
4.7 Event Handling Module..............................................................................................15
4.7.1 Functional Testing............................................................................................. 15
4.7.2 Performance Testing......................................................................................... 15
4.7.3 Security Testing.................................................................................................15
4.7.4 Test Cases (Examples)..................................................................................... 15
4.8. Device Management Module.................................................................................... 16
4.8.1 Functional Testing............................................................................................. 16
4.8.2 Performance Testing......................................................................................... 16
4.8.3 Security Testing.................................................................................................16
4.8.4 Test Cases (Examples)..................................................................................... 17
4.9. Imaging Module (Profile T)........................................................................................17
4.9.1 Functional Testing............................................................................................. 17
4.9.2 Performance Testing......................................................................................... 17
4.9.3 Security Testing.................................................................................................18
4.9.4 Test Cases (Examples)..................................................................................... 18
4.10 Network Management Module................................................................................. 18
4.10.1 Functional Testing........................................................................................... 18
4.10..2 Performance Testing...................................................................................... 18
4.10.3 Security Testing...............................................................................................19
4.10.4 Test Cases (Examples)................................................................................... 19
5. Test Environment and Tools...............................................................................................19
5.1 Test Environment....................................................................................................... 19
5.1.1 Hardware Requirements................................................................................... 19
5.1.2 Network Setup.................................................................................................. 20
5.2 Testing Tools.............................................................................................................. 20
5.2.1 Functional Testing Tools....................................................................................20
5.2.2 Performance Testing Tools................................................................................20
5.2.3 Automation Tools.............................................................................................. 20
6. Test Reporting and Metrics................................................................................................ 20
6.1 Performance Test Reports......................................................................................... 20
6.1.1 Key Metrics....................................................................................................... 20
6.2 Functional Test Reports............................................................................................. 20
6.2.1 Key Metrics....................................................................................................... 20
6.2.2 Report Format...................................................................................................21
6.3 Release Readiness Assessment............................................................................... 21
7. Appendices........................................................................................................................ 21
Appendix A: Sample Test Cases......................................................................................21
A.1 Performance Test Case Examples...................................................................... 21
A.2 Security Test Case Examples..............................................................................22
Appendix B: Testing Schedule Template..........................................................................22
1. Introduction
This document outlines advanced testing methodologies for the ONVIF client library that
supports Profiles S, G, and T. It specifically addresses performance testing, security
considerations, and module-specific testing approaches as requested by the client. The testing
framework described herein is designed to ensure the ONVIF client library meets all functional
requirements while maintaining performance efficiency, stability, and security.
1.1 Purpose
The purpose of this supplementary document is to:
● Define comprehensive performance testing methodologies
● Outline security testing considerations and approaches
● Detail module-specific testing strategies
● Propose additional testing parameters for quality assurance
1.2 Scope
This document extends the existing test plan with advanced testing methodologies that focus
on:
● Performance under various load conditions
● Memory management and resource utilization
● Security vulnerabilities and mitigations
● Module-specific testing approaches
● Cross-functional testing parameters
2. Performance Testing (Library level)
Performance testing will evaluate the ONVIF client library's efficiency, stability, and resource
utilization under various conditions, ensuring it meets performance requirements in real-world
scenarios.
2.1 Concurrent Request Handling
2.1.1 Test Approach
Testing the library's ability to handle multiple concurrent requests without degradation in
performance or stability.
2.1.2 Test Scenarios
1. Baseline Testing (To be tested on linux client)
○ Single request processing time measurement
○ Establish baseline metrics for comparison
2. Gradual Load Increase (To be tested on linux client)
○ Sequential testing with 3, 5, and up to 10 concurrent requests
○ Measure response times, success rates, and system resource utilization
3. Sudden Load Spike (To be tested on linux client)
○ Abrupt increase from minimal to maximum expected load
○ Monitor recovery time and system stability
4. Extended Duration Load (To be tested on linux client)
○ Maintain moderate concurrent load (10 requests) for 24 hours
○ Monitor for degradation in performance over time, including:
■ Response time trends (increasing over time indicates degradation)
■ Memory consumption patterns (baseline, peaks, growth over time)
■ CPU utilization stability (spikes or steady increases)
■ Thread count (to detect thread leaks)
■ Network connection stability (dropped connections)
■ Error rates (if increasing over time)
■ Successful transaction rate (decreasing over time indicates issues)
2.1.3 Metrics and Acceptance Criteria
● No memory leaks or resource exhaustion
● No thread deadlocks or race conditions
● System remains responsive after load tests
Note: All performance testing can be conducted on the final compiled library binary.
2.2 Resource Utilization Monitoring
2.2.1 Test Approach
Systematic monitoring of CPU, memory, network, and other system resources during various
operations.
2.2.2 Test Scenarios
1. CPU Utilization Testing (To be tested on linux client)
○ Monitor CPU usage during resource-intensive operations:
■ Device discovery with network scanning
■ Video stream decoding (Actual usage may vary based on the player
integrated on the panel side for video decoding)
■ PTZ operations
■ Media profile configuration
■ Recording search operations
2. Memory Usage Patterns (To be tested on linux client)
○ Monitor memory allocation patterns during:
■ Library initialization
■ Device connection establishment
■ Video stream processing (Actual usage may vary based on the player
integrated on the panel side for video decoding)
■ Recording data handling
3. I/O and Network Utilization (To be tested on linux client)
○ Monitor network bandwidth consumption during:
■ Multiple simultaneous video streams (Actual usage may vary based on
the player integrated on the panel side for video decoding)
■ Device discovery broadcasts
■ Recording downloads
■ Event subscription handling (maintaining active subscriptions to device
events like motion detection, tampering alerts, and receiving/processing
event notifications from multiple devices)
2.2.3 Metrics and Acceptance Criteria
● CPU usage patterns should be predictable and proportional to workload
● Memory usage patterns should be predictable and proportional to workload
2.5 Scalability Testing
2.5.1 Test Approach
Evaluate the library's ability to handle increasing numbers of devices, streams, and operations
on the panel.
2.5.2 Test Scenarios
1. Device Scaling (To be tested on Panel)
○ Test with 1, 3, and 8 devices on network (Max device support needs to be
confirmed by JCI) (Requires cameras to be used from JCI)
○ Measure discovery time, memory usage, and CPU utilization
2. Stream Scaling (To be tested on Panel)
○ Test with 1, 3, 4, 8 simultaneous streams (Cannot be tested on Linux client due to
resource constraints - to be tested on panel)
○ Measure frame rates, latency, and resource utilization
3. Profile Scaling (To be tested on Panel)
○ Test with devices having multiple media profiles (3-5)
○ Measure profile switching time and memory impact
2.5.3 Metrics and Acceptance Criteria
● Library functions correctly with maximum expected device count (8) (need confirmation
from JCI)
● Performance degradation is gradual, not exponential
● Resource utilization scales linearly with workload
3. Security Testing (To be tested on linux client)
Security testing ensures the ONVIF client library is resistant to common vulnerabilities and
follows secure coding practices.
3.1 API Security Validation
3.1.1 Test Approach
Comprehensive testing of security mechanisms in the library's API implementation.
3.1.2 Test Scenarios
1. Authentication Testing
○ Test with valid credentials
○ Test with invalid credentials
○ Test with expired credentials
○ Test with malformed credentials
○ Test credential storage security
○ Test token-based authentication workflows
2. HTTPS/TLS Implementation Testing
○ Verify proper certificate validation
○ Test with self-signed certificates
○ Test with expired certificates
3. Session Management Testing
○ Test session timeout handling
○ Test concurrent session limits
○ Test session renewal mechanisms
3.1.3 Metrics and Acceptance Criteria
● Authentication properly enforced for all secure operations
● Invalid credentials consistently rejected
● Credentials never logged or exposed in plaintext
● TLS certificate validation properly enforced
● Session tokens properly secured and validated
● Proper HTTP response codes returned for all authentication scenarios (e.g., 401 for
unauthorized access)
3.2 Input Validation
3.2.1 Test Approach
Systematic testing of library's handling of malicious, malformed, or unexpected inputs.
3.2.2 Test Scenarios
1. Boundary Testing
○ Test with minimum and maximum values for all parameters
○ Test with values just outside valid ranges
○ Test with extremely large values
2. Fuzzing Tests
○ Use automated fuzzing script to generate malformed inputs:
■ Invalid character sequences
■ Oversized data fields
■ Command injection patterns
3. Null/Empty Value Testing
○ Test with null values where applicable
○ Test with empty strings
○ Test with whitespace-only strings
3.2.3 Metrics and Acceptance Criteria
● All inputs properly validated before processing
● Malformed inputs rejected with appropriate error codes
● No crashes, hangs, or unexpected behavior with malicious inputs
● Proper error logging for validation failures
● No information leakage in error responses
● Appropriate HTTP response codes for invalid input (e.g., 400 Bad Request)
3.3 Buffer Overflow Prevention
3.3.1 Test Approach
Thorough testing to identify and prevent memory corruption vulnerabilities.
3.3.2 Test Scenarios (Black box testing)
1. Buffer Boundary Testing (Black Box Approach)
● Test with progressively larger inputs until behavior changes (e.g., start with 256-byte
inputs, then 512-byte etc.)
● Observe response patterns to identify potential buffer limits
● Test all string inputs with common boundary sizes (1023/1024, 2047/2048, 4095/4096
bytes)
● Monitor for crashes, hangs, or unexpected responses that might indicate buffer issues
● Test with boundary characters (null bytes, newlines, quotes) at different positions
2. Memory Corruption Testing (Black Box Approach)
● Run long testing sessions with memory monitoring tools (like monitoring process
memory growth)
● Create and delete objects in rapid succession to stress memory management
● Introduce unexpected operation sequences that might trigger cleanup issues
● Monitor for abnormal process termination or stability issues over time
● Look for degrading performance after repeated operations (sign of memory issues)
3. String Handling Testing (Black Box Approach)
● Test with special characters in all string inputs (quotes, backslashes, control characters)
● Test with Unicode edge cases (zero-width characters, combining characters, right-to-left
marks)
● Test with extremely long strings in all text inputs
● Test with strings that contain SQL, HTML, XML or format string patterns
● Look for truncation, corruption, or incorrect handling in responses
● Test for content type mismatches (providing XML when JSON expected, etc.)
3.3.3 Metrics and Acceptance Criteria
● No memory corruption vulnerabilities identified
● All buffer operations properly bounds-checked
● String operations use safe alternatives to vulnerable functions
● All dynamic memory allocation includes proper size validation
● No exploitable memory safety issues
4. Module-Specific Testing Methodologies
This section outlines testing approaches tailored to each major module of the ONVIF client
library.
4.1 Authentication Module
4.1.1 Functional Testing
● Verify authentication with valid credentials
● Verify rejection with invalid credentials
● Test handling of special characters in credentials
● Test credential storage mechanisms
● Test token generation and renewal
● Verify proper HTTP response codes (200 for success, 401 for unauthorized, etc.)
4.1.2 Performance Testing
● Measure authentication response time
● Test with multiple simultaneous authentication requests
● Test memory usage during authentication operations
4.1.3 Security Testing
● Test against replay attacks
● Test authentication timeout handling
4.1.4 Test Cases (Examples)
1. Basic Authentication Success
○ Precondition: Valid username and password, device online
○ Action: Authenticate to device
○ Expected Result: Authentication successful, token returned, HTTP 200 OK
response
2. Authentication with Special Characters
○ Precondition: Username and password with special characters (@#$%^&*)
○ Action: Authenticate to device
○ Expected Result: Authentication successful, characters properly encoded, HTTP
200 OK response
3. Authentication Timeout Handling
○ Precondition: Valid credentials, slow network
○ Action: Authenticate with 500ms timeout
○ Expected Result: Appropriate timeout error, retry mechanism activated, HTTP
408 Request Timeout response
4.2 Device Discovery Module
4.2.1 Functional Testing
● Verify WS-Discovery message formation
● Test discovery across different network segments
● Test response parsing from different device vendors (Dependency on JCI for the
camera)
● Verify proper HTTP response codes for all discovery operations
4.2.2 Performance Testing
● Measure discovery time with varying device counts
● Test memory usage during discovery operations
● Test CPU utilization during network scanning
● Test with multiple concurrent discovery operations
4.2.3 Security Testing
● Test against discovery response flooding
● Test network isolation scenarios (how the library behaves when devices are on different
subnets, when firewalls are present, or when only certain ports are accessible)
4.2.4 Test Cases (Examples)
1. Basic Network Discovery
○ Precondition: Multiple ONVIF devices on network (Dependency on JCI for the
cameras)
○ Action: Perform discovery operation
○ Expected Result: All devices discovered with correct endpoints, proper HTTP
response codes
4.3 Media Profile Module
4.3.1 Functional Testing
● Verify profile retrieval from devices
● Test profile configuration operations
● Test creation, modification, and deletion of profiles
● Test audio configuration within profiles
● Test metadata configuration within profiles
● Verify proper HTTP response codes for all operations
4.3.2 Performance Testing
● Measure profile retrieval time
● Test with devices having many profiles
● Test memory usage during profile operations
● Test concurrent profile operations
4.3.3 Security Testing
● Test against profile configuration injection
● Verify proper validation of profile parameters
● Test handling of malformed profile data
4.3.4 Test Cases (Examples)
1. Profile Enumeration
○ Precondition: Device with multiple profiles
○ Action: Retrieve all profiles
○ Expected Result: Complete profile list with correct attributes, HTTP 200 OK
response
2. Profile Creation
○ Precondition: Device supporting profile creation
○ Action: Create custom profile with specific parameters
○ Expected Result: Profile created with requested configuration, HTTP 201 Created
response
3. Profile Modification
○ Precondition: Existing profile
○ Action: Modify resolution and framerate
○ Expected Result: Profile updated with new parameters, HTTP 200 OK response
4.4 Media Streaming Module
4.4.1 Functional Testing
● Verify RTSP URL generation
● Test stream parameter configuration
● Test streaming with metadata
● Verify proper HTTP/RTSP response codes for all operations
4.4.2 Performance Testing (On Panel)
● Measure stream initialization time
● Test with multiple simultaneous streams
● Test memory and CPU usage during streaming
● Test network bandwidth utilization
● Test frame rate stability over time
● Test with different resolution and bitrate combinations
4.4.3 Security Testing
● Test stream authentication mechanisms
4.4.4 Test Cases (Examples)
1. Basic Stream Establishment
○ Precondition: Valid profile with video source
○ Action: Request stream URL and establish stream
○ Expected Result: Stream successfully established, frames received, proper
RTSP response codes
2. Multiple Stream Handling
○ Precondition: Device supporting multiple streams
○ Action: Establish 3 simultaneous streams
○ Expected Result: All streams working correctly, no frame drops, proper resource
management
3. Stream Stability Test (can be tested from panel only)
○ Precondition: Single stream established
○ Action: Maintain stream for 24 hours
○ Expected Result: Stream remains stable, no memory leaks, consistent frame rate
4.5 PTZ Module (Need appropriate camera from JCI)
4.5.1 Functional Testing
● Verify PTZ capability detection
● Test absolute movement operations
● Test relative movement operations
● Test continuous movement operations
● Test stop operations
● Test preset management (set, get, move to preset)
● Test home position functionality
● Test auxiliary commands
● Test speed control for movements
● Verify proper HTTP response codes for all PTZ operations
4.5.2 Performance Testing
● Measure command response time
● Test with rapid sequences of commands
● Test memory usage during PTZ operations
● Test simultaneous PTZ control of multiple cameras
4.5.3 Security Testing
● Test against unauthorized PTZ control
● Verify proper validation of movement parameters
● Test handling of malformed PTZ commands
4.5.4 Test Cases (Examples)
1. Absolute Position Movement
○ Precondition: PTZ-capable camera, known coordinate system
○ Action: Move to specific coordinates (x,y,z)
○ Expected Result: Camera moves to requested position, HTTP 200 OK response
2. Preset Operations
○ Precondition: PTZ-capable camera
○ Action: Set preset, move away, return to preset
○ Expected Result: Camera returns precisely to preset position, HTTP 200 OK
response
3. Continuous Movement with Stop
○ Precondition: PTZ-capable camera
○ Action: Start continuous movement, stop after delay
○ Expected Result: Movement starts and stops cleanly, no overshoot, proper HTTP
response codes
4.6 Recording Module
4.6.1 Functional Testing
● Verify recording capabilities detection
● Test recording configuration
● Test recording control (start, stop, pause)
● Test recording search operations
● Test recording status monitoring
● Verify proper HTTP response codes for all recording operations
4.6.2 Performance Testing
● Measure search response time with varying recording sizes
● Test memory usage during recording operations
● Test with multiple simultaneous recording operations
4.6.3 Security Testing
● Test against unauthorized recording access (verifying authentication is required for
recording access, confirming recording access follows proper permission models, testing
if recordings can be accessed without proper credentials)
● Verify proper validation of recording parameters (time ranges, recording identifiers,
search filters, export formats)
● Test handling of malformed recording commands
4.6.4 Test Cases (Examples) (SD cards are required)
1. Recording Search by Time
○ Precondition: Device with recordings
○ Action: Search for recordings in specific time range
○ Expected Result: Correct recordings returned with metadata, HTTP 200 OK
response
4.7 Event Handling Module
4.7.1 Functional Testing
● Verify event capability detection
● Test event subscription
● Test event notification handling
● Test different event types (motion, tampering, etc.)
● Test subscription renewal
● Test subscription cancellation
● Verify proper HTTP response codes for all operations
4.7.2 Performance Testing
● Measure event propagation latency
● Test with high event frequency
● Test memory usage during event processing
● Test with multiple simultaneous subscriptions
4.7.3 Security Testing
● Verify proper validation of event parameters (ensuring only valid event types are
subscribed to, parameters are within valid ranges, and event handling is secure)
● Test handling of malformed event data
4.7.4 Test Cases (Examples)
1. Event Subscription and Notification
○ Precondition: Device with event capabilities
○ Action: Subscribe to motion events, generate motion
○ Expected Result: Events received with correct data, HTTP 200 OK response
2. Subscription Renewal
○ Precondition: Active subscription near expiration
○ Action: Renew subscription before expiration (subscription renewal involves
creating an event subscription with a short termination time, monitoring for
"subscription about to expire" signals, sending renewal request, verifying
continuous event reception without gaps, and confirming subscription reference
remains valid)
○ Expected Result: Subscription extended, no events missed, HTTP 200 OK
response
3. Multiple Event Type Handling
○ Precondition: Device supporting multiple event types
○ Action: Subscribe to several event types
○ Expected Result: All event types correctly received and processed, proper HTTP
response codes
4.8. Device Management Module
4.8.1 Functional Testing
● Verify device information retrieval (manufacturer, model, firmware version)
● Test device capability discovery (supported profiles, services)
● Test device status monitoring (operation status, system health)
● Test time synchronization capabilities (NTP configuration)
● Test system configuration functions (reboot, factory reset)
● Test network configuration management (IP settings, DNS)
● Test user account management (create, modify, delete users)
● Verify proper HTTP response codes for all operations (200 for success, appropriate error
codes)
4.8.2 Performance Testing
● Measure response time for device information queries
● Test concurrent device information requests
● Test memory usage during configuration operations
● Test system stability during configuration changes
4.8.3 Security Testing
● Test against unauthorized system configuration access
● Verify proper validation of configuration parameters
● Test handling of malformed system commands
● Test secure storage of device credentials and settings
● Verify access control enforcement for administrative functions
4.8.4 Test Cases (Examples)
1. Basic Device Information Retrieval
○ Precondition: Device online and accessible
○ Action: Request device information (GetDeviceInformation)
○ Expected Result: Complete device information returned (manufacturer, model,
serial number, firmware version), HTTP 200 OK response
2. Device Capability Discovery
○ Precondition: Device online and accessible
○ Action: Request device capabilities (GetCapabilities)
○ Expected Result: All supported capabilities correctly reported, HTTP 200 OK
response
3. System Reboot Command
○ Precondition: Device online with administrative access
○ Action: Issue system reboot command
○ Expected Result: Command accepted, device reboots, connection re-established
after reboot, HTTP 200 OK response
4. Network Configuration
○ Precondition: Device online with administrative access
○ Action: Modify network settings (IP configuration, DNS)
○ Expected Result: Network settings updated correctly, HTTP 200 OK response
4.9. Imaging Module (Profile T)
4.9.1 Functional Testing
● Verify imaging settings retrieval (brightness, contrast, saturation)
● Test imaging settings configuration
● Test focus control (auto focus, manual focus)
● Test exposure settings (mode, compensation, shutter speed)
● Test white balance configuration
● Test wide dynamic range settings
● Test image stabilization controls
● Test day/night switching functionality
● Verify proper HTTP response codes for all operations
4.9.2 Performance Testing
● Measure response time for imaging setting changes
● Test concurrent imaging setting operations
● Test memory usage during imaging operations
● Measure command execution time for focus operations
4.9.3 Security Testing
● Test against unauthorized imaging control access
● Verify proper validation of imaging parameters
● Test handling of malformed imaging commands
● Verify access control for imaging settings
4.9.4 Test Cases (Examples)
1. Image Settings Retrieval
○ Precondition: Device online with imaging capabilities
○ Action: Request current imaging settings
○ Expected Result: Complete imaging settings returned with correct values, HTTP
200 OK response
2. Brightness Adjustment
○ Precondition: Device online with imaging capabilities
○ Action: Modify brightness setting to a specific value
○ Expected Result: Brightness successfully changed, value persisted, HTTP 200
OK response
3. Auto Focus Operation
○ Precondition: Device with auto focus capability
○ Action: Trigger auto focus operation
○ Expected Result: Focus operation completes successfully, focus improves, HTTP
200 OK response
4. Day/Night Mode Switching
○ Precondition: Device with day/night capability
○ Action: Switch between day and night modes
○ Expected Result: Mode changes successfully, image characteristics adapt to
selected mode, HTTP 200 OK response
4.10 Network Management Module
4.10.1 Functional Testing
● Test network interface configuration
● Verify DNS settings management
● Test NTP configuration
● Test network service enablement/disablement
● Test DHCP client configuration
● Test zero-configuration networking
● Test hostname configuration
● Verify proper HTTP response codes for all operations
4.10..2 Performance Testing
● Test network configuration change response time
● Measure system stability during network changes
● Test system recovery after network disruptions
4.10.3 Security Testing
● Test against unauthorized network configuration access
● Verify proper validation of network parameters
● Test handling of malformed network commands
● Test for information leakage in network configurations
4.10.4 Test Cases (Examples)
1. Network Interface Configuration
○ Precondition: Device with network management capabilities
○ Action: Modify network interface settings
○ Expected Result: Network settings updated successfully, changes effective,
HTTP 200 OK response
2. NTP Server Configuration
○ Precondition: Device with time synchronization support
○ Action: Configure NTP server settings
○ Expected Result: NTP settings updated, time synchronization occurs, HTTP 200
OK response
5. Test Environment and Tools
This section outlines the recommended test environment and tools for comprehensive testing of
the ONVIF client library.
5.1 Test Environment
5.1.1 Hardware Requirements
● Test laptop with minimum specifications:
○ CPU: 8 cores
○ RAM: 16 GB
○ Storage: 250 GB SSD
○ Network: Gigabit Ethernet
● ONVIF-compliant devices for testing:
○ Minimum 5 different camera models from different manufacturers
○ Mix of Profile S, G, and T compliant devices
○ Devices with varying capabilities (PTZ, recording, etc.)
5.1.2 Network Setup
● Isolated test network to prevent external interference
● Configurable network switches for VLAN segregation
● Network traffic capture capabilities
5.2 Testing Tools
5.2.1 Functional Testing Tools
● ONVIF Device Manager (ODM)
● ONVIF Client Test Tool
● Linux client (./IQOnvif Client) for API testing
● Wireshark for network analysis
5.2.2 Performance Testing Tools
● Custom scripts for load testing
● Resource monitoring tools (htop, perfmon)
5.2.3 Automation Tools
● Custom test scripts for load and performance testing and monitoring
6. Test Reporting and Metrics
This section outlines the reporting structure and key metrics for evaluating the ONVIF client
library's quality.
6.1 Performance Test Reports
6.1.1 Key Metrics
● Response times (minimum, maximum, average)
● Throughput (requests per second, bandwidth)
● Resource utilization (CPU, memory, network)
● Concurrent request handling capability
● Stability metrics (crash rate)
6.2 Functional Test Reports
6.2.1 Key Metrics
● Test case execution coverage
● Pass/fail rates by module
6.2.2 Report Format
● Test execution summary
● Detailed test case results
● Bug tracking and status
● Module-specific test coverage
● Regression test results
6.3 Release Readiness Assessment
● Consolidated view of all testing results
● Blocking issues summary
● Risk assessment for known issues
● Recommendation for release/hold decision
7. Appendices
Appendix A: Sample Test Cases
A.1 Performance Test Case Examples
Test ID: TC-PERF-CONCUR-001
Title: API Concurrent Request Handling
Description: Test the library's ability to handle multiple concurrent API requests
Test Type: Performance
Priority: High
Preconditions:
- Test environment set up with 10 ONVIF cameras
- Library initialized with default configuration
Steps:
1. Initialize the test setup with timing measurement
2. Prepare 10 concurrent device discovery requests
3. Execute all requests simultaneously
4. Measure response time for each request
5. Calculate average, min, max response times
6. Monitor CPU and memory usage during the test
Expected Results:
- All requests complete successfully
- Average response time < 500ms
- CPU usage remains below 70%
- No memory leaks detected
A.2 Security Test Case Examples
Test ID: TC-SEC-AUTH-001
Title: Authentication with Invalid Credentials
Description: Test the library's handling of invalid authentication credentials
Test Type: Security
Priority: Critical
Preconditions:
- Test environment set up with secured ONVIF camera
- Library initialized with default configuration
Steps:
1. Attempt authentication with incorrect username
2. Attempt authentication with incorrect password
3. Attempt authentication with empty credentials
4. Attempt authentication with oversized credentials (1000+ characters)
5. Attempt authentication with special characters
6. Monitor error handling and response codes
Expected Results:
- All invalid authentication attempts rejected
- Appropriate error codes returned (HTTP 401 Unauthorized)
- No sensitive information revealed in error messages
- No memory corruption or crashes
- Failed attempts properly logged
Appendix B: Testing Schedule Template
Phase Test Type Duration Dependencies Deliverables
1 Environment Setup 1 day Hardware, Test environment
Network readiness report
2 Functional Testing on 3 days Phase 1 Functional test report
linux client
3 Performance and 3 days Phase 2 Performance test report
Security Testing on linux
client
6 Smoke Testing on Panel 3 days Phase 2,3 Integration test report
8 Final Reporting 1 days All previous Comprehensive test
phases summary