Chapter Three
Chapter Three
Scope Narrow: core stages of GPU rendering Broad: includes asset management, draw calls
Example Vertex → Fragment → Framebuffer Load model → Set camera → Submit → Draw
Discussion Point 4
How do you think the rendering process impacts the realism and
performance of graphics applications? The rendering process
significantly affects both the realism and performance of graphics
applications.
REALISM: PERFORMANCE:
• Lighting and Shading: Advanced techniques like global • Complexity of Algorithms: Advanced rendering techniques that
illumination, ray tracing, and ambient occlusion help produce realism often require more computational resources.
simulate how light interacts with objects, creating realistic For example, ray tracing, which calculates the path of light rays,
shadows, reflections, and refractions. These techniques is much more computationally expensive compared to
mimic how light behaves in the real world, enhancing visual rasterization. This can slow down performance, especially in real-
fidelity. time applications such as video games.
• Textures and Materials: The application of high-resolution • Level of Detail (LOD): Optimizing performance often involves
textures and complex materials (e.g., using shaders to techniques like LOD, where less detailed models are rendered
simulate skin, water, or metal) can contribute to a lifelike when objects are far away from the camera. This reduces the
appearance. The realism of an object can depend on how number of computations, improving performance while
accurately these materials reflect light and how they are maintaining visual quality for close-up objects.
affected by environmental conditions. • Parallel Processing and Hardware Acceleration: The use of GPU
• Depth of Field and Motion Blur: Simulating camera effects acceleration, which allows for parallel processing of multiple
like depth of field and motion blur can make the scene feel rendering tasks, can help balance the tradeoff between
more immersive and realistic by emulating how cameras performance and realism. However, not all devices may have
and human eyes perceive the world. access to such hardware, limiting performance in some cases.
Discussion Point 5
Why is rendering considered both an artistic and computational
process? Rendering is considered both artistic and computational
because it blends creative expression with technical execution.
Importance in rendering:
• Clarifies Process Flow: The reference model acts
A reference model in computer as a blueprint, helping to break down the
graphics defines a standardized rendering process into manageable and logical
framework for understanding and stages.
implementing the rendering • Promotes Modularity: Each stage can be
pipeline. independently developed and optimized,
It outlines the main stages and supporting modular software and hardware
data flow involved in transforming design.
a 3D scene into a 2D image. • Ensures Consistency: By following a well-defined
reference model, different systems (e.g., graphics
APIs like OpenGL or Direct3D) produce consistent
rendering behavior.
• Enables Optimization: Identifying where
performance bottlenecks occur (e.g., geometry
processing or pixel shading) becomes easier when
the stages are clearly defined.
How does the reference model help both 8
software developers and hardware designers
in implementing rendering systems?
For Software Developers: For Hardware Designers:
• Guides Implementation: It provides a roadmap for • Architecture Planning: Helps design GPUs
building rendering engines and graphics that mirror the logical structure of the
applications, with clearly defined input-output rendering pipeline—dedicating specific
relations at each stage. units to vertex processing, rasterization,
• API Design and Usage: Libraries and APIs (e.g., pixel shading, etc.
OpenGL, Vulkan) are structured based on the • Parallelization Opportunities: Clarifies
reference model, helping developers to use these independent operations that can be
tools effectively. parallelized (e.g., per-vertex or per-pixel
• Debugging and Optimization: Understanding the calculations), which is crucial for GPU
model helps in isolating bugs or performance issues performance.
in specific pipeline stages (e.g., vertex • Resource Allocation: Informs decisions on
transformation, rasterization). memory usage, register allocation, and
• Shader Programming: The reference model defines processing power distribution across
where shaders (vertex, fragment/pixel, geometry) pipeline stages.
operate, enabling precise control over rendering
behavior.
Rendering Process Vs. Reference 9
Model
Feature Rendering Process Reference Model
Specification-defined
Scope Implementation-specific
(OpenGL spec)
Abstraction
High-level conceptual Low-level implementation
Level
Understand rendering
Purpose Execute rendering tasks
logic
API
None (theoretical) API & GPU specific
Dependency
Describes "what
Customization Defines "how it happens" (e.g. shaders)
happens"
Layer Role
Sends draw commands, data, and handles
Application
logic
OpenGL API Interface to access rendering functionality
OpenGL Driver Converts API calls into GPU instructions
GPU (Hardware) Executes rendering tasks
Graphics Pipeline Processes geometry to pixels
Polynomial Evaluator :-This stage is
OpenGL Architecture
responsible for evaluating mathematical
functions like curves and surfaces using 19
polynomial equations. It processes data that
represents curves or surfaces in an efficient
way before passing it to the next stage.
OpenGL 4.x Advanced GPU & compute features Tessellation, compute shaders
Why? When modifying your object it makes most sense to do this in local
space, while calculating certain operations on the object with respect
to the position of other objects makes most sense in world coordinates
and so on.
Real World Analogy 28
29
Coordinate System Description Comparison Attributes Example
Object/Local Coordinate System - Defines the local space of an - **Position**: Relative to the A chair has its own local
(OCS) object, independent of other object's local origin. coordinates, where the seat might
objects in the scene. - **Transformation**: Only affects be centered at (0, 0, 0). Any
- Position and orientation are the object. modifications, like rotation or
relative to the object's own origin. - **Usage**: Modeling individual scaling, are applied locally.
objects.
World Coordinate System (WCS) - Represents the entire 3D scene - **Position**: Relative to the A city model where buildings,
as seen from a global global origin. cars, and roads are positioned
perspective. - **Transformation**: Applies to all based on the overall scene, such
- Objects are placed and objects in the world. as placing a building at (50, 20, 0)
transformed relative to a global - **Usage**: Placing objects in a relative to the global origin.
origin (0, 0, 0). scene.
View/Camera Coordinate System - The space relative to the - **Position**: Objects are A first-person camera view in a
(VCS) camera or viewer. transformed based on camera game, where the camera is at the
- Defines how objects are placement. origin, and all objects are viewed
transformed based on the - **Transformation**: Camera- from this point, facing along the
camera's position and orientation. centric. negative Z-axis.
- **Usage**: Rendering scenes
from the camera’s perspective.
Clip Coordinate System - The system after applying - **Position**: Affected by A scene where only objects within
projection transformations. projection (perspective or the view frustum are rendered,
- Coordinates are in a range orthographic). and objects outside this range are
before being clipped by the - **Transformation**: Pre-clipping, clipped off and not drawn.
viewing frustum. defines visible regions.
- **Usage**: Clipping unnecessary
parts.
Normalized Device Coordinates - A normalized space where - **Position**: Normalized between After projection, an object might
(NDC) coordinates are in a range of [-1, [-1, 1]. have coordinates (0.3, 0.2, 0.8) in
1] for all axes.
- Helps in standardizing object
- **Transformation**: Post-
projection. 30
NDC, meaning it's within the
visible range of the screen and
placement for rendering - **Usage**: Maps visible parts to a ready for rasterization.
regardless of device standardized unit cube for
specifications. rendering.
Window/Screen Coordinate - The final coordinate system used - **Position**: Mapped to pixel A 3D object’s (0.5, 0.5) position in
System for rendering. space. NDC could be mapped to (960,
- Coordinates are mapped to - **Transformation**: From NDC to 540) in screen coordinates,
pixel positions on the physical pixel coordinates. placing it at the center of a
screen, such as in a 1920x1080 - **Usage**: Rendering on a 1920x1080 display.
resolution. specific display or screen
resolution.
Eye Space or Camera Space - A space where all coordinates - **Position**: Relative to the In a 3D scene, lighting
are transformed relative to the camera's origin. calculations are done in eye
camera’s position. - **Transformation**: Used for space to determine how light
- Used in lighting and shading lighting/shading. reflects based on the object’s
calculations. - **Usage**: Calculations of position relative to the camera
lighting effects like specular or (eye).
diffuse.
Texture Coordinate System - A 2D system used to map - **Position**: Relative to the A 512x512 texture is mapped to a
textures to objects. texture. cube. The texture coordinates (0,
- Coordinates are typically - **Transformation**: Applied to 0) might represent the bottom-left
expressed in terms of **u** and textures before rendering. corner, and (1, 1) the top-right
**v**. - **Usage**: Texture mapping on corner of the texture.
objects (2D to 3D).
31
Coordinate Systems and 32
Transformations
Steps in Forming an Image
specify geometry (world coordinates)
specify camera (camera coordinates)
project (window coordinates)
map to viewport (screen coordinates)
Each step uses transformations
Every transformation is equivalent to a change in
coordinate systems
Synthetic Camera 33
Application Structure
Configure and open window
Initialize OpenGL state
Register input callback functions
render
resize
glBegin(GL_TRIANGLES); glutDisplayFunc(display);
glutMainLoop();
glVertex2f(-0.5f, -0.5f);
return 0;
glVertex2f(0.5f, -0.5f);
}
glVertex2f(0.0f, 0.5f);
glEnd();
glFlush(); // Render the scene
}
Code Description 48
Note: Please create a folder named “GL” inside the include folder if
it does not already exist and copy glut.h to this folder.
2. Copy glut.lib, glut32.lib 55
SysWOW64 is a folder that is present exclusively on 64-bit operating systems. It’s located
under C:\Windows, and it contains the 32-bit file components and resources that the
operating system needs.
When we compare the SysWOW64 and System32 folders, they almost have the same structure.
That’s because SysWOW64 is similar to a 32-bit System32 on a 64-bit operating system.
Copy glut32.dll: 57
Copy glut32.dll:
From the same bin folder, copy glut32.dll to:
A .dll file is a dynamic link library, which is a collection of code and data that
other programs can use to perform tasks.
Configure Visual Studio Project 58
Create a New Project: 59
End of Chapter 3