KEMBAR78
Close Range Photogrammetry - Part1 | PDF | Camera Lens | Computer Vision
0% found this document useful (0 votes)
214 views70 pages

Close Range Photogrammetry - Part1

This document outlines the topics to be covered in a course on close range photogrammetry. The syllabus includes cameras for close range photogrammetry, calibration of cameras, image coordinate measurement, adjustment of image triangulation, geometric and non-geometric factors, accuracy and reliability, 3D modeling and visualization, and applications in fields like civil engineering, architecture, archaeology, medicine, and police work. It also discusses recent developments in close range photogrammetry including automation, laser scanning, and future trends comparing photogrammetry to laser scanning.

Uploaded by

Eben Ezerwyranto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
214 views70 pages

Close Range Photogrammetry - Part1

This document outlines the topics to be covered in a course on close range photogrammetry. The syllabus includes cameras for close range photogrammetry, calibration of cameras, image coordinate measurement, adjustment of image triangulation, geometric and non-geometric factors, accuracy and reliability, 3D modeling and visualization, and applications in fields like civil engineering, architecture, archaeology, medicine, and police work. It also discusses recent developments in close range photogrammetry including automation, laser scanning, and future trends comparing photogrammetry to laser scanning.

Uploaded by

Eben Ezerwyranto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 70

Close Range Photogrammetry

Deni Suwardhi
Monday, 9th September 2019
3D Modeling & Information System Division
Remote Sensing & Geographical Information Science
Research Group – FITB – ITB

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 1
Sylabus
1) Cameras for close range photogrammetry
2) Calibration of cameras.
3) Image coordinates measurement.
4) Adjustment of image triangulation in close range
photogrammetry.
5) Geometric and non geometric factors of projects.
6) Accuracy and reliability.
7) 3D models, vizualization.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 2
Sylabus
8) Applications of close range photogrammetry in civil
engineering and industry.
9) Applications of close range photogrammetry in
architecture, archeology,
10) Applications of close range photogrammetry in
medicine, police practice.
11) Automatizations in close range photogrammetry.
12) Laser scanning.
13) Future of close range photogrammetry, comparing
with laser scanning
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 3
Overview
• Recent Developments in Close-range Photogrammetry
• Photography
– The First Part of Photogrammetry
• Mathematical Foundations of Photogrammetry,
Metrology
– The Second Part of Photogrammetry
• Definitions
• Equipment
• Mathematical Explanations
• Working
• Applications

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 4
Recent Developments in Close-
range Photogrammetry

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 5
Close Range Photogrammetry(CRP)

 Photogrammetry is a measurement technique where


the coordinates of the points in 3D of an object are
calculated by the measurements made in two
photographic images(or more) taken starting from
different positions.

 CRP is generally used in conjunction with object to


camera distances of not more than 300 meters (984
feet).
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 6
CRP
 Network/Geometry
of Camera Position
and Orientation

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 7
Vertical Aerial Photographs

ITB Campus in Jalan Ganesa 10 at ITB Campus in Jalan Ganesa 10 at


approx. 30 meters approx. 200 meters

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 8
High Oblique Aerial Photographs

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 9
CRP

It can be broadly divided into two main parts:

 Acquiring data from the object to be measured by


taking the necessary photographs.
 Reducing the photographs (perspective projection)
into maps or spatial coordinates (orthogonal
projection).

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 10
CRP
 Dramatic advances in automatic digital image
analysis have opened up new applications and made
photogrammetry applicable for a broader field of
users that lack specific knowledge of
photogrammetry.
 Megatrends such as Industry 4.0, building
information modelling (BIM) and the digital
transformation are driving the advancement of 3D
measurement technologies, both in terms of high-
end systems and low-cost sensors.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 11
CRP
 New technical challenges such as autonomous
driving or unmanned aerial vehicles are demanding
innovative sensor systems, but also extended maps
and models of the environment.
 Modern approaches such as structure from motion
(SfM), simultaneous localization and mapping
(SLAM) or visual odometry are now being combined
with classical photogrammetric methods.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 12
CRP
 New technical challenges such as autonomous driving
or unmanned aerial vehicles (UAVs or ‘drones’) are
demanding innovative (hybrid) sensor systems, There
is a huge range of image-recording devices available for
close-range photogrammetry purposes (see Figure 1 for
a few examples).

Monday, March 24, 2008 EE 5358 Computer Vision 13


Acquisition of Data: Camera

Cameras can be broadly classified into two:

• Metric
• Single Cameras
• Stereometric Cameras

• Non-metric

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 14
Cameras
Typical sensors can be roughly classified as
follows:
 Action and fisheye cameras
 Cameras for the consumer market
 Cameras for professional applications
 Industrial cameras
 Metric cameras for explicit photogrammetric
applications
 High-speed cameras
 Panoramic cameras
 Multi-camera systems
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 15
Cameras

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 16
Cameras

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 17
 This technique can provide data of vehicle dynamics or motions with accuracy
equal to or better than inertial measurement units and has been used on
multiple NESC-sponsored projects. Maximum flexibility is provided in
choosing the number of cameras, the types of lenses, and the placement of
those cameras in accessible locations for a flight experiment or ground test
facility without overlapping camera view constraints
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 18
Cameras

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 19
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 20
Cameras
 Practically all modern imaging sensors are designed
based on CMOS technology.
 Their availability ranges from mass products, e.g.
smartphone cameras (usually with a rolling shutter),
to specific high-performance sensors used for special
applications such as high-speed imaging.
 Sensors with a global shutter are required for most
dynamic applications where the camera or object are
moving with respect to each other.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 21
Metric Cameras
Photogrammetric Camera that enables geometrically
accurate reconstruction of the optical model
of the object scene from
its stereo photographs

Single Cameras
• Total depth of field
• Photographic material
• Nominal focal length
• Format of photographic material
• Tilt range of camera axis and number of intermediate
stops

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 22
Metric Cameras (contd.)

Stereometric Cameras
• Base Length
• Nominal Focal Length
• Operational Range
• Photographic Material
• Format of photographic material
• Tilt range of optical axes and
number of intermediate tilt stops

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 23
Non-metric Cameras
Cameras that have not been designed especially for
photogrammetric purposes:
• A camera whose interior orientation
is completely or partially unknown
and frequently unstable.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 24
Non-metric Cameras
Advantages
• General availability
• Flexibility in focusing range
• Price is considerably less than for metric cameras
• Can be hand-held and thereby oriented in any direction
Disadvantages
• Lenses are designed for high resolution at the expense of high
distortion
• Instability of interior orientation (changes after every exposure)
• Lack of fiducial marks
• Absence of level bubbles and orientation provisions precludes
the determination of exterior orientation before exposure
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 25
Hybrid sensor systems
 An increasing number of hybrid systems are available
in which camera sensors are combined with additional
measuring devices.
 The most popular examples are terrestrial laser
scanners equipped with one or more cameras for the
acquisition of panorama images or for recording colour
values for each point of a laser scan.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 26
Hybrid sensor systems
 As an example, the Leica BLK 360 scanner (Fig. 2a)
consists of an additional thermal camera that can
measure temperatures.
 The new Leica RTC 360 scanner (Fig. 2b) consists of
five cameras that are used for visual odometry in a
SLAM approach in order to measure the way and pose
when the scanner is moved to the next station.
 Other examples of hybrid systems are low-cost sensors
running on tablets or mobile devices that combine
IMU, GNSS, time-of-flight (ToF), laser triangulation or
RGB cameras for low-to-medium-accuracy 3D
scanning. Fig.EE 2c
Monday, March 24, 2008
shows a handheld tablet device based27
5358 Computer Vision
Hybrid sensor systems

Monday, March 24, 2008 EE 5358 Computer Vision 28


Measurement of tie points
 Multi-image photogrammetry requires overlapping
images which are connected by corresponding points
(homologous or tie points).
 By means of targets, which may be coded to define a
certain point number, the process of finding
correspondences and approximations for orientation is
relatively easy.

Monday, March 24, 2008 EE 5358 Computer Vision 29


Measurement of tie points
 Using natural features as tie points, certain detectors
and descriptors allow for the matching of similar
features using different criteria.
 As examples, operators like SIFT, SURF or ORB
provide robust feature detection and matching.
 However, for a reliable match it is recommended to
acquire the images with a large relative overlap, e.g.
90% from image to image.
 Matching becomes weak or even impossible if large
object areas are imaged without sufficient textures.
Monday, March 24, 2008 EE 5358 Computer Vision 30
Orientation
 The calculation of the exterior orientations of all
images (also called alignment) is a prerequisite for
subsequent 3D object reconstruction.
 Basically, the process starts with a complex procedure
for finding approximate values of all unknown
parameters by a clever combination of relative and
absolute orientations, space resections and
intersections.

Monday, March 24, 2008 EE 5358 Computer Vision 31


Orientation
 The final optimization of all parameters is done by
bundle adjustment which minimizes the residuals of
all observations (image measurements) in one process
in order to determine the desired calibration and
orientation parameters, and the 3D coordinates of all
tie points.
 If control points are available, they are integrated to
define the final coordinate system and to compensate
for the datum defect of a photogrammetric network.
 The 3D coordinates of all measured points provide a
sparse point cloud of the object surface.
Monday, March 24, 2008 EE 5358 Computer Vision 32
Orientation
 Bundle adjustment is the critical part in a
photogrammetric orientation process.
 Hence, statistical quality parameters (sigma values,
RMS of object points) shall be analyzed carefully to
provide a picture of the internal precision of the
adjustment.
 However, the real accuracy must be checked by
independent reference values (see below).

Monday, March 24, 2008 EE 5358 Computer Vision 33


3D modelling
 Digital Mono Plotting

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 34
3D modelling
 Monoscopic Multi-Image Measurement

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 35
3D modelling
 Digital Stereo
Plotting

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 36
SfM and SLAM
 Basically, the structure-from-motion approach is a
complex procedure where subsequent images with
high overlap are oriented automatically by means of
feature detection, feature matching and robust
sequential orientation.
 Based on RANSAC-based procedures and linear
estimation models, datasets with a high number of
outliers can be processed sufficiently.

Monday, March 24, 2008 EE 5358 Computer Vision 37


SfM and SLAM
 Simultaneous localization and mapping (SLAM)
approaches are often used in dynamic environments
(e.g. moving robots) to measure the route (pose) of the
sensor and the unknown environment simultaneously.
 Image-based SLAM algorithms are also called ‘visual
odometry’. Since the geometric configuration of image
sequences is often weak, additional sensors (e.g. IMU)
and Kalman filtering are included.

Monday, March 24, 2008 EE 5358 Computer Vision 38


Dense point clouds
 After successful bundle adjustment, a dense point
cloud of the object surface can be calculated, if
necessary.
 The objective is to derive 3D coordinates for every pixel
(or in a specific resolution in object space).
 Today, the most successful approaches are based on
semi-global matching (SGM) which looks for best
matches along epipolar lines by minimizing a
particular cost function.

Monday, March 24, 2008 EE 5358 Computer Vision 39


Dense point clouds
 SGM is a robust method that can interpolate
textureless areas and create surface models that
sufficiently preserve sharp edges in object space.
 Compared to the orientation process above, the
generation of dense point clouds requires much higher
computational effort.

Monday, March 24, 2008 EE 5358 Computer Vision 40


Orthophotos
 Even in close-range photogrammetry, the generation
of (true) orthophotos has become an important
product, especially for UAV applications or in
architectural and archaeological projects.
 Since the process described above generates
orientation parameters and a dense surface model,
orthophotos can be derived directly from the acquired
images.

Monday, March 24, 2008 EE 5358 Computer Vision 41


Orthophotos
 However, a photogrammetric point cloud usually
describes the visible surface of the object, i.e. with
vegetation or other disturbing objects included.
 Before a final true orthophoto can be produced, the
surface model may be subject to (manual) cleaning,
filtering or other types of post-processing.

Monday, March 24, 2008 EE 5358 Computer Vision 42


3D modelling from Point Cloud
 In many applications, further processing of point
clouds is required, e.g. for the production of
architectural plans or 3D models for BIM or facility
management.
 Although a number of semi-automated software
approaches are available for the extraction of certain
elements (e.g. planes, pipes), the generation of final
products often requires manual processing.
 This holds true for any kind of point clouds, i.e. also
from laser scanning or other technologies.

Monday, March 24, 2008 EE 5358 Computer Vision 43


3D modelling from Point Cloud

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 44
3D modelling from Point Cloud

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 45
3D modelling from Point Cloud
 Semantic modelling, i.e. automatic classification of
object parts, is still an ongoing research task that must
involve human knowledge of the object and the
application.
 Recent machine learning approaches demonstrate
promising solutions to solve at least a part of the
modelling interpretation process.

Monday, March 24, 2008 EE 5358 Computer Vision 46


Accuracy and verification
 Professional use of photogrammetry usually involves
specifications for high-quality results.
 In industrial applications, the verification of the
achieved accuracy with respect to accepted guidelines
is most important.
 In most cases, standardized characteristics such as the
maximum length measurement error have to be
reported including retraceability to the standard unit
metre, e.g. by measurement of calibrated reference
artefacts.

Monday, March 24, 2008 EE 5358 Computer Vision 47


Accuracy and verification
 In non-industrial fields, e.g. cultural heritage or
topographic surveying, the comparison of
independent control points is a well-established
method to derive accuracy figures.
 However, these points should not be included into
orientation processes but should be measured
individually in order to represent subsequent
processes such as dense matching.

Monday, March 24, 2008 EE 5358 Computer Vision 48


Applications
 Industry 4.0 is
characterized by a
number of
significant changes
in production, e.g.
higher degree of
individual products,
shorter life cycles or
more flexible
manufacturing
lines.
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 49
Example: Robot-based industrial
inspection
 A new direction of online systems uses robots to drive
a surface sensor to specific areas of an object. Since the
mechanical positioning accuracy of a robot is not high
enough to provide the exterior orientation (pose) of
the scanning device directly, a camera system observes
the actual position of the sensor with respect to a pre-
calibrated field of targets.
 This concept provides high flexibility to adapt for
specific measurement conditions. Hence, it allows the
integration of optical 3D measurement devices into
flexible production lines.
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 50
Example: Robot-based industrial
inspection
 Figure 3a shows a system in which a set of ceiling-
mounted cameras measures the 6DOF pose of a fringe
projection system. Figure 3b illustrates a robot-based
system in which a camera is attached to the surface
sensor which permanently measures a set of reference
targets.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 51
Example: UAV photogrammetry
 UAV-based photogrammetry has become a standard
measuring technology now that remotely piloted drones
with high capabilities for autonomous flying are available.
 With the relevant flight permissions, UAVs can be used as
imaging platforms for a wide area of applications, e.g. road
mapping, observation of construction sites, archaeological
surveys, topographic mapping, environmental monitoring,
etc.
 In most cases, recorded images are processed by SfM
software that automatically generates point clouds or
orthophotos.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 52
Example: UAV photogrammetry
 Since many users have only limited photogrammetry
skills, they do not always properly understand the
impact of image configurations, distribution of control
points or camera calibration issues.
 Consequently, the quality of results not only varies
from project to project, but can also vary within a
certain area of measurement within a project.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 53
Example: UAV photogrammetry
 The professional use of UAV photogrammetry usually
requires a high-quality camera/lens system, camera
stabilization, dense image overlaps, sufficient intersection
angles and a suitable distribution of control points, just as
with aerial photogrammetry.
 Camera calibration can be particularly difficult when
using non-professional cameras and/or weak flight
configurations.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 54
Example: UAV photogrammetry
 Although the drone was guided manually and the camera
is not designed for measurements, configuring flights at
different heights and the additional circular arrangement
of images enabled the simultaneous calibration of the
camera.
 Together with additional terrestrial images taken with a
24MP DSLR camera, a consistent and accurate 3D
model with an average accuracy of 0.6 pixels (GSD
between 5-10mm) could be generated by an SfM
approach (RealityCapture).

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 55
Example: UAV photogrammetry
 Figure 4 shows an example of the use of an amateur
drone (DJI Mavic Pro) for recording the roof areas of an
ancient church.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 56
Example: Underwater weld
measurement
 In this example of measuring welding seams of
underwater steel constructions, the objective is to
measure the surface of welds with an accuracy and a
resolution of about 30µm in a distance of about 50mm.
 Two variants of a prototype system have been
developed.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 57
Example: Underwater weld
measurement
 One solution consists of a laser line projector and two
cameras that observe the projected laser line for stereo
matching.
 The second version uses one or two cameras that are
moved across a reference field with control points in
order to derive the exterior orientations. The surface
within overlapping areas is reconstructed by
photogrammetric image matching.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 58
Conclusion
 Recent trends and developments from photogrammetry
and computer vision indicate a continuous change of
classical measurement technologies which could be
classified as a paradigm shift.
 A wide area of new applications is now addressed,
leading to new prospects and challenges.
 However, whilst new automated imaging technologies
increasingly cover dynamic scene recordings, the proper
use of these methods by users with limited skills may
lead to unsafe or unforeseen results.
 Hence, appropriate teaching concepts for students as
well as life-long learning offers for practitioners are
urgently required.
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 59
Data Reduction

• Analog 1900 to 1960

• Analytical 1960 onwards

• Semi-analytical

• Digital 1980 onwards

Monday, March 24, 2008 EE 5358 Computer Vision 60


Photography
– The First Part of Photogrammetry

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 61
Photography
 The three main considerations for good photography
are:
1) Field of View
2) Focusing
a) Depth of Field
3) Exposure
a) Apperture
b) Shutter Speed
c) ISO

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 62
Field of View
 The camera’s field of view defines how much it sees and is a
function of the focal length of the lens and the size (often
called the format) of the digital sensor. For a given lens, a
larger format sensor has a larger field of view.
 In general, there is a
tradeoff between the
field of view of a lens
and accuracy. Although
wider-angle lenses need
less room around the
object, they also tend to
be less accurate.
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 63
Field of View
 Similarly, for a
given size sensor, a
shorter focal
length lens has a
wider field of view.
The relationship
between format
size, lens focal
length and field of
view is shown in
Figure 2
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 64
Focusing
 One consideration for normal photography is, of
course, focusing the lens so the image is sharp. The
range of acceptable sharpness is called the depth of
focus.
 The depth of focus of a lens is a function of many
factors, including: the focal length of the lens, the
format size, the distance from the camera to the
object, the size of the object, and the f-number of the
lens.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 65
Focusing
 As you can appreciate
from all the factors
listed above, the
depth of focus can be
a complex function.
Figure 3 demonstrates
the relationship
between f-number,
focus distance and the
resultant focus depth.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 66
What is depth of field (DOF)?
 DOF refers to how much a photograph appears to be
in focus. If the main subject is in focus, but the
foreground or background is blurred, the photo is said
to have a shallow DOF. If most all of the photo is in
focus, including the foreground and background, the
photo is said to have a deep DOF. It is determined by
the distance between the camera and the subject, as
well as the aperture (aka f-stop) and focal length of the
lens.

 Distance: Moving the camera closer to your subject can


cause blurring in the background and foreground of
Monday, September 09, 2019
Close Range Photogrammetry, Computer Vision and 3D Modeling 67
Image Exposure
 For photogrammetry purposes, it is desirable to set the
targets bright and the background dim. When retro-
reflective targeting is used, the target and background
exposures are almost completely independent of each
other.
 The target exposure is completely determined by the flash
power while the background exposure is determined by the
ambient illumination.
 The amount of background exposure is controlled by the
shutter time and f-number. Shutter speed, aperture, and
ISO all work together to control the amount of light that
enters the camera and influences exposure.
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 68
Image Exposure
 Figure 4 demonstrates the relationship between
shutter speed and the f-number.

Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 69
Aperture
 Aperture is the diaphragm opening in the lens through
which light passes to the image sensor.
 Cameras that have the option to adjust the aperture
provide control over the depth of field in a
photograph.
 When a wide aperture (indicated by a lower f-stop
number) is used, more light is passed to the image
sensor; this creates a shallow depth of field.
 By using a narrow aperture (indicated by a higher f-
stop number), less light passes to the image sensor and
creates a deep depth of field
Monday, September 09, 2019 Close Range Photogrammetry, Computer Vision and 3D Modeling 70

You might also like