Machine Learning
Logistic Regression
Azam Asilian Bidgoli
These slides include content from the work of:
Richard Zemel, Raquel Urtasun, and Sanja Fidler, University of Toronto
Overview
Classification: predicting a discrete-valued target
Binary classification: predicting a binary-valued target
Multiclass classification: predicting a discrete(> 2)-valued target
Examples of binary classification
predict whether a patient has a disease, given the presence or absence of various symptoms
classify e-mails as spam or non-spam
predict whether a financial transaction is fraudulent
PAGE 2
O verview
B i n a r y linear classification
classification: given a D-dimensional input x ∈ R D predict a
discrete-valued target
binary: predict a binary target t ∈ { 0, 1}
• Training examples with t = 1 are called positive examples, and
training examples with t = 0 are called negative examples. Sorry.
Intro M L (UofT) • t ∈ {0, 1} or t ∈ {−1, + 1 } is for computational convenience.
linear: model prediction y is a linear function of x, followed by a
threshold r:
Some Simplifications
Eliminating the threshold
We can assume without loss of generality ( W L O G ) that the
threshold r = 0:
Intro M L (UofT)
Eliminating the bias
Add a dummy feature x 0 which always takes the value 1. The weight
w0= b equivalent to a bias (same as linear regression)
Simplified model
Receive input x ∈ RD + 1 with x 0 = 1:
Examples
Let’s consider some simple examples to examine the properties of our
model
Let’s focus on minimizing the training set error, and forget about whether
our model will generalize to a test set.
PAGE 5
Examples
NOT
x0 x1 t
1 0 1
1 1 0
Intro M L (UofT)
• Suppose this is our training set, with the dummy feature x 0
included.
• Which conditions on w0, w1 guarantee perfect classification?
When x 1 = 0, need: z = w0x0 + w1x1 ≥ 0 ⇐⇒ w0 ≥ 0
When x 1 = 1, need: z = w0x0 + w1x1 < 0 ⇐⇒ w0 + w1 < 0
• Example solution: w0 = 1, w1 = −2
• Is this the only solution?
Examples
AND
z = w0x0 + w1x1 + w2x2
x0 x1 x2 t
need: w0 < 0
Intro M L (UofT) 1 0 0 0
1 0 1 0 need: w0 + w2 < 0
1 1 0 0 need: w0 + w1 < 0
1 1 1 1 need: w0 + w1 + w2 ≥ 0
Example solution: w0 = −1.5, w1 = 1, w2 = 1
The Geometric Picture
I n p u t Space, or D a t a Space for N O T example
x0 x1 t
1 0 1
1 1 0
Intro M L (UofT)
Training examples are points
Weights (hypotheses) w can be represented by half-spaces
H + = { x : w T x ≥ 0}, H − = { x : w T x < 0}
▪ The boundaries of these half-spaces pass through the origin
The boundary is the decision boundary: { x : w T x = 0}
▪ In 2-D, it’s a line, but in high dimensions it is a hyperplane
If the training examples can be perfectly separated by a linear
decision rule, we say data is linearly separable.
The Geometric Picture
Data Space Weight Space
w0 ≥ 0
w0 + w1 < 0
Intro M L (UofT)
Weights (hypotheses) w are points
Each training example x specifies a half-space w must lie in to be
correctly classified: w T x ≥ 0 if t = 1.
For N O T example:
x 0 = 1, x 1 = 0, t = 1 = ⇒ (w0, w1) ∈ { w : w0 ≥ 0}
x 0 = 1, x 1 = 1, t = 0 = ⇒ (w0, w1) ∈ { w : w0 + w1 < 0}
The region satisfying all the constraints is the feasible region; if
this region is nonempty, the problem is feasible, otw it is infeasible.
The Geometric Picture
The A N D example requires three dimensions, including the dummy one.
To visualize data space and weight space for a 3-D example, we can look
Intro M L (UofT)
at a 2-D slice.
The visualizations are similar.
Feasible set will always have a corner at the origin.
The Geometric Picture
Visualizations of the A N D example
D ata Space Weight Space
x0 x1 x2 t
1 0 0 0
1 0 1 0
1 1 0 0
1 1 1 1
- Slice for w0 = −1.5 for the constraints:
- Slice for x 0 = 1 and
- w0 < 0
- example sol: w0 = −1.5, w1 = 1, w2 = 1
- w0 + w2 < 0
- decision boundary:
- w0 + w1 < 0
w0x 0 + w1x 1 + w2x 2 = 0
- w0 + w1 + w2 ≥ 0
= ⇒ −1.5+ x 1 + x 2 = 0
Summary — Binary Linear Classifiers
S u m m a r y : Targets t ∈ {0, 1}, inputs x ∈ R D + 1 with x 0 = 1, and
model is defined by weights w and
Intro M L (UofT)
How can we find good values for w?
If training set is linearly separable, we could solve for w using
linear programming
We could also apply an iterative procedure known as the perceptron
algorithm (but this is primarily of historical interest).
If it’s not linearly separable, the problem is harder
Data is almost never linearly separable in real life.
Towards Logistic Regression
First Try: 0-1 Loss
Binary linear classification with 0-1 loss:
The cost J is the misclassification rate:
PAGE 14
Problems with 0-1 loss
To minimize the cost, we need to find a critical point.
But, the gradient is zero almost everywhere!
Changing the weights has no effect on the loss.
Also, 0-1 loss is discontinuous at z = 0, where the gradient is undefined.
For t=0
PAGE 15
Second Try: Squared Loss for Linear Regression
Choose an easier to optimize loss function.
How about the squared loss for linear regression?
Treat the binary targets as continuous values.
Make final predictions y by thresholding z at 1/2 .
PAGE 16
Problems with Squared Loss
If t = 1, a greater loss for z = 10 than z = 0.
Making a correct prediction with high confidence should be good,
but incurs a large loss.
Third Try: Logistic Activation Function
For binary targets, no reason to predict values outside [0, 1].
Let’s squash predictions y into [0, 1].
The logistic function is a sigmoid (S-shaped) function.
C S C 311 I n t r o t o M L (UofT)
This results in a linear model with a logistic non-linearity:
σ is called an activation function.
Problems with Logistic Activation Function
Suppose that t = 1 and z is very negative (z≪ 0).
Then, the prediction y ≈ 0 is really wrong.
However, the weights appears to be at a critical point:
C S C 311 I n t r o t o M L (UofT)
Logistic Regression
Interpret y ∈ [0, 1] as the estimated probability that t = 1.
Heavily penalize the extreme mis-classification cases when t = 0, y = 1 or t = 1, y = 0.
Cross-entropy loss (aka log loss) captures this intuition:
PAGE 20
Logistic Regression
Cross-Entropy Loss w.r.t z, assuming t = 1
PAGE 21
Comparing Loss Functions for t = 1
PAGE 22
Gradient Descent for Logistic Regression
How do we minimize the cost J for logistic regression? Unfortunately, no direct solution.
Use gradient descent
since the logistic loss is a convex function in w.
initialize the weights to something reasonable and repeatedly adjust them in the direction of steepest
descent.
A standard initialization is w = 0
PAGE 23
Gradient of Logistic Loss
Back to logistic regression:
Therefore
Gradient descent update for logistic regression:
PAGE 24
Gradient Descent for Logistic Regression
Comparison of gradient descent updates:
Linear regression:
Logistic regression:
Not a coincidence! These are both examples of generalized linear models. But we won't go in further detail.
Notice 1: N in front of sums due to averaged losses. This is why you need smaller learning rate when cost is summed
losses
PAGE 25
Main Takeaways on Logistic Regression 1/2
What is the main motivation for using logistic regression?
When data isn’t linearly separable, cannot classify data perfectly.
Use a loss function and minimize average loss.
Why did we try 0-1 loss first? What’s the problem with it?
Natural choice for classification.
Gradient zero almost everywhere. a discontinuity.
Why did we try squared loss next? What’s the problem with it?
Easier to optimize.
Large penalty for a correct prediction with high confidence.
PAGE 26
Main Takeaways on Logistic Regression 2/2
Why did we try logistic activation function next? What’s the problem with it?
Prediction ∈ [0, 1].
An extreme mis-classification case appears optimal.
Why did we try cross-entropy loss next?
Heavily penalizes extreme mis-classification.
How do we apply gradient descent to logistic regression?
Derive the update rule.
PAGE 27
Main Takeaways on Basic Concepts
Compare and contrast K N N and Linear Classifiers.
What is a hypothesis, the hypothesis space, and the inductive bias?
A hypothesis is a function from the input space to the target space.
A hypothesis space contains a collection of hypotheses.
C S C 311 I n t r o t o M L (UofT)
Define parametric and non-parametric algorithms. Give examples.
A parametric algorithm has a finite number of parameters.
Examples: linear regression, logistic regression.
A non-parametric algorithm has no parameter is defined in terms
of the data.
Examples: k-nearest-neighbours, decision trees.
Linear Classifiers vs. K N N
Linear classifiers and K N N have very different decision boundaries:
Linear Classifier K Nearest Neighbours
. . .. . . . . .. . . . . .. . . . . . . .. . . .. .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . .. . . .. . . .. . . . . .. . . . . .. . . . . . . .. . . .. .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . .. . . . . . . . . . . . . . . . . . . .
. . .. . . . . ... . . . . .. . . . . . . .. .o ............................................... . . .. . . . . ... . . . . .. . . . . . . .. .o . . . .. . . . . . .. . . . . . . .. . . . . .. . . . . ... . . .. . . . . . ...........
. . .. . . . o . .. . . . ... . . . . . . .. . . . . o..... . ... . .. . . . . .. . .. . . .. . . .. . .. . . .. . . . .. . . .. . .. . . . .. . . .. . . ... . . .. . . . ... . . . . . . .. . . . . .. .o
. . .. . . . o . . ... . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . ... . . .. . . ... .
. . .. . . . . .. . . .o. .. . . . . . . .. . . . . .. . . .o. .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . .. . . .. . . .. . . . . .. . . .o. .. . . . . . . .. . . . . .. . . .o. .. . . . . . . .. . . . . .. . . . . .. .. . . . . . .. . . . . .. . . . . . . .. . . ..
. . .. . . . . ... . . . .. . . . . . . ... . . . .o ... ..o. ..o . .. . . . . . . ... . . . . .. . . . . .. . . . . . . ... . . . . .. . . . . . . .. . . . . .. . . . . ... . . . .. . . . . . . ... . . . .o . .o . . . .. . . . . . . .. . . . . . .. . . . . .. .. . . . . . . . . . . . . . . . . . . .
. . .. . . . . .. . . . . o . . . . .o. . . . . . . . . o
. . .. . . . . .. . . . . ... . . . . . . ... . . . .o.... . .o.o . . . . .... .. ... .. .. .. .. ... .. ..o.. .. ... .. .. .. .. .. .. ... .. .. .. .. ... .. .. .. .. .. .. ... .. .. ... . . .. . . . . .. . . . .o .. . . .o.. . .. . . . . ..o... .o ... . . . . . . .. . . . . .. . o . . . .. .. . . . . . . . . . . . . . . . . . . . . .
.. .... .. ..o
. . . ... . . . ... . . . ... . . . . . ... . . . ... . . . . . ... . .. . . . . .. . . . . .. . . . . .. . . . . . . .. . . . .o... . . .o... . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . .. . . ..
. . ..o . . . . . . . . . ...o ............ . . ..o. . . . .. . . . . . ..o . . . . . . .. . . . . .. o.. . .. . . .o.. . . .. . . . . .. . . . . .. . . . . . . ... . . . . .. . . . . . . .. . .
. . .. . . . . ... o . . . . . . . . . . . ... . . . . .. o . . . .. .o . . . . . ... . . . . .. . . . . .. . . . . . . ... . . . . .. . . . . . . .. . . . . .. . . . . .. o. . . . . . . . . . . . . . . . . .. o.. . .. .o.. . . . . .. . . . . .. . . . . .. . . . . . . ... . . . . .. . . . . . . .. . .
.o . .. . .o. o ... . . . . .... . .o. . . .. .. . .o... . .. . .o... .. . . . . .. . .. . . .. . . .. . .. . . .. . . . .. . . .. . .. . . . .. . . .. . . ... . .o. .. . o
. . . .. . . . . ... . .o.. . . ... . . . o
. . .. . . . o . . . . . . . ... o.. . o
. . . . . . .. . . . . . . . .. . . . . .. . . . . .. . . . . . . . .. . . . . .. . . . . . . .. . .
. . .. . . . . .. . . . . .. . . . . .o... . . . . .. . o.. . .. . . . . . . .. . . . .o.. .. . . .. . . . . . . .. . . . . .. . . . . . . .. . . .. . ... . . . . .. . . . . o . . . . . . . . . . . . . .. . . . . .. . . . . . . .. . . . . . .. . . . . . . .. . .
. . .. . . . . ... . . . . .. . . . . . . .. . . . . ... o . . . . . . . . . .. . . .o ............................. . . .. . . . . .. . . . . .. . . . . . . .. .. . . . .. .o .. . . . .. . . . . .. . .. . . .. . o . .o ........................
. . .. . . o.. .. . . . . .. . . . . . . ..o.. . . .. . . . . .. . .. .o . . . .. . .o..o... . .. . .. . . .. . . . .. . . .. . .. . . . .. . . .. . . .. . . . . .. . . o.. .. . . . . ... . . . . . . ..o . . . . . . . ... . o
. . . . . .. . .o. .o
. . . . . .. . .o . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . .
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . .o.. .. . o.o.. o .. .. . .o.. ..o .. . . .. . . . . . . 29 .. .o/.. 66 . .. . . . . . . .. . . .. . . .. . . . . ... . . . . .. . . . . . . .. . . . . ... . . . . ... o .. .o. . . . . . . . . . . . .o. . . . . . . . . . . . .
C S C 311 I n t r o t o M L (UofT) . . .. . . . . ... . . . . .. . . . . . . .. . . o . .. . . . . .. . . . . . . o...o. . .o .. . . . . .. . . . . . . . .. . . . . .. . . . . . . .. . . . . . .. . . . . ... . . . . .. . . . . . . .. . . o o o o
. . . . . . . . . . . . . o.. . . .o... . .. . . . . . . . . . . . . . . . . . . . . . . . .
. . .. . . . . .. . . . . ..o.. . . . .o... . o.. .. . .o. . .. . . . . . . .. . . . . .. . .. . .. . . . . .. .. .o. . . .. . . .o.. . .. . . .. . . . . . o. . . . . ....o. . . .... . . . . . . ..o . . . . . . . . . . . . . . . . .o
.. . . . . . .o . . . . . . . . . . o
. . .. . . . . ... . . . . ..o . . . . . . . . . . . .. .. .. .. .. . .
. . .. . . . . .. . . . . .. . . . . . . .. . . .o... .o.. . ..o.. . . . . .. . .o.. .. . .o. . .. . . . .o . . .. . . . . .. . . . . . . .. . . .. . . .. . . . . ... . . . . .. . . . . . . ... . . o . . .. o . . . . . . . . . . ... . . . . ... o
. . .. . . . . .. . . . . . .. o .. . . . .. . . . . ... . . . o... . . . .o... . . . . .. . . . . .. . . . . . . . .. . . . . .. . . o . . .. . . . . ... . . . . .. o . . .. . . . o . . .. . . . o...o . . . . .o . .. . . o . ... . . ... . . . . . . . . . . . . . . . .o.. . . . . . .
. . .. . . . . .. . . . . . .. . . . .oo..o
....... .. . o
..o. . . .o.. . . .o. . . . . . . . .. . . . .o . . . o. . ..o
. .. . . . . .. o . . . . . . . . . . . ..... . .. . . .. . . . . ... . . . . .. . . . . .o o
. . . . .. . . . .. . . . . . . .. . . . . ... . . . . .o o . . . ..o.... .o.... . .. . .. .. . .. . .. . .. .. ..
.. . .. . .o . o... . . . . ... . . . . .. . . . . . . o
. .. o
.o . .. . . . . ... . . . . .. . . . . . . .. o .o o.. .. . . . . .. . . . . . . .o .o .. . .. . . . . .. . . o
o . . . . ... . . .... . .. o .. . .o .o. . . . .. . . . . .. . . . .o .oo .. . .. . . . . .......o ............. o .. . . . .o .
o.. ..o. . .oo... .. .. ...o..o.o......o... .. .. ...o. .. .. .o..o.. .... .. .o..................o............ . . . . .. . . . . ... . . . . .. . . . . . . ... . o . ...o . .. . . o
. . . . . . . ... . . . . .. . . . . . . . . . . o . . .o. .o. ... . . .o . . .o . ... .o. o . . . . . . . .o . . . .o . . .. .o.o . . . .. . . . . . . . o.. . . .. . . ..
. . .. . . . . ..o.. . . .. . o.. . . . .. . . . . .. ...o . . .. . . . . ... . . . . .. o . . . . . .. . . . . . . o . o. o .. . . . ..o . . . .. .o .o . .. . . . . ..... . . . . .o ...............
. . .. . . . . .. . . . . .. . . . . . . .. . . .. . . . . . . . . . .o.o.. o . o..o.. . . . .o.. . .o.. . . . . o.o.. . . . . . . . . . . . o
. . .. . . . . ... . . . . .. . . . . . . .. . . .. . . . . . . . . . . .oo o . . . .o..o . . .. . . . . ..... o . . . . . . o.. . . . . . . . . . . . .
. . .. . . . . .. . . . . . . . . . . . . . .o.. . . . . . . . . . . . . . . . .o.. . . . . . . . . . . . . . . . . .. . . .. . . .. . . .. . . . . . . . . . . . . . .o ... ... . . . . . . . .o. . . . . . . . . . . .
... .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. ..o..o
.o
. .o.. . .o.. . . . . . o.. . . . . . .o.. . . . . . . . . . .o. . . .o.. . . . . .
. . . . . .. o
. . .. . . . . ... . . . . .. . . . . . . .. .. .o
. . .. . . . . ... . . . . .. . . . . . . .. .. . . . o..o.o
. . .. . . . . ... . . . . .. . . . . . . .. .. . . . . . . . .. o
o.. . .. o.. .. .. .. .... ... ... .o.. ...o........... ......o..o
. .. .. . . . . . . . . . o.. . . o . . . . . . .
. . ... . . . . .... . . . . ... . . . . . . ... . . . . ....o. . o o.... ... ... ...o... .o.. ...........o. . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . . . .
. .oo . . . . . . . . .o . .o . . .. . . . . .. . . . . . .. . . . .. . .. . . . . . o . . . .oo.. . . . o . . .. . . . .o o.. . . . . .. . . .o
. . . ...... ... ... .. .. .. .. .. .. .. .. .. .. .. .. .. .. ..
o
. . .. . . . . .. . . . . .. . . .. . . . .o.. . . . . . . . . o... . . . . . .o. .. ..o
. ..... ...........
. . . . .. . . . . ....o .. .. . ... . . . . . . .... . . . . ... . . . . . . ... . . .. . .. .. . .. . . .... . .. .. . . . . . . . .. . . . . . . .
. . .o. ..o
. . .. . . . . ... . . . . .. . . . . . . ..o ..o o o
. . .. . . . . ... . . . . .. . . . . . . .. . . . . ... . . . . .. . . . . . . ..o . . . o.. . .o.. . . ... . . . . .. . . . . . . .. . . . . . .. . . . . ... . . . . .. . . . . . . .. . . . . .. . . . . . .. . . . . . . ..o . . . . .. . o
. . . . .o.. . . . . . . . . . . . . . . . . . .
. . . . . . . .o. . . . o . . .o. . . ..o
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . .. . . ..o..o.. oo . . . ...o... . .. .o. .. . .. . . .. . . . .. . . .. . o
. . .. . . . . ... . . . . .. . . . . . . .. . . o .. . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . ... . . . . .. . . . . .. . . . . . .. .. . . . . . . . . . o ..... o .oo
.. . o .. . . . . . ... . . ... . . . .. .o. . . .. . . . . . . .. . . .
. . . . .o. . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . .o.. . . o
.. . . . . . .. . . . . .. . . . . .. . . . .. . . . . . . . . . . . . . . . . . . o.. . . . . o
..o.. . . .. . . . . .. . . . . . . .. . . . . ... . . . . . . ... . . ... . . .. . . . . .. . . .. . .. . .. .. . .. . .. .. . .. . .. .. . .. . .. .. .o... .. . ..o... ..o . . .. . .. . .. .. . .. . .. . .. .. . .. . .. .. . . . . . .. . . ..
. . .. . . . . .. . . . . .. . . . . . . .. . .o.. .. . . . . .. . . . . . . ..o..oo.. .. o .. . . .. . . . . . . .. . . . . .. . . . . . . .. . . .. . . .. . . . . .. .. . . . . . . . . . . . . .o.. . . . . . . . . . . . . .o..oo.. .. o .. . . . . . . . . . . . . . . . . . . . . . . . . .
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . .o.. .. . . . .o.. .. . . . . .. . . . . . . .. . . .. . . .. . . . . .. .. . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . o.. . . . . . o ..................
. . .. . . . . ... . . . . .. . . . . . . .. . . . . ... . . .o... o.. . o . . . . . . . . o.. . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . ... . . . . . . . . . . . . . . . . . . . . .o.. . .o.. .o.. . . . . . . o.. . . . . . . . . . . . . . . . . . . . . . . . .
. . .. . . . . .. . . . . .. . . . . . .o... . . . .. . .o. . .. . . . . . . .. . ... . .. . . .. . .. . . .. . . . .. . . .. . .. . . . .. . . .. . . .. . . . . .. . . . . .. . . . . .. . . . . . . o... . . . ..o.. . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . .. . . ..
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . .o.. .. . . . . . . .. . . . . ... . . . .. .. . .o.. .. . . . . .. . . . . . . .. . . ..
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . .o... . . . . o .o
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . .. . . ..
.. .. .o .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . o.. . . . . . . o . . . . . . o. o . . . . .o. . .o. .. . . . . . . . . . . . . . . . .
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . .o...o.o.. . ... . . . . ..o . . . . . . .. . . . . .. . . . . . . .. . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . oo . . . . . . . . . . . . o. . . . . . . . . . . . . . . . . . . . . .
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. .o.. . ... .o.. . . . ... . . . . ... . . . . . . ... . . ... .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .o. . .o.. . . .o.. . . . . . .. . . . . . . . . . . . . .
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . .o... .o.. o... . . . .. . . . . . . .. . . . . .. . . . . . . .. . . .. . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . .o... .o.. o.. . . . .. . . . . . .. . . . .. . . . . . . . . .
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . .. . .. . . . . .. . .o.. .. . . . . . . .. . . . . .. . . . . . . .. . . .. . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . .. . .. . . . . .... . o.. ... . . . . . . ... . . . . ... . . . . . . ... . . ...
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . .o. . . .. . . . . .. . . . . .. . . . o.. . .. . . . . .. . . . . . . .. . . .. . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . .o. . . .. . . . . .. . . . . .. . . . o.. . .. . . . . .. . . . . . . .. . . ..
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . o . .. . . . . . . . . . . . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . o
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . ... . . . . ... . . . . . . ... . . ... .. . . . . . . . . . . . . . . .
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . ... . . . . ... . . . . . . ... . . ...
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. .. . . .. . . . . .. . . . . . . o ... . . . .. . . . . . . .. . . ..
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .o . . . . .. . . . .. . . . . . . . . . . .. . . . . . . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. .. . . .. . . . . .. . . . . . . o
o ..
. . . . .. . . . . . . .. . . ..
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . ... . . . . ... . . . . ... . . . . . . ... . . . . ... . . . . . . ... . . ... . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . .. . . ..
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . .. . . ..
. . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . .. . . .. . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . .. . . . . . . .. . . . . .. . . . . . . .. . . ..
.. . .. .. . .. .. . .. .. .. . .. .. . .. .. . .. .. .. . .. .. . .. .. . .. .. . .o. .. .. . .. .. .. . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . o