-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Add inverse gamma distribution and fix sign bug in PowerTransform.
#104501
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/104501
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 655b449 with merge base 3db0095 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! I have only one comment re: the .__init__() method.
Thanks for your patience!
torch/distributions/inverse_gamma.py
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we'll want something like
neg_one = -base_dist.rate.new_ones(())
PowerTransform(neg_one)so that if the default tensor type differs from the concentration and rate passed in, we'll still get a valid PowerTransform. Does that sound right?
2b93766 to
3a672cd
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
|
@pytorchbot merge this please |
|
❌ 🤖 pytorchbot command failed: Try |
|
@pytorchbot merge |
Merge failedReason: Approval needed from one of the following: |
3a672cd to
528c606
Compare
|
I think the test failures are unrelated in this case. |
528c606 to
a746098
Compare
a746098 to
3c6a0b2
Compare
|
@pytorchbot merge |
Merge failedReason: This PR needs a If not, please add the To add a label, you can comment to pytorchbot, for example For more information, see Details for Dev Infra teamRaised by workflow job |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
Thank you, @ezyang! |
pytorch#104501) This PR comprises a few small contributions: 1. `PowerTransform` returned a sign of `+1` irrespective of exponent. However, it should return the sign of the exponent because the gradient has the same sign as the exponent. That issue has been fixed. 2. Added tests to catch errors akin to 1. in the future. 3. Added an `InverseGamma` distribution as a `TransformedDistribution` with `PowerTransform(-1)` and `Gamma` base distribution. The `InverseGamma` is often used as a prior for the length scale of Gaussian processes to aggressively suppress short length scales (see [here](https://betanalpha.github.io/assets/case_studies/gaussian_processes.html#323_Informative_Prior_Model) for a discussion). Note: I added a `positive` constraint for the support of the inverse gamma distribution because the `PowerTransform(-1)` can fail for `nonnegative` constraints if the random variable is zero. ```python >>> torch.distributions.InverseGamma(0.5, 1.0).log_prob(torch.zeros(1)) --------------------------------------------------------------------------- ValueError Traceback (most recent call last) <ipython-input-8-758aa22deacd> in <module> ----> 1 torch.distributions.InverseGamma(0.5, 1.0).log_prob(torch.zeros(1)) ~/git/pytorch/torch/distributions/transformed_distribution.py in log_prob(self, value) 140 """ 141 if self._validate_args: --> 142 self._validate_sample(value) 143 event_dim = len(self.event_shape) 144 log_prob = 0.0 ~/git/pytorch/torch/distributions/distribution.py in _validate_sample(self, value) 298 valid = support.check(value) 299 if not valid.all(): --> 300 raise ValueError( 301 "Expected value argument " 302 f"({type(value).__name__} of shape {tuple(value.shape)}) " ValueError: Expected value argument (Tensor of shape (1,)) to be within the support (GreaterThan(lower_bound=0.0)) of the distribution InverseGamma(), but found invalid values: tensor([0.]) ``` This differs from the scipy implementation. ```python >>> scipy.stats.invgamma(0.5).pdf(0) 0.0 ``` Pull Request resolved: pytorch#104501 Approved by: https://github.com/fritzo, https://github.com/ezyang
pytorch#104501) This PR comprises a few small contributions: 1. `PowerTransform` returned a sign of `+1` irrespective of exponent. However, it should return the sign of the exponent because the gradient has the same sign as the exponent. That issue has been fixed. 2. Added tests to catch errors akin to 1. in the future. 3. Added an `InverseGamma` distribution as a `TransformedDistribution` with `PowerTransform(-1)` and `Gamma` base distribution. The `InverseGamma` is often used as a prior for the length scale of Gaussian processes to aggressively suppress short length scales (see [here](https://betanalpha.github.io/assets/case_studies/gaussian_processes.html#323_Informative_Prior_Model) for a discussion). Note: I added a `positive` constraint for the support of the inverse gamma distribution because the `PowerTransform(-1)` can fail for `nonnegative` constraints if the random variable is zero. ```python >>> torch.distributions.InverseGamma(0.5, 1.0).log_prob(torch.zeros(1)) --------------------------------------------------------------------------- ValueError Traceback (most recent call last) <ipython-input-8-758aa22deacd> in <module> ----> 1 torch.distributions.InverseGamma(0.5, 1.0).log_prob(torch.zeros(1)) ~/git/pytorch/torch/distributions/transformed_distribution.py in log_prob(self, value) 140 """ 141 if self._validate_args: --> 142 self._validate_sample(value) 143 event_dim = len(self.event_shape) 144 log_prob = 0.0 ~/git/pytorch/torch/distributions/distribution.py in _validate_sample(self, value) 298 valid = support.check(value) 299 if not valid.all(): --> 300 raise ValueError( 301 "Expected value argument " 302 f"({type(value).__name__} of shape {tuple(value.shape)}) " ValueError: Expected value argument (Tensor of shape (1,)) to be within the support (GreaterThan(lower_bound=0.0)) of the distribution InverseGamma(), but found invalid values: tensor([0.]) ``` This differs from the scipy implementation. ```python >>> scipy.stats.invgamma(0.5).pdf(0) 0.0 ``` Pull Request resolved: pytorch#104501 Approved by: https://github.com/fritzo, https://github.com/ezyang
This PR comprises a few small contributions:
PowerTransformreturned a sign of+1irrespective of exponent. However, it should return the sign of the exponent because the gradient has the same sign as the exponent. That issue has been fixed.InverseGammadistribution as aTransformedDistributionwithPowerTransform(-1)andGammabase distribution. TheInverseGammais often used as a prior for the length scale of Gaussian processes to aggressively suppress short length scales (see here for a discussion).Note: I added a
positiveconstraint for the support of the inverse gamma distribution because thePowerTransform(-1)can fail fornonnegativeconstraints if the random variable is zero.This differs from the scipy implementation.
cc @fritzo @neerajprad @alicanb @nikitaved @lezcano