-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ENH: Add proximal operator for second kind of KL-divergence. #561
Conversation
if g is not None and g not in space: | ||
raise TypeError('{} is not an element of {}'.format(g, space)) | ||
|
||
class ProximalCConjKL(Operator): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This name should not be the same as in the other implementation (copy-past error)
Should be added to slow tests as well. We have tests for proximals there. |
The functional is not well-defined without a prior g. Hence, if g is | ||
omitted this will be interpreted as if g is equal to the one-element. | ||
|
||
Write something about similarities and differences between the two |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, and also make sure to add a See Also
part
Updated according to comments. Left to do
EDIT: regarding the naming suggestion, see http://ieeexplore.ieee.org/document/1056144/?arnumber=1056144&tag=1 |
The commit also add tests for it.
KL first version: - update doc - change: no data implicitly mean the one element (otherwise not well defined) - add to slow-test suite for proximal operators KL second version: - move larger test slow-test suite - add to slow-test suite for proximal operators - update doc - minor performance improvements
be46dc9
to
8ee1eb2
Compare
The naming issue remains: what to call it? And what to call the other one? I vote for calling this one After naming, is this one ok to merge? |
|
||
""" | ||
|
||
# TODO: Update Notes-section in doc above. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess this should be done.
prox_val = prox(x) | ||
|
||
# Explicit computation: | ||
x_verify = x - lam * lambertw(1.0 / lam * sigma * g * np.exp( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
x_verify = x - lam * lambertw(sigma / lam * g * np.exp(x / lam))
Gave some nit picking comments but I guess input is always good. Once done, do the rename (and make sure you also rename tests etc) and you can merge. |
The proximal factory for the second kinf of kl divergence to kl_cross_entropy. Also update doc for both kl divergences: mention the existence of the other.
Updated according to comments, and renamed to |
@aringh Please write a release note on this. |
It adds the proximal operator for the convex conjugate of the second kind of the KL-divergence:
f(x) = sum_i (x_i log(x_i) - x_i log(g_i) + g_i - x_i
.The naming is an issue here... what should we call the two different KL-divergences? The current name is temporary.
The two functionals are highly related as they only difference is that the prior and the variable changes place in expression. However, they are two different functionals with a bit different characteristics.
EDIT: It also contains a few update in the doc of the proximal operator for the previous KL-divergence (typos).