Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: Add proximal operator for second kind of KL-divergence. #561

Merged
merged 3 commits into from
Sep 14, 2016

Conversation

aringh
Copy link
Member

@aringh aringh commented Sep 5, 2016

It adds the proximal operator for the convex conjugate of the second kind of the KL-divergence: f(x) = sum_i (x_i log(x_i) - x_i log(g_i) + g_i - x_i.

The naming is an issue here... what should we call the two different KL-divergences? The current name is temporary.

The two functionals are highly related as they only difference is that the prior and the variable changes place in expression. However, they are two different functionals with a bit different characteristics.

EDIT: It also contains a few update in the doc of the proximal operator for the previous KL-divergence (typos).

if g is not None and g not in space:
raise TypeError('{} is not an element of {}'.format(g, space))

class ProximalCConjKL(Operator):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This name should not be the same as in the other implementation (copy-past error)

@adler-j
Copy link
Member

adler-j commented Sep 5, 2016

Should be added to slow tests as well. We have tests for proximals there.

The functional is not well-defined without a prior g. Hence, if g is
omitted this will be interpreted as if g is equal to the one-element.

Write something about similarities and differences between the two
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, and also make sure to add a See Also part

@aringh
Copy link
Member Author

aringh commented Sep 6, 2016

Updated according to comments. Left to do

  • Resolve naming. Calling the new one cross_entropy or kl_cross_entropy?
  • Update doc with interlinking comments. I will do this when the name issue is resolved.

EDIT: regarding the naming suggestion, see http://ieeexplore.ieee.org/document/1056144/?arnumber=1056144&tag=1

KL first version:
- update doc
- change: no data implicitly mean the one element (otherwise not well defined)
- add to slow-test suite for proximal operators

KL second version:
- move larger test slow-test suite
- add to slow-test suite for proximal operators
- update doc
- minor performance improvements
@aringh aringh force-pushed the second_type_kl_divergence branch from be46dc9 to 8ee1eb2 Compare September 6, 2016 12:56
@aringh
Copy link
Member Author

aringh commented Sep 12, 2016

The naming issue remains: what to call it? And what to call the other one? I vote for calling this one kl_cross_entropy. Any other suggestions?

After naming, is this one ok to merge?


"""

# TODO: Update Notes-section in doc above.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess this should be done.

prox_val = prox(x)

# Explicit computation:
x_verify = x - lam * lambertw(1.0 / lam * sigma * g * np.exp(
Copy link
Member

@adler-j adler-j Sep 13, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

x_verify = x - lam * lambertw(sigma / lam * g * np.exp(x / lam))

@adler-j
Copy link
Member

adler-j commented Sep 13, 2016

Gave some nit picking comments but I guess input is always good. Once done, do the rename (and make sure you also rename tests etc) and you can merge.

The proximal factory for the second kinf of kl divergence to kl_cross_entropy.
Also update doc for both kl divergences: mention the existence of the other.
@aringh
Copy link
Member Author

aringh commented Sep 14, 2016

Updated according to comments, and renamed to kl_cross_entropy. Merging after the checks are done

@aringh aringh merged commit 10d2cdb into master Sep 14, 2016
@aringh aringh deleted the second_type_kl_divergence branch September 19, 2016 12:35
@kohr-h
Copy link
Member

kohr-h commented Sep 21, 2016

@aringh Please write a release note on this.

@aringh aringh mentioned this pull request Sep 21, 2016
23 tasks
aringh added a commit that referenced this pull request Sep 22, 2016
Add notes for pulls #561 and #582.
kohr-h pushed a commit that referenced this pull request Oct 21, 2016
Add notes for pulls #561 and #582.
kohr-h pushed a commit that referenced this pull request Oct 31, 2016
Add notes for pulls #561 and #582.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants