Skip to content

Conversation

@devmotion
Copy link
Contributor

@devmotion devmotion commented Nov 24, 2025

Based on JuliaNLSolvers/NLSolversBase.jl#168 and JuliaNLSolvers/LineSearches.jl#187.

The diff would be cleaner if #1195 (which depends on #1209) would be available on the master branch.

Based on #1212.


Currently, tests are failing due to missing definitions of NLSolversBase.value_jvp! etc. for ManifoldObjective.

@pkofod
Copy link
Member

pkofod commented Nov 24, 2025

this one also has conflicts for the same reason as the AD pr I suppseo

@devmotion devmotion force-pushed the dmw/jvp branch 4 times, most recently from e601dc9 to 33f43b0 Compare November 25, 2025 00:02
@pkofod
Copy link
Member

pkofod commented Nov 25, 2025

I see you're hitting all the wrappers

@devmotion devmotion force-pushed the dmw/jvp branch 3 times, most recently from 2e8626a to 28b8899 Compare November 25, 2025 10:48
elseif !NLSolversBase.isfinite_value(d)
TerminationCode.ObjectiveNotFinite
elseif !NLSolversBase.isfinite_gradient(d)
TerminationCode.GradientNotFinite
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pkofod a random bug I came across when fixing these lines

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm wondering if I had that f_calls there for a reason

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The logic is not changed, it's just moved to NLSolversBase to avoid having to expose jvp(obj) (IMO such calls are quite unsafe in general, so I'd like to avoid having to introduce new ones at least).

My point's just that currently the termination code is a GradientNotFinite when the objective function is not finite.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understood the part of about the wrong code but had missed the part about the new function . Great

initial_x = copy(initial_x)
retract!(method.manifold, initial_x)

value_gradient!!(d, initial_x)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pkofod why would we ever want to call the !! methods explicitly? If the input is different from the one that the value and gradient were evaluated with the value/gradient will be recomputed anyway.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

An alternative is to create new instances of OnceDifferentiable, but the "problem" is Fminbox that has outer and inner loops. In the previous inner loop you may evaluate the objective at x but then Fminbox updates a parameter that changes the function essentially and then the stored value is not correct.

Without getting too specific here, you can thing of it as the EBEs in Laplace's method. If you have an outer objective that performs an EBE optimization on the inside then a OnceDifferentiable for the outside objective of the marginal loglikelihood cannot know if we changes the inner EBEs or not.

As I said above, maybe a better approach is to construct a new OnceDifferentiable per outer iteration.

I believe other users have relied on it in the past as well, because they are doing the same. Allocating / constructing the OnceDifferntiable once and then they're solving a sequence of optimize with some parameter that's updated between optimize calls. I'm not completely sure if it's documented or not, but the idea is that each initial evaluation and each reset! for Fminbox forces an evaluation even if x is not updated.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the previous inner loop you may evaluate the objective at x but then Fminbox updates a parameter that changes the function essentially and then the stored value is not correct.

It sounds like we should just use NLSolversBase.clear! instead of only resetting the number of calls in the loop of the algorithm?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes we can call clear before value!, gradient! etc is called in those places

bw.Fb = value(bw.b, x)
bw.Ftotal = bw.mu * bw.Fb
NLSolversBase.value(obj::BarrierWrapper) = obj.Ftotal
function NLSolversBase.value!(bw::BarrierWrapper, x)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't check CI yet, but I think these may call failures because the multiplier could have been updated

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess you can rewrite these to check if mu was updated in the same manner...

@devmotion devmotion force-pushed the dmw/jvp branch 3 times, most recently from 170a47b to aa176a1 Compare November 25, 2025 22:53
results = optimize(dfbox, x, _optimizer, options, state)
stopped_by_callback = results.stopped_by.callback
dfbox.obj.f_calls[1] = 0
# TODO: Reset everything? Add upstream API for resetting call counters?
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed above, clear! should suffice

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It didn't but maybe I did something wrong. I spent a day trying to fix the errors but then decided to revert back to not touching any of the !! etc. functionality in this PR. The problem - and why I wanted to get rid of them in the first place - is that the JVP function messes up any of such implicit assumptions. There's no guarantee anymore that during/after the line search the gradient will be available or that a new gradient is used at all in the line search. IMO one should (at some point, but maybe not in this PR) get rid of the value(d) etc. API and any such implicit assumptions completely. If a specific gradient etc. has to be passed, it should be passed explicitly; and IMO the objective function structs should be treated as black boxes that just perform some optimisations by caching evaluations.

@devmotion devmotion force-pushed the dmw/jvp branch 6 times, most recently from 6525ce4 to 6d92ce1 Compare December 1, 2025 08:49
@pkofod pkofod closed this in #1212 Dec 1, 2025
@pkofod pkofod reopened this Dec 1, 2025
@devmotion devmotion force-pushed the dmw/jvp branch 2 times, most recently from b5c9710 to 6394bf3 Compare December 1, 2025 20:10
@codecov
Copy link

codecov bot commented Dec 1, 2025

Codecov Report

❌ Patch coverage is 89.28571% with 12 lines in your changes missing coverage. Please review.
✅ Project coverage is 85.95%. Comparing base (a466ee4) to head (af0ab0a).

Files with missing lines Patch % Lines
src/multivariate/solvers/constrained/fminbox.jl 72.41% 8 Missing ⚠️
src/Manifolds.jl 92.30% 1 Missing ⚠️
src/multivariate/solvers/first_order/adam.jl 50.00% 1 Missing ⚠️
src/multivariate/solvers/first_order/adamax.jl 50.00% 1 Missing ⚠️
...ariate/solvers/second_order/newton_trust_region.jl 50.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #1210      +/-   ##
==========================================
- Coverage   86.77%   85.95%   -0.82%     
==========================================
  Files          45       44       -1     
  Lines        3539     3482      -57     
==========================================
- Hits         3071     2993      -78     
- Misses        468      489      +21     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@devmotion
Copy link
Contributor Author

Locally (macOS aarch64) and on MacOS (aarch64) in CI all tests pass, but on ubuntu and windows 4 tests fail...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants