Skip to content

Conversation

kevinsung
Copy link
Collaborator

@kevinsung kevinsung commented Aug 1, 2025

This improves performance but something is wrong. The optimization with complex numbers gives a worse result than before.

>       np.testing.assert_allclose(energy, -108.58613393502857)
E       AssertionError:
E       Not equal to tolerance rtol=1e-07, atol=0
E
E       Mismatched elements: 1 / 1 (100%)
E       Max absolute difference among violations: 5.03683129e-05
E       Max relative difference among violations: 4.63855845e-07
E        ACTUAL: array(-108.586084)
E        DESIRED: array(-108.586134)

tests/python/variational/orbital_optimization_test.py:86: AssertionError

@kevinsung kevinsung marked this pull request as draft August 1, 2025 00:27
@q-inho
Copy link
Contributor

q-inho commented Aug 1, 2025

I'm not confident whether it is appropriate to comment on this PR draft, but I'd like to share a quick thought.
From my scheme, I think current _generator_to_parameters misses that each variational parameter controls two elements of the anti‑Hermitian generator.
Do you think multiply the gradient contributions by 2 and to account for the sign of the imaginary part when converting back to the parameter vector would solve the issue?

@kevinsung
Copy link
Collaborator Author

I'm not confident whether it is appropriate to comment on this PR draft, but I'd like to share a quick thought. From my scheme, I think current _generator_to_parameters misses that each variational parameter controls two elements of the anti‑Hermitian generator. Do you think multiply the gradient contributions by 2 and to account for the sign of the imaginary part when converting back to the parameter vector would solve the issue?

Sorry, I don't quite understand what you mean, but feel free to open a pull request if you have a solution 🙂.

@kevinsung
Copy link
Collaborator Author

kevinsung commented Aug 1, 2025

It turns out the issue was jax-ml/jax#4891.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants