8000 `jax.numpy.nancumsum` brings different results with `numpy.nancumsum` · Issue #28669 · jax-ml/jax · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

jax.numpy.nancumsum brings different results with numpy.nancumsum #28669

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
apiqwe opened this issue May 10, 2025 · 1 comment
Closed

jax.numpy.nancumsum brings different results with numpy.nancumsum #28669

apiqwe opened this issue May 10, 2025 · 1 comment
8000
Assignees
Labels
bug Something isn't working

Comments

@apiqwe
Copy link
apiqwe commented May 10, 2025

Description

I found that jax.numpy.nancumsum brings different results with numpy.nancumsum in following case.
I suppose this may be a bug in JAX.

import numpy as np
import jax.numpy as jnp

print(np.nancumsum([0.1],dtype=bool))
print(jnp.nancumsum(jnp.array([0.1]), dtype=bool))

Output:

[ True]
[False]

System info (python version, jaxlib version, accelerator, etc.)

jax:    0.6.0
jaxlib: 0.6.0
numpy:  2.2.3
python: 3.10.12 (main, Feb  4 2025, 14:57:36) [GCC 11.4.0]
device info: NVIDIA GeForce RTX 4090-1, 1 local devices"
process_count: 1
platform: uname_result(system='Linux', node='4db45dc420f8', release='6.11.0-25-generic', version='#25~24.04.1-Ubuntu SMP PREEMPT_DYNAMIC Tue Apr 15 17:20:50 UTC 2', machine='x86_64')


$ nvidia-smi
Wed May  7 16:26:08 2025       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.230.02             Driver Version: 535.230.02   CUDA Version: 12.2     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA GeForce RTX 4090        Off | 00000000:01:00.0 Off |                  Off |
| 36%   32C    P2              36W / 450W |    407MiB / 24564MiB |      1%      Default |
|                                         |                      |                  N/A |
+-----------------------------------------+----------------------+----------------------+
                                                                                         
+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|
+---------------------------------------------------------------------------------------+
@apiqwe apiqwe added the bug Something isn't working label May 10, 2025
@jakevdp
Copy link
Collaborator
jakevdp commented May 10, 2025

This seems like an unimportant corner case involving the detailed path by which floats are cast to bools in the context of reductions. It's very similar to #28646 in that way. This strikes me as not particularly important, unless it has caused issues in a real-world use-case.

@jakevdp jakevdp closed this as completed May 10, 2025
@jakevdp jakevdp self-assigned this May 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants
0