-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add cross product operator #818
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great! Thanks for doing this
/build |
Hi @mfzmullen, I apologize I didn't see this earlier, but the documentation needs to be updated:
|
@cliffburdick no worries, my bad for missing that in the first place! Just pushed |
/build |
Hi @mfzmullen now It's getting:
You can test this yourself by "make sphinx" if you want it to run in a local environment. |
/build |
@cliffburdick got the documentation built locally with latest push. Again my apologies on the mess with that, but thank you for your patience. |
/build |
Merged. Thanks @mfzmullen ! |
@mfzmullen it looks like the unit test is failing:
|
/build |
@cliffburdick strange, I played with the tolerance a bit for the half types and wasn't getting any failures with the current setting after a few dozen runs. I can look at it a bit more. |
Here is the error:
|
@cliffburdick if I change the threshold to 0.06 for the half types and run I had it pass twice and failed once after 623 iterations. What failure probability is acceptable? |
I don't think there would be any uncertainty in these calculations, so I wouldn't expect repeating to help. 0.06 should be fine. |
While there isn't an uncertainty, each repetition uses different randomly generated inputs, so the difference between the numpy and matx versions will change on each repetition. I'll push a change soon. |
NumPy can use the same seed for the rng so you get the same number each time. That's how we do most of our tests currently so we have some predictability. |
Oh got it. I see that in some of the generators now, but not in the ones I used as examples (toeplitz). I can add that to my PR. |
It looks like it's still failing with a larger threshold. I will make it even larger since it appears to be just a precision issue:
|
That difference is already less than thresh (=0.08). That is with a double or float perhaps? All these tests passed on my local machine. The tests need to pass before merging to main or no? Just curious why they seem to be failing sometimes even with the consistent seed in the rng now. |
It looks like it's working now. Thanks! |
Takes in two operators of size A0 x ... x An x 3 (or 2) and size B0 x ... x Bn x (3 or 2) with matching batch sizes and computes the cross product. If both input operators have size = 2 on the last dimension, the size of the output's last dimension is 1 and only contains the z component of the output. This is contrary, but as similar to, the NumPy implementation which drops the rank of the output by 1 if both inputs are in-plane vectors. Doing so in MatX does not seem straightforward since
Rank
is a static method and we may not know the sizes of A and B (e.g. 2 or 3 on the last dimension) at the time the rank is determined, but those sizes affect the rank ofcross
if we followed NumPy's approach.