Last modified: 2025-03-23 23:53:58
< 2025-03-22 2025-03-24 >I am having an issue because it keeps trying to take the square root of an interval that contains negative numbers.
I think the reason for this is I am computing lots of "squares"
as a*a
for example, but that means the resulting interval is much
wider than it needs to be.
OK, so interval arithmetic is actually much harder than I thought. (Story of my life).
I made multiplication detect if the 2 arguments are the same and turn
into squaring in that case. This isn't a completely general solution
in the evaluateInterval()
case
because it is detecting whether the un-simplified arguments are the
same. Ideally would be looking at the arguments after simplify()
has
had a chance to clean them up. For the code generation cases it is
fine.
Now it is getting NaN for the distance value, not yet sure why.
I had messed up the rotation matrix multiplication.
And now it is always getting an interval that crosses 0.
Oh, one issue is it was cloning the Ivec3 wrong.
It looks like it is almost working on the sphere now. Still not working properly on the default object.
My suspicion is that the reason it's not working is because the interval tells you that it might cross the surface of the object, rather than that it definitely does. So if I split an interval in half and find that neither half crosses the surface, currently I just split until the end of the second half and then say that's where the surface must be, but really I need to be breaking into the second half of the parent interval and searching there instead.
Or, maybe I'm wrong and it actually can provide tight bounds?
So yeah, with the sphere it is definitely getting a distance interval that contains zero even for Z intervals that don't intersect the surface.
So a distance interval containing zero tells you that the Z interval might intersect the surface, it doesn't tell you that it does intersect the surface. And that's even before we have any rotation matrix effect. The rotation matrix will transform my nice neat 1-dimensional interval into an annoying 3-dimensional interval.
I think the plan is to alternate between doing a step of binary search and a step of sphere tracing. 50% of the time (maybe) the binary search will let you jump through half the remaining space, and the rest of the time sphere tracing will let you make a jump. Or does that just get you the worst of both worlds?
Well an initial implementation that alternates between "interval halving" and "sphere tracing" does come up with better answers than "interval halving" alone.
The problem is that we now have the issue where the ray can step through the geometry if the field over-estimates the distance to the surface.
Also on the default model, even near the pathological case surface, it is taking 3 ms for 15 steps of interval halving (including 15 of sphere tracing) and only 1 ms for 300 steps of ordinary sphere tracing. Which makes the whole interval arithmetic thing pointless.
I wonder if we can get the tree nodes to propagate up a value for the maximum amount they will overestimate the distance by?
So I basically have 2 issues with sphere tracing:
So one option is I can make the space deformations divide by their assumed Lipschitz contant and that way they will become lower-bound estimates!
And in fact maybe I only need to divide by the local Lipschitz "constant". I basically just need to make sphere tracing work, I don't need to be mathematically rigorous.
And then we only have the parallel surfaces problem.
So one approach would be to figure out a way to use some type of depth fog or something to make it just not look bad instead of making it go away. And then the user can naturally rotate it into an orientation that makes it easier to see.
One idea is to say that if we hit the 500 steps or something, then instead of saying that we "didn't hit the surface", we instead just say that we did hit the surface, at wherever we ended up. Then at least we'll get colour of surface instead of holes through the surface.
So that's an improvement already:
Just that the colour is wrong because the normal is wrong.
So then what if we blend in some fog based on the number of steps?
Wahey! Problem solved!
It now looks like this, with edge detection off:
So not only does that "not look bad", it actually looks quite cool, and you can argue that it is giving the user extra information by making depth discontinuities more obvious.
One idea would be to make a shader that just does thousands of square roots, and one that does thousands of multiplications, etc., and see what frame rate they all run at, to come up with an estimate of how many of each operation we can do per second.