I try to draw lines with different thicknesses using the geometry shader approach from here:
https://forum.libcinder.org/topic/smooth-thick-lines-using-geometry-shader
It seems to work great on my development machine (some Intel HD). However, if I try it on my target (Nvidia NVS 300, yes it's old) I get different results. See the attached images. There
seem to be gaps in my sine signal that the NVS 300 device creates, the intel does what I want and expect in the other picture.
It's a shame, because I just can't figure out why. I expect it to be the same. I get no Error in the debug output, with enabled native debugging. I disabled culling with CullMode.None. Could it be some z-fighting? I have little clue about it but I tested to play around with the RasterizerStateDescription and DepthBias properties with no success, no change at all. Maybe I miss something there?
I develop the application with SharpDX btw.
Any clues or help is very welcome
↧