I am making a game using a custom graphics engine written in Direct3D 11. I've been working to lower the amount of system RAM the game uses, and I noticed something that, to me, was surprising and went against my expectations:
Textures that are using D3D11_USAGE_DEFAULT or D3D11_USAGE_IMMUTABLE (along with a D3D11_CPU_ACCESS_FLAG of 0) are increasing my system RAM usage according to the size of the texture (i.e., a 1024x1024x32bpp texture adds about 4MB of system RAM usage). I had thought that the point of the D3D11_USAGE_DEFAULT and (especially) D3D11_USAGE_IMMUTABLE usage modes was to put the texture in VRAM instead of system RAM?
I might expect this behavior on a system with integrated graphics and shared memory, but I'm seeing this on a desktop with no integrated graphics and only a GTX 1070 GPU.
So am I just not understanding how this works? Is there any way I can make sure textures are allocated only in VRAM?
Thanks for your help!
↧