10-bit is a bit of a gimic at this point for desktop computing, even when you are on the regular windows desktop. When setting monitors to 10-bit mode, you are only telling the graphics driver to send information to the monitor using 10-bit deepcolor. Meanwhile, on that exact same menu that lets you change your color depth in the nvidia control panel, you'll see 32-bit truecolor right next to it. This is what is actually used to render the desktop. This allocates 8-bits for r, g, b, and an alpha channel used for compositing -- only 8-bit color, you're not going to be taking advantage of 10-bit.
Indeed, fullscreen applications can bypass the desktop compositor completely and so can display 10-bit color if it wanted to, but unfortunately, the status quo for a lot of game engines is still just 8-bit as we've been using that color depth for decades. Combined with the fact that it's easier to program for, as memory is already laid out in 8-bit chunks. Unless you're using a modern game that specifically supports HDR and 10-bit, 10-bit support is going to be hard to come by.
People are mentioning wide gamuts in this thread saying games are sRGB or DCI-P3 that has little to do about the conversation about color depth. With 8-bit color, 255 will mean its more red on DCI-P3 than sRGB. Essentially, you're just stretching that 0-255 space over a wider set of colors. You're going to get a higher quantisation error, that's why you also see 10-bit color typically come with wide gamuts to try to reduce it.