There are ways of forcing Dolby Vision with HDR10 only capable projectors. Use an HDFury, for one. It then tricks the DV compliant device (disc player or streamer or local media decoder like the newest KScape model with Dolby Vision or Zidoo or Dune) with EDID into sending HDR12 (not HDR10) via a backwards compatible signal technique called RGB tunneling... a standard that many non Dolby Vision capable HDR displays will still accept. The device handles the dynamic Dolby HDR metadata processing, rather than the display.
I did that with my JVC 4k projector and it really worked wonders and it is true Dolby Vision via a backdoor method. The HDFury Vertex, which I went with, allows custom Dolby Vision calibration scripts to be uploaded for fine tuning, as well as manually entering Dolby Vision display values that help the DV playback device send the right HDR values to the particular display you are using.
So as not to take this thread too far off-topic, is there a good place to read up on this further?