Reddit user Glennwing posted a great explanation of how the current breed of 4K 144 Hz displays actually work:
I’m seeing a lot of user reviews for the new 4K 144 Hz monitors, and it seems like everyone mentions that it looks noticeably worse at 144 Hz. I keep expecting these posts to say “due to the 4:2:2 chroma subsamping”, but instead they say “I’m not sure why” or something like that, both on here and on various forums. It seems monitor companies have done their usual good job of “forgetting” to inform people of this limitation, as most of the early adopters are apparently unaware that it is not actually capable of full 4K 144 Hz even though the subsampling was mentioned in the Anandtech article a month or two ago. In any case, I want to make people aware of what chroma subsampling is, and that these first-gen 4K 144 Hz monitors use it.
Basically, if you value image quality and want to use 144 Hz, then skip this generation of screens.