This is quite a tough question, but you really need to see the difference in person. First, TVs may say "200Hz" but they are usually only 50Hz, with motion blurring or fast motion assist or whatever jargon they've used, to make it up to 200Hz.
This means the picture on a $3000 HDTV for gaming, doesn't refresh as quick as a 60Hz LCD monitor. The picture quality will be a little worse as the pixels are much higher due to the screen size, which is where 4K comes into play.
Input lag is also an issue, especially for gaming, as the TVs aren't as quick in response as an LCD monitor. Then we have 120Hz LCD monitors which are true 120Hz - refreshing 120 times per second for liquid smooth gaming. This is the ultimate in gaming in my opinion, especially for first-person shooters.
As for support for greater resolutions, yes - the hardware supports it. Most of AMD and NVIDIA's latest GPU's support right up to 4K (Ultra HD) @ 3840x2160. This is done either over HDMI (at 30Hz for now) or DisplayPort at 60Hz. 4K is quite expensive right now, with ASUS' 31.5-inch 4K-capable monitor priced at $3,999.
Now, multi-monitor gaming is an entirely new ball game. The same rules apply for delay, but it would be cheaper to have a 3-screen setup (Eyefinity or Surround Vision) with 42-inch (or bigger) HDTVs, but the experience wouldn't be as good as say triple 60Hz (or 120Hz) LCD monitors.
You're better off with a single big HDTV for games you could play with a controller, like driving games, some RPGs, etc and for first-person shooters, have a 120Hz monitor, or three!