JasonVP
Member
Executive Summary: This is a resource management challenge. I get that. Bear with me, please.
Machine 1 - Gaming Rig:
Machine 2 - Streaming Rig:
I was fortunate enough to pick up one of those Asus PG27UQ displays; the ones that can do 4K/144Hz. Once obtained, I upped the resolution of all of my games from 1440p to 4K; and of course the Titans started sweating a bit more. That was expected and understood. But for the most part, with the games I play, 4K/140 is doable with very high settings; of course I disable AA, AO, post-processing, and one or two other useless bits that make no difference, but cost buttloads of frames. So we're good.
But I need to get that 4K goodness from one machine over to the other. And here's where the fun begins (or ends, depending on your patience level).
OBS With NDI
Easy to set up, but it doesn't work well. This is the resource management problem. While my cards are able to sweat and push 4K/140 in most cases, they'll routinely kiss or hit 100% use. As we know, this will cause rendering lag with the OBS process on the gaming rig, and that gets sent to the streaming rig. Then folks watching my stream see that. No bueno. This isn't a solution I can use unless something else can be done to help. Yes, I'm frame capping at 140FPS but that's as low as I want to go; it was sorta the point behind buying this display.
NDI Scan Converter
I've tried version 3.7, which is their latest. This is akin to screen scraping and sending everything over the network. And: it works. Mostly. The problem with it is: it costs me frame rate in-game. Sometimes severely. Like with Rainbow 6 Siege, for instance: 80FPS if not more! If I uncap my frame rate without Scan Converter running, it'll shoot right up to 180-190 or so FPS. When I fire Scan Converter back up, I'm lucky to get 100FPS in the same section of the game. That's obnoxiously bad!
Just Use the Capture Card, Dude
This is what I'm left with at this point. The problem here is an interesting NVidia-specific one. I can't have my three 4K displays (one of them is 144, the other two are 60) connected and have the 4K60 Pro card connected AND have them all run at their max refresh/resolutions. As soon as I connect up the capture card, the gaming rig sees it as a 4th display, and down-clocks the Asus panel to 120Hz for some reason. Automatically. If I disconnect either of the 4K/60 displays or the capture card, it bounces right back up to 144Hz.
A bandwidth or throughput issue of some sort. Either way, this is what I'd term: suboptimal. The resulting stream looks fine even though Display Cloning does cost some SLI performance problems. They're nothing like Scan Tool, and the results on the streaming rig are a smooth, consistent video feed.
At this point, I've settled on using the capture card. I don't like it at all, specially since it's using a poop HDMI 2.0 interface instead of a DisplayPort 1.4 one. But that's how it goes when you primarily cater to the console market. But I'm curious if anyone has any ideas here? Maybe some tweaks I can do to Scan Converter so that it doesn't absolutely nuke my game's frame rate? Or something I can do to help OBS besides limiting the frame rate any further?
Machine 1 - Gaming Rig:
- Intel 7900X de-lidded, water cooled, OC'd to 4.7Ghz
- 64GB of DDR4 OC'd to 4GHz
- Two Titan X Pascal GPUs, water cooled, OC'd
Machine 2 - Streaming Rig:
- Intel 5960X, water cooled (stock speed)
- 64GB of DDR4
- NVidia GTX1050
- Elgato 4K60 Pro capture card
I was fortunate enough to pick up one of those Asus PG27UQ displays; the ones that can do 4K/144Hz. Once obtained, I upped the resolution of all of my games from 1440p to 4K; and of course the Titans started sweating a bit more. That was expected and understood. But for the most part, with the games I play, 4K/140 is doable with very high settings; of course I disable AA, AO, post-processing, and one or two other useless bits that make no difference, but cost buttloads of frames. So we're good.
But I need to get that 4K goodness from one machine over to the other. And here's where the fun begins (or ends, depending on your patience level).
OBS With NDI
Easy to set up, but it doesn't work well. This is the resource management problem. While my cards are able to sweat and push 4K/140 in most cases, they'll routinely kiss or hit 100% use. As we know, this will cause rendering lag with the OBS process on the gaming rig, and that gets sent to the streaming rig. Then folks watching my stream see that. No bueno. This isn't a solution I can use unless something else can be done to help. Yes, I'm frame capping at 140FPS but that's as low as I want to go; it was sorta the point behind buying this display.
NDI Scan Converter
I've tried version 3.7, which is their latest. This is akin to screen scraping and sending everything over the network. And: it works. Mostly. The problem with it is: it costs me frame rate in-game. Sometimes severely. Like with Rainbow 6 Siege, for instance: 80FPS if not more! If I uncap my frame rate without Scan Converter running, it'll shoot right up to 180-190 or so FPS. When I fire Scan Converter back up, I'm lucky to get 100FPS in the same section of the game. That's obnoxiously bad!
Just Use the Capture Card, Dude
This is what I'm left with at this point. The problem here is an interesting NVidia-specific one. I can't have my three 4K displays (one of them is 144, the other two are 60) connected and have the 4K60 Pro card connected AND have them all run at their max refresh/resolutions. As soon as I connect up the capture card, the gaming rig sees it as a 4th display, and down-clocks the Asus panel to 120Hz for some reason. Automatically. If I disconnect either of the 4K/60 displays or the capture card, it bounces right back up to 144Hz.
A bandwidth or throughput issue of some sort. Either way, this is what I'd term: suboptimal. The resulting stream looks fine even though Display Cloning does cost some SLI performance problems. They're nothing like Scan Tool, and the results on the streaming rig are a smooth, consistent video feed.
At this point, I've settled on using the capture card. I don't like it at all, specially since it's using a poop HDMI 2.0 interface instead of a DisplayPort 1.4 one. But that's how it goes when you primarily cater to the console market. But I'm curious if anyone has any ideas here? Maybe some tweaks I can do to Scan Converter so that it doesn't absolutely nuke my game's frame rate? Or something I can do to help OBS besides limiting the frame rate any further?