If you want the premium experience just accept that big biz has got you by the gonads. Lets start by saying that DisplayPort is the unwanted child that does everything important except that it exists. It's great and the gaming oriented tech folks wants to sell it to you, but it came late to the party and lacks the networking effects that HDMI has. So it's widely unsupported. Great, except HDMI is a POS because of how convoluted and messy the specification itself is particularly when there's no consistent labeling. At least with USB 3.0 if you saw the blue plasticy bit you knew it was 3.0. HDMI has colors that are all over the place that companies pay extra to color code hoping that it'll be good marketing. I'm sure to a professional who spends their entire lives doing this understands it perfectly. I am a simpleton when it comes to this and all I know is a cable male and a cable female come together with an electrifying chemistry, or in this case a working monitor.
My journey begins when I foolishly thought I was being smart buying Dell's Cyber Monday deal on a monitor. I snagged the 24" for $70 and it has a sort of archaic look to it that screams boring business. The issue however is that it only has an HDMI and a VGA port. I couldn't get it to work on my AMD graphics card despite there being enough ports on it because I already had 3 monitors. I was being cheeky trying to get a fourth monitor and I paid the price--[[Hardware XXXX] 4 monitors is the perfect number]]. The trouble is I am using an R9 380, which is an older graphics card along with older monitors that all have been serving me faithfully since at least 2016. That's right, these bad boys have been with me through the thick and thin and have been performing wonderfully. I thought I was being smart by future proofing my setup and getting a 144hz 1080p one. I don't think 4K is really necessary for a monitor, but higher hz is absolutely noticeable and is a game changer. Even today the monitor is working fantastic except for one problem: it's using legacy hardware.
Three monitors works great. Nice Job AMD. However, the moment you add a 4th monitor AMD starts pouting and sulking and only three monitors will work. Technically, it's supposed to work but you need to have 3 monitors being the exact same model.. or was it 2? The answer will surprise you because nobody knows. I didn't know that either, so I went on down to my local overpriced Bestbuy to pick up the cheapest adapter I could and it still cost me about the same as a 6' cable on Amazon...which is absurd. Ironic that we say the Amazon premium is the price of convenience when going down to the local tech store is also paying for the price of convenience. That's another story for another day.
Anyway, it didn't work. So, now I had this black screen looking down on me everyday and I can't handle it,so I began researching how to solve it. "Get an MST adapter. It's plug-and-play", they said and so I did. I spent $80 to get a $70 monitor working with my setup. Except it didn't. It turns out that for some reason the MST I bought isn't compatible with my graphics card and none of my screens connected to it were working. So now I have 3 black screens staring down me and I have to creen my neck to view the side monitor because well that was just how the wiring ended up.
So I tell myself that maybe MST isn't the way to go. Okay, so then I heard some people mention about a DisplayPort to HDMI active adapter that would do the trick. I bought one of those. Then I read the fine print and realized that they come in different flavors and a unidirectional. For example, I accidentally bought the wrong one which is a DisplayPort to HDMI only meaning that you have a DisplayPort monitor and an HDMI signal... which is the exact opposite of what I needed. That was a dead end.
Then I got the idea of upgrading my GPU especially since it'll last me a good 3-6 years and despite AMD has all the power levels, but is painfully lacking in the software support. It's a horse without a saddle. You can ride it if you want to spent too much time figuring out how to ride bareback, but it's going to painful. On the other hand NVDIA is all sunshine and sugar cubes. This is also the time when I discovered Stable Diffusion and the text2image rage that was going on. So I got an NVDIA. Suddenly the 4th monitor works. Fantastic, but NVIDIA doesn't have a DVI+ whatever port so now my 144hz monitor is stuck at 60hz.
Simple solution I thought, I saw that there are HDMI cables with 144hz support, so I whip up the ol' Amazon to find a 144hz HDMI cable, so that I can overnight the cable because I'm not paying the BestBuy premium again. ANNNND it turns out that in order for HDMI 1.4 or 2.0 or any of these specifications to actually work not only does your graphics card and the cable need to support it, but also your monitor needs to support it.
In the end, the obvious answer was staring me right in the face. Throw away the old perfectly good monitor and buy a new expensive one so I can enjoy that quality experience I expected. That is exactly not what I did. I'll settle for 60hz until after the hole in my pocket recovers and/or one of my monitors starts to fail. On the other hand, the crippled 3060 I got works beautifully for Stable Diffusion and I can render a standard image in 4-8 seconds. If you are interested in checking it out here is my affiliate link. I got it for $389 and the price to value is off the charts. Obviously the 3080 or the 3090 have much better raw performance, but they are much less value and you could literally buy a used car with how much they cost.