Uh no it is not.
Hdmi 1.4a could do 125hz at 1080p after using toastys patch. I ran 120hz @ 1080p on my Seiki 4k tv for quite awhile.. I happen to have the 50" model.. Even blurblusters can confirm it is true 120hz.. and it can be bumped to 125hz.
Hdmi 2.0 has the bandwidth to do 144hz at 1440p. It can run 144hz at 1080p too. It's just that not many people know how to patch it and not many tvs with hdmi 2.0 can legitimately do 120hz or 144hz without using duplication techniques.
So why in the world of satans asshole would you talk when you know absolutely JACK **** about the subject?!?!
Here is some life advice that you and millions like you should have been taught as soon as you learned how to talk:
If you do not know ******* dick all about the subject at hand, keep your retarded opinions to yourself! If you know your **** and what is comming out of your mouth are indeed facts, then proceed to state your point, if its idiot opinions, then shut the **** up!
So I guess in order for me to learn anything I need to have it? You can learn **** if you don't have what it is. That is how I can learn about a solar panel even if I don't have one. I could learn about a ******* nuclear power plant. I don't need one if I don't have one. For the 4k monitor **** , its a god damn monitor why would it not be able to reach higher refresh rate. We might be limited, but it can happen.
I never said you have to have something in order to know about it, but if you dont know about something, dont ******* talk about it.
and as of ******* now and in the next few years, 4k monitors do NOT go past 60hz.
what may or may not be one day is neither here nor there.
well SLI is rarely if ever supported by any video games...so you probably did yourself a favor there buddy, seeing as it only ***** most games up when its turned on
honestly just take the other video card out and sell it on ebay or use it as a back up. never buy two video cards ever again, just buy 1 really good one. trust me.
the refresh rate of a monitor is how many times it updates per second
it's similar to frames per second
most PC monitors and TV's are 60hz, some higher end ones are 120hz
which provides smoother images; or can be used for 3D it alternates between showing the left eye's image and right eyes image at 120hz people watching it wear glasses that are synched with the monitor covering the correct eye at the correct moment so you see one image with your right eye at 60hz and a different image with your left eye at 60hz (this is active 3D, passive 3D - what you see at movie theatres - is vastly different)
Yea, what's the deal with jellyfish? Like are they a solid or a liquid? And they don't have brains, yet somehow they're animals. And how come some of them never die? ******* jellyfish. They make no sense to me.
Jellyfish are about 95% water ... which is why they look like a gelatinous blob when out of water, but they are technically a solid.
while true jellyfish do not have bones, brains, blood, or hearts, they are classified in the Kingdom of Animalia (making them animals) it has to do with their the makeup of their sells ... Algae and bacteria don't have a separate nucleus in their cells. They also have a cell wall and they can propagate asexually; Plants do have cell nucleii, but they have rigid cell walls made of cellulose; Animal cells have only a plasma membrane, no cell wall and a separate nucleus.
finally, it is theoretically possible for jellyfish to be immortal because if their cells are damaged (from attack, sickness, old age, or anything else) they can undergo a process to transform them into a new type of cell ... however most jellyfish only live for 1 to 5 years
I'm not some sort of console pheasant, but I'm not a die-hard Reese's Pieces Mustard Face either. I don't know if I would be able to tell the difference between 60fps and 120fps, or if I would care.
Differences of 10 are easily unnoticed, but a difference of 60 is pretty big.
When I first saw 120fps, it seemed so strange to me.
It was a massive difference
By now, pretty used to it. I use a 60hz monitor, and so everything is usually 50-60 FPS, but my parents own a 144hz screen. The comparison is noticable, but not as jarring since I'm used to it.
The difference is actually really huge, but you don't miss it when you don't have it (imo). This is what I enjoy about the current advancement in graphics and the likes.
Hey look its Metroid Prime, I remember that ************ made me so pissed as a kid. That was the first time I ever got so mad at a game I actually threw the controller.
why do people reply to random image rolls and talk about their lives in relation to the picture besides we all know Quadraxis was the biggest ******* prick in that series next to Mogenar
ok u got me on the eyes bit. GJ, just assumed higher frequency meant more strain on the eyes.
but double the rate information is being piped to your monitor means the transistors in your video card's memory are being flipped at an increased rate. like any piece of memory, the transistors slowly die out over time the more you use them, and although your video card may still work after 10 years, I guarantee you not at the same speed as when you first got it.
Your graphics card isn't working twice as hard. It's working at exactly the same pace. It's just under higher stress.
Your graphics card works in a set frequency range, generally around 1 ghz. If you go to a 144hz monitor it doesn't work at 2ghz, that's not how it works. It makes literally no difference what refresh rate you're using
I didnt say the rate it crunched data is doubled though did I? its output is doubled, it's having to flip the gates on cached memory at double the rate it would otherwise...leading to a quicker death of said transistors.
nah transistors bug out all the time. we use a bandwidth analyzer at work to evaluate video cards used in diagnostic medical equipment. worst I ever saw was a card that only had access to 40% of its original memory after being in a system for only 2 years. mind you this was a scanner in an emergency department. somehow this was causing the host computer to shut off intermittently. the card would get very hot after being used for a very short time.
Well would you look at that, you made my point for me
"Get very hot"
"assuming it's not overheating"
Who'd have guessed that using something above it's safe operating temperature lowers its lifespan? Not to mention cached memory has really nothing to do with refresh rate. It's outputting at whatever frame rate it's pushing, not the refresh rate. big difference
increased output definitely is linked to decreased reliability, I have no idea why anyone would argue otherwise....unless they were just overly concerned with being right.
the increased heat is due to increased resistance on the impeded circuit...ie: the more transistors you bust up the more heat you begin to generate.
i have a 120hz screen in my laptop but i always leave it at 60fps because i can never get over 60fps on games anyway. only reason i would bother is if i wanted to watch a 3d movie and yes its a 3d screen