- last post: 01.01.0001 12:00 AM PDT
Posted by: staticx576
Posted by: MrNick01
"Lag would be that frames are already there and it takes a second to display to the monitor, where as low framerate is the frames need to be rendered first." - And what, my friend would take the moniter "a second" to display an already received signal?
That was an example and an exaggeration at that, you know this so stop being hard. Want an example? Say a mile long cable, since electrical signals have a finite speed it would take time for the signal from the Video card to reach the display.
Posted by: MrNick01
The GPU renders the images, converts the digital image into an analog format, and then sends this signal to the moniter, which displays the images (for all practical purposes) in realtime.
Have you heard of DVI? It's all digital so no need for analog. But yes the conversion to analog if you were using an analog display would cost time, but it wouldn't affect the framerate, this might be considered lag.
Posted by: MrNick01
Last I checked, Low framerates and lag are one and the same (unless you are referring to connection-based lag). It is absurd to think otherwise. Because in such a case, a remedy to lag would be to replace your moniter.
How dense are you that you cannot seperate the two. As you put a remedy to lag would be to replace your monitor, and using your information as you so eloquently put that would in fact raise your framerate while this is simply not true.
This is all semantics anyways and as lag is commonly refered to as it takes longer to get somewhere than you want it to. Where as low framerate is where the videocard renders the game at a certain speed, common terms dictate that each frame gets to where you want it to at the same speed regardless of how fast the game is rendered.
This is a completely different point then I wans trying to make, and your examples are just a tad impractical.