- last post: 01.01.0001 12:00 AM PDT
I don't think we're talking about 64 bit GPU's. I was thinking 64 CPU's. 64 bit GPU's are OLD.
Posted by: ProgramLog
always in to learn something so:
whats the difference between 32 and 64 bit gaming? how do you find out if your graphics card supports/doesn't support it?
Bits are the number of on and offs a computer can fetch. On and off's equle commands, so with each hertz, my CPU will fetch 64 ons and offs (64 commands) the more you have the better off you are.
Each bit is one command (even though I'm doing CPU's, this is a good example. Imagen you want to change frames, well that requires the computer to do an on/off on your monitor. It might turn one pixel off (or a different color, but it's operated in 0's and 1's, which are ons and offs). The CPU would use 1 bit to change that color of the pixel, then it would change the rest on the monitor for the frame to go by. Get what I'm saying yet?