P
pigdos
I learned in my basic digital logic and design courses that the clock rate
determined the rate at which flip-flops/registers could read in data. My
question is how can a GPU running at say, 500Mhz, clock-in AGP data at a
rate of 2+GB/sec? I suppose if the registers clock in data on the rising and
falling edges it wouldn't be an issue. Is this what happens?
determined the rate at which flip-flops/registers could read in data. My
question is how can a GPU running at say, 500Mhz, clock-in AGP data at a
rate of 2+GB/sec? I suppose if the registers clock in data on the rising and
falling edges it wouldn't be an issue. Is this what happens?