Post
Topic
Board Off-topic
Re: Just what is a clock buffer anyway?
by
MrTeal
on 29/11/2012, 18:26:31 UTC
OK, those explanations make good sense, and are more or less in line with what I expected.

I imagine that a clock buffer, at one end, receives the signal from a clock source and demodulates it (digitizes it, quantizes it, whatever).  Then, based on its now digital interpretation of the input signal, it creates one or more new signals (maybe it has multiple outputs), which may have different characteristics than the input signal, such as amplitude (voltage), wave shape, and probably a delay (offset) relative to the input signal.

I can see this being does at the board level, but does it make sense also at the chip die level?

In VLSI design there is the concept of fan-out, which is the the number of gates that a gate has to drive. The larger the load on a gate, the more capacitance slows the rise and fall of the signal edge. You can't just take a clock source and hook it up to a couple hundred points around the chip as the capacitance is such that a minimum sized transistor can't drive it. You can increase the drive capability of the circuit by cascading stages making each stage about 4x larger that the last (see FO4) and by buffering the signal. Just increasing the drive capability of your main clock source isn't always the best answer though, and local clock buffers are often used for different logic blocks. It's basically two inverters in series. They aren't a cure-all though. You still run into skew, where the signal from your clock source arrives later at one part of the chip than another, and jitter, where the period of the clock isn't regular. If they already have a working design "without flaws", they better be damned careful adding clock buffers. Depending on how synchronous the design is, it's not trivial to change a lot in your clocking system without introducing new problems.