Estimate signal rise/fall time is a common task for digital board design. A common scenario is CMOS circuit driving external load which is modeled as a capacitor. A lot of times we’d like to use RC model to calculate delay. But CMOS I-V curve is not linear. This spreadsheet shows we can use real CMOS I-V curve to get delay. The idea is to calculate delta time needed to charge capacitor to the next level for a given voltage level. Then the sum up of delta time gives us rise/fall time.

The I-V curve we use in spreadsheet is out of below formula:

It can also come from measurement or IBIS model. IO level is 2.5v and Vt is 0.5v. Max driving current needs to be given. Normally it is like 4mA, 8mA, 12mA, etc. In addition to rise time, equivalent R at a given voltage level is also shown. As expected, equivalent R varies.

Great answer, thanks for sharing.

nice

KSen,

Interesting xls. I played it a little bit by changing driving strength and load capacitance values. Let’s define MOS driver resistance R as Vdd/I_drive where Vdd is 2.5v in your xls. Rise time can be approximated as 1.2RC instead of normal 2.2RC. I think it makes sense. R at 2.5v is the largest. When Vds decreases driving strength remains the same so R also decreases. In an extreme case that driving MOS is a constant current source, rise time is just C*V/I=RC. 1.2RC is due to linear region of MOS.