Question #625bd

1 Answer
Feb 5, 2018

Increasing the resistance of the circuit slows the rate at which the capacitor discharges.
See the discussion that follows for more detail...

Explanation:

To give a qualitative explanation (without the formulas and numbers), you should realize this - once the capacitor is charged, it has a limited store of electrons (on the negative plate) that are held there until the two plates are connected by an external circuit. This results in a potential difference between the two plates of the capacitor.

Once the circuit is connected across the terminals of the capacitor, the electrons will begin to transfer from the negative plate, through the circuit and finally, they will reach the positive plate. This charge movement reduces the charge on both plates, equally. It continues until both plates are neutral, and the potential difference between the plates goes to zero.

Therefore, if we can reduce the rate at which charge moves through the circuit, we also increase the time it will take before the capacitor is discharged.

Increasing the resistance of the circuit reduces the current, and slows the rate at which the capacitor discharges.

Just one peek at the math:

If you multiply together the capacitance C of the capacitor, and the resistance R of the external circuit, you get a product (RC of course) that will have units of seconds, and tells you the amount of time needed for the voltage to drop to 36.8 % (=#e^(-1)#) of its original value. Since the voltage and the charge remaining on the plates is directly proportional, this means that only 36.8% of the charge will remain on the plates.