Post date: Feb 26, 2015 5:21:20 PM
Heat is generated in an electric circuit when current flows: the greater the current, the greater the heat output. For designing a heating element this tells us that we should pick something with a low resistance to allow lots of current to flow. The question is how small can we make the resistance of the element and have a good heating element: 10 Ohm, 1 Ohm, 0.001 Ohm?
Back to the last problem
Recall the simple heating element circuit from the previous example, which is made of a resistor (with resistance, R), battery (supplying a voltage Vsupply) and connecting wires.
By having a small resistance, we maximise the current and therefore the heating - for the simple circuit we used Ohm's law to find the current I=V/Relement . By this reasoning as we the resistance of the element go towards zero we will get infinite current out - clearly this is nonsense and we must be missing something from our model!
In solving this problem previously we ignored two things:
the wires have some small resistance (Rwire).
the battery has an internal resistance.
Why do I care about these now? Well, as we make the resistance of the element closer to zero, we are effectively ignoring this so we should take care of these other contributions.
To answer this we have to revisit our simple circuit and start to think about the things we left out before. Our new circuit diagram looks like this, where we now also have Rwire and Rint (for the wire and internal resistances).