What is a load resistor

What is the difference between a load resistor and a normal one?

The term "load voltage" is used to describe the voltage drop caused by a series-connected ammeter in a particular situation. The term "load resistance" is used to describe situations where each additional unit of current flowing through the device increases the load voltage by a fixed amount (e.g., when every milliampere flowing through the device adds an additional millivolt drop cause the load resistance would be one ohm). A very common means of sensing current flowing into or out of a circuit is to put a resistor in series with one branch of the circuit and then measure the voltage drop across that resistor. In most such designs, very little current flows into, out of, or through the voltage measurement circuit; Almost all of the current flows through the resistor.

In general, I've heard the term "current sense resistor" used to describe the physical device and the term "load" used to describe the effect seen by the circuit being monitored. Note that from the standpoint of the circuit to be monitored, the ideal "load resistance" would be zero ohms or, if this is not the case, as small as possible. On the other hand, from the standpoint of the device making the measurement, a larger value for the current sense resistor will, up to some point, make the current measurement easier and more accurate. The "purpose" of the current sense resistor is not to apply a load voltage to the circuit under observation; Rather, a voltage should be generated that can be seen by the voltage measuring circuit. The fact that such tension is considered load tension is an unfortunate side effect.