Ohm (unit)
Definition and Overview
The Ohm (Ω) is the standard unit of electrical resistance in the International System of Units (SI). Named after German physicist Georg Simon Ohm, it is defined as the resistance between two points of a conductor when a constant potential difference of one volt, applied to these points, produces in the conductor a current of one ampere, the conductor not being the seat of any electromotive force.
Historical Background
The concept of resistance was first introduced by Georg Simon Ohm in his 1827 publication, "Die galvanische Kette, mathematisch bearbeitet" (The Galvanic Circuit Investigated Mathematically). Ohm's work set the foundation for the development of the concept of electrical resistance and its measurement, leading to the establishment of the ohm as the unit of electrical resistance.
Derivation from Base SI Units
In the International System of Units (SI), the ohm is derived from base units. The ohm is defined as one kilogram meter squared per second cubed per ampere squared (kg·m²·s⁻³·A⁻²).
Measurement and Calculations
The resistance of a component can be calculated using Ohm's law, which states that the resistance (R) is equal to the voltage (V) divided by the current (I). Thus, R = V/I. This equation allows for the calculation of resistance in ohms, given the voltage in volts and the current in amperes.
Applications and Importance
The ohm is a fundamental unit in electrical and electronic circuits, as it quantifies the amount of resistance a material or component provides to the flow of electric current. This resistance is crucial in controlling and managing electrical energy in circuits, making the ohm integral to the design and operation of electronic devices.