what is the need of standard units?
The internationally accepted standard units are the SI units. The International System of Units, universally known as the SI (from the French Systeme International d’Unites) is necessary because lack of standard unit of measurement would cause confusion and a waste of time in converting from one unit to another all the time. So, the SI system is adopted all over the world as the standard measurement unit system.
We need standard units to be able to communicate facts, measurements, durations clearly and precisely. At least that’s the intent. The metric system was adopted in 1960. The problem of standard units for the whole world remains unsolved, for example the United States is one of the few developed societies that hasn’t converted to the metric system: Celsius, meter, grams, etc. It still uses the English system: Fahrenheit, feet, yards, miles, pounds, ounces, etc. There is more cooperation on standard units in engineering and scientific communities.
“By the eighteenth century, dozens of different units of measurement were commonly used throughout the world. Length, for example, could be measured in feet, inches, miles, spans, cubits, hands, furlongs, palms, rods, chains, leagues, and more. The lack of common standards led to a lot of confusion and significant inefficiencies in trade between countries. At the end of the century, the French government sought to alleviate this problem by devising a system of measurement that could be used throughout the world. In 1790, the French National Assembly commissioned the Academy of Science to design a simple decimal-based system of units; the system they devised is known as the metric system. In 1960, the metric system was officially named the Système International d'Unités (or SI for short) and is now used in nearly every country in the world except the United States. The metric system is almost always used in scientific measurement.”
“One might think people would have a very good number sense, but as it turns out, people do not. Experiments have shown that the average person has a number sense that is around four. People groups in the world today that have not developed finger counting have a hard time discerning the quantity four. They tend to use the quantities one, two and many-which would include four. …
So what separates people from the rest of the animal kingdom? It may include many things, but the ability to count is very much one of them. Counting, which usually begins at the end of our own hands or fingers, is usually taught by another person or possibly by circumstance. It is something that we should never take lightly for it has helped advance the human race in countless ways. “
Old customs are hard to change. It might be more practical to “eyeball” a measurement instead of using a ruler, a scale, a measuring cup or other instrument. I wondered if there is some biological aspect to number recognition. I was surprised to learn that humans are not really different in their number sense than other animals.
“There was little need for a numeric system until groups of people formed clans, villages and settlements and began a system of bartering and trade that in turn created a demand for currency.”
Human counting ability has been extended by more advanced mathematical concepts and tools. Engineering and scientific notation was developed to express very small or very large numbers. The origin of scientific notation is controversial. Some sources say it was Descartes who invented it. Other sources say it was Archimedes. But the actual term “scientific notation” first came into use in the 1960s by computer scientists.
“The Oxford English Dictionary (http://www.oed.com/) keeps records of the first time that any particular word appears in print. The first recorded use of the term scientific notation goes back to 1961 in the third edition of the New international dictionary of the English language. The next recorded use of the word in the Oxford English Dictionary comes from 1963 in Digital Computer Technol. & Design, “The power of the base appearing in an expression which is in scientific notation in effect indicates the position of the point.” The 1963 quote above makes it clear that scientific notation referred to any number of the form first number times (second number raised to third number). In modern usage, the second number is always 10 in scientific notation, and the more general term exponential notation can be used when this second number is different.”
The next recorded use of the word in the Oxford English Dictionary comes from 1963 in Digital Computer Technol. & Design, “The power of the base appearing in an expression which is in scientific notation in effect indicates the position of the point.” The 1963 quote above makes it clear that scientific notation referred to any number of the form first number times (second number raised to third number). In modern usage, the second number is always 10 in scientific notation, and the more general term exponential notation can be used when this second number is different.”
What is the standard unit used to measure Voltage? Current? Resistance?
"The volt is the quantity of electric potential energy per unit charge measured in joules per coulomb.
James Prescott Joule first proved that heat was a form of energy so the unit of energy most commonly used by physicists, joules, was named after him.
A joule is the amount of energy required (or work performed) when applying a force of 1 newton (1 newton=0.2248 pounds) over a distance of 1 meter. There is a famous legend about apples falling on Isaac Newton's head, and a small apple weighs roughly one newton. So think of a joule as roughly the energy required to lift an apple from the floor to a table, roughly 1 meter.
One can also understand how much energy a joule is by noting that a 100 watt light bulb emits 100 joules of energy each second. To convert between joules and calories, 1 calorie equals 4.186 joules.
An erg is a very small energy unit that is still occasionally used by scientists. An erg is 1 ten millionth of a joule. (1 erg=1E-7 joule) Think of an erg as very roughly the amount of energy it takes a flea to jump."
“Between 1785 and 1787, the French physicist Charles Augustine de Coulomb performed a series of experiments involving electric charges, and eventually established what is nowadays known as Coulomb's law. According to this law, the force acting between two electric charges is radial, inverse-square, and proportional to the product of the charges. Two like charges repel one another, whereas two unlike charges attract.”
“Opposite charges will produce an attractive force while similar charges will produce a repulsive force. The greater the charges, the greater the force. The greater the distance between them, the smaller the force.
Quantity of charge can be measured in either elementary charges (an elementary charge is the amount of charge on one electron or proton) or in Coulombs. An elementary charge is a very tiny unit of charge. Since it is so small it is not usually a convenient unit to measure typical amounts of charge. It would be similar to measuring distances from one town to the next, in millimeters.
On the other hand, a coulomb is an incredibly large unit of charge. It is actually too large a unit of charge for talking about electrostatics (stationary charges) but it is an appropriately sized unit as we begin describing the quantity of charge moved in an electric circuit.
Unfortunately, we are stuck with either one unit or the other: 1Coulomb = 6.3x10^18 elementary charges; 1elementary charge = 1.6x10^-19 Coulomb.”
A volt is the amount of work 1 joule performs on moving 1 Coulomb of electric charge. This an astronomical number of electrons. A trillion is a million million. A Coulomb is 6 .3 billion billion electrons. In other words it is 6.3 million trillion electrons.
“Electric current is measured as the amount of electric charge transferred per unit time.” The SI unit of electric current is the ampere defined as 1 coulomb per second.
“In order for a current to exist in a conductor, there must be an electromotive force (emf), or potential difference, between the conductor’s ends. An electric cell, a battery of cells, and a generator are all sources of electromotive force.”
Electrical resistance is a material’s opposition to the flow of electrons; measured in ohms.
“An electron traveling through the wires and loads of the external circuit encounters resistance. Resistance is the hindrance to the flow of charge. For an electron, the journey from terminal to terminal is not a direct route. Rather, it is a zigzag path which results from countless collisions with fixed atoms within the conducting material. The electrons encounter resistance - a hindrance to their movement. While the electric potential difference established between the two terminals encourages the movement of charge, it is resistance which discourages it. The rate at which charge flows from terminal to terminal is the result of the combined affect of these two quantities.”
Ohm’s law describes the relationship between voltage (V), current (I), and resistance (R); V = I R or I = V/R or R = V/I.
So many of the measurements and instruments in biomedical engineering are derived from electronic sources that a degree of familiarity with the physical relationships is essential.
When people want or need to cooperate in in doing or building things then there needs to be a way that they can agree on the size of the bits that each is making.
If this were not the case then things would not fit together.
However if we all agree how long a meter is and we make a room 10 meters long and 10 meters wide, we know that we need 100 square meters of carpet to cover its floor. Without this standard unit of measurement when we went out to buy carpet, we may get too much or too little.