By Vitaly MELNIKOV, Dr. Sc. (Phys. & Math.), department head, All-Russia Research Institute of the Mertrological Service (VNIIMS);
Kirill BRONNIKOV, Dr. Sc. (Phys. & Math.), department head, VNIIMS Institute
Our research center dates back to the year 1900 when, at the initiative of the great Russian scientist Dmitry Mendeleyev, a Chamber of Commercial Weights and Measures, one of Russia's firsts, was set up. Today VNIIMS is the kingpin of our country's metrological service...
Among our research staff of two hundred and sixty we have 64 degree-holding scientists - eleven Doctors and fifty-three Candidates of Sciences, or M. Sc. All of them are concerned with a wide range of weights and measures and everything that goes with it - state standards, certification and the like. Gravimetry is an important part of our activities. Besides basic research, we are involved with economic, legal and organizational matters bearing on the unified system of weights and measures countrywide.
Among other things, our research center (VNIIMS) operates as the state center of reference mean pressures (in a range of 10 2 to 2.5 x 10 5 Pa). Our microprocessor- controlled set-point devices (used for the control of pressure gauges and related instruments) boost the verification efficiency level 30 fold. Such reference-input elements have no analogues elsewhere in the world.
Now take the accuracy of machining and working in industry. We are giving a good deal of attention to this problem. Our basic standard related to the parameters and characteristics of surface roughness in workpieces has been adopted in other countries as an international standard.
Our experts are also working in electrical engineering. They have developed a family of differential high-voltage meters for 100, 400 and 800 kV which are in error by 0.01-0.05 percent only. The trademark of these meters is DVINA.
Or take chemistry where unified standards are all-important for assays. To help chemists, our center
has developed standard specimens of pure hydrocarbons C 1 -C 4 for the calibration of gas chromatographs and mass spectrometers; we have also drawn up codes and specifications for chromatograph testing and checking.
As you see, our VNIIMS Institute caters to a very wide range of human activity. It tests and certifies seismometers and infrasound vibration transducers; it checks on the forms and quality of workpiece surfaces; on three-dimensional measuring machines (used for the verification of forms and dimensions of sundry items, say, turbine rotor blades); on high-precision resistance thermal converters... And we are doing lots of other things.
And yet there is something else we are particularly proud of. This is our method of determining the gravitational constant. Any basic physical theory, you see, has to deal with constants describing the stability of various processes and substances. Such constant values are very important, for they are independent and have the same magnitude, at the present-day level of accuracy at least. Such constants are known as the Fundamental Physical Constants (FPCs). But we cannot define this concept in rigorous terms or determine the set of constants: they are dimensional by and large and tend to be substituted by new, more general constants. Also, there are certain correlations between the old and the new constants. Here's a typical example.
Until recently physicists knew of four types of basic physical interactions: gravitational, electromagnetic, strong and weak. A set of constants was determined accordingly. These are the four values characterizing each of the above interactions: velocity of light, Planck's constant* (also called the quantum of action), electronic charge, proton (or electron) mass; we must also add three cosmological parameters as well as the Boltzmann constant ** - and Joule's equivalent, or the mechanical equivalent of heat. The latter two perform the role of conversion factors between temperature, on the one hand, and energy and mechanical values, on the other.
Today a new theory has been conceptualized: it integrates electromagnetic and weak interactions into what we call electro- weak interactions. This theory has proven effective in numerous experiments on elementary particles at various laboratories of the world. Hence the set of FPCs is change-prone. For instance, the constant characterizing strong interactions is now "off to give way to a parameter proper to this phenomenon and used in quantum chromodynamics (QCD) *** ; however, the constants of macroscopic phenomena (gravitational, cosmological constants) remain the same.
Yet should a new theory be devised to cover all of the four interactions, we will get another set of constants. All kinds of options have been suggested in the last few decades - supergravitations, superstrings, and what is called the M-theory.
Besides the list of the adopted basic, fundamental constants, there is also another important facet, their measurement accuracy, and it varies. Say, if the velocity-of-light constant is thought to be absolutely exact, the situation is different where microscopic (atomic) constants are concerned - e.g. for electronic mass and charge, and other parameters; here the error margin is between 10 -6 and 10 -8 , and in some cases it may be as much as 10 percent and even higher.
The set and character of basic constants are determined by physical laws directly linked to the birth and evolution of the universe. Some of such constants may be viewed as limiting values for definite states of matter. Say, in relativistic theories the velocity of light is highest for the movement of any object; in a similar way, the electronic charge is the smallest amount of electricity. And so forth. In turn, combinations of constants are taken as natural scales for the basic physical units of time, length and mass. For example, in 1983 light wavelength was adopted for determining the accuracy of metric measures (1m) instead of the platinum-iridium bar that had been used as the standard before.
All these constants may be tentatively divided into four groups. The first one comprises universal FPCs,
* Planck's constant-a basic constant of the quantum theory, designated by h and equal to 6.626 x10 -34 J x s. - Ed .
** Boltzmann constant-a physical constant equal to the relation of the universal gas constant to the Avogadro number. - Ed.
*** See: Yu. Simonov, V. Shevchenko, "Quarks: Trapped and Liberated", Science in Russia, No. 2, 1998 . - Ed.
such as Planck's constant whereby all processes and phenomena are divided into quantum and nonquantum (micro- and macro- worlds); to some extent the velocity of light is also within this group: here movement is viewed either as relativistic (if velocities are close to that of light) or as nonrelativistic. The second group takes in constants characterizing basic interactions. The third group encompasses elementary constant matter represented by protonic or electronic mass and the like. And finally, in the fourth group are the conversion factors - the Boltzmann constant and the mechanical equivalent of heat (Joule's equivalent), and partly the velocity of light.
Yet the division of constants into the above four groups should not be understood in absolute terms as something immutable: as we have said, FPCs may change over from one group to another, depending on new evidence. For example, electronic charge was first ranked within the third group only, while now it has "moved" into the second group, for in combination with other constants it accounts for electroweak interactions. As to the velocity of light, it has already "visited" all of the four groups of constants.
One particular constant is of primary significance for many contemporary trends in science, be it the gravitational-relativistic theory or an attempt to develop an integral theory of basic physical interactions. This is the gravitational constant (G). Without it we cannot make an accurate measurement of the mass and mean density of the planets of the solar system and ultimately construct their reliable models - that is couple mechanical and electromagnetic values; we cannot identify new physical interactions, geophysical effects and so forth.
Hence the conclusion: the problem of an accurate G and its stability as a constant is central to the cognition of many natural phenomena. This very problem involves three essential steps:
1) determine an absolute value of this constant; 2) its possible time-related variants (slow ones, on a par with the rate of the expansion of the universe, and slower); and 3) its possible distance-related variants (or novel non-Newtonian interactions).
Let's recall: G is the proportionality factor in the formula expressing Newton's law of gravitation. Its values have been determined in the course of many laboratory measurements using a torsion scale invented by the English physicist and chemist Henry Cavendish (1731-1810); the error margin is 10 -3 . However, scientists of three countries - Russia, the United States and France - have succeeded in improving the measuring accuracy to 10 -4 . It looks like the limit of the possible (because we cannot eliminate or take account of the impact of sundry ambient objects, or allow for errors due to the instability of measuring filaments, among other things). Or else we are dealing with some new kind of physics.
FPC values are not only of fundamental but also of metrological significance. The contemporary system of standards for various values is chiefly based on quantum physics phenomena. Therefore the stability of constants is of cardinal importance. Yet natural laws have been evolved and verified in experiments on earth and in near space only within a relatively short time, i.e. in the last two or three centuries. The time and the distances involved are but nothing compared with the age and dimensions of the universe. Accordingly, being unable to exclude a priori the possibility of slow variations of constants (commensurately with the rate of the evolution of the universe), we should check on the corresponding values time and again for ever better accuracy.
This problem cropped up simultaneously with attempts to explain the connection between phenomena of the micro- and macroworld. The British physicist Paul Dirac (1902-1984) was the first to conceptualize his theory of large numbers. In keeping with this theory very large numbers should not be of random occurrence in physical theories but relate to the age of the universe. And since this age always changes, the gravitational constant must likewise be in a state of "flux".
That original hypothesis of Dirac's was followed by a good many hypotheses and theories allowing for the variation of some FPCs. And that may bring us to important astrophysical, cosmological, geophysical and other consequences and effects. Our VNIIMS Institute has developed a method that allows to predict the gravitational
constant's annual variation at 10 -12 - 10 -13 and visualize one of the possible scenarios of the origin and evolution of the universe.
Experimental data obtained from various sources, such as the rate of coral growth, the slowdown of pulsed electromagnetic radiation by pulsars and, the most reliable evidence on the laser detection and ranging of the moon, have made it possible to determine this value only within 10 -11 -10 -12 . Predicting all the various fluctuations of FPCs both in time and in space is rather problematic because of the absence of a well-conceived integral theory of interactions. Relative to the gravitational constant this problem can be tackled by means of the SEE (Satellite Energy Exchange) project now at the gestation stage and involving researchers of VNIIMS and Gravitational Society in collaboration with US counterparts from the Universities of Tennessee and Virginia; also taking part are experts of the National Laboratory at Oak Ridge, Tennessee, and of the Marshall aerospace center.
The idea of this experiment is based on the effect of the movement of Saturn's satellites. A capsule composed of two coaxial cylinders is lifted into a 1,500 km-high circumterrestrial orbit. Within this capsule two bodies are in free motion: one is large enough with a mass of about 500 kg, dubbed "shepherd", and the other is smallish ("particle") weighing but 100 g only. During the flight the two objects move like satellites in orbit; however, the small body that gains extra energy from the "shepherd's" gravitation changes its trajectory all along.
Laser-interferometer sensors keep tabs on the position of both objects with respect to the capsule. The accuracy is up to a micron. The readings taken at small intervals are loaded into a computer and compared with theoretical, standard trajectories calculated for different values. An experimental trajectory closest to the computed one will be the sought-for quantity. According to theoretical estimates, the gravitational constant can be determined within an error limit of 10 -6 or perhaps even with better accuracy.
But that's not all to it. The same experiment may give new data on other parameters of the gravitational interaction. It may be possible to achieve a fantastic level of accuracy -10 -15 ! - in checking on the equivalence principle which says that all bodies, irrespective of mass and chemical composition, have the same free-fall acceleration. Today, let us recall, the best accuracy is at 10 -12 . Even the slightest change in this value will usher in a fundamental revision of the views on the nature of gravitation and other physical interactions. Eventually we shall get a new picture of the physical world.
Such accuracy is possible only in space laboratory conditions that provide for this essential point: exclude the action of any extraneous forces save the gravitational pull, within the capsule. Jet microengines will keep it steady with regard to the "shepherd", which is a standard body, exactly on the axis. Besides, the position of the capsule in respect of the earth should be monitored with great precision, for the gravitational pull of the earth that affects the movement within it changes substantially from point to point. Of much help in this respect will be the modern space navigation systems that allow to pinpoint the position of spacecraft at any moment up to a centimeter. In the selfsame experiment we can determine the time-related variations within an annual 10 - 13 to 10 -14 .
This comprehensive space experiment is scheduled for the year 2005. It will be concerned not only with just another constant, be it a fundamental one, but also will help us find clues to essential problems bearing on the gravitational pull and its interdependence with other physical phenomena.
Prepared by Arkady MALTSEV
About · News · For Advertisers · Donate to Libmonster
Libmonster ® All rights reserved.
2014-2023, LIBMONSTER.COM is a part of Libmonster, international library network (open map)
Keeping the heritage of the United States