If you take a flashlight and shine it at a wall or through a prism or do pretty much anything with the light, you'll find that the effects of the light change in direct proportion to the amount of light the flashlight's putting out. Twice the light, twice the reflection from the wall. Twice the light, twice the brightness in each color component of the prism spectrum. Nothing really changes qualitatively as you change the light intensity, the only difference is quantitative.
This property of light is called linear optics. Until the last half-century or so it was just called "optics". The reason for this is the fact that in fact there are physical situations where the response of a material to light qualitatively changes as a function of the intensity of the light. For instance, we now know that the refractive index of a material changes slightly when exposed to intense light. This can result in all kinds of weird phenomena such as self-focusing and filamentation. Other effects such as frequency doubling and Raman scattering crop up, and in fact have found uses in practical technology. Many green laser pointers use a nonlinear effect where an infrared laser is focused into a nonlinear crystal which doubles the frequency of the light into the green range. But this doesn't happen for the very same crystal in infrared light of everyday intensity.
So why is it that we don't see nonlinear optical effects when we stand in sunlight or turn on our headlights? Well, we might correctly surmise that (at least classically) light interacts with materials mostly by virtue of the electromagnetic field causing the electrons orbiting the atoms to wiggle around. This wiggle itself is an accelerating electric charge, which combines with the original electromagnetic wave and thereby slightly alters it. We might expect that the linear nature of this process would start to fail when the strength of the electric field of the incoming light starts to be comparable in magnitude to the electric field that holds the atomic electrons to the nucleus in the first place.
We want to know how much electric field is being felt by the electron, holding it in its orbital around the nucleus. The electric field produced by a point charge - in this case the nucleus - at a distance r is given by:
For a hydrogen atom, the classical separation between the nucleus and the electron is the Bohr radius, which is about 5.29 x 10-11 meters. We can look up the electron charge and electric constant pretty easily, and plugging into the equation the characteristic electric field inside an atom is about 5.14 x 1011 volts per meter. This is a pretty stout electric field - air breaks down with a lightning flash at a mere three million or so volts per meter.
So how intense does a light beam need to be before you get field strengths that high? We have an equation for that as well - the derivation is too long for this post but you can find it in any E&M textbook if you're interested. The intensity of light as a function of its peak field strength E is:
Where c is the speed of light. Plugging in our value for E, and we get that the intensity of the light is about 3.5 x 1020 watts per square meter. For reference, this is about a hundred thousand trillion times more intense than direct sunlight. No wonder we don't see nonlinear optics much.
But on the other hand if you take a relatively ordinary laboratory laser that emits 1mJ pulses with a duration of 35 femtoseconds and focus it down to about a square micron, suddenly you have light of about 100 times the threshold intensity without breaking a sweat.
Now this is just a back-of-the-envelope calculation. It's easily possible to see nonlinear optical effects at vastly lower powers than our hydrogen-field calculation. But this kind of argument does give an idea as to why they're uncommon with everyday incoherent light.
For two equal point charges separated by a distance r, the electric field between them is given by:
With the equation you've written, it would be more accurate to say that's the field due to one point charge +e, which would be the field felt by the other charge. But your quote makes it seem like you're looking at the total field due to both charges, in which case you'd have to use the dipole field.
Good point. I've reworded it a bit for clarity.
dear matt springer , i have a silly question .
what is the source of electric field , only charges ?
electric charges can be the source of an electric field.
also, a changing magnetic field can be the source of an electric field. look up Faraday's law of induction to read about it.
what is the percentage carbon composition in liquidfy natural gas