I want to thank all readers for their interest in the post and the discussions we had during the past months about 5G health issues. There was a high number of supporting comments and requests for clarification, but also quite a few critical comments to both of my published posts (What’s wrong with the “studies” and 5G health issues explained).
We thought it would be worthwhile to summarize a few interesting discussion topics in a new post since future readers will probably not try to dig through the whole discussion thread. To keep this amendment at a manageable length, I have taken the freedom to shorten, group, and summarize the topics, rather than repeat everything.
My first group deals with radiated power, frequencies, and exposure times. Let me quote a few of the statements:
These frequencies have never been used in consumer applications!
Comparing any radio transmitter to 60 W light bulb proves that the writer may lack knowledge about light and radio waves …
Most of the 50 W lightbulb will go to heat and not light.
What is the need for so many dipoles in one antenna?
Plus, on a video blog, someone used the figures I provided as a basis to dream up a real horror scenario (as he puts it). I will deal with this further below.
It seems to me that I have to repeat a few key facts: Concerning the interaction of electromagnetic radiation with matter, in particular with living tissues, there are only two relevant parameters: frequency and power.
All electromagnetic radiation is a stream of massless particles, the photons that carry the energy. The total energy is the sum of the energies of the transmitted photons. It requires a single photon to have enough energy to break or alter a chemical bond, which is what biologists require to see free radicals and other unpleasant chemistry in our bodies. If the photon’s energy is not high enough, all that ever happens is that the whole molecule is agitated – an effect that we call heating in daily life.
These are the thermal effects that are controlled by limits on transmitted power. You feel that as heat on your skin in sunlight. Where sunlight comes with several 100 Watts of power per square meter. The tower radiation is limited to 20 Watts per antenna, and a total limit per location, and I have shown you the Milliwatt and Microwatt levels that finally reach people on the street. Plus the photons of microwave radiation are about 10,000 times weaker than those of visible light. It needs UV light, X-Ray, and worse to create the claimed medical effects.
Coming to the light bulb example, which created a heated discussion. Basically, with a conventional light bulb, all-electric power fed to the wire is turned into electromagnetic radiation. Most of it is actually infrared (heat), and only a little bit – like 5% – is visible light. Still, both heat and light have more powerful photons than microwaves. So, in fact, a lightbulb is a much more dangerous object than a mobile phone (20 – 60 Watts of IR and light vs. 200 Milliwatt of microwave).
The topic of “lack of experience” with what is called “5G frequencies” is next. The 5G standards define a lot of frequency bands. Not all of the defined frequency bands will be used – at least not in near-time. Manufacturers and phone makers will not be able to design and produce radio equipment for all frequency bands. As reported by the industry association GSA, at present, out of 1560 licensed bands worldwide (FDD mode), the most popular ones are:
Band # | ~ Frequency | Total networks | Frequency Previously used |
3 | 1,800 MHz | 420 | GSM (2G) |
7 | 2,600 MHz | 240 | LTE (4G) |
20 | 800 MHz | 225 | TV |
1 | 2,000 MHz | 140 | 3G (2G) |
8 | 900 MHz | 105 | GSM |
These frequency ranges cover 85% of the total licensed spectrum worldwide. In terms of the size of radio networks (and thus the number of base stations and antennas), the most popular is band 78, with 3,500 MHz, previously allocated to a technology called WiMax, with a considerable deployment worldwide. T-Mobile US is even rolling out a 600 MHz 5G network. It must be quite disappointing to fear-mongers and conspiracy believers to see that these are all well established and previously used frequencies. Billions of WiFi-Routers worldwide use the 5 GHz ISM band, and they do it for more than 10 years.
The much-discussed higher frequency bands beyond 20 GHz are not widely used today, except in the US for some congested inner-city areas. But they have already been used for LMDS (n257) and satellite TV communication (bands n258 – n261) before. There is nothing really new here, too. TV and other communication satellites transmit at 200 – 300 Watt power level. The signals are transmitted through large parabolic dishes and arrive on the earth surface from 35,000 km above at a level of -100 dBm to -110 dBm. That is still within the range of normal cell phone tower radiation. We do have long term experience with this radiation.
A general “common sense” argument may be in order here: all current radio technologies use very similar radio chipset designs. There is no magic design secret. And all commercial operators of radio technology want to use as little electrical energy as possible. That’s because the electricity bill is high up on their monthly bill. Nobody is going to use extra power if it is not strictly required. If a mobile phone can detect and decode signals received at -80 dBm or less, then the transmission power is adjusted so that the phone receives it.
The “antenna gain” issue I have already discussed in an amendment to the first post. Thus I think it will be enough to repeat a few key points:
Finally, let us deal with the related question concerning the number of dipoles in one antenna.
An array of dipoles generates a directional effect. The dipoles transmit the input signal to the antenna in full sync. If you stand perpendicular to the antenna plane, you get all the signals from the dipoles in synch, and the waves are simply adding up. When you look at the antenna from an angle, the distance between you (your phone) and each dipole varies a little. That makes a small shift of the waves from one dipole relative to each other (see figure below).
I’ll take your word about the 4th power, though I was taught the “inverse square law,” i.e., second power.
Can you show us how to derive your formula for path loss? It is noted that the Hata model is not suitable for micro-cell planning where the antenna is below roof height, and its maximum carrier frequency is 1500MHz. It is also not valid for 1800 MHz and 1900 MHz systems.
Why is the frequency not visible in the formula? How would the formula look for higher frequencies?
Are we talking about power or power per square meter (power density) here?
Let me start here with a general disclaimer. The distance-dependent formula for the path-loss I used in my post is from a specific model they use at Ericsson [Ericsson9999] for 1 GHz of frequency. There are many alternative models out there, and when people from within the business talk about the Okumura-Hata model, they no longer refer to the original data and model. Things have been developed a lot over the past 50 years, as you can assume, and there are models available for each and every frequency that may be in use in the future.
I really don’t want to sit down here and explain the parameters (I think you believe me that I could). Let me get two fundamental facts out:
The main attenuation models are:
You see: the situation is quite complex. These models have made certain assumptions about the ground, heights, etc. and adjustment data. Rather than developing the theory here, I would like to give you an impression of some of these models and how they work, depending on frequency and distance. The charts are taken from the website CloudRF.com, where you can find a lot more if needed. Two observations may be in order:
People asked to compare the mobile network values with satellite radiation. Communication satellites and TV satellites transmit in the frequency range between 6 GHz and 20 GHz (there are higher frequencies, too). This reserved range is why 5G needs to go up to 28 GHz for high band transmission. Very roughly, the satellite data are:
Which gives you about -120 dBm of received power for geostationary orbits, and -80ish for MEO and LEO orbits. Which is well within the ballpark of mobile network field power.
Last question to be discussed in this section: Power density is usually stated in (Watt/square meter); my figures were confusing a few readers. But putting antenna gain in produces “real” power values. And when it comes to estimating effects on the human body, the effective aperture of a human is way lower than 1 m2. Summing this up, the radiation received by a human from mobile towers during the day is about 1% of the already low microwatts that the Hata-model delivers.
Several topics have been brought up in the comments that I did not address in the original blog. I cannot discuss them all here, so I picked a few and summed them up. The topics:
5G is causing Corona or at least strongly related to the spread of the disease
The real long-term effects are a non-stop chronic radiation of 1 V/m everywhere from mobile towers.
A long list of respected scientists issue warnings and call for a moratorium and further in-depth studies of medical risks associated with 5G.
You are not an expert, and you are referencing to a small group of controversial ICNIRP/WHO experts.
The CoVid-19 topic was obviously brought up after I wrote the original post. The allegations are widespread:
I am not an expert in this area, and I never claimed to be one. As a professional scientist, I know how to tell bad science from good science. My understanding is that all leading virologists worldwide have a unanimous stand on this: The SARS-CoV-2 virus is a member of the well-known virus family of Coronaviridae (Nidovirales). It has its natural reservoir in bats. It has one of the largest genomes of any known virus, making it completely unreasonable to speculate about a lab creation. Closely related viruses like SARS-CoV-1 and MERS-CoV have made the transition to humans before – and without the presence of 5G and actually without any widespread mobile technology in the areas where this happened.
One of the comments brought up the famous memorandum and asked for my opinion. I really don’t want to go again into all the things mentioned there; much of it has been discussed in part 2. I know most signatories; these have been mostly the same names for more than 20 years. And it is not what it tries to make appear: this is not the memorandum of “science”; there is a colorful list of activists, journalists, family doctors, etc. Nobody of “weight” in the scientific community.
The book of Arthur Firstenberg, “Microwaving the Planet,” was mentioned and recommended in a comment. This book is a conspiracy theory at its best if you ask me. And if you look at Firstenberg’s vita, you will find that he dropped out of medical school due to illness, which he then attributed to electromagnetic hypersensitivity brought by numerous x-rays. This story looks very familiar and is a typical source of EMF claims: some people tend to project their problems on external causes. You’ve got a headache; it must be the cell tower nearby.
For the claimed long-term effects of mobile network radiation, please check the following chart. It shows by example that the claimed medical consequences of mobile phone radiation really do not exist. This chart compares the brain cancer incidents and mobile phones’ growth in the US from 1977 to 2006. Brain cancer is the most frequently claimed disease in connection with mobile communication. Comparable charts exist for basically every claimed damage that mobile has been accused of causing in humans. There is not a tiny hint at correlation, let alone causation.
Concerning the expertise issue brought up in the comments: I rely on published, peer-reviewed medical research when it comes to medical effects. And I’m afraid I have to disagree with the claim that I only refer to a “small controversial group of WHO experts.” The WHO is certainly not a small group of conspiring scientists. The European Union’s Scientific Committee on Emerging and Newly Identified Health Risks (SCEHINR) had reached the same conclusion. Please see their obvious final verdict.
The overwhelming majority (and I am talking 99% here) of medical experts do not see the slightest problem. There is just this tiny, noisy group. I am still waiting to see a single study out of that circle that has made it through a leading scientific medical journal’s expert review cycle. Show me one, and we are in serious discussions. Those who say that this is part of the suppression mechanism applied by some corrupted “science elite”: the vast majority of such papers are already rejected for stupid simple quality reasons. See the snapshot from Elsevier’s journal Toxicology, with the rejection of a study claiming DNA damage based on microwave exposition.
Leave a Reply