Schedule a free product or technology session with Grandmetric Engineer
schedule a video call

Blog

IP and Mobile Trends and Education

 

Is this anything to worry about? 5G health issues explained

Author:


26.03.2019

5G Health Issues Explained

Introduction

Announcements of a new mobile network technology generation (5G) have triggered a series of alarming claims about connected health threats. This is nothing new: this phenomenon is with us since the 1990s; those alarming claims have been made around the launch of UMTS (3G) in the year 2000, and with the start of LTE in 2010, too. This time it is quite a bit heavier than in the past, mainly due to the existence of social networks, which tend to spread alleged “bad news” and alarming stories literally at light speed. Public opinion is first and foremost against the cell phone towers as they are the visible landmarks of the technology. Each time a tower for a cellular network is built or planned to be built in a city or near a rural settlement, there is a new discussion about health issues of mobile phone or network radiation. It is about time to put things back into perspective. I’d like to deal with the reality of mobile network radiation in this post.

My first statement here is: Major exposition of humans from mobile radio technology is from handheld phones, not from base stations!

The reason is very simple. The power of electromagnetic radiation goes down with distance extremely fast when moving away from the transmitter. See the post https://www.grandmetric.com/blog/2018/02/20/explained-pathloss/ from Mateusz Buczkowski in this blog for a general introduction of the concept of path loss.

For a frequency of 1 GHz (typical range for mobile phone networks) the path loss measured in Decibel units is

where r is the distance from the source to the measurement point in meters. This is a formula due to the Japanese scientists Okumura and Hata, who have done endless series of measurements and have compiled them into empirical formulas. The Okumura-Hata formulas are internationally accepted and part of the mobile phone standards and acceptance rules. There are variants for different environments (city, rural) and frequencies, but they all show the same pattern. The formula in very simple terms says: the radiation power goes down almost with 4th power of distance.

 

Tower/base station perspective

Let us try it out: Antenna transmission power is anywhere between 250mW (expressed as 24 dBm) for a Small Cell, and 120W for the largest 5G MIMO arrays (which is 50 dBm). A typical 2G, 3G, or 4G antenna has got a transmission power of 20W (43 dBm).

Let us quickly apply that to a user, standing in a relatively small distance to the transmitter:

A Small Cell is comparable to a WLAN access point, and you can come pretty close. We assume a distance of 10 m and get a path loss of 7.3+37.6=44.9dB. Subtraction of path loss from transmission power gives 24dBm – 45dB = -21dBm, which corresponds to approximately 8 µW. (µW is the 1 millionth part of a Watt)

A 5G macro cell antenna will be placed up on a tower or on the roof of a high building. Height above ground is thus some 30 m, and we assume a position in 100 m distance from the antenna. Path loss can be calculated to as =82.5dB. The received power is 50 dBm – 82.5 dB = -32 dBm, which is less than one µW.

A light bulb has about 60W energy consumption, and the emitted light and heat will be in that range. Since hat home your distance to a light bulb will be 2-3 meters. The impact from the light bulb on your body will be more than a million times higher. In is general consensus in medical and biological research that the only impact of microwave radiation, as the one used in mobile networks, is by heating up the target object.

 

Amendment

This post made a reference to the above. The author used some of my figures to create a true „horror“ case of mobile radiation, where radiation in the kilowatt range would hit humans. I want to show with this amendment why his construction is a misconception. The blogger basically uses his understanding of the term „antenna gain“. He claims that I had omitted to include antenna gain in my high-level calculations, and that antenna gain would turn my innocent-looking figures into a real power monster.

What is antenna gain? The term actually sounds like a hidden amplifier, which is a complete misconception. Instead, „directional gain“  would be a much better fit, and should be used in the technical literature.  Antennas are passive, with no electrical power connected.  They just receive the radio frequency signal from the transmission circuitry and convert it to electromagnetic waves. Since it doesn’t make a lot of sense to radiate in all spherical directions (upwards, downwards) antennas are constructed to focus the radiation into a solid angle, typically 120 degrees wide and 15 to 20 degrees high. For all who have seen the video, I would like to add the basic construction of such antennas (see figure below):

Directional antenna gain

You see the enclosures mounted on the pole on the left-hand side, and a schematic drawing next to it, showing the internal. The enclosure is what people see when they look at a cell tower. Inside you see 12 vertical beams in a 2×6 arrangement, the dipoles. These dipoles are the radiating elements. Each dipole gets only a fraction of the total transmission energy supplied to the antenna. This is where the video goes wrong. He is assuming that each dipole gets the full 20 Watts and with “thousands of dipoles” arrives at his key message. The total energy supplied to the antenna and radiated by the antenna does not change through this arrangement, though. Wave interference will have the effect that the wavefronts generated by the dipoles add up or cancel out depending on the direction. The energy is focused on the angle shown in the drawing. The 5G “Massive MIMO” technology just uses more dipoles (such as 64 in 8×8 or 128 in 12 by 8 configuration instead of just 2×6) and feeds them with dynamically delayed signals, so that the “beam” can move and sweep an area. And never ever are there “thousands of dipoles” used in antenna construction. There is no digital signal processor available today, to do the MIMO mathematics (which is complex matrix multiplication) for that many elements simultaneously and in real-time.

Antenna gain is the result of the radiation focus: The antenna in the picture has got an antenna gain of 15 dB. Which just tells you that in the main transmission direction there is 30 times more power than to the side. The total radiation supplied by the antenna remains unchanged.

By the way: The total transmission power is limited by legal and regulatory requirements. And regulatory administrations in all countries that I know of are adding up the total radiation level from a tower, and not just consider a single antenna.

 

Mobile Phone Perspective

Let us have a look at phone radiation, then. The phone next to your head is transmitting at a maximum about 200mW (which is 23 dBm). That is at least 10,000 times more than the signal received from the tower. Typical transmission power values of phones are a lot lower, though. The base station at the tower controls the power of the phone. It sets phone transmission power to a level so that all phone signals are received at approximately the same strength. If you are near a tower, your phone will transmit at a minimum level (which is below one milliwatt, again). Only if the reception from the tower is very bad, your phone will be commanded to increase transmission power. It may sound crazy, but: more mobile base stations mean less overall radiation levels.

The power of phone transmission has gone down since the first generations of mobile communication. In GSM phones were allowed to transmit up to 1.0W (sometimes even 2W). You may remember that 20 years ago the typical heavy user was holding the phone against the head, and making voice calls all the time. With today’s smartphones the typical user does hardly make a phone call any more, and instead is holding the phone about 1m away from the face for screen interaction.

The impact of phone radiation since the early 2000s has dramatically gone down. If there was any health effect from mobile phone radiation, we should have started to see it in the meantime. We had millions of users exposed to higher radiation than today in the past 20 years. For example, there is simply no increase in the cancer rates predicted by some people already since the year 2000. None of the studies that are always cited by the alarmist news have ever passed scientific quality reviews. They have been rejected on the basis of selection bias, too small sample sizes, and many other reasons. The WHO and national health administrations give a very critical review of these studies. If you are interested in more details about those aspects, see the following post.

Author

Kurt Behnke

Kurt Behnke received his PhD in Mathematics from University of Hamburg (Germany) in 1981, and his second degree (Habilitation) in 1986. He published more than 20 original research papers, joined the Max Planck Institute of Mathematics in Bonn as a guest researcher, was a Visiting Fellow at Warwick University, and was awarded a Heisenberg Research Grant from German Research Council in 1987. In 1991 he made a career shift to telecommunications, where he worked for Philips Communications on OSI system management applications for fiber transmission systems (SDH). In 1993 he joined the California based startup Raynet, and helped creating the first optical fiber access system. From 1997 on he worked for Ericsson in various national and international assignments. These included a position as Director of Customer Operations for T-Mobile International, and one as Head of Managed Services North Africa and Middle East. He went on old age retirement in 2017, where he keeps himself busy by starting to lecture part time on data communications and mobile networks at a local university, and he has occasional assignments for consultancy in the areas of LTE, 5G and machine-type-communications.

45 Comments
Auro
23 October 2019 at 20:03

Thank you so much for this very informative article!

 
Rommel
20 January 2020 at 12:55

Since you proclaim that you are an expert, what kind of worst case scenario test did you do and if ever there was a test how long were the duration, how many Cell transmitter did you use on the same spot and how many cellphone side by side did you test and kindly publish you experiment procedure, duration and data. Since you are a PhD you know what worst scenario means…. tsk tsk tsk tsk

 
Dr Kurt Behnke
30 March 2020 at 11:57

This type of comment doesn’t make it easy to give a serious response. I know that conspiracy theorists are using the “ad hominem” tactics for attacking unwelcome facts. You start with an implicit personal attack and you end with a similar one. No, I am not “proclaiming” to be an expert, and I have never claimed to be one. Just spent more than 25 years in this industry, and have a background that allows me to confidently work with figures and numbers. And then, what does your question have to do with being an expert or not?

But let me try to respond to your question anyway: Simulations at 3GPP have included 1 Million devices per square kilometer (that is one per square meter). In normal city environments you can “fit” up to 10 base stations to a square kilometer. Football stadiums are equipped with up to 70 base stations. During a football afternoon there are up to 80,000 visitors in the arena, and you can safely assume that all of them carry a mobile phone. Those are “worst case” scenarios. When it comes to one “spot”, all countries that I know of, have tough regulatory constraints on radiation power installed in the same location. Internationally there is an agreed upper limit of 60 V/m field strength. Providers have to apply for permission, with detailed information on used equipment etc.

With all this said: the conclusion is that the radiation impact is 99% from devices, not from towers. And that can be verified exactly by measurements. And you can take those measurements everywhere. And the impact from devices is solely the impact from your personal device, carried by you. Even the accumulated power of 1000 devices in a crowd of people near you is negligible compared to what you hold in your hand. Distance matters a lot when it comes to radiation.

 
Omonigho Anthony E.
21 April 2020 at 08:22

Very clear explanation Dr. much appreciated

 
Michael Klaus
14 May 2020 at 00:31

The example of the football stadium is a worst case scenario only in the range of officially intended use. Should a hacker/government succeed in directing the maximum possible number of beams at one point, how much power would this exert on the volume (e.g. compared to the active denial system)?
There’s a fear that 5G might be used as a weapon, and having this ‘catastrophic case’ number is utterly important to demonstrate how (un)realistic it is.

 
Dr Kurt Behnke
26 May 2020 at 08:08

The scenario you point out is “bad sci-fi” at best. I’ll give you a technical rebuttal, but I would also like to point out where the “weapon 5G” story comes from.

5G mm waves (above 28 GHz of spectrum) propagate more or less via straight lines. No massive reflection, no bending around corners, only line of sight. Which means that you “illuminate” only the square or street the base station is located. High frequency waves encounter a massive path loss, as detailed in another comment above. Within 100 m from the source the signal attenuates by more than 120 dB (which is a factor 1:10^(-12)). With this in mind, let us play your “worst case and weapon game”, to stop this story for once and forever.

I don’t want to give your “evil hacker” unlimited “god-like” power. He will have to stay within the physical limits of the system. But I ignore technical limitations he would encounter, such as
Technology does not allow direct manipulation of the beamforming (but we assume he has got that). In fact, the beams sweep the whole area all the time; otherwise there would not be a chance for new customers to sign on.
The O&M System includes a couple of AI and rule-based components, plus intrusion detection and prevention. (But we may assume that he is an employee with admin right on the Network Operations Center (NOC)).
I assume an open square of 100×200 meters, 66 base stations equally distributed around the square, which means that every 20 m you have got one pole. My next assumption is: each of these base stations has got 3 radio heads (transmission amplifiers) that each generate 60 Watts of output each for massive MIMO antennas with 40 dB of antenna gain (that is equivalent to a 6 m satellite dish, and of similar size – 4000 transmitting dipoles). Note that the legal limits are at 20 W at best, with an accepted antenna gain of 16 dBi. And we will assume that hour hacker directs all of the transmitters to send their signals towards single point of the square, simultaneously. We will pick 100 points at random on the square, and calculate the pathloss and the received power from the Friis free space formula, which includes distance and frequency. The point with highest radiation level received and the average are included in the table below. You will notice that at no point we will have more than 0.2 Milliwatt of received radiation, which is 1/1,000 of the phone radiation. For everybody wanting to hurt people using millimeter waves sucks.

This is also the point where the alleged military frequency story breaks down: Decades ago, US Army research actually did some experiments around 60 GHz microwave weapons, after the technology allowed to generate this frequency (which is non-trivial). The 60 GHz were not chosen because they were anything like “lethal”, but because you could hope to aim and direct a beam towards a target. Something that is hopeless for 5 GHz frequencies. According to my knowledge they tried a transmission of 100,000 Watt, in order to create a non-destructive weapon. It seems to have resulted in the conclusion that it would be more efficient and more destructive to throw the massive antenna at the enemy directly. 10 Kilowatt transmitters have been tried as a non-lethal police weapon; this failed too.

 
Stavros
18 February 2020 at 15:34

Its nonsense and manipulation. Why don’t you compare 60W X-Ray transmitter to the light bulb? Even amateurs know that different wave length cannot be compared because the impact on the human body is always different.

 
Dr Kurt Behnke
30 March 2020 at 11:59

I did the comparison of mobile radiation with light bulb as an illustration. Both are electromagnetic waves. The frequency of mobile radiation (also known as microwave) is much lower (by a factor of 100,000) than the frequency of light. Which means that the photons of light are more energetic (by that factor) than the photons of microwave radiation. Thus the damage that light from a light bulb can cause on living tissue is much bigger than the possible damage caused by microwave of same energy.
And I am not comparing X-Ray, because X-Ray frequencies are 1000 times higher than those of light. X-Ray photons are a lot more aggressive than light photons, and 100,000,000 times more aggressive than mobile phone microwave radiation.

 
hexalm
19 April 2020 at 00:38

Just a minor quibble, but a 60W bulb rating isn’t comparable to an antenna power output rating. Most of the 60W of the bulb will be radiated as heat rather than light/EM as incandescent bulbs are only 5% efficient.

 
Dr Kurt Behnke
22 May 2020 at 13:12

Yes, but heat radiation is again electromagnetic radiation. And 100% of the 60 Watt goes into radiation, finally. Be it visible light, a minor portion of UV, and infrared.

 
Fjeldfross Ph.D.
20 April 2020 at 03:05

While I basically agree with your analogy to the light bulb from an energy point of view, there is an effect of wavelength/frequency. Microwaves work by using the resonant frequency of water molecules, causing them to vibrate and generate heat by friction. This doesn’t happen with light because it’s higher frequency doesn’t have the same effect. So the effect of the radiation on the body can be different between different frequencies. Neither microwaves nor visible light (is ‘visible light’ a tautology?) have enough energy to cause cellular damage because they are not in the ionising spectrum. Microwaves penetrate tissues to only a very small depth. Microwave cooking depends mostly on conduction of heat in the surface layers towards the centre of the mass.

 
Bub Dub
11 April 2020 at 21:24

Wavelengths at visible and below are non-ionizing, and the worst they can do is heat you up. If you don’t feel heat coming from the tower, you’re not being affected. That’s why a lightbulb can only hear your skin, but it can’t give you skin cancer, whereas being in the sun for an hour can–ultraviolet radiation is ionizing at higher frequencies.

 
Dr Kurt Behnke
4 May 2020 at 09:28

I really do not know what you are trying to tell me here. Except throwing terms like “nonsense”, “manipulation”, “lack of knowledge” and “ignorance” (just a quick selection from your 8 lines of text) at me. I think you are disqualifying yourself with such a comment. Let me recall that I have made three main points:
– If there would be any impact from electromagnetic radiation associated with mobile communication on human bodies, it would come from the phone in user’s hand. Not from a tower, how close it might be, and not from other devices nearby.
– The transmission power of mobile phones is strictly limited by law and regulation. It is more than an order of magnitude lower than the power from a nearby 60 W light bulb. Where the energy of the photons of microwave radiation is 10,000 times lower than the energy of photons of visible light. And without high photon energy, no impact on body chemistry and health.
– All ‘studies’ that claim to prove negative impacts on human health and wellbeing show major flaws in methodology, data collection and analysis. The major international studies, as conducted by the WHO show no impact.
If you want to criticize any of these points, please do so. But I would expect solid arguments, based on science, not on conspiracies. And no insults, please.

 
Stavros
18 February 2020 at 15:29

Comparing any radio transmitter to 60W light bulb proofs that a writer may have lack knowledge about light and radio waves spectrum. Its total nonsense. Its typical “scientific position” that ignores everything other than what it is currently analyzing.

 
Dr Kurt Behnke
30 March 2020 at 12:00

I did the comparison of mobile radiation with light bulb as an illustration. Both are electromagnetic waves. The frequency of mobile radiation (also known as microwave) is much lower (by a factor of 100,000) than the frequency of light. Which means that the photons of light are more energetic (by that factor) than the photons of microwave radiation. Thus the damage that light from a light bulb can cause on living tissue is much bigger than the possible damage caused by microwave of same energy.
And I am not comparing X-Ray, because X-Ray frequencies are 1000 times higher than those of light. X-Ray photons are a lot more aggressive than light photons, and 100,000,000 times more aggressive than mobile phone microwave radiation.

 
Richard Cliffe
20 April 2020 at 23:06

Thank you Dr Behnke for your detailed explanation and patience. It is helpful to have the detailed explanation and actual numbers. I spent a lifetime in military telecomms and heavy radar and now find myself having to spend time in my retirement debunking the faux science used to propagate 5G conspiracy theories.

 
Fred Nemano PhD
4 April 2020 at 03:01

You clearly do not understand the nature of energy (transmitted and incident) and the electromagnetic spectrum – hence your confusion with light bulb energy vs. radio waves….The physics of EMR is well established (for many decades in fact). And it is absolutely science! Your comments dismissive of the scientific position betray what I can only describe, with all due respect, as ignorance. Unless of course your are proposing a different ‘physics’ to that of Maxwell, Einstein etc.

By way of a brief illustration, let assume you are placed in front of THREE 60w direction narrow beam – alfa, beta and gamma energy generators/transmitters – at a distance of say 2m.

You will find that Alpha particles cannot penetrate intact skin while Beta particles can partially penetrate skin, causing “beta burns”. Gamma and x-rays can pass through a person damaging cells in their path.

 
Adam
11 April 2020 at 07:01

Please note that alpha and beta radiation are not forms of electromagnetic radiation, thus are not comparable to this situation described in the blog post regarding exposure to a light bulb vs that of a cell tower, which are both sources of EM radiation.

 
Joshua
18 April 2020 at 20:59

What ln earth are you talking about? Alpha and beta radiation is NOT electromagnetic radiation… they are highly energetic particles.of matter. Visible light, 5G, infrared are all part of the electromagnetic spectrum (photons) and therefore directly comparable

 
Richard Cliffe
20 April 2020 at 22:59

Fred, Alpha radiation is not electromagnetic radiation (it is a helium nucleus – a particle) and neither is beta radiation (electrons- a particle). You may have dearly held beliefs on the subject but your comments add no value because they are meaningless. A 16 year old school science student would recognize your comments as such.

 
Steve Quon, PhD, Physics
27 April 2020 at 00:01

Fred, to clarify, your are comparing alpha (doubly ionized He) and beta particles (free electrons) with EM radiation, that is particle emission versus EM wave radiation. The “burns” are due to kinetic energy collisions. Gamma and x-ray EM radiation can cause ionization of electrons. As far as 5G frequencies, we are talking about RF adsorption at the water adsorption of frequency of 2.45 Ghz which can cause heating.

 
Dr Kurt Behnke
4 May 2020 at 09:29

Again, I really do not know what you are trying to tell me here. Except throwing terms like “nonsense”, “manipulation”, “lack of knowledge” and “ignorance” (just a quick selection from your 8 lines of text) at me. I think you are disqualifying yourself with such a comment. Let me recall that I have made three main points:
– If there would be any impact from electromagnetic radiation associated with mobile communication on human bodies, it would come from the phone in user’s hand. Not from a tower, how close it might be, and not from other devices nearby.
– The transmission power of mobile phones is strictly limited by law and regulation. It is more than an order of magnitude lower than the power from a nearby 60 W light bulb. Where the energy of the photons of microwave radiation is 10,000 times lower than the energy of photons of visible light. And without high photon energy, no impact on body chemistry and health.
– All ‘studies’ that claim to prove negative impacts on human health and wellbeing show major flaws in methodology, data collection and analysis. The major international studies, as conducted by the WHO show no impact.
If you want to criticize any of these points, please do so. But I would expect solid arguments, based on science, not on conspiracies. And no insults, please.

 
Martin Ericson
24 February 2020 at 18:09

Your conversion of dBm to W is wrong in all examples.

 
Dr Kurt Behnke
4 May 2020 at 09:09

How so? That is a very broad statement of yours! I offer a challenge: I convert a Watt value into dBm, and you show me what you think is wrong with it: 20 Watt = 20,000 Milliwatt. And 10 log_10⁡〖20,000=43〗 up to a few digits after the decimal point. Which tells us: 20 Watt (b.t.w. the power limit to mobile transceivers on cell towers) correspond to 43 Decibel relative to a reference value of 1 Milliwatt, which is what we shorten to 43 Decibel-Milliwatt, or 43 dBm.

 
Grant
30 March 2020 at 17:04

Thank you for your time and effort in explaining these concepts. It is refreshing to hear from an expert in the field of EM explaining the magnitude differences that the fear-mongering, google-expert, tin-hat-brigades conveniently disregard. I like the way you bring in the visual spectrum in the explanation – something that everyone can relate to. I am an electronic engineer myself who specializes in wireless communication and am thus constantly bombarded with these conspiracy theories and asked for explanations. I now conveniently now refer all queries to your site. Vielen Dank!

 
Jonathon
5 April 2020 at 02:44

Hi thanks for the explanation. Got a question if you dont mind answering. Was a wondering about the wattage of the satellites being used, as you mentioned power drops off over distance. Been struggling to find much info on how it actually transmittes from the satellites, but if the power is mega high isn’t there a danger of it interacting with planes or even birds in someway. I’m guessing if something was to be hit with a beam in the 500watts range theyd know about

 
Dr Kurt Behnke
4 May 2020 at 09:12

Thanks for your question. Just to clarify at the beginning: There are no satellites in 5G or any mobile communication. 5G is all terrestrial stuff — the professional abbreviation for the mobile networks is PLMN = Public Land Mobile Network. There is a lot of satellite communication, though. It works through basically the same kind of electromagnetic wave transmissions that we use in land-based mobile and fixed communications.

There are different types of satellite orbits to begin with: Low Earth Orbits (LEO) up to 2000 km of altitude, High Earth Orbits (HEO) with geo-stationary positions over the equator at approximately 36,000 km altitude, and Medium Earth Orbits (MEO) which designates everything in between. ISS is LEO, but also all GPS, GALILEO or GLONASS positioning satellites. The TV satellites and all data transmission satellites are in HEO positions. Otherwise you would have to use an electronic adjustment with your satellite TV dish to stay tuned.

Transmission power of satellites is surprisingly low; it is quite common that each transponder transmits 200 – 300 Watts (corresponding to decibel values of 53 to 55 dBm). They are using pretty large parabolic antenna dishes to focus that power into a smaller angle. There is a massive path loss on the long distance to the surface (and given 200 to 1000 km height of a “Low Earth Orbit” I consider even an airplane at 10 km altitude as being on the surface).

The GPS signal arrives down here at a reasonable level to be received by the small antennas of mobile phones and fitness gadgets. It is a very low bandwidth signal, so that there is only little thermal noise involved. The TV-Sat signal arrives here with a power level that is on the edge of recognizability. Thermal noise on the link is already higher than the signal level. It needs a rather big dish (gain = 25 dB) and the so-called LNB (gain = 15 dB) to bring the signal to a level where it can be detected and decoded by TV Sat receivers.

To compare with mobile phone signals: depending on distance to site, the signal received by your phone from the cell tower will be between -50 dBm and -115 dBm. Which is another confirmation of my point in the blog post.

 
Arjen
9 April 2020 at 15:27

Quick question around the units.

When you run the calculations, you are talking about watt rather than watt per square meter. What am I to infer? That it is the total power output through a cross section of the beam? Or just through a 1 meter square and that the Okumura-Hata formula already includes the inverse square law?

 
Dr Kurt Behnke
22 May 2020 at 13:00

The Okumura-Hata formulas do indeed include the aperture of an (idealized) isotropic antenna – hence in units it is Watt and not Watt/m2. A normal dipole, as used in most mobile phones has an antenna gain of 2.15 dBi (which I have added in the calculatins that I did, but have not explicitly written down.

Antenna arrays and MIMO is a very difficult subject for phones; they define a directional preference, and thus would require the user to aim at the transmitter, which most users would consider an inconvenience, to put it mildly. Thus MIMO in phones is more or less “transmit diversity”, which doesn’t add antenna gain.

 
Wes Groleau
14 April 2020 at 01:58

I’ll take your word about the 4th power, though I was taught the “inverse square law,” i.e., second power. Either way leads to the same conclusion that the phone in my hand does more than the tower next door. However, you cite a formula for 1GHz. Doesn’t 5G Work on 60 GHz?

 
Dr Kurt Behnke
22 May 2020 at 13:04

Ok. This requires to clarify a few things.
1. “5G use 60 GHz” is a massive, but very common misunderstanding. 5G uses a lot of frequency bands, with the lowest currently defined at 450 MHz. The most popular one at the moment is the 3.4 GHz – 3.8 GHz, so just above the main LTE/4G frequency of 2.6 GHz. By the way: those were the frequencies reserved for the now discontinued WiMax technology – nobody complained about them in 2010. A comprehensive list of frequency bands used country by country is found here: https://en.wikipedia.org/wiki/List_of_5G_NR_networks. High frequencies for 5G start at 28 GHz. Between 6 GHz and just above 20 GHz spectrum is reserved for satellite communication. There are a couple of High Frequency bands defined, but hardly used. Most installations of the so-called FR2 bands are purely experimental.
2. Microwave radiation spreads differently, depending on frequency. For the low frequency range (450 MHz to 6 GHz) the exponents vary between 3.5 (450 MHz) and 3.9 (6 GHz). That is what I described as 4th order. A pure “plane terrestrial propagation model” would result in a clear-cut inverse 4th power. Now waves are reflected on the ground and turn around corners, to rejoin at the receiver. Which eventually makes the path loss formula that we use.
If we look at high frequencies, their propagation characteristics are very different from those lower frequencies. They tend to follow straight lines, they are not bent around corners, and they are reflected only by very smooth surfaces (like glass walls of modern office buildings). Thus there is a “Line of Sight” attenuation (LOS), and a “Non Line of Sight” attenuation. LOS does indeed follow an inverse square Free Space propagation formula. But keep in mind that this also depends on wavelength, not just on distance. NLOS has repeatedly been measured by research teams. A measurement project at NYU resulted in PL(d)[dBNLOS] = 68+45 log_10⁡d for Non Line of Sight with d now measured in meter, where 28 GHz Line of Sight would mean PL(d)[dBLOS]=62+20 log_10⁡d. Which means: for a distance of 100 m from the source, you experience a path loss (LOS) of 102 dB and (NLOS) of 158 dB. There is a factor of 400,000 (!) between the two. Which means that the effect of high band waves is restricted to the immediate neighborhood of the source (square, street, etc).

 
arthur heath
14 April 2020 at 14:25

WHAT ARE THE REAL LONG TERM AFFECTS, OF EXPOSURE AND CAN IT CAUSE BIO PHYSICAL RESPONSES IN HUMAN BEING AS WELL AS ANIMALS. REDUCING OUR ABILITY TO FIGHT OF BIO-CHEMICAL OR VIRAL ATTACKS.

 
Dr Kurt Behnke
22 May 2020 at 13:05

What is „real long term”? Mobile phones and networks exist since about the 1940s. Mass deployment of 2G started more than 20 years ago. High frequency microwaves (up to 50 or 60 GHz) have always been used since the 1960s, when the first communication satellites were sent up to orbit the earth. And we have 1000s of such satellites up there today, with exact same kind of radiation, providing satellite TV around the world. If we haven’t seen any long-term effects yet (and we haven’t) we can safely say: there are none.
Conspiracy believers just don’t want to notice that modern technology is able to draw so much more out of the same or even less. Compare the computing power of your first clumsy microcomputer and of your smartphone today.

 
Fabio Coelho
16 April 2020 at 00:26

Dear Kurt,
Tks very much for your effort and time put on this article. I have some background knowledge to understand and refute with my sceptic friends – who very often fall into the well laid quasi pseudo-scientific arguments that 5G poses health risks – but your explanations have really helped me on a better argument.
Now, I have two questions for you if you would be so kind as to comment, which you may or may not covered already:
1. One common argument is that these frequencies have never been used before in consumer applications… and that this is the wavelength used in a weapon, the Active denial system. I couldn’t actually find the power for the ADS anywhere…
2. I understand TV broadcast bands, have been moved from analogue to digital to free up the UHF band for 4G. Weren’t the analogue TV emission stations powered in the kW to MW?
And if operating on the sub 3GHz, similar to microwave cookers, at a longer and more penetrating wavelength. Should we, therefore, be seeing whole towns and villages near TV broadcast stations with a higher count of cancers, immune diseases or whatever for the past … Why isn’t anyone talking about analogue TV effects on health over the past 50 years or so?

 
Dr Kurt Behnke
22 May 2020 at 13:08

Thanks for your question.

Answer to number 1: No, that is not true. All the frequencies now in discussion for 5G have been in technical use across the globe for a long time already. When we talk about “bulk” 5G then this is plain obvious. Those frequencies have been used by terrestrial TV broadcast (sub 1 GHz), by GSM and UMTS systems (between 1 GHz and 3 GHz roughly), by WiMax (3 GHz to 4 GHz). On top GPS uses 1.5 GHz from some 50 satellites, and since your phone is receiving the signals, you can safely assume that their power is equal to the signal from the base station you are connected to. WiFi is 2.4 GHz and 5 GHz in widespread use (I assume that every household with an internet access has got a WiFi router working in their house).
For the higher frequencies: up from 6 GHz we have got the satellite range. All communication satellites, including satellite TV, broadcast between 6 GHz and some 80 GHz. Received power levels are somewhat weaker (that is why we use the parabolic dish antennas). The 5G high frequencies were put in the gaps not yet occupied by satellites.

Answer to number 2: Yes, absolutely correct. TV towers transmit 100 Kilowatt EIRP typically, which is several thousand times more powerful than any radio base station (2O Watts). And we do not see this.
I would like to give you a counter-argument, just in case this comes up in your discussions: The historical origin, to my best knowledge, of the “mobile networks cause cancer” story is in England, some time after WW2. Radar (I forgot to mention that earlier) is using microwaves up to 300 GHz. It happened that operating staff of flight radar stations had a higher rate of cancer, and in an early conclusion this was traced back to the microwave. To make a very long story short: the reason was that the early radio valves used to create microwave actually also created X-ray at a considerable rate. Which of course was the reason for the cancer cases among the staff. Today there is no radio valve involved in microwave.

 
R
17 April 2020 at 22:21

I agree with you that a person’s exposure from the mobile is far higher than from the base station. The loss formula you use has no factor for carrier frequency, yet we know range drops 6dB for every doubling in frequency. Is the formula just for nominal 2GHz LTE and legacy bands? For example, we see 28GHz mmWave loses 109dB at 250m but 900MHz travels much farther.

 
Dr Kurt Behnke
22 May 2020 at 13:12

I have built in the frequency and used a formula for 1 GHz. Didn’t want to bother the reader with the truly complex story of dependency on frequency. I also have skipped the “environment part” of the formula. There are different formulas for path loss in rural, urban and suburban regions. There is also a “log-normal statistical component”.

Back to your question: Yes, Path Loss increases by 6 dB per doubling of frequency. On the other hand for 28 GHz you have a different propagation characteristics. While for the lower frequency bands the reflection on ground is responsible for the inverse 4th power loss, in the high band propagation is very similar (not quite though) to your infrared remote TV control. The propagation goes direct, without reflection. In contrast to the infrared case (with much higher frequency) you still have some shadowing loss. That is: we should expect a high frequency-depending initial attenuation, plus an inverse square law for the distance depending factor.

The dramatic attenuation with distance is e.g.: 145 dB over a distance of 200m. In my model in the post the 1 GHz wave would lose just over 90 dB on the same distance.

 
D. Atanasov, M.SC.
20 April 2020 at 20:50

Thank you for this article, Dr Behnke.
I have a question. What is the need for so many dipoles in one antenna when they do not work simultaneously. Thank you!

 
Dr Kurt Behnke
22 May 2020 at 13:13

The dipole array generates a directional effect. The input signal to the antenna is equally split among the dipoles. They transmit their signal in synch. If you stand in front of the antenna (in a distance – I mean directional) then you get all the signals in synch, and the waves are adding up to the original strength. When you look at the antenna from an angle, the distance between you (your phone) and each dipole varies by little. That makes a little shift of the waves from each dipole relative to each other. Overlay of these ways in the receiver antenna reduces the energy received. So that you have a strong wave in the direction perpendicular to the antenna, and a weak signal (or no signal) from the side.
A dish antenna like the ones used for satellite TV has the same effect, but much stronger. Mobile networks cannot use such a strong directional effect.

 
Richard Corfield
21 April 2020 at 11:05

Worth noting that the higher power towers will be on the lower frequencies, those that until recently were used for TV. If you look at the information for our local TV transmitter you see that it’s putting out about 500,000W in total! These phone towers are really small in comparison. You’d not want to climb a TV transmitter tower while it was turned on! But we accept TV as comforting, old, technology.

 
Gary Lewis, New Zealand
18 May 2020 at 00:13

Good morning Kurt.
Congratulations on a well balanced and sane presentation. I thought you deflected some fairly nasty personal attacks with dignity and grace.
Here in New Zealand, we have experienced criminals setting fire to 8 cell towers in the last two days, but our police will almost certainly track them down and prosecute.
Kind regards to you.

 
Dr Kurt Behnke
26 May 2020 at 08:12

Thank you very much. And sorry to hear that. Same is happening in a few places across Europe and North America. It is a shame to see how far conspiracy believe can drive people.

 
Josh Higgs
7 June 2020 at 18:48

Finally some common sense, free of conspiracy and anecdotes.Thank you Kurt for taking the time to explain this, and thanks google for putting it in my search results.

 
Bob Hay
12 June 2020 at 04:24

Thank you for your excellent comments. I teach a grad engineering class in RF Design and your analysis is spot on. I also spent a hour listening to a bunch of arguments against permitting 5G base stations in our community and was amazed by how many conspiracy theories there are about the dangers posed by these base stations. People using big technical words they do not understand. Things like “pulsed energy” which, of course, is pretty much the same as 4G. As you point out, anyone worried about the effects of RF energy should immediately stop using cellphones, WiFi, Bluetooth, and, for that matter, microwave ovens. The radiation of 5G from base stations is minuscule in comparison.

 

Leave a Reply

Your email address will not be published. Required fields are marked *


 

Newsletter