Is this anything to worry about? 5G health issues explained

Is this anything to worry about? 5G health issues explained

Author:


26.03.2019

5G Health Issues Explained

Introduction

Announcements of a new mobile network technology generation (5G) have triggered a series of alarming claims about connected health threats. This is nothing new: this phenomenon is with us since the 1990s; those alarming claims have been made around the launch of UMTS (3G) in the year 2000, and with the start of LTE in 2010, too. This time it is quite a bit heavier than in the past, mainly due to the existence of social networks, which tend to spread alleged “bad news” and alarming stories literally at light speed. Public opinion is first and foremost against the cell phone towers as they are the visible landmarks of the technology. Each time a tower for a cellular network is built or planned to be built in a city or near a rural settlement, there is a new discussion about health issues of mobile phone or network radiation. It is about time to put things back into perspective. I’d like to deal with the reality of mobile network radiation in this post.

My first statement here is: Major exposition of humans from mobile radio technology is from handheld phones, not from base stations!

The reason is very simple. The power of electromagnetic radiation goes down with distance extremely fast when moving away from the transmitter. See the post https://www.grandmetric.com/blog/2018/02/20/explained-pathloss/ from Mateusz Buczkowski in this blog for a general introduction of the concept of path loss.

For a frequency of 1 GHz (typical range for mobile phone networks) the path loss measured in Decibel units is

where r is the distance from the source to the measurement point in meters. This is a formula due to the Japanese scientists Okumura and Hata, who have done endless series of measurements and have compiled them into empirical formulas. The Okumura-Hata formulas are internationally accepted and part of the mobile phone standards and acceptance rules. There are variants for different environments (city, rural) and frequencies, but they all show the same pattern. The formula in very simple terms says: the radiation power goes down almost with 4th power of distance.

 

Tower/base station perspective

Let us try it out: Antenna transmission power is anywhere between 250mW (expressed as 24 dBm) for a Small Cell, and 120W for the largest 5G MIMO arrays (which is 50 dBm). A typical 2G, 3G, or 4G antenna has got a transmission power of 20W (43 dBm).

Let us quickly apply that to a user, standing in a relatively small distance to the transmitter:

A Small Cell is comparable to a WLAN access point, and you can come pretty close. We assume a distance of 10 m and get a path loss of 7.3+37.6=44.9dB. Subtraction of path loss from transmission power gives 24dBm – 45dB = -21dBm, which corresponds to approximately 8 µW. (µW is the 1 millionth part of a Watt)

A 5G macro cell antenna will be placed up on a tower or on the roof of a high building. Height above ground is thus some 30 m, and we assume a position in 100 m distance from the antenna. Path loss can be calculated to as =82.5dB. The received power is 50 dBm – 82.5 dB = -32 dBm, which is less than one µW.

A light bulb has about 60W energy consumption, and the emitted light and heat will be in that range. Since hat home your distance to a light bulb will be 2-3 meters. The impact from the light bulb on your body will be more than a million times higher. In is general consensus in medical and biological research that the only impact of microwave radiation, as the one used in mobile networks, is by heating up the target object.

 

Amendment

This post made a reference to the above. The author used some of my figures to create a true „horror“ case of mobile radiation, where radiation in the kilowatt range would hit humans. I want to show with this amendment why his construction is a misconception. The blogger basically uses his understanding of the term „antenna gain“. He claims that I had omitted to include antenna gain in my high-level calculations, and that antenna gain would turn my innocent-looking figures into a real power monster.

What is antenna gain? The term actually sounds like a hidden amplifier, which is a complete misconception. Instead, „directional gain“  would be a much better fit, and should be used in the technical literature.  Antennas are passive, with no electrical power connected.  They just receive the radio frequency signal from the transmission circuitry and convert it to electromagnetic waves. Since it doesn’t make a lot of sense to radiate in all spherical directions (upwards, downwards) antennas are constructed to focus the radiation into a solid angle, typically 120 degrees wide and 15 to 20 degrees high. For all who have seen the video, I would like to add the basic construction of such antennas (see figure below):

Directional antenna gain

You see the enclosures mounted on the pole on the left-hand side, and a schematic drawing next to it, showing the internal. The enclosure is what people see when they look at a cell tower. Inside you see 12 vertical beams in a 2×6 arrangement, the dipoles. These dipoles are the radiating elements. Each dipole gets only a fraction of the total transmission energy supplied to the antenna. This is where the video goes wrong. He is assuming that each dipole gets the full 20 Watts and with “thousands of dipoles” arrives at his key message. The total energy supplied to the antenna and radiated by the antenna does not change through this arrangement, though. Wave interference will have the effect that the wavefronts generated by the dipoles add up or cancel out depending on the direction. The energy is focused on the angle shown in the drawing. The 5G “Massive MIMO” technology just uses more dipoles (such as 64 in 8×8 or 128 in 12 by 8 configuration instead of just 2×6) and feeds them with dynamically delayed signals, so that the “beam” can move and sweep an area. And never ever are there “thousands of dipoles” used in antenna construction. There is no digital signal processor available today, to do the MIMO mathematics (which is complex matrix multiplication) for that many elements simultaneously and in real-time.

Antenna gain is the result of the radiation focus: The antenna in the picture has got an antenna gain of 15 dB. Which just tells you that in the main transmission direction there is 30 times more power than to the side. The total radiation supplied by the antenna remains unchanged.

By the way: The total transmission power is limited by legal and regulatory requirements. And regulatory administrations in all countries that I know of are adding up the total radiation level from a tower, and not just consider a single antenna.

 

Mobile Phone Perspective

Let us have a look at phone radiation, then. The phone next to your head is transmitting at a maximum about 200mW (which is 23 dBm). That is at least 10,000 times more than the signal received from the tower. Typical transmission power values of phones are a lot lower, though. The base station at the tower controls the power of the phone. It sets phone transmission power to a level so that all phone signals are received at approximately the same strength. If you are near a tower, your phone will transmit at a minimum level (which is below one milliwatt, again). Only if the reception from the tower is very bad, your phone will be commanded to increase transmission power. It may sound crazy, but: more mobile base stations mean less overall radiation levels.

The power of phone transmission has gone down since the first generations of mobile communication. In GSM phones were allowed to transmit up to 1.0W (sometimes even 2W). You may remember that 20 years ago the typical heavy user was holding the phone against the head, and making voice calls all the time. With today’s smartphones the typical user does hardly make a phone call any more, and instead is holding the phone about 1m away from the face for screen interaction.

The impact of phone radiation since the early 2000s has dramatically gone down. If there was any health effect from mobile phone radiation, we should have started to see it in the meantime. We had millions of users exposed to higher radiation than today in the past 20 years. For example, there is simply no increase in the cancer rates predicted by some people already since the year 2000. None of the studies that are always cited by the alarmist news have ever passed scientific quality reviews. They have been rejected on the basis of selection bias, too small sample sizes, and many other reasons. The WHO and national health administrations give a very critical review of these studies. If you are interested in more details about those aspects, see the following post.

Author

Kurt Behnke

Kurt Behnke received his PhD in Mathematics from University of Hamburg (Germany) in 1981, and his second degree (Habilitation) in 1986. He published more than 20 original research papers, joined the Max Planck Institute of Mathematics in Bonn as a guest researcher, was a Visiting Fellow at Warwick University, and was awarded a Heisenberg Research Grant from German Research Council in 1987. In 1991 he made a career shift to telecommunications, where he worked for Philips Communications on OSI system management applications for fiber transmission systems (SDH). In 1993 he joined the California based startup Raynet, and helped creating the first optical fiber access system. From 1997 on he worked for Ericsson in various national and international assignments. These included a position as Director of Customer Operations for T-Mobile International, and one as Head of Managed Services North Africa and Middle East. He went on old age retirement in 2017, where he keeps himself busy by starting to lecture part time on data communications and mobile networks at a local university, and he has occasional assignments for consultancy in the areas of LTE, 5G and machine-type-communications.

77 Comments
Auro
23 October 2019 at 20:03

Thank you so much for this very informative article!

 
Rommel
20 January 2020 at 12:55

Since you proclaim that you are an expert, what kind of worst case scenario test did you do and if ever there was a test how long were the duration, how many Cell transmitter did you use on the same spot and how many cellphone side by side did you test and kindly publish you experiment procedure, duration and data. Since you are a PhD you know what worst scenario means…. tsk tsk tsk tsk

 
Dr Kurt Behnke
30 March 2020 at 11:57

This type of comment doesn’t make it easy to give a serious response. I know that conspiracy theorists are using the “ad hominem” tactics for attacking unwelcome facts. You start with an implicit personal attack and you end with a similar one. No, I am not “proclaiming” to be an expert, and I have never claimed to be one. Just spent more than 25 years in this industry, and have a background that allows me to confidently work with figures and numbers. And then, what does your question have to do with being an expert or not?

But let me try to respond to your question anyway: Simulations at 3GPP have included 1 Million devices per square kilometer (that is one per square meter). In normal city environments you can “fit” up to 10 base stations to a square kilometer. Football stadiums are equipped with up to 70 base stations. During a football afternoon there are up to 80,000 visitors in the arena, and you can safely assume that all of them carry a mobile phone. Those are “worst case” scenarios. When it comes to one “spot”, all countries that I know of, have tough regulatory constraints on radiation power installed in the same location. Internationally there is an agreed upper limit of 60 V/m field strength. Providers have to apply for permission, with detailed information on used equipment etc.

With all this said: the conclusion is that the radiation impact is 99% from devices, not from towers. And that can be verified exactly by measurements. And you can take those measurements everywhere. And the impact from devices is solely the impact from your personal device, carried by you. Even the accumulated power of 1000 devices in a crowd of people near you is negligible compared to what you hold in your hand. Distance matters a lot when it comes to radiation.

 
Omonigho Anthony E.
21 April 2020 at 08:22

Very clear explanation Dr. much appreciated

 
Michael Klaus
14 May 2020 at 00:31

The example of the football stadium is a worst case scenario only in the range of officially intended use. Should a hacker/government succeed in directing the maximum possible number of beams at one point, how much power would this exert on the volume (e.g. compared to the active denial system)?
There’s a fear that 5G might be used as a weapon, and having this ‘catastrophic case’ number is utterly important to demonstrate how (un)realistic it is.

 
Dr Kurt Behnke
26 May 2020 at 08:08

The scenario you point out is “bad sci-fi” at best. I’ll give you a technical rebuttal, but I would also like to point out where the “weapon 5G” story comes from.

5G mm waves (above 28 GHz of spectrum) propagate more or less via straight lines. No massive reflection, no bending around corners, only line of sight. Which means that you “illuminate” only the square or street the base station is located. High frequency waves encounter a massive path loss, as detailed in another comment above. Within 100 m from the source the signal attenuates by more than 120 dB (which is a factor 1:10^(-12)). With this in mind, let us play your “worst case and weapon game”, to stop this story for once and forever.

I don’t want to give your “evil hacker” unlimited “god-like” power. He will have to stay within the physical limits of the system. But I ignore technical limitations he would encounter, such as
Technology does not allow direct manipulation of the beamforming (but we assume he has got that). In fact, the beams sweep the whole area all the time; otherwise there would not be a chance for new customers to sign on.
The O&M System includes a couple of AI and rule-based components, plus intrusion detection and prevention. (But we may assume that he is an employee with admin right on the Network Operations Center (NOC)).
I assume an open square of 100×200 meters, 66 base stations equally distributed around the square, which means that every 20 m you have got one pole. My next assumption is: each of these base stations has got 3 radio heads (transmission amplifiers) that each generate 60 Watts of output each for massive MIMO antennas with 40 dB of antenna gain (that is equivalent to a 6 m satellite dish, and of similar size – 4000 transmitting dipoles). Note that the legal limits are at 20 W at best, with an accepted antenna gain of 16 dBi. And we will assume that hour hacker directs all of the transmitters to send their signals towards single point of the square, simultaneously. We will pick 100 points at random on the square, and calculate the pathloss and the received power from the Friis free space formula, which includes distance and frequency. The point with highest radiation level received and the average are included in the table below. You will notice that at no point we will have more than 0.2 Milliwatt of received radiation, which is 1/1,000 of the phone radiation. For everybody wanting to hurt people using millimeter waves sucks.

This is also the point where the alleged military frequency story breaks down: Decades ago, US Army research actually did some experiments around 60 GHz microwave weapons, after the technology allowed to generate this frequency (which is non-trivial). The 60 GHz were not chosen because they were anything like “lethal”, but because you could hope to aim and direct a beam towards a target. Something that is hopeless for 5 GHz frequencies. According to my knowledge they tried a transmission of 100,000 Watt, in order to create a non-destructive weapon. It seems to have resulted in the conclusion that it would be more efficient and more destructive to throw the massive antenna at the enemy directly. 10 Kilowatt transmitters have been tried as a non-lethal police weapon; this failed too.

 
Stavros
18 February 2020 at 15:34

Its nonsense and manipulation. Why don’t you compare 60W X-Ray transmitter to the light bulb? Even amateurs know that different wave length cannot be compared because the impact on the human body is always different.

 
Dr Kurt Behnke
30 March 2020 at 11:59

I did the comparison of mobile radiation with light bulb as an illustration. Both are electromagnetic waves. The frequency of mobile radiation (also known as microwave) is much lower (by a factor of 100,000) than the frequency of light. Which means that the photons of light are more energetic (by that factor) than the photons of microwave radiation. Thus the damage that light from a light bulb can cause on living tissue is much bigger than the possible damage caused by microwave of same energy.
And I am not comparing X-Ray, because X-Ray frequencies are 1000 times higher than those of light. X-Ray photons are a lot more aggressive than light photons, and 100,000,000 times more aggressive than mobile phone microwave radiation.

 
hexalm
19 April 2020 at 00:38

Just a minor quibble, but a 60W bulb rating isn’t comparable to an antenna power output rating. Most of the 60W of the bulb will be radiated as heat rather than light/EM as incandescent bulbs are only 5% efficient.

 
Dr Kurt Behnke
22 May 2020 at 13:12

Yes, but heat radiation is again electromagnetic radiation. And 100% of the 60 Watt goes into radiation, finally. Be it visible light, a minor portion of UV, and infrared.

 
Fjeldfross Ph.D.
20 April 2020 at 03:05

While I basically agree with your analogy to the light bulb from an energy point of view, there is an effect of wavelength/frequency. Microwaves work by using the resonant frequency of water molecules, causing them to vibrate and generate heat by friction. This doesn’t happen with light because it’s higher frequency doesn’t have the same effect. So the effect of the radiation on the body can be different between different frequencies. Neither microwaves nor visible light (is ‘visible light’ a tautology?) have enough energy to cause cellular damage because they are not in the ionising spectrum. Microwaves penetrate tissues to only a very small depth. Microwave cooking depends mostly on conduction of heat in the surface layers towards the centre of the mass.

 
DP
26 August 2021 at 01:53

Comparing the “radiation” of the light bulb with the 5G radiation at frequencies from 3000 to 39000 MHz is not right. Just think of the bulb radiation pattern, kind of an isotropic radiator pattern. I agree with Mr. Fjeldfross Ph.D. Such a comparison IS NOT RIGHT AT ALL.

All those TALKING HEADS who swear that is not harmful (probably isn’t) should be put to live in a place close to a BTS with their families included if they are so convinced 100% that there is no problem at all. Let’s see how many that Talk the talk also do walk the walk!

Do you get paid you to write this? If so, who pays?

 
Kurt Behnke
25 October 2021 at 18:16

I have worked in buildings with less than 50m distance to major mobile sites during office hours (8 – 12 hours per day) in Düsseldorf, Germany 1997 – 2001, and 2005 -2007, 2013 – 2017 next to Vf Germany headquarter, in Bonn, Germany (2001 – 2005) just across the street from T-Mobile headquarters, in Cairo, Egypt, Smart Village, from 2008 to 2012. Been on many active sites during configuration and test runs as PM for network services. 70 years of age, still alive and well.

There is simply ZERO evidence for human health hazards from mobile. To the contrary: the explicit statistics of glioma, for example from United States and Germany, between 1990 and 2015, show a constant level of cases over 25 years. In a period, where mobile communication became a commonality, and basically everyone started using phones. The heaviest exposure was certainly during the late 1990s and early 2000s, when mobile communication was voice, with the phone directly at your head. This is 20 years in the past, and still no effect. What are you guys chasing here?

 
Bub Dub
11 April 2020 at 21:24

Wavelengths at visible and below are non-ionizing, and the worst they can do is heat you up. If you don’t feel heat coming from the tower, you’re not being affected. That’s why a lightbulb can only hear your skin, but it can’t give you skin cancer, whereas being in the sun for an hour can–ultraviolet radiation is ionizing at higher frequencies.

 
Dr Kurt Behnke
4 May 2020 at 09:28

I really do not know what you are trying to tell me here. Except throwing terms like “nonsense”, “manipulation”, “lack of knowledge” and “ignorance” (just a quick selection from your 8 lines of text) at me. I think you are disqualifying yourself with such a comment. Let me recall that I have made three main points:
– If there would be any impact from electromagnetic radiation associated with mobile communication on human bodies, it would come from the phone in user’s hand. Not from a tower, how close it might be, and not from other devices nearby.
– The transmission power of mobile phones is strictly limited by law and regulation. It is more than an order of magnitude lower than the power from a nearby 60 W light bulb. Where the energy of the photons of microwave radiation is 10,000 times lower than the energy of photons of visible light. And without high photon energy, no impact on body chemistry and health.
– All ‘studies’ that claim to prove negative impacts on human health and wellbeing show major flaws in methodology, data collection and analysis. The major international studies, as conducted by the WHO show no impact.
If you want to criticize any of these points, please do so. But I would expect solid arguments, based on science, not on conspiracies. And no insults, please.

 
Stavros
18 February 2020 at 15:29

Comparing any radio transmitter to 60W light bulb proofs that a writer may have lack knowledge about light and radio waves spectrum. Its total nonsense. Its typical “scientific position” that ignores everything other than what it is currently analyzing.

 
Dr Kurt Behnke
30 March 2020 at 12:00

I did the comparison of mobile radiation with light bulb as an illustration. Both are electromagnetic waves. The frequency of mobile radiation (also known as microwave) is much lower (by a factor of 100,000) than the frequency of light. Which means that the photons of light are more energetic (by that factor) than the photons of microwave radiation. Thus the damage that light from a light bulb can cause on living tissue is much bigger than the possible damage caused by microwave of same energy.
And I am not comparing X-Ray, because X-Ray frequencies are 1000 times higher than those of light. X-Ray photons are a lot more aggressive than light photons, and 100,000,000 times more aggressive than mobile phone microwave radiation.

 
Richard Cliffe
20 April 2020 at 23:06

Thank you Dr Behnke for your detailed explanation and patience. It is helpful to have the detailed explanation and actual numbers. I spent a lifetime in military telecomms and heavy radar and now find myself having to spend time in my retirement debunking the faux science used to propagate 5G conspiracy theories.

 
Fred Nemano PhD
4 April 2020 at 03:01

You clearly do not understand the nature of energy (transmitted and incident) and the electromagnetic spectrum – hence your confusion with light bulb energy vs. radio waves….The physics of EMR is well established (for many decades in fact). And it is absolutely science! Your comments dismissive of the scientific position betray what I can only describe, with all due respect, as ignorance. Unless of course your are proposing a different ‘physics’ to that of Maxwell, Einstein etc.

By way of a brief illustration, let assume you are placed in front of THREE 60w direction narrow beam – alfa, beta and gamma energy generators/transmitters – at a distance of say 2m.

You will find that Alpha particles cannot penetrate intact skin while Beta particles can partially penetrate skin, causing “beta burns”. Gamma and x-rays can pass through a person damaging cells in their path.

 
Adam
11 April 2020 at 07:01

Please note that alpha and beta radiation are not forms of electromagnetic radiation, thus are not comparable to this situation described in the blog post regarding exposure to a light bulb vs that of a cell tower, which are both sources of EM radiation.

 
Joshua
18 April 2020 at 20:59

What ln earth are you talking about? Alpha and beta radiation is NOT electromagnetic radiation… they are highly energetic particles.of matter. Visible light, 5G, infrared are all part of the electromagnetic spectrum (photons) and therefore directly comparable

 
Richard Cliffe
20 April 2020 at 22:59

Fred, Alpha radiation is not electromagnetic radiation (it is a helium nucleus – a particle) and neither is beta radiation (electrons- a particle). You may have dearly held beliefs on the subject but your comments add no value because they are meaningless. A 16 year old school science student would recognize your comments as such.

 
Steve Quon, PhD, Physics
27 April 2020 at 00:01

Fred, to clarify, your are comparing alpha (doubly ionized He) and beta particles (free electrons) with EM radiation, that is particle emission versus EM wave radiation. The “burns” are due to kinetic energy collisions. Gamma and x-ray EM radiation can cause ionization of electrons. As far as 5G frequencies, we are talking about RF adsorption at the water adsorption of frequency of 2.45 Ghz which can cause heating.

 
Dr Kurt Behnke
4 May 2020 at 09:29

Again, I really do not know what you are trying to tell me here. Except throwing terms like “nonsense”, “manipulation”, “lack of knowledge” and “ignorance” (just a quick selection from your 8 lines of text) at me. I think you are disqualifying yourself with such a comment. Let me recall that I have made three main points:
– If there would be any impact from electromagnetic radiation associated with mobile communication on human bodies, it would come from the phone in user’s hand. Not from a tower, how close it might be, and not from other devices nearby.
– The transmission power of mobile phones is strictly limited by law and regulation. It is more than an order of magnitude lower than the power from a nearby 60 W light bulb. Where the energy of the photons of microwave radiation is 10,000 times lower than the energy of photons of visible light. And without high photon energy, no impact on body chemistry and health.
– All ‘studies’ that claim to prove negative impacts on human health and wellbeing show major flaws in methodology, data collection and analysis. The major international studies, as conducted by the WHO show no impact.
If you want to criticize any of these points, please do so. But I would expect solid arguments, based on science, not on conspiracies. And no insults, please.

 
Martin Ericson
24 February 2020 at 18:09

Your conversion of dBm to W is wrong in all examples.

 
Dr Kurt Behnke
4 May 2020 at 09:09

How so? That is a very broad statement of yours! I offer a challenge: I convert a Watt value into dBm, and you show me what you think is wrong with it: 20 Watt = 20,000 Milliwatt. And 10 log_10⁡〖20,000=43〗 up to a few digits after the decimal point. Which tells us: 20 Watt (b.t.w. the power limit to mobile transceivers on cell towers) correspond to 43 Decibel relative to a reference value of 1 Milliwatt, which is what we shorten to 43 Decibel-Milliwatt, or 43 dBm.

 
Grant
30 March 2020 at 17:04

Thank you for your time and effort in explaining these concepts. It is refreshing to hear from an expert in the field of EM explaining the magnitude differences that the fear-mongering, google-expert, tin-hat-brigades conveniently disregard. I like the way you bring in the visual spectrum in the explanation – something that everyone can relate to. I am an electronic engineer myself who specializes in wireless communication and am thus constantly bombarded with these conspiracy theories and asked for explanations. I now conveniently now refer all queries to your site. Vielen Dank!

 
Jonathon
5 April 2020 at 02:44

Hi thanks for the explanation. Got a question if you dont mind answering. Was a wondering about the wattage of the satellites being used, as you mentioned power drops off over distance. Been struggling to find much info on how it actually transmittes from the satellites, but if the power is mega high isn’t there a danger of it interacting with planes or even birds in someway. I’m guessing if something was to be hit with a beam in the 500watts range theyd know about

 
Dr Kurt Behnke
4 May 2020 at 09:12

Thanks for your question. Just to clarify at the beginning: There are no satellites in 5G or any mobile communication. 5G is all terrestrial stuff — the professional abbreviation for the mobile networks is PLMN = Public Land Mobile Network. There is a lot of satellite communication, though. It works through basically the same kind of electromagnetic wave transmissions that we use in land-based mobile and fixed communications.

There are different types of satellite orbits to begin with: Low Earth Orbits (LEO) up to 2000 km of altitude, High Earth Orbits (HEO) with geo-stationary positions over the equator at approximately 36,000 km altitude, and Medium Earth Orbits (MEO) which designates everything in between. ISS is LEO, but also all GPS, GALILEO or GLONASS positioning satellites. The TV satellites and all data transmission satellites are in HEO positions. Otherwise you would have to use an electronic adjustment with your satellite TV dish to stay tuned.

Transmission power of satellites is surprisingly low; it is quite common that each transponder transmits 200 – 300 Watts (corresponding to decibel values of 53 to 55 dBm). They are using pretty large parabolic antenna dishes to focus that power into a smaller angle. There is a massive path loss on the long distance to the surface (and given 200 to 1000 km height of a “Low Earth Orbit” I consider even an airplane at 10 km altitude as being on the surface).

The GPS signal arrives down here at a reasonable level to be received by the small antennas of mobile phones and fitness gadgets. It is a very low bandwidth signal, so that there is only little thermal noise involved. The TV-Sat signal arrives here with a power level that is on the edge of recognizability. Thermal noise on the link is already higher than the signal level. It needs a rather big dish (gain = 25 dB) and the so-called LNB (gain = 15 dB) to bring the signal to a level where it can be detected and decoded by TV Sat receivers.

To compare with mobile phone signals: depending on distance to site, the signal received by your phone from the cell tower will be between -50 dBm and -115 dBm. Which is another confirmation of my point in the blog post.

 
Arjen
9 April 2020 at 15:27

Quick question around the units.

When you run the calculations, you are talking about watt rather than watt per square meter. What am I to infer? That it is the total power output through a cross section of the beam? Or just through a 1 meter square and that the Okumura-Hata formula already includes the inverse square law?

 
Dr Kurt Behnke
22 May 2020 at 13:00

The Okumura-Hata formulas do indeed include the aperture of an (idealized) isotropic antenna – hence in units it is Watt and not Watt/m2. A normal dipole, as used in most mobile phones has an antenna gain of 2.15 dBi (which I have added in the calculatins that I did, but have not explicitly written down.

Antenna arrays and MIMO is a very difficult subject for phones; they define a directional preference, and thus would require the user to aim at the transmitter, which most users would consider an inconvenience, to put it mildly. Thus MIMO in phones is more or less “transmit diversity”, which doesn’t add antenna gain.

 
Wes Groleau
14 April 2020 at 01:58

I’ll take your word about the 4th power, though I was taught the “inverse square law,” i.e., second power. Either way leads to the same conclusion that the phone in my hand does more than the tower next door. However, you cite a formula for 1GHz. Doesn’t 5G Work on 60 GHz?

 
Dr Kurt Behnke
22 May 2020 at 13:04

Ok. This requires to clarify a few things.
1. “5G use 60 GHz” is a massive, but very common misunderstanding. 5G uses a lot of frequency bands, with the lowest currently defined at 450 MHz. The most popular one at the moment is the 3.4 GHz – 3.8 GHz, so just above the main LTE/4G frequency of 2.6 GHz. By the way: those were the frequencies reserved for the now discontinued WiMax technology – nobody complained about them in 2010. A comprehensive list of frequency bands used country by country is found here: https://en.wikipedia.org/wiki/List_of_5G_NR_networks. High frequencies for 5G start at 28 GHz. Between 6 GHz and just above 20 GHz spectrum is reserved for satellite communication. There are a couple of High Frequency bands defined, but hardly used. Most installations of the so-called FR2 bands are purely experimental.
2. Microwave radiation spreads differently, depending on frequency. For the low frequency range (450 MHz to 6 GHz) the exponents vary between 3.5 (450 MHz) and 3.9 (6 GHz). That is what I described as 4th order. A pure “plane terrestrial propagation model” would result in a clear-cut inverse 4th power. Now waves are reflected on the ground and turn around corners, to rejoin at the receiver. Which eventually makes the path loss formula that we use.
If we look at high frequencies, their propagation characteristics are very different from those lower frequencies. They tend to follow straight lines, they are not bent around corners, and they are reflected only by very smooth surfaces (like glass walls of modern office buildings). Thus there is a “Line of Sight” attenuation (LOS), and a “Non Line of Sight” attenuation. LOS does indeed follow an inverse square Free Space propagation formula. But keep in mind that this also depends on wavelength, not just on distance. NLOS has repeatedly been measured by research teams. A measurement project at NYU resulted in PL(d)[dBNLOS] = 68+45 log_10⁡d for Non Line of Sight with d now measured in meter, where 28 GHz Line of Sight would mean PL(d)[dBLOS]=62+20 log_10⁡d. Which means: for a distance of 100 m from the source, you experience a path loss (LOS) of 102 dB and (NLOS) of 158 dB. There is a factor of 400,000 (!) between the two. Which means that the effect of high band waves is restricted to the immediate neighborhood of the source (square, street, etc).

 
arthur heath
14 April 2020 at 14:25

WHAT ARE THE REAL LONG TERM AFFECTS, OF EXPOSURE AND CAN IT CAUSE BIO PHYSICAL RESPONSES IN HUMAN BEING AS WELL AS ANIMALS. REDUCING OUR ABILITY TO FIGHT OF BIO-CHEMICAL OR VIRAL ATTACKS.

 
Dr Kurt Behnke
22 May 2020 at 13:05

What is „real long term”? Mobile phones and networks exist since about the 1940s. Mass deployment of 2G started more than 20 years ago. High frequency microwaves (up to 50 or 60 GHz) have always been used since the 1960s, when the first communication satellites were sent up to orbit the earth. And we have 1000s of such satellites up there today, with exact same kind of radiation, providing satellite TV around the world. If we haven’t seen any long-term effects yet (and we haven’t) we can safely say: there are none.
Conspiracy believers just don’t want to notice that modern technology is able to draw so much more out of the same or even less. Compare the computing power of your first clumsy microcomputer and of your smartphone today.

 
Fabio Coelho
16 April 2020 at 00:26

Dear Kurt,
Tks very much for your effort and time put on this article. I have some background knowledge to understand and refute with my sceptic friends – who very often fall into the well laid quasi pseudo-scientific arguments that 5G poses health risks – but your explanations have really helped me on a better argument.
Now, I have two questions for you if you would be so kind as to comment, which you may or may not covered already:
1. One common argument is that these frequencies have never been used before in consumer applications… and that this is the wavelength used in a weapon, the Active denial system. I couldn’t actually find the power for the ADS anywhere…
2. I understand TV broadcast bands, have been moved from analogue to digital to free up the UHF band for 4G. Weren’t the analogue TV emission stations powered in the kW to MW?
And if operating on the sub 3GHz, similar to microwave cookers, at a longer and more penetrating wavelength. Should we, therefore, be seeing whole towns and villages near TV broadcast stations with a higher count of cancers, immune diseases or whatever for the past … Why isn’t anyone talking about analogue TV effects on health over the past 50 years or so?

 
Dr Kurt Behnke
22 May 2020 at 13:08

Thanks for your question.

Answer to number 1: No, that is not true. All the frequencies now in discussion for 5G have been in technical use across the globe for a long time already. When we talk about “bulk” 5G then this is plain obvious. Those frequencies have been used by terrestrial TV broadcast (sub 1 GHz), by GSM and UMTS systems (between 1 GHz and 3 GHz roughly), by WiMax (3 GHz to 4 GHz). On top GPS uses 1.5 GHz from some 50 satellites, and since your phone is receiving the signals, you can safely assume that their power is equal to the signal from the base station you are connected to. WiFi is 2.4 GHz and 5 GHz in widespread use (I assume that every household with an internet access has got a WiFi router working in their house).
For the higher frequencies: up from 6 GHz we have got the satellite range. All communication satellites, including satellite TV, broadcast between 6 GHz and some 80 GHz. Received power levels are somewhat weaker (that is why we use the parabolic dish antennas). The 5G high frequencies were put in the gaps not yet occupied by satellites.

Answer to number 2: Yes, absolutely correct. TV towers transmit 100 Kilowatt EIRP typically, which is several thousand times more powerful than any radio base station (2O Watts). And we do not see this.
I would like to give you a counter-argument, just in case this comes up in your discussions: The historical origin, to my best knowledge, of the “mobile networks cause cancer” story is in England, some time after WW2. Radar (I forgot to mention that earlier) is using microwaves up to 300 GHz. It happened that operating staff of flight radar stations had a higher rate of cancer, and in an early conclusion this was traced back to the microwave. To make a very long story short: the reason was that the early radio valves used to create microwave actually also created X-ray at a considerable rate. Which of course was the reason for the cancer cases among the staff. Today there is no radio valve involved in microwave.

 
R
17 April 2020 at 22:21

I agree with you that a person’s exposure from the mobile is far higher than from the base station. The loss formula you use has no factor for carrier frequency, yet we know range drops 6dB for every doubling in frequency. Is the formula just for nominal 2GHz LTE and legacy bands? For example, we see 28GHz mmWave loses 109dB at 250m but 900MHz travels much farther.

 
Dr Kurt Behnke
22 May 2020 at 13:12

I have built in the frequency and used a formula for 1 GHz. Didn’t want to bother the reader with the truly complex story of dependency on frequency. I also have skipped the “environment part” of the formula. There are different formulas for path loss in rural, urban and suburban regions. There is also a “log-normal statistical component”.

Back to your question: Yes, Path Loss increases by 6 dB per doubling of frequency. On the other hand for 28 GHz you have a different propagation characteristics. While for the lower frequency bands the reflection on ground is responsible for the inverse 4th power loss, in the high band propagation is very similar (not quite though) to your infrared remote TV control. The propagation goes direct, without reflection. In contrast to the infrared case (with much higher frequency) you still have some shadowing loss. That is: we should expect a high frequency-depending initial attenuation, plus an inverse square law for the distance depending factor.

The dramatic attenuation with distance is e.g.: 145 dB over a distance of 200m. In my model in the post the 1 GHz wave would lose just over 90 dB on the same distance.

 
D. Atanasov, M.SC.
20 April 2020 at 20:50

Thank you for this article, Dr Behnke.
I have a question. What is the need for so many dipoles in one antenna when they do not work simultaneously. Thank you!

 
Dr Kurt Behnke
22 May 2020 at 13:13

The dipole array generates a directional effect. The input signal to the antenna is equally split among the dipoles. They transmit their signal in synch. If you stand in front of the antenna (in a distance – I mean directional) then you get all the signals in synch, and the waves are adding up to the original strength. When you look at the antenna from an angle, the distance between you (your phone) and each dipole varies by little. That makes a little shift of the waves from each dipole relative to each other. Overlay of these ways in the receiver antenna reduces the energy received. So that you have a strong wave in the direction perpendicular to the antenna, and a weak signal (or no signal) from the side.
A dish antenna like the ones used for satellite TV has the same effect, but much stronger. Mobile networks cannot use such a strong directional effect.

 
Richard Corfield
21 April 2020 at 11:05

Worth noting that the higher power towers will be on the lower frequencies, those that until recently were used for TV. If you look at the information for our local TV transmitter you see that it’s putting out about 500,000W in total! These phone towers are really small in comparison. You’d not want to climb a TV transmitter tower while it was turned on! But we accept TV as comforting, old, technology.

 
Gary Lewis, New Zealand
18 May 2020 at 00:13

Good morning Kurt.
Congratulations on a well balanced and sane presentation. I thought you deflected some fairly nasty personal attacks with dignity and grace.
Here in New Zealand, we have experienced criminals setting fire to 8 cell towers in the last two days, but our police will almost certainly track them down and prosecute.
Kind regards to you.

 
Dr Kurt Behnke
26 May 2020 at 08:12

Thank you very much. And sorry to hear that. Same is happening in a few places across Europe and North America. It is a shame to see how far conspiracy believe can drive people.

 
Matt
27 May 2020 at 18:46

Hello Dr
Thank you for answering these questions regarding 5G, I have enjoyed reading the full page.
Would you be so kind as to possibly address a couple of things I have recently come accross regarding this technology?
There are currently concerns circulating regarding 5G potential interactions with oxygen molecules.
The story is that when 5G towers are on, the waves can potentially spin oxygen molecules, hindering to the body’s ability to absorb the oxygen as effectively as normal.
This is a concern for anyone with any kind of lung problems or low immune system.
Secondly, there are a number of articles that make reference to a Norwegian study done, demonstrating a negative effect on different sweat glands on the body, causing bacteria that is largely immune to drug treatments.
Thank you for your time.
Regards. Matt

 
Josh Higgs
7 June 2020 at 18:48

Finally some common sense, free of conspiracy and anecdotes.Thank you Kurt for taking the time to explain this, and thanks google for putting it in my search results.

 
Bob Hay
12 June 2020 at 04:24

Thank you for your excellent comments. I teach a grad engineering class in RF Design and your analysis is spot on. I also spent a hour listening to a bunch of arguments against permitting 5G base stations in our community and was amazed by how many conspiracy theories there are about the dangers posed by these base stations. People using big technical words they do not understand. Things like “pulsed energy” which, of course, is pretty much the same as 4G. As you point out, anyone worried about the effects of RF energy should immediately stop using cellphones, WiFi, Bluetooth, and, for that matter, microwave ovens. The radiation of 5G from base stations is minuscule in comparison.

 
Art
24 June 2020 at 10:49

Can you please explain the derivation of your equation. There are several limitations with the Hata model that should be considered: The limitations on Hata Model are due to the range of test results from carrier frequencies 150Mhz to 1500Mhz, the distance from the base station ranges from 1Km to 20Km, the height of base station antenna (hb) ranges from 30m to 200m and the height of mobile antenna (hm) ranges from 1m to 10m. It is noted that the Hata model is not suitable for micro-cell planning where antenna is below roof height and its maximum carrier frequency is 1500MHz. It is also not valid for 1800 MHz and 1900 MHz systems. At the higher frequencies used by 5G technology and distances it may well be the case that the equation is also not valid, and in addition the urban/suburban environment in Japan are likely to be rather cluttered comapared to other parts of the world.

 
Kurt Behnke
16 June 2021 at 17:51

I don’t want to go into detail. You are right; the original Okumura-Hata covers only a limited amount of spectrum. Since then, about 20 different models have been established, basically covering all of usable microwave spectrum. After all, this is needed to do a reasonable network planning. All formulas that I know have the Okumura-Hata structure: Attenuation = A + 10 B log(Distance), where A collects all the distance-independent terms like wavelength, tower height, environment, … and B is the propagation coefficient, which is a number between 2 and 4. For 5G high frequencies (note: only a small fraction of 5G comes at high frequencies), typically B is very small (close to 2 – where 2 means direct line of sight, no reflection or shadowing), but A is very large. For long wavelength, like in the UHF-range (400 – 800 MHz) you will have B=3.6 and A of low order.

This was not a post on radio network planning. You need to understand that I picked a rather typical figure, rather than go through all options (which would have been a book).

 
L
24 June 2020 at 13:27

1. You are comparing mobile phone vs base stations and you neglect the fact that the base stations cause non-stop chronic radiation exposure. Base stations often cause exposures > 1V/m, which is comparable to mobile phone radiation at a distance of a few cm. And exposure durations are incomparable.
2. You say: “The impact of phone radiation since the early 2000s has dramatically gone down.” You are talking about short term exposure. What about long term exposure from base station?
3. You are missing the point because you are not expert for biological effects of RF radiation and you are referencing on small group of controversial ICNIRP/WHO experts. Read this https://www.emfscientist.org. (over 250 experts) and this http://www.orsaa.org/uploads/6/7/7/9/67791943/orsaa_submission_to_icnirp.pdf

 
Kurt Behnke
16 June 2021 at 17:31

Honestly: I am just shaking my head here.
1. I have made an explicit comparison in another comment just a minute ago. Please go and search. Base stations do not matter, when it comes to total impact of electromagnetic radiation.
2. I am talking long term effects, too. Mobile towers radiate in total only a tiny fraction of what they did in the 1990s. Mainly due to very fine power control. The electricity bill is one of the biggest items in network operational cost. The industry is keen on not wasting electricity. 5G will bring another factor 1/100 in energy efficiency. In 5G due to modern electronics they will be able to control the transmission power of each single bit.
3. Your emfscientist.org list is well known. It contains all the conspiracy sources that are there since the early 1990s. And only a fraction of them could call themselves “experts”, at least in this field. Some of them have been openly criticized for their work as completely unscientific. There is no single publication from this group that has ever been published in a leading journal, and in one case a guy from the list has been fired by his university, after his assistant confessed that their measurement results were just picked from the blue sky. The EMF database at University of Aachen (emf-portal.de) lists over 30,000 publications on the subject, which makes it the best studied field in history when it comes on possible impacts on humans. EMF is way better understood than any chemical. Antonio Guterres has probably done the only sensible thing he could do: put the appeal right into the trash. Nothing has been heard of this since then. The authors only try to make themselves look more important than they really are.

 
steve zl1
28 June 2020 at 07:24

hi kurt. im interested in laymans terms of a figure of wattage or micro volts an average cell tower transmittes. So the near field radiation maxima. whether in front of the beamwidth at ground level at an optimum distance or directly below the tower… i did see a figure above.,, cant be sure thats maximum.. The comments above seem to have gone off track a bit about the light bulb thing… you could take it or leave it.. not that important … no one seems to argue with the rest of your statement… which is the topic

 
Kurt Behnke
16 June 2021 at 18:05

The ICNIRP gives recommendations, and most countries follow. The allowed values in electrical field strength are frequency dependent. The range from 38 V/m (800 MHz) to 61 V/m (2600 MHz and beyond). The European commission has decided early on, that they would follow ICNIRP. The higher value for higher frequency corresponds to the stronger attenuation. The high frequency signal strength goes down much faster with distance. In North America the limits are proposed by IEEE and set by FCC. They are 47.5 V/m for 900 MHz to (again) 61 V/m for 2600 and beyond.

All measurements need to be taken in a standardized manner, in a short distance from the antenna, in antenna main direction.

To link this to power: 50 V/m correspond to 650 mW/m^2 (Intensity) resp. 38 dBm (logarithmic value)

 
Jonathan
4 July 2020 at 15:01

Oh trust me when I say when these anti-science crowds win their respective 5G NIMBY wars the next thing they would whine about is how 5G sucks due to the lack of coverage which THEY caused to begin with.

There’s no cure for stupid.

 
sse
6 July 2020 at 09:23

What about cost of converting your 4G grid to 5G ? And which country have more advantage from this bazaar ? I think these are the main questions that don’t appear

 
Kurt Behnke
16 June 2021 at 17:43

Upgrade a 4G network to 5G is a major undertaking. It usually goes through an intermediate step, where the 4G nodes stay in control of the 5G transmitters on the same mast, and the whole traffic goes through the 4G core network. Only when an area is fully deployed with 5G, and the core network (cloud based) is ready to roll, things are switched. That is currently happening in a number of countries, in areas (Switzerland, Germany, Finland, Sweden).

The upgrade of radio nodes will be a simple one, as long as the 4G node has been delivered by a major supplier and not before, say, 2018. In which case it is a software upgrade. Older nodes may need what is called “fork lift upgrade” — you understand that term, I guess. Also backhaul and backbone need to upgraded (more capacity), and of course data centers for cloud based 5G core need to be built or established.

The cost depend on the size of the network, of course. A small to medium network has got 1000 to 3000 radio nodes, a larger one (such as in Germany or France) have got 30,000 radio nodes, and large countries (China, US, India) have got severe hundred thousand radio nodes. So there are way too many factors to provide a meaningful answer, but I would guess it ranges from Millions to Billions of US-Dollar per network.

Apart from 5G being beneficial to every operator there is, due to its wider set of applications and business opportunities, benefitors are of course the network infrastructure suppliers: Ericsson (Sweden), Nokia (Finland), Huawei, ZTE (China), Samsung (Korea), NEC (Japan). Plus the smartphone makers, which you all know. Each 5G phone is a new phone. And the total phone business is many times bigger than the network infrastructure business.

 
Manfred Wagner (Electronics engineer ret.)
2 August 2020 at 15:50

Thank you for your efforts. Great work and clear explanations. And I’m with Jonathon… …all attempts to inform the “stupid” falls on deaf ears, attracts personal abuse and is topped off with silly comments like “it causes Covid-19 to propagate, it’s the source of the untraceable mystery cases”. The major problem with this being the facts, .e. these cases also appear in places where there is NO 5G.

 
jB
11 August 2020 at 02:19

In regard to the comment that there have been no long term effects from EMF radiation in general, and cell phone and cell tower radiation in particular, I suggest you read the book The Invisible Rainbow by Arthur Firstenberg.
It is well documented and quite informative in regard to the subject of this blog.

 
Tim Dyson
11 August 2020 at 05:08

Dr Kurt Behnke, all I can say is you have the patience of a saint. I may have missed it in the text or comments but what I think would be interesting would be a comparison between the exposure due to typical distribution 5G and one’s typical home WiFi. Different frequencies and transmission strength I know, but, given that most of the angst is over remote transmission exposure and not the device in the hand (weird, very weird) it would be interesting to see the wattage exposure difference between 5G and home Wifi.

 
Owen Jones
14 March 2021 at 09:10

Thank you very much Dr Behnke. I agree some absolutely excellent explanations here which are very helpful indeed. A comparison of received power from WiFi vs 4G/5G would be interesting. Some years ago I was installing a WiFi network in an office in Hong Kong but ran into trouble because the local press was running scare stories about WiFi Access Points frying people’s brains. We ended up having to install the APs invisibly above the celling grid which which placed them in a Faraday cage. The result was that the users complained constantly about poor WiFi connectivity while paradoxically the APs ran constantly at maximum power, which probably meant that the users were receiving a higher radiation dose than if we had simply placed the APs in the optimum locations.

 
Kurt Behnke
17 March 2021 at 16:18

That is actually a frequently asked question, also on other fora. A few things are pretty obvious:
– WiFi uses pretty much the same frequency ranges (on slightly different bands, of course).
– These WiFi bands are not exclusive to WiFi use; other technologies are using them as well (like Bluetooth). They are called ISM bands in most countries (Industry, Science, Medical) and are open for everyone according to the regulatory proposal of the International Telecommunication Union, that most countries have turned into national legislation.
– Free to use for everyone means that everyone has to “behave”; otherwise it would end up in a total chaos, and no one would really benefit from it. That’s why there are side conditions, on behalf of national regulators, that differ slightly from country to country, and from band to band.For WiFi this means essentially that the output power of all transmitters are limited by law. And this applies to access points and to devices equally.
– there are a couple of different schemes for the different frequency ranges:
– WiFi 2.4 GHz: in Europe the transmitter output power (EIRP, so including all antenna gains) is limited to 100 mW. In decibel this is 20 dBm (Decibel-Milliwatt) Since most of the received power is direct line of sight (you are in the same room and can see the access point), attenuation is -42dB for 1m distance, -48dB for 2m, -56dB for 5m and -62dB for 10m distance to the transmitter.
Which means that the received power is between -22dBm (1m) to -42dBm (10m).Which is a little bit higher than the received power from a public 4G/5G network (where you typically get values from -50 dBm down), but certainly not significantly different. The main impact is still from your device, that sits besides you and transmits 100mW. Which gets down to -36 dBm when the radiation hits you. There is no transmit power adaptation in WiFi (as there is in 4G and 5G).

– WiFi 5 GHz: in principle the power story is the same. A few variations, though. The WiFi spectrum in 5 GHz is split in two bands and total three parts. 5150 MHz – 5250 MHz:Transmission power is 200 mW, thus double of WiFi 2.4 , but attenuation is 4 times stronger, thus the received power is lower by 3 dB for all distances. Use of this band is strictly restricted to indoors. 5250 – 5350 MHz: Again transmission is limited to 200 mW, but only if your device has got transmission power control (I’ll explain this further down). If not, then it is 100 mW, like WiFi 2.4. The upper band from 5470 MHz to 5725 MHz allows a whopping 1000 mW, under the assumption that Transmit power control is installed. The upper band also requires Dynamic Frequency Selection from all devices. TPC and DFS are checked by the authorities, before they approve the distribution of devices.

So what is TPC and what is DFS? Well, the point is here that the 5GHz frequencies are very congested. Professional services, like Weather radar, traffic radar etc. are using the frequencies. The general rule of authorities is: you may use those bands, but you are a low priority user. You may under no circumstances disturb the professional services, but you have to accept, that they disturb you. TPC reduces the power as much as possible, given the received power values from the other side (this is similar to power control in 4G/5G). DFS changes the frequency channel as soon as it “senses” other radiation on your channel. Which may be a professional service.

So all in all: WiFi is pretty much the same story as mobile networks, when it comes to radiation. The impact (if any) comes from the device near you, and not from the access point. And all power values in question are below 0.1% of a MilliWatt.

 
Owen Jones
14 March 2021 at 16:26

Thank you very much Dr Behnke. I agree some absolutely excellent explanations here which are very helpful indeed. A comparison of received power from WiFi vs 4G/5G would be interesting as well.

Some years ago I was installing a WiFi network in an office in Hong Kong but ran into trouble because the local press were running scare stories about WiFi frying people’s brains. We ended up having to install the Access Points invisibly above the celling grid which placed them in a Faraday cage. The result was that the users complained constantly about poor WiFi connectivity while the APs ran constantly at maximum power, which probably meant that the users were receiving a higher radiation dose than if we had simply placed the APs in the optimum locations. A nice example I think of scientific misunderstanding leading to counterproductive results.

 
Johnny J
17 December 2020 at 21:53

I have 2 follow-up questions on your article above:

A) You note that electromagnetic radiation from a light bulb 10m away is millions of time higher. How so? Looks like you have not factored in that white light has a frequency of ~500THz on average, and attenuates a lot faster than comparable milli meter wave. My calculations using free space propagation tell me that 100W light bulb should result in ~ -110dBm of power 6 ft away, which is way below your numbers.

B) Most use cases(video streaming, browsing etc) are download – the base station transmitting to the mobile phone. A 5G base station (macro or small cell) is continuously transmitting to serve 100s if not 1000s of users streaming videos or downloading data, unlike a mobile phone, which is mostly idle, unless you are actively using it. While the power from the phone is higher, it’s a lot lesser exposure to EM radiation compared to base station that’s always transmitting.

What am i missing here?

 
Kurt Behnke
22 December 2020 at 12:51

Are you trying to tell me that I cannot see a light bulb that is positioned in 2m distance on a table, with no obstructions blocking the way? That’s what your free space propagation calculation suggests. You claim that I have to phase the frequency into the light bulb equation. My response is: No, you are not allowed to do that. Simply because light propagation like all electromagnetic free space propagation is simply 1/r^2.
The Friis formula that you are referring to is strictly for radio frequency. That’s the way it has been discovered and derived. Friis composed two things in his seminal IEEE paper: the inverse square law and the experimental value of the aperture of a point-shaped antenna (as the limit value of a series of shorter and shorter dipoles). 100 Watts positioned in 2 m distance gives a power density of 2000 mW/m^2, which means that your body is “hit” by about 2000 mW = 33 dBm, and your eye surface, which I just approximate by 10 cm^2 will get 1/1000 of that amount, 3 dBm.

Just one more remark on the non-applicability of Friis’ formula for high frequency (non-antenna based) radiation: If you take your remark seriously, then all of X-rays, gamma rays and cosmic rays would have a reach of a few millimeters only. Thus X-ray would be completely useless, and gamma-rays would not be dangerous due to their extremely short reach.

Your second remark I have actually dealt with before. Just a few words: There is a fundamental difference between radioactive radiation and electromagnetic radiation. In the radioactive case (alpha and beta) the radiation consists of particles, equipped with a lot of energy. Each of these particles is capable of destroying a chemical bond, and creating a free radical in human tissue. Thus even at low intensity exposition time matters, since the number of hits accumulates over time.
For electromagnetic waves, there is a relatively sharp bound at the border between visible light and UV. Everything with frequency higher than visible light consists of photons of individual energy of more than 1 electron Volt. Those photons are able to kick electrons in molecules and atoms out of their position, break chemical bonds and create ions and free radicals in tissue, just like radioactive radiation particles. Thus there you have got the damage that individual photons do, and which can be very harmful in high density situations, but also accumulate over time even when transmitted with low intensity.
Radio Frequency electromagnetic waves are in the low Gigahertz range – anything between 500 MHz and 5 GHz with a planned extension up to 50 GHz (the Ka-band used for satellite communication today). That is about a million times less energy per photon, and it has been proven again and again, that the only effect these particles have is a thermal one. When a photon of such frequency / energy hits a molecule the molecule starts to oscillate a tiny bit faster, but loses this energy very quickly into the neighborhood (an effect called dissipation). Only if the photons come in dense showers, like in a microwave oven, the heating effect can be so large, that the tissue is damaged from overheating. There is simply no accumulation over time, due to heat dissipation.

We all are exposed to the high-band frequencies for decades already. A lot of satellites transmit in this frequency range, including satellite TV, and GPS. If there was an accumulating effect we should have seen some of it (which we clearly haven’t). And of course there is no evidence for any health problem created by mobile phone networks through cell tower radiation.

 
Possy from egypt
26 December 2020 at 11:07

I am from Egypt. I live on the ninth floor on the right, with a communications network in which an antenna is directed to the house exactly at the same horizontal level, fifty meters away, and in front of me are more than five towers at a distance of two hundred meters, but also on the same horizontal level. Unfortunately, and most unfortunately, on the north side there is a roof with more than twenty antennas on the same horizontal plane
Is she in danger?
Noting that there were no buildings higher than me from any direction

 
Kurt Behnke
29 December 2020 at 13:02

Hi There! Greetings to Egypt (I assume you live in Cairo or Alex). I stayed in Cairo from 2008 to 2011, working for Ericsson as regional head of service business for the region North-Africa. An I tell you: we did have problems finding sites. We installed Vodafone and Etisalat. One of the problems there is that you have got a lot of “unofficial” buildings; there is no building permit, hence no limit on the amount of antennas to put there. And of course many of these buildings don’t have electricity supply, so you got to put a Diesel genset on the ground level. That together with the fuel tank creates more risk for the neighborhood than all the talk about EMF.

But to your question: 50m is pretty close, but still not dangerous. Outside your flat at max transmission power of 20 watts and a common 2×12 antenna shape you will see a -10 dBm signal strength. That equates to 1/10 of a Milliwatt per square meter. And that dominates all the other sources by orders of magnitude. Inside, count on 10 dB additional attenuation; if your house is concrete then it is at least 15 dB. Which gives you -20 to -25 dBm, which is 1/100 to 1/300 of a Milliwatt.
I don’t understand the picture of the north side. Are the antennas directed away from the house? The backplane of the rectangular antenna shape is made from reflecting material, that acts as an almost perfect shield. It the situation is not like that, please tell me. I understand that living 5 m away from an antenna is not a good feeling, but it is more of a psychological issue. Unless you live right there in the center of the beam, a few meters away. Nothing is impossible in Cairo.

 
Alex
8 January 2021 at 17:37

Greetings Kurt Behnke,
Thank you very much for all the great information, i have a Question please regarding the maximal allowable transmission power in dBW or dBm for the 4G and 5G base stations.
you took the value of 50dBm for 5G and 43dBm for 4G, but according to the 3GPP document:
https://www.etsi.org/deliver/etsi_ts/138100_138199/138104/16.04.00_60/ts_138104v160400p.pdf
page 48 it says that the wise range BS does not have a limit for the output power!
searching more i found the low frequency bands in the FR1 NR regions will be transmitted by the Wide Area BS you can see that in table Table 6.6.4.2.1 and Table 6.6.4.2.2 under page 61 for unwanted emissions.
can you please give me a document link for the maxium transmission power for 4G and 5G BS please? i need this for my work.
Many thanks in advance.

 
Kurt Behnke
13 January 2021 at 16:00

Hi there Alex

the absolute limits of power output is not in the realm of ETSI or ITU or 3GPP. What they do in the reference you sent is to separate the smaller base stations (local, medium) from the full size (wide area) nodes, and thereby define the smaller types. The power limits for base stations and base station locations are in the legislation of each state/nation, but are commonly agreed across all countries I know.

I tried to find a more suitable reference (like US), but I don’t know which country you are, and which language those documents would be. Let me give the German reference, just as a typical example. The source is the official network administration Bundesnetzagentur, acting on behalf of the federal government (equivalent to FCC in the US). The link is below:

https://www.bundesnetzagentur.de/DE/Sachgebiete/Telekommunikation/Unternehmen_Institutionen/EMF/emf-node.html

For your convenience I have translated into English / the German source is cited below:
LTE 800 MHz 38 V/m converted into power density of 3,9 W/m2
GSM 900 MHz 41 V/m converted into power density 4,6 W/m2
GSM/LTE 1800 MHz 58 V/m converted into power density 9,0 W/m2
UMTS/LTE 2600 61 V/m converted into power density 10,0 W/m2
5G 3600 MHz 61 V/m converted into power density 10,0 W/m2

Original (German):

LTE 800 MHz 38 V/m umgerechnet in Leistungsflussdichte 3,9 W/m2
GSM 900 MHz 41 V/m umgerechnet in Leistungsflussdichte 4,6 W/m2
GSM/LTE 1800 MHz 58 V/m umgerechnet in Leistungsflussdichte 9,0 W/m2
UMTS/LTE 2600 61 V/m umgerechnet in Leistungsflussdichte 10,0 W/m2
5G 3600 MHz 61 V/m umgerechnet in Leistungsflussdichte 10,0 W/m2

http://www.gesetze-im-internet.de/bemfv/__3.html

Best regards

Kurt

 
Possy from egypt
13 January 2021 at 15:39

Greetings from Egypt
I would like to thank you very much for the response
To clarify more. About five towers two hundred meters away I live in a building around it, most of them are the same horizontal level on my right and in front of me
To the right of two towers, maps have a station with four operators
I have a firm piece of information that works at 80 power
The one closest to me is 50 meters away
In the count, I find more than twenty stinks heading towards the area from all directions
Very scared because I have babies
Is there any solution or isolation I can do?
How do I calculate the amount of energy that reaches me if each antenna emits 80 power?
Thank you so much
God with you

 
Kurt Behnke
13 January 2021 at 16:19

Greetings.
What is 80 power? Power is measured in units of Watts, Milliwatts, decibel with reference to Watt or Milliwatt. It could also be electric field strength (Volt/m). Without knowing that it is difficult to answer.

If I assume 80 Watts for a moment, that’s 80,000 Milliwatt or 59 dBm. Just for the calculation. In reality what you probably have is 4×20, since 20 watts is an agreed power limit by national regulators. That is true for Egypt, as you know that I know. No supplier (not the Chinese, nor Ericsson) are delivering a radio unit with 80 watts of power output. Doesn’t make sense for them, since it can’t be sold anywhere.

In 50 m distance to the antenna the path loss is in the range of -70 decibel (approximately), which is 1/10 000 000 part of the original value. Electromagnetic radiation dissipates so extremely quickly. 59 – 70 = -11 dBm, which would be roughly 1/10 of a Milliwatt. Even with a 20 or so base stations in your neighborhood you would still have an impact in the range of 1 to 2 milliwatt. Your phone sends 100 times the signal strength.

I know it sounds extreme, with all the sturdy metal structures in sight. The towers are needed to support the wind load from the antenna space. The antenna space is needed to hold the transmitting dipoles at a distance, that allows to generate a directed signal.

Best regards

Kurt

 
Jaspreet Kaur
16 June 2021 at 12:33

what is the best height and location to place 5G antennas?

 
Kurt Behnke
16 June 2021 at 18:19

The answer to your question depends a lot on environment and purpose. There is no general rule. It is always a compromise. The rule of thumb usually is “the higher the better”. Since it avoids reflections, for example on ground, and shadowing. On the rural side this is what matters most.

In urban environments, you have to take what you get. Since you end up mostly on rooftops. Also attenuation does not matter so much there, since you want a certain density of sites, in order to reach the needed capacity. Which means that your sites need to be small; few hundred meters in case of inner cities. Which forces the network operator to even reduce the transmission power to a minimum, in order to avoid interference between different sites.

Network planning is a highly specialized and difficult job. Network planners are among the highest paid people in the industry. They bring experience and a lot of training, plus highly sophisticated tools. Which includes models of ground material, foliation, density of settlements, hills, water, in a detailed 3-dimensional map.

I have never done detailed network planning myself. What I have done is “nominal network planning” where you assess the number of sites to be built in a certain area. This is usually based on uplink consideration (phones need to get through to the tower antenna), and tower height is one of your variables. The goal is to develop a business case for a rollout. When network planners come in later, they usually shake their heads, and start their work. But on the whole the business case usually stands up. Some points dont need a tower, somewhere you need more than one base station. We usually worked out cases with average tower heights between 20 m and 40 m. Note that every tower meter costs a four.digit amount of initial invest.

 
Ivan
5 July 2021 at 12:37

Dear Kurt,
Thanks for the article and taking the time to answer all the questions. I am afraid I have to partially disagree with you about the levels of RF radiation that penetrate the buildings you mentioned in some of the answers above. While your calculus makes perfect sense I am afraid the reality is often different. While concrete does partially decrease the amount of radiation that penetrate buildings, there are also windows which do very little or nothing on that field, also not all the walls are made of concrete, nowadays lighter materials are used and it would be interesting to see where they stand in terms of RF radiation isolation.
People react to this topic out of fear for their health and health of their children. As soon as you say lock yourself in a concrete cage and you should be fine people are alarmed and with a good reason. Radiation pollution is getting out of control and it’s about to get even worse considering the number of newly announced antennas to support 5G network.
I have a pretty powerful new router installed in my home, the RF measuring device maxes out instantly in a close proximity to it, but as soon as I get 2m away from it the level of RF radiation falls to 0.0018 mW/m2 while I have an excellent wifi performance in my entire flat. On the other hand the antenna tower near my flat gives me 2,500 mW/m2 with constant peaks of over 20,000mW/m2, and that’s through the walls from some 40m away, that’s 1000 to 10000 times the router radiation. I am genuinely concerned about the wellbeing of my family. I am well aware of the official recommendations and that those number fall under it, but here is the thing, no expert will recommend you to put a baby to sleep next to a router, while many experts advise you that 10000 time more powerful radiation is acceptable.
Here is what I think and a question for you. Radiation from the antenna dramatically falls off in the other side of my apartment, which perfectly proves your point from the beginning. My opinion is that the progress cannot be stopped, but the way we are doing it is wrong, instead of denying the problem we should put more effort in battling it. This leads to nowhere. I am going to install very expensive radiation shield in the walls of all the rooms that are affected by the antenna because I do not wish to risk my kids’ health with something that no experts have found a consensus on yet . I suppose many of my neighbors will do the same because they all have children. This will leave us with a lot of wasted money, but will also reduce the efficiency of the installed antenna, by how much I don’t know, but it could be considerate because the building is right in front of the antenna.
So here is the question: What level of RF radiation would you feel comfortable having in your children’s bedroom and what would be the best and safest option for the future when it comes to building antennas in terms of reducing the radiation? The antenna solution near my building seems ridiculous it’s some 35-40m high while buildings around it are 30m high. If the radiation falls of so dramatically inside of my apartment what remains of the signal when it passes through? And if what remains is enough, why not build a higher antennas so they are further away from both people and obstacles and make them less powerful in the first place. In my parent’s home for example I measure 0,000 mW/m2 with peaks up to 0,014 and they have absolutely no issues with using their phones. Would solution be a bigger number of less powerful antennas following this logic? Thank you.

 
Kurt Behnke
5 July 2021 at 13:32

Can I give you an honest advice? Trash this “toy-store” measurement instrument. It is worth nothing, and only serves to confuse you. We have done measurements on behalf of authorities and network operators, to assess both the field attenuation and the indoor penetration of houses of different type. Which includes windows and all that.

The typical wall penetration figure is 15 dB (median across a sample of several hundred buildings, rural, urban, dense urban across Germany) with a standard deviation of 6 dB. That’s what we confirmed through our measurements. I insist that my figures are correct, and they are based on actual measurements. Taken with a 25 K$ high quality device from Rohde&Schwarz, the world-leading high quality measurement and test equipment company. I see these crap measurements all over the place. People buy the kit from internet stores, and believe that they have got accurate information that way. We had cases over here, where ligitations were based on such measurements, and in court when the operator insisted that professional measurement kit was put in place, crumbled miserably. It is an irresponsible business to sell such things (where I believe that a lot of the “protection” stuff like shielding is the true business model behind these conspiracy stories).

To give you a concrete argument why your instrument cannot be accurate: If, as you say, your tower is 30-40 m high, and stands in 40 m distance from your flat or house, then the linear distance to the antenna is about 50 m. With line of sight attenuation (no wall penetration considered), I get an attenuation factor of 0.000 000 2 corresponding to -66 dB. If you measure 2,500 mW/m^2 or 2.5 W/m^2, then the tower must pump out about 11 Megawatt of radiation. That’s a bit ridiculous, don’t you think? In reality, even at 50 distance line of sight attenuation does not apply fully, so we have a bit of a stronger attenuation. Plus add at least 9 dB of wall penetration. That would require the tower to produce about 100 Megawatt (per antenna). You would need a power plant per tower.

 
Ivan
6 July 2021 at 15:44

Thanks for a quick reply Kurt, I really appreciate it. I see now that I put a decimal point the way the device shows it, It’s actually 2.5 – mW/m2 not 2500 with peaks at 20 mW/m2. As I said I am aware that this figure falls under regulations but my concern is still there as the figure is still 1000 to 10000 higher than the radiation I measure in other parts of the flat and my parents house. The instrument I used is one of the popular solutions that goes for around 250Eur, I know it’s not completely accurate it is well stated it measures with +- 20% accuracy, but even if it was +- 200% it’s still very indicative and consistent in showing higher values of radiation in the rooms that are directly exposed to the antenna. You didn’t tell me what amount of radiation in mW/m2 would you be comfortable with in your home? I see in your resume you worked for Ericson, I have colleagues working for Ericson who are currently setting up a 5G network in Luxembourg and who also worked on different projects in Germany, France, Spain, India and other countries and none of them could answer that question.

 
Katrice Frazer
30 October 2021 at 11:53

Hi, all is going perfectly here and ofcourse every one is sharing information,
that’s genuinely excellent, keep up writing.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Sign up to our newsletter!


 

Newsletter