Much ado was made this week of Facebook launching a solar powered drone to bring the internet to all. It may sound like a noble goal, but beware of the physics.
Facebook’s push is another example of Silicon Valley stepping outside its realm of expertise to try to do something that probably won’t work. Google is trying to do the same thing as Facebook, but with balloons instead of drones. The idea is this: build lots of little tiny aircraft and send them everywhere in order to relay the signal to people who don’t have it. Keep in mind that Facebook (and Google) are both held together by people whose predominant expertise is writing computer programs…
Now, the reason I call this set of ideas tilting at windmills is simple: they are trying things somewhat randomly hoping that the technology will catch up just in time for implementation without thinking about what making it work will mean to the world around them. Silicon Valley is trying desperately to roll out the 5G wireless standard in the next five years. Here is a list of the fundamental criterion for 5G taken from wikipedia:
- Data rates of tens of megabits per second for tens of thousands of users
- 1 gigabit per second simultaneously to many workers on the same office floor
- Several hundreds of thousands of simultaneous connections for massive wireless sensor network
- Spectral efficiency significantly enhanced compared to 4G
- Coverage improved
- Signalling efficiency enhanced
- Latency reduced significantly compared to LTE.
Sounds kinda wonderful does it? Got to admit, downloading a whole movie in seconds to minutes instead of hours would be wicked nice!
Now, the trick here is how. How do you do this?
The way that the physics of light interface with electronics is very strongly tied to clockspeed in a computer. This is how fast your processor makes calculations (after a fashion). As transistors have gotten smaller, the amount of energy needed to switch them has gone down and the switching rates have gone way up in conjunction with the number of transistors present. Modern 4G technology interfaces with light at frequencies of 2.5 GHz and 5 GHz (that is roughly 5 billion switchings a second) using 40 MHz to 60 MHz frequency modulation bands at around those frequencies in order to transfer the data. With compression routines based on frequency convolution, this microwave light can transfer between 5 and 12 million computer bits per second with as much as 100 million bits per second. 5G will supposedly do 100 to 1000 times that: 12 billion bits per second. In order to handle this enhancement of speed, 5G will be broadcast at microwave frequencies of something like 40 to 60 GHz.
One thing all these wonderful stats aren’t telling you about is a physical phenomenon called Penetration Depth. As light travels through any medium, the intensity of the wave is gradually absorbed by that medium and reduced until it goes away. As you drive away from a radio station, eventually, the 1/r^2 reduction of power with distance and the atmospheric attenuation with penetration depth conspire to make your radio reception go away. For the broadcast frequencies used by radio stations, this is a couple hundred miles distance. Such stations have good range in large part because they don’t need to transfer horrifyingly large quantities of data: everybody gets the same broadcast and that broadcast doesn’t need a very high frequency to transfer the information required: AM radio is only in the hundreds of kHz (hundreds of thousands of switchings per second, roughly 10,000 fold slower than 4G).
Now, 4G is exactly the same thing as old timey radio, except that everybody with a smartphone gets their own unique broadcast and then broadcasts signals back to the tower in return. Everybody is their own TV station. That is, of course, why 10,000 fold faster was needed.
Now, penetration depth for a 4G station is different than for a 200 kHz radio station. As it turns out, penetration depth is inversely proportional to frequency. The penetration depth for a 4G signal is 1/10,000th of the depth of an AM radio signal in atmosphere. Receivers have gotten more sensitive since the AM radio revolution, but the physics has conspired to limit reception distances for the 4G signal to just a couple miles. Literally, after you get out of direct line of sight with a 4G cell tower, the signal is gone. This is why there are now ugly cellphone tower trees everywhere you look!
The signal doesn’t go very far, so the antennas need to be everywhere.
5G now is going to be set at 10-fold higher frequency than 4G. That means 5G has 1/10th the penetration depth of 4G. This changes miles into hundreds of yards.
In order to support 5G, there will need to be many many more antennas (think factor of ten). You can’t get around it physically. You will need one in your home and probably one directly in your car.
Now then, what does this mean for our merchants of signal from above? Google and Facebook have no choice but to be unable to deliver 5G wifi directly from their balloons or drones. Do you really think anybody can support the infrastructure for floating a balloon or drone every couple hundred yards, no higher than a couple hundred yards, everywhere, all the time? For one thing, Helium is a precious commodity that is ultimately more scarce than fossil fuels. 5G will certainly happen, but is this the way we really want for it to be implemented?
I think I personally would be out skeet shooting. People will eventually call enough. Do we really need Silicon Valley doodads flying everywhere, 24/7? Get ready for it… people will try it.