Okay, so, I’ve been messing around with antennas for a while now, specifically for LTE stuff, and I gotta say, figuring out the right length for different bands can be a real pain. But, you know, after a lot of trial and error, I’ve kinda got the hang of it, and I thought I’d share what I’ve learned in case anyone else is pulling their hair out over this.
So, first off, I started by digging into how to calculate this stuff. The basic idea is that the length of your antenna is tied to the wavelength of the frequency you’re using. I found this formula: λ (that’s wavelength, by the way) equals the speed of light divided by the frequency. It sounds complicated, but the speed of light is always the same, around 300,000 kilometers per second. So, you can simplified the formula to λ=300/f (f is the frequency, and the unit for frequency should be MHz here) to calculate the wavelength. Pretty straightforward, right?
Once I had the wavelength, I learned that ideally, you want your antenna to be half that length (λ/2). But then, at the connection point, you gotta cut that in half again, so each side ends up being a quarter of the wavelength (λ/4). This is what most of the people saying online.
For example, I was working with a 700 MHz frequency for LTE Band 13. So, I punched that into the formula like this: 468 divided by 700, that gave me roughly 0.668 feet, or about 8 inches. But, remember, that’s for a half-wave antenna. For a quarter-wave, which is what I needed, I divided that by 2, ending up with about 4 inches. I built one, and tried to see if it works, and surprisingly, the signal is pretty good, and I can say this formula is really working. And if you are using a 500 MHz system, after the calculating, a 1/4 wave antenna is about 6 inches. So you can see the higher the frequency, the shorter the wavelength, and then the shorter the antenna is gonna be.
Now, here’s where I messed up a few times. I thought, “Oh, I’ll just use a super long cable to get the antenna outside.” Turns out, that’s a bad idea. From what I gathered, you really want to keep those cables as short as possible, like under 10 meters for sure. Longer cables mean more signal loss, and that just defeats the whole purpose. So I ended up rearranging my setup to minimize the cable length. I also figured that don’t put omni-directional antennas too close to each other. I tried to put them at least one wavelength apart. This made a noticeable difference.
Then after I learned all the basic rules and formulas, I started to read some articles about practical examples. I found that there was a study used a Quasi-Yagi antenna design for LTE applications, I think that’s some high-tech stuff, but the basic calculation is the same as what I did.
The gist of it:
- Figure out the frequency you’re working with.
- Use the formula λ = 300 / f to get the wavelength.
- Divide the wavelength by 4 to get the ideal length for each side of your antenna.
- Keep those cables short!
- For omni-directional antennas, make sure they are at least one wavelength apart.
It took me a while to get this down, with lots of tweaking and testing. But once I figured out the basics and started paying attention to cable lengths and antenna spacing, things started to click. I wouldn’t call myself an expert, but I’m definitely getting better results now. Hopefully, this little write-up helps someone else avoid some of the headaches I went through!