7 replies
September 2022

mmorasch

Really enjoyed reading it, thank for taking your time for the write up!

One question I have is how did you install Caddy? Did you build from source? As you correctly saw the spikes when Go’s garbage collector was running, when building with Go > 1.19 you have the option to specify GOMEMLIMIT, which is a soft limit of RAM for when the garbage collector should run. As your target server has 4GB of RAM, my assumption would be, this would improve the performance quite a bit in the more CPU intensive tests.
On the other hand, I am very impressed by nginx almost never overstepping 32MB of memory used.

1 reply
September 2022 ▶ mmorasch

tylerjl

Thank you for taking the time to read it! :smile:

My system under test was a NixOS 22.05 machine, which builds Caddy using this package. Nix is sort of funky to read but it’s built with go 1.17. Maybe in the future I can tweak the build environment to see what that kind of change might cause, but I’d certainly be curious as well.

October 2022

eva2000

Thanks for sharing your benchmarks. In the past I’ve done some Nginx vs Caddy benchmarks and noticed as you add more response headers to requests, Caddy’s performance drops more than Nginx does (see links below). As as I’ve been always mindful to test with real world used response headers/equivalent for both web servers.

So I’m curious if in your proxy testing for latest Caddy versions, does the same performance overhead happen with Caddy still exist as you add more response headers?

Looking forward to more of your tests in future :slight_smile:

1 reply
October 2022 ▶ eva2000

tylerjl

Performance measurements that include response headers is something I haven’t tested yet - since I published this research I’ve received lots of helpful input about some additional variables to test, and headers are one of those variables that I think would be helpful to include. I’ve recorded your suggestion here if and when I do another round of tests :slightly_smiling_face:

1 reply
October 2022 ▶ tylerjl

eva2000

Cheers much appreciated :slight_smile:

September 2024

ethereal-engineer

Signed up/in just to say thank you for your incredibly thorough and detailed piece. Although it is now two years old, it still helps educate and explain. I, too, love Caddy, having used nginx and apache before it for many years. This helps me understand and gives me an improved perspective to know how to ask the right questions of performance of my reverse proxy setup. So again, thank you! GREAT work.

November 2024

Nilankar

wow the bench mark is great.
nice blog very Thanks!