Most distros force you to dynamically link every dependency if you want them to package it. So the default build for most projects is dynamic
You've stumbled on a holy war between distros and guys like you, me, and Linus Torvalds[1] that want to deploy a binary and just have it work everywhere.
Sure this is understandable when we are building an OS.
But here we are using Docker so we have full control over the application we are building. Why this craziness of having these huge images containing who knows what?
Is it just because we can? And the cloud providers like us when we do it?
It mostly happened because the we carried over the previous assumptions, practices and limitations when moving into containers.
I agree with you and the parent commenter, this should be the default, but some people are against static-linking, even in cases dynamic-linking provides no advantages.
The naive solution using the golang image is nearly 1GB. Why carry this extra complexity around?