Simpler but small updates, like say openssl, become massive distro updates. There's a reason why everyone went with shared libs when they became stable.
A system with shared libraries also needs an update for the security fixes. There is no avoiding the update step. However, the only difference is the size of the update. If update process is robust, size of the update shouldn't matter, isn't it?
The matter isn't the binary size of the update. It is the size of the crowd of people and parties involved in the update: For a bunch of statically linked applications, you need to involve all the application people/vendors. For a bunch of dynamically linked applications depending on a shared library, you (ideally) just need to involve the one library's people/vendor.
And "involve" might mean: wait for a bunch of unresponsive external parties to produce a new binary each for a security issue they might not care about. Of course on very different timelines, etc.
It does matter. If there is a problem with openssl, just the openssl maintainers have to push an update and everything on your system is secure.
If everything is statically linked, you need to wait for every maintainer of every single program on your system to rebuild and push an update. You're basically guaranteed that there is always _something_ missing important patches