One thing I noticed when repeatedly running docker build while developing a new docker file is that re-downloading lots of packages every time I build gets old fast. Run the build, wait, something isn't right, tweak the docker file, run build again, oh look it's downloading all those packages again. I'm sure everyone playing with docker files know exactly what I'm talking about.
So I was thinking about working on my laptop possibly without an internet connection and I suddenly thought, man that would be basically impossible to work on the docker files I want to. That's when I remembered apt-cacher-ng. You see, I had been wanting to set up an apt-cacher-ng server or equivalent for my home network for a while. I just never got around to doing it. But thinking about how to overcome this potential problem made me think to try using apt-cacher-ng with docker.
Long story short, I set up a quick apt-cacher-ng server (in a docker container of course) and set up another container to use it for apt. Installed a bunch of packages, deleted the container, then went to install the same packages again. It was amazing. The second install, the download phase was basically instantly done. No waiting whatsoever.
I'd publish an apt-cacher-ng docker image, but to do it right, you'll probably want to use volumes to keep the cached downloaded deb's outside of your image, and It's really nothing difficult anyway. There's also a decent example already on the docker website here http://docs.docker.com/examples/apt-cacher-ng/.
So if you've been complaining about re-downloading packages all the time, give this a try.