This blog is managed on GitHub
, and not only the markdown but also all images are managed with git. However, when you manage binary files like images with git, the problem is size.
GitHub
also recommends keeping repositories under 1GB.
Keep your repository small; ideally under 1GB, and definitely under 5GB. The smaller your repository, the faster cloning and maintenance will be. Individual files are strictly limited to 100MB. For details, see “Working with large files”.
So, for blogs or content sites that use a lot of binary files, it’s ideal to use storage services like S3 for images and only reference them in git. But that’s a hassle, right? You want to manage images together too. That means you need to use something like Git LFS
. But that’s also a hassle.
Isn’t there an easier way? After some research, I found that aws
CodeCommit
is a good candidate.
It’s especially excellent in terms of high availability and durability.
AWS CodeCommit stores your repositories in Amazon S3 and Amazon DynamoDB. Encrypted data is redundantly stored across multiple facilities, ensuring high availability and durability.
CodeCommit
itself also says:
With CodeCommit, you can store anything from code to binaries.
So you really can commit anything. It seems to use S3 under the hood, so it probably handles binary data well (I don’t know the details).
The downside compared to GitHub is that you can’t manage issues, but being able to manage binary data together is a big plus.
If you’re struggling with binary data, why not consider CodeCommit as an option?