

- #Git clone branch from remote update
- #Git clone branch from remote code
- #Git clone branch from remote download
Where sha refers to the usual SHA of the entire repository.

submodules that can track only subdirectories), we can support this with some metadata in a git tracked file: Until git starts supporting this natively (i.e. external directories can be relied upon only at specified versions.every file and subdirectory under it directly at their latest versions.We want to avoid submodules because it is annoying to have to commit to two separate repositories every time you make a change that has a submodule and non-submodule component.Įvery directory with a Makefile or analogous should build and test itself. This is a possible philosophy for monorepo maintenance without submodules. Imagine if submodules were treated exactly like regular directories: just request a tree SHA, and a DNS-like mechanism resolves your request, first looking on your local ~/.git, then first to closer servers (your enterprise's mirror / cache) and ending up on GitHub. Imagine if GitHub would allow per file / directory metadata like stars and permissions, so you can store all your personal stuff under a single repo. Imagine storing huge blobs directly in the repo without any ugly third party extensions.
#Git clone branch from remote code
Imagine having all the code base of your enterprise in a single monorepo without ugly third-party tools like repo. The dream: any directory can have web interface metadata
#Git clone branch from remote download
You can also only download certain files with the much more understandable: git clone -depth 1 -filter=blob:none -no-checkout \īut that method for some reason downloads files one by one very slowly, making it unusable unless you have very few files in the directory.Īnother less verbose but failed attempt was: git clone -depth 1 -filter=blob:none -sparse \īut that downloads all files in the toplevel directory: The sparse-checkout part is also needed unfortunately.
#Git clone branch from remote update
The -filter option was added together with an update to the remote protocol, and it truly prevents objects from being downloaded from the server. Let me know if anyone finds a way to clone just the small/ directory from it! I've also created a very extreme repository with some very large tree objects (100 MB) under the directory big_tree: In most projects this won't be an issue, but the perfectionist in me would like to avoid them. We can confirm that by running: git ls-filesĪnd seeing that it contains the directories large files such as: big/0 directory listings, but not actual file contents). The above method downloads all Git tree objects (i.e. TODO also prevent download of unneeded tree objects Your branch is up to date with 'origin/master'.

Remote: Total 253 (delta 0), reused 253 (delta 0), pack-reused 0 Remote: Total 3 (delta 0), reused 3 (delta 0), pack-reused 0 Remote: Compressing objects: 100% (3/3), done. Remote: Counting objects: 100% (3/3), done. Remote: Total 1 (delta 0), reused 1 (delta 0), pack-reused 0Īnd then the final checkout downloads the files we requested: remote: Enumerating objects: 3, done. Remote: Counting objects: 100% (1/1), done. On the above, git clone downloads a single object, presumably the commit: Cloning into 'test-git-partial-clone-big-small'. So if you download anything you didn't want, you would get 100 MB extra, and it would be very noticeable. a small/ and small2/ subdirectories with 1000 files of size one byte eachĪll contents are pseudo-random and therefore incompressible, so we can easily notice if any of the big files were downloaded, e.g.9 on toplevel (this is because certain previous attempts would download toplevel files) a big/ subdirectory with 10x 10MB files.In this test, clone is basically instantaneous, and we can confirm that the cloned repository is very small as desired: du -apparent-size -hs *. This method doesn't work for individual files however, but here is another method that does: You could also select multiple directories for download with: git sparse-checkout set -no-cone small small2 Git clone -filter + git sparse-checkout downloads only the required filesĮ.g., to clone only files in subdirectory small/ in this test repository: git clone -n -depth=1 -filter=tree:0 \Ĭd test-git-partial-clone-big-small-no-bigtree
