Skip upload of known LFS objects to forks
Release notes
The client should not be required to upload already known LFS objects to forks.
Problem to solve
As a developer working from home on a typical asynchronous home Internet connection, I want to use the workflow of pulling from the original repository and pushing to a fork in order to use Merge Requests.
On the initial creation of the fork, all LFS objects are made available to the fork. On the client side, I need to make sure to do one fetch to make them known to the client. git lfs checks the remote branches to learn about LFS objects known to the (forked) repository.
All LFS objects created after the initial fork, however, are pushed to the fork from the client. This is a huge issue in relation to home Internet connections with asynchronous speed. Using a DSL connection with 16Mbit/s downstream and 1Mbit/s upstream, a 4 minute download from the original repository, will take over an hour to push to the fork.
Gitlab basically tells the client to upload the objects, but then throws them away (disk usage does not increase).
Intended users
People working from home on repositories with large files.
User experience goal
The user should be able to create merge requests from home by pushing to a forked repository using LFS files without having to spend several hours on uploading of redundant information.
Proposal
According to https://github.com/git-lfs/git-lfs/issues/4362 the git-lfs client offers LFS objects and the server may tell the client that the object is already known. When the client offers an LFS object, that is known to a forked repository itself or its parent, Gitlab should tell the client not to upload it.
Further details
Permissions and Security
No changes to the permission.
Documentation
This issue proposes a performance optimization (which would enable the usage of forks with LFS objects), but does so in a way that is completely transparent to users.