Friday, September 3, 2021

Error while syncing the local git repository to remote repository | error: unpack failed: error Shallow object update failed | Shallow vs Full Cloning

Error while syncing the local git repository to remote repository | error: unpack failed: error Shallow object update failed | Shallow vs Full Cloning

I was using the BitBucket Pipeline to sync the local repository(specified branches) to the remote git repository, the sync was working without any issue and stooped working recently with the below exception.

bitbucket-pipelines.yml:


image: atlassian/default-image:2

pipelines:
    branches:
      dev:
        - step:
           script:
             - git remote add sync https://testuser:[email protected]/test/test.git
             - git checkout dev
             - git pull    
             - git push sync dev

      uat:
        - step:
           script:
             - git remote add sync https://testuser:[email protected]/test/test.git
             - git checkout uat
             - git pull    
             - git push sync uat 

Error:
 
"git push sync dev
error: unpack failed: error Shallow object update failed: The object xxxxxxxxxxxxxxx is being referenced but does not exist.
To https://testuser:[email protected]/test/test.git
 ! [remote rejected] dev -> dev (Shallow object update failed: The object xxxxxxxxxxxxx is being referenced but does not exist.)
error: failed to push some refs to 'https://testuser:[email protected]/test/test.git'"



After analysis, the root cause for the issue is the BitBucket pipeline doing a Shallow clone(with specific depth)

A shallow clone is a repository created by limiting the depth of the history that is cloned from an original repository. A shallow clone is created using the --depth option when calling the clone command, followed by the number of commits that you want to retrieve from the remote repository.

--depth <depth> - Create a shallow clone with a history truncated to the specified number of commits. Implies --single-branch unless --no-single-branch is given to fetch the histories near the tips of all branches. If you want to clone submodules shallowly, also pass --shallow-submodules.

A shallow clone helps to improve the performance of the clone by pulling down just the latest commits, not the entire repo history

In full clone, by default git download the history of all branches, sometimes this will create performance issues but useful for some cases like repository synching.

git clone --branch="dev" --depth 50 https://x-token-auth:[email protected]/$BITBUCKET_REPO_FULL_NAME.git $BUILD_DIR
Cloning into '/opt/atlassian/pipelines/agent/build



The issue can be resolved by enabling the full clone instead of a shallow clone in the pipeline configuration

bitbucket-pipelines.yml:

image: atlassian/default-image:2

clone:
  depth: 'full'

pipelines:
    branches:
      dev:
        - step:
           script:
             - git remote add sync https://testuser:[email protected]/test/test.git
             - git checkout dev
             - git pull    
             - git push sync dev
     uat:
        - step:
           script:
             - git remote add sync https://testuser:[email protected]/test/test.git
             - git checkout uat
             - git pull    
             - git push sync uat 

Now the depth is not added during the clone and the pipeline is successfully completed - the local branch synced with the remote repository.



1 comment: