Showing posts with label Others. Show all posts
Showing posts with label Others. Show all posts

Friday, September 3, 2021

Error while syncing the local git repository to remote repository | error: unpack failed: error Shallow object update failed | Shallow vs Full Cloning

Error while syncing the local git repository to remote repository | error: unpack failed: error Shallow object update failed | Shallow vs Full Cloning

I was using the BitBucket Pipeline to sync the local repository(specified branches) to the remote git repository, the sync was working without any issue and stooped working recently with the below exception.

bitbucket-pipelines.yml:


image: atlassian/default-image:2

pipelines:
    branches:
      dev:
        - step:
           script:
             - git remote add sync https://testuser:[email protected]/test/test.git
             - git checkout dev
             - git pull    
             - git push sync dev

      uat:
        - step:
           script:
             - git remote add sync https://testuser:[email protected]/test/test.git
             - git checkout uat
             - git pull    
             - git push sync uat 

Error:
 
"git push sync dev
error: unpack failed: error Shallow object update failed: The object xxxxxxxxxxxxxxx is being referenced but does not exist.
To https://testuser:[email protected]/test/test.git
 ! [remote rejected] dev -> dev (Shallow object update failed: The object xxxxxxxxxxxxx is being referenced but does not exist.)
error: failed to push some refs to 'https://testuser:[email protected]/test/test.git'"



After analysis, the root cause for the issue is the BitBucket pipeline doing a Shallow clone(with specific depth)

A shallow clone is a repository created by limiting the depth of the history that is cloned from an original repository. A shallow clone is created using the --depth option when calling the clone command, followed by the number of commits that you want to retrieve from the remote repository.

--depth <depth> - Create a shallow clone with a history truncated to the specified number of commits. Implies --single-branch unless --no-single-branch is given to fetch the histories near the tips of all branches. If you want to clone submodules shallowly, also pass --shallow-submodules.

A shallow clone helps to improve the performance of the clone by pulling down just the latest commits, not the entire repo history

In full clone, by default git download the history of all branches, sometimes this will create performance issues but useful for some cases like repository synching.

git clone --branch="dev" --depth 50 https://x-token-auth:[email protected]/$BITBUCKET_REPO_FULL_NAME.git $BUILD_DIR
Cloning into '/opt/atlassian/pipelines/agent/build



The issue can be resolved by enabling the full clone instead of a shallow clone in the pipeline configuration

bitbucket-pipelines.yml:

image: atlassian/default-image:2

clone:
  depth: 'full'

pipelines:
    branches:
      dev:
        - step:
           script:
             - git remote add sync https://testuser:[email protected]/test/test.git
             - git checkout dev
             - git pull    
             - git push sync dev
     uat:
        - step:
           script:
             - git remote add sync https://testuser:[email protected]/test/test.git
             - git checkout uat
             - git pull    
             - git push sync uat 

Now the depth is not added during the clone and the pipeline is successfully completed - the local branch synced with the remote repository.



Saturday, June 22, 2019

Error extracting plugin descriptor: 'Goal: * already exists in the plugin descriptor for prefix: *

 Error extracting plugin descriptor: 'Goal: validate already exists in the plugin descriptor for prefix: test

I was getting the below error while installing the Maven plugin - mvn clean install

maven-plugin-error

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-plugin-plugin:3.2:descriptor (default-descriptor) on project test: Error extracting plugin descriptor: 'Goal: validate already exists in the plugin descriptor for prefix: test
Existing implementation is: test.MyMojo
Conflicting implementation is: test.MyMojo'
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)

The configurations are below

MyMojo.java

import org.apache.maven.plugin.AbstractMojo;
import org.apache.maven.plugin.MojoExecutionException;
import org.apache.maven.plugins.annotations.LifecyclePhase;
import org.apache.maven.plugins.annotations.Mojo;
import org.apache.maven.plugins.annotations.Parameter;

/**
 * Goal which validate the content.
 * @goal validate
 * @phase VERIFY
 */

@Mojo (name="validate", defaultPhase=LifecyclePhase.VERIFY, requiresProject=false )
public class MyMojo extends AbstractMojo
{
    @Parameter (property="message", defaultValue="Default Message")
    private String message;

    public void execute() throws MojoExecutionException
    {
        System.console().writer().println(message);
    }
}

Pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>test</groupId>
  <artifactId>test</artifactId>
  <packaging>maven-plugin</packaging>
  <version>1.0-SNAPSHOT</version>
  <name>test Maven Mojo</name>
  <url>http://maven.apache.org</url>
  
    <properties>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
  </properties>
  
  <dependencies>
    <dependency>
      <groupId>org.apache.maven</groupId>
      <artifactId>maven-plugin-api</artifactId>
      <version>3.0</version>
    </dependency>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>
  <dependency>
      <groupId>org.apache.maven.plugin-tools</groupId>
      <artifactId>maven-plugin-annotations</artifactId>
      <version>3.4</version>
      <scope>provided</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.maven</groupId>
      <artifactId>maven-core</artifactId>
      <version>3.2.5</version>
      <scope>provided</scope>
    </dependency>
  </dependencies>
</project>

After analysis, the root cause is @goal validate in the Java annotation and the @Mojo setting (name="validate") generating the same goal - validate and failing with the duplicate exception.

/**
 * Goal which validate the content.
 * @goal validate
 * @phase VERIFY
 */

@Mojo (name="validate", defaultPhase=LifecyclePhase.VERIFY, requiresProject=false )

The deployment is success after removing the @goal validate from the java annotation.

import org.apache.maven.plugin.AbstractMojo;
import org.apache.maven.plugin.MojoExecutionException;
import org.apache.maven.plugins.annotations.LifecyclePhase;
import org.apache.maven.plugins.annotations.Mojo;
import org.apache.maven.plugins.annotations.Parameter;

/**
 * Goal which validate the content.
 * @phase VERIFY
 */

@Mojo (name="validate", defaultPhase=LifecyclePhase.VERIFY, requiresProject=false )
public class MyMojo extends AbstractMojo
{
    @Parameter (property="message", defaultValue="Default Message")
    private String message;

    public void execute() throws MojoExecutionException
    {
        System.console().writer().println(message);
    }
}

Tuesday, December 25, 2018

com.google.appengine.tools.admin.AdminException: Unable to stage app: Cannot get the System Java Compiler. Please use a JDK, not a JRE - Google SDK App Deployment

I was getting the below exception while deploying the application through google cloud SDK command line interface - gcloud app deploy appengine-web.xml

gcloud-app-deploy-no-jdk

Unable to stage:
java.lang.RuntimeException: Cannot get the System Java Compiler. Please use a JDK, not a JRE.
at com.google.appengine.tools.admin.Application.compileJspJavaFiles(Application.java:1297)
at com.google.appengine.tools.admin.Application.compileJsps(Application.java:1273)
at com.google.appengine.tools.admin.Application.populateStagingDirectory(Application.java:983)
at com.google.appengine.tools.admin.Application.createStagingDirectory(Application.java:875)
at com.google.appengine.tools.admin.AppAdminImpl.stageApplication(AppAdminImpl.java:539)
at com.google.appengine.tools.admin.AppAdminImpl.stageApplicationWithDefaultResourceLimits(AppAdminImpl.java:492)
at com.google.appengine.tools.admin.AppCfg$StagingAction.execute(AppCfg.java:2508)
at com.google.appengine.tools.admin.AppCfg.executeAction(AppCfg.java:363)
at com.google.appengine.tools.admin.AppCfg.<init>(AppCfg.java:211)
at com.google.appengine.tools.admin.AppCfg.<init>(AppCfg.java:118)
at com.google.appengine.tools.admin.AppCfg.main(AppCfg.java:114)
com.google.appengine.tools.admin.AdminException: Unable to stage app: Cannot get the System Java Compiler. Please use a JDK, not a JRE.
at com.google.appengine.tools.admin.AppAdminImpl.stageApplication(AppAdminImpl.java:543)
at com.google.appengine.tools.admin.AppAdminImpl.stageApplicationWithDefaultResourceLimits(AppAdminImpl.java:492)
at com.google.appengine.tools.admin.AppCfg$StagingAction.execute(AppCfg.java:2508)
at com.google.appengine.tools.admin.AppCfg.executeAction(AppCfg.java:363)
at com.google.appengine.tools.admin.AppCfg.<init>(AppCfg.java:211)
at com.google.appengine.tools.admin.AppCfg.<init>(AppCfg.java:118)
at com.google.appengine.tools.admin.AppCfg.main(AppCfg.java:114)
Caused by: java.lang.RuntimeException: Cannot get the System Java Compiler. Please use a JDK, not a JRE.
at com.google.appengine.tools.admin.Application.compileJspJavaFiles(Application.java:1297)
at com.google.appengine.tools.admin.Application.compileJsps(Application.java:1273)
at com.google.appengine.tools.admin.Application.populateStagingDirectory(Application.java:983)
at com.google.appengine.tools.admin.Application.createStagingDirectory(Application.java:875)
at com.google.appengine.tools.admin.AppAdminImpl.stageApplication(AppAdminImpl.java:539)
... 6 more

The Java/Javac versions and the JAVA_HOME and PATH variables were configured properly.

gcloud-app-deploy-no-jdk

Also the java runtime was configured properly in appengine-web.xml

gcloud-app-deploy-no-jdk

After little struggle,  the root cause of the issue was with wrong configuration of PATH environment variable.

Other Java executables added to the PATH variable caused the SDK to choose the JRE version for deployment. 

gcloud-app-deploy-no-jdk


The issue got fixed after changing the order of PATH variable - moved up the JAVA bin path over other path configurations

gcloud-app-deploy-no-jdk

Friday, May 5, 2017

How to display the git tags based on the environment in Jenkins parameter

How to get the basic profile details of a user in external clients through OAuth - AEM

This post will explain how to display the git tags based on the environment in Jenkins parameter - Displaying the dynamic list with tag names filtering the tags with environment name e.g. QA, UAT, PROD(the environment name should be included in the tag while creating)

Select "This build is parameterized" in Jenkins job configuration
Add new parameter of type Extensible Choice
Enter the name "Tag" and select the Choice Provider as "System Groovy Choice Parameter"
Enter the below script in "Groovy System Script"

def gettags = "git ls-remote -t https://username:[email protected]/project/repo.git".execute()
def tags = []
def t1 = []
gettags.text.eachLine {tags.add(it)}
for(i in tags)
{
   def tagName=i.split()[1].replaceAll('\\^\\{\\}', '').replaceAll('refs/tags/', '')
   if(tagName.contains('QA'))
      t1.add(tagName)
}
t1 = t1.unique()
return t1

Change the git repository details and the string based on that the tags should be filtered e.g QA, UAT, PROD


Click on "Run the script now" to test the script - this will displayed the filtered tags.
Save the configurations finally

Tuesday, April 25, 2017

Error while submitting the Eloqua form - Value must not contain any URL

Error while submitting the Eloqua form - Value must not contain any URL

I was getting the following error while submitting the Eloqua form

<!DOCTYPE html>
<html>
<body bgcolor="#ffffff">
<div align="center" style="margin: 60px;">
<!-- CONFIRMATION PAGE TITLE -->
<div align="left" style="width: 400px; font-size: 14pt; font-family: Tahoma, Arial, Helevtica; font-weight: bold;">
<img src="/EloquaImages/ConfirmationPage/error.gif" width="32" height="50" border="0" align="left">  The Information Provided is Incomplete or Invalid. </div>
<!-- CONFIRMATION PAGE INFORMATION -->
<div align="left" style="width: 400px; font-size: 10pt; font-family: Arial, Helevtica; padding-left: 45px; padding-top: 10px; padding-right: 45px;">
<p>Reference- Value must not contain any URL&#39;s<br/></p>
</div>
</body>
</html>

The Reference filed is configured as hidden in Eloqua and also sending the URL as the input.

Based on the reading Oracle Eloqua 483 Release enabled by default "Must Not Contain URL" validation on all hidden fields but user was not able to modify this validation. But Eloqua 487 Release provided the access to users to modify the validation on hidden fields

Disable "Must Not Contain URL" validation in the hidden field that expecting the URL as input.