April flash sale

Jenkins Interview Questions and Answers

Jenkins is an open-source automation server that helps developers automate various parts of the software development process. It is primarily used for Continuous Integration and Continuous Delivery processes and ensures that code changes are tested, integrated, and delivered to end users quickly and with high quality. Check out this list of top Jenkins interview questions and answers. We have covered topics like scheduling, building pipelines, source code management, environment directive, triggers, continuous integration, deployment, builds, Git, Jira, Docker, and AWS, so this list is equally beneficial for beginners, intermediate and expert professionals. Prepare better with the best interview questions and answers, and walk away with top interview tips. These Jenkins interview questions with answers will boost your core interview skills and help you perform better. Be smarter with every interview.

  • 4.7 Rating
  • 57 Question(s)
  • 30 Mins of Read
  • 10369 Reader(s)

Beginner

The user needs to open the console output for the build and will try to see if any file changes that were missed during building the project. If there are no issues on that then a better approach would be clean and update his local workspace to replicate the problem on their local machine and will try to solve it.

To make sure Jenkins build is not broken at all we need to make sure that we perform a successful clean install on the local machine with all unit tests. Then make sure that all code changes are checked in without any issues. Then synchronize with a repository to make sure that all required config and changes and any differences are checked into the repository. 

Below are the ways of scheduling build in Jenkins

  • Builds can be triggered by source code management commits.
  • Builds can be triggered sequentially after completion of other builds.
  • Can be scheduled to run at a specified time using the CRON jobs
  • Manual Build Requests.

This is one of the most frequently asked Jenkins interview questions for freshers in recent times. 

Please follow the below steps,

  • Slide a job from one installation of Jenkins to another by copying the related job directory
  • Make a copy of an already existing job by making a clone of a job directory by a different name
  • Renaming an existing job by renaming a directory.

The role-based strategy plugin allows us to create three different types of roles as describes below,

  • Global Roles: Some of the roles such as admin, job creator, anonymous can be created while selecting this option. The user can allow setting Overall, slave, job, and View and SCM permissions on a global basis.
  • Project roles: This job is basically to allow the creation of Job and Run permissions on a project basis.
  • Slave roles: This job is only to set node-related permissions.

Creating a chain of jobs in Jenkins is the process of automatically starting the sequential job after one job is executed successfully. This approach lets the user build multi-step build pipelines or trigger the rebuild of a project if one of the Project dependencies is updated.

A Jenkins file is a text file that contains the definition of a Jenkins Pipeline and it is generally checked into source control. 

  • Audit trail for the Pipeline can be monitored
  • It serves as a single source of truth for the Pipeline, which can be viewed and edited by multiple members of the project.

Jenkins supports a wide range of Source Code Management tools and few of them are mentioned below,

  • AccuRev
  • CVS
  • Subversion
  • Git
  • Mercurial
  • Perforce
  • Clearcase
  • RTC

This is one of the most frequently asked Jenkins interview questions for freshers in recent times.

The environment directive specifies a sequence of key-value pairs which will be defined as environment variables for the all steps, or stage-specific steps, depending on where the environment directive is located within the Pipeline. This directive supports a special helper method credentials() which can be used to access pre-defined Credentials by their identifier in the Jenkins environment.

A stage block defines a conceptually distinct subset of tasks performed through the entire Pipeline. Stages contain a sequence of one or more stage directives, the stages section is where the bulk of the "work" described by a Pipeline will be located. Minimum, it is recommended that stages contain at least one stage directive for each discrete part of the continuous delivery process, such as Build, Test, and Deploy.

This is one of the most frequently asked Jenkins interview questions and answers for freshers in recent times.

Jenkins is an open source automation server written in Java. As an extensible automation server, Jenkins can be used as a simple CI server or turned into the continuous delivery hub for any project.

Impress Employers with your Jenkins knowledge

  • So, this is all about the best interview questions and answers on Jenkins. I hope these Jenkins interview questions help you crack your interview.
  • Prepare yourself with the best interview questions and answers on Jenkins to land your dream job!

Continuous delivery (CD ) is a product design methodology in which team software in short cycles, guaranteeing that the product can be dependably released whenever and, when releasing the product, doing as such manually. It goes for building, testing, and releasing programming with more prominent speed and recurrence. The methodology lessens the cost, time, and risk of conveying changes by considering progressively gradual updates to applications underway. A direct and repeatable arrangement procedure is significant for continuous delivery. CD stands out from continuous deployment, a comparable methodology where software is likewise delivered in short cycles yet through automatic arrangements as opposed to manual ones. The flowchart beneath demonstrates the Continuous Delivery Workflow. Hope it will be much easier to understand with visuals.

continuous delivery workflow

Intermediate

Jenkins is an open source tool written in Java with modules worked for Continuous Integration reason. Jenkins is utilized to construct and test your product extends consistently making it simpler for engineers to coordinate changes to the undertaking, and making it simpler for clients to acquire a new form. It additionally enables you to persistently convey your product by coordinating with an enormous number of testing and arrangement innovations. With Jenkins, associations can quicken the product improvement process through computerization. Jenkins incorporates improvement life-cycle procedures of assorted types, including manufacture, record, test, bundle, organize, convey, static examination and considerably more. Jenkins facilitate Continuous Integration with the help of various plug-ins. Plug-in allows the combination of Various DevOps stages. If we need a specific use case, we have to introduce the plug-in for those tools. For example Git, Maven 2, Amazon EC2, HTML distributer and so on.

The simple workflow of JenkinsThe simple workflow of Jenkins

Points of interest of Jenkins include:

  • It is an open source instrument with incredible network support.
  • It is easy to install and configure
  • It has 1000+ modules to facilitate your work. On the off chance that a module does not exist, you can code it and offer with the network.
  • It is free of cost.
  • It is worked with Java and henceforth, it is convenient to all the real stage

Consistent integration is a procedure where all development work is incorporated as ahead of schedule as could reasonably be expected. The subsequent artifacts are consequently made and tried. This procedure permits to recognize mistakes as ahead of schedule as could reasonably be expected. Jenkins is a well known open source device to perform constant integration and build. Prior to CI, Nightly build was famous where code was getting pulled distinctly around evening time however CI guaranteed the early discovery of deformities.

continuous integration in Jenkins

Continuous Integration has numerous advantages: 

  • Bid a fond farewell to long and tense combinations
  • Increment perceivability empowering more noteworthy correspondence
  • Catch issues early and halt them from developing in any way
  • Invest less energy investigating and additional time including highlights
  • Fabricate a strong establishment
  • Quit holding on to see whether your code's getting down to business
  • Diminish incorporation issues enabling you to convey programming all the more quickly

Here is a rundown of the main 8 Continuous Integration tools:

  • Jenkins: Jenkins is an open-source CI tool written in Java.
  • TeamCity: TeamCity is the full-grown CI server, originating from the labs of the JetBrains organization. Incredible arrangement by and large, however because of its intricacy and value, more qualified for big business needs.
  • Travis CI: A Mature arrangement that offers both facilitated and On-premises variations, cherished and utilized by numerous groups, very much reported.
  • Go CD: Go is the most up to date Cruise Control manifestation from the ThoughtWorks organization. Barring the business bolster that ThoughtWorks offers, Go is for nothing out of pocket. It is accessible for Windows, Mac, and different Linux circulations.
  • Bamboo: It is Atlassian offerings.Great On-premises CI instrument that initially offered Cloud arrangement as well. Bitbucket Pipelines supplanted the cloud arrangement. A Pipeline is a cutting edge and quick cloud CI instrument incorporated into Bitbucket. Has a free preliminary for 30 days, and paid plans after that.
  • GitLab CI: GitLab CI is a fundamental piece of the open-source Rails venture GitLab, which was exposed by the organization GitLab inc
  • CircleCI: Another cloud elective that originates from the organization with a similar name. CircleCI as of now just backings GitHub and the rundown of upheld dialects incorporates Java, Ruby/Rails, Python, Node.js, PHP, Haskell, and Scala.
  • Codeship: Codeship comes in two unique variants: Basic and Pro. The fundamental form offers out-of-the-container Continuous Integration administration yet doesn't have docker backing and its principal intention is to construct applications with normal work processes through the UI. Star form offers greater adaptability and docker support.

SCM stands for Source Code Management is an integral part of any development project in the current IT world. It is very critical to manage source code in an efficient way. There are several SCM tools which are available. Some advantages of using SCM:

  1. Backups: Always accept that your PC is going to be sucked into a monstrosity dark opening at any second and work to limit the misfortune from that. Source control enables you to effortlessly push finished work to a remote host so little work is lost with a solitary PC.
  2. Record of work: On numerous ventures, there's intermittently when you need to return the code to a past state to perceive how something was done, if a bug was available, or work out why the present code is broken. Source control makes this simple.
  3. Arrangement: Having source control enables you to robotize fabricates and organizations. No one ought to ftp records around any longer as that is inclined to human blunder.
  4. Forming: It's anything but difficult to swap between renditions of your code, giving you a chance to switch between fixing a little bug underway and thoroughly reworking your key usefulness.
  5. Simple venture setup: As a task's source code can likewise incorporate arrangement contents, introduce records, and so forth… at that point setting up an undertaking can be as simple as cloning the storehouse and running a few contents to set up a domain that is indistinguishable from every single other designer.

Jenkins underpins AccuRev, CVS, Subversion, Git, Mercurial, Perforce, ClearCase and RTC. For every one of them exists a module and as you likely definitely know, Jenkins isn't constrained to just that rundown, anybody can make an SCM module for different choices on the off chance that they need to.

Here is the connection to the modules: 

Expect to come across this, one of the most important Jenkins interview questions for experienced professionals in product management, in your next interviews.

In Jenkins, under the job configuration arrangement, we can characterize different build triggers. Basic discover the 'Build Triggers' segment and check the ' Build Periodically' checkbox. With the occasionally build you can plan the construct definition by the date or day of the week and an opportunity to execute the assembly. The configuration of the 'Schedule' textbox is as per the following:

MINUTE (0-59), HOUR (0-23), DAY (1-31), MONTH (1-12), DAY OF THE WEEK (0-7)

In the activity design page, we should look down directly to the Build Triggers area. Since we expect to make a direct activity, we should choose the checkbox stamped Build intermittently. When we select this checkbox, a Text Box is shown with the Schedule name. We need to offer some benefit in a Cron-consistent arrangement. There's broad data accessible on the page in the event that we click the question mark next to the textbox.

Let’s type */2 * * * * here, which represents an interval of two minutes:l

Build in Jenkins

After selecting out of the content box, we can see data directly underneath the container. It enlightens us regarding when will the activity keep running straightaway. How about we spare the activity – in around two minutes, we should see the status of the principal execution of the activity:

Project simple job

Since we’ve configured the job to run every two minutes, we should see multiple build numbers when we go back to the job dashboard after waiting for some time.

In the present DevOps world, continuous delivery and deployment are basic to conveying fantastic programming item quicker than any time in recent memory. Jenkins is an open-source persistent incorporation server written in Java. It is by a wide margin the most generally utilized instrument for overseeing constant reconciliation builds and delivery pipelines. It helps engineers in structure and testing programming continuously. It expands the size of mechanization and is rapidly picking up ubiquity in DevOps circles. One of the key points of interest of Jenkins is that it requires little upkeep and has worked in a GUI apparatus for simple updates. Jenkins additionally gives tweaked arrangement as there are more than 400 modules to help to build and to test essentially any venture. By executing the correct setup for you, you get practically prompt input. You will dependably know whether the manufacturer broke. You will become more acquainted with what the explanation behind occupation come up short was and you can likewise become acquainted with how you can return it.

Advantages of Jenkins include: 

  1. Bugs tracking are simple at a beginning period being developed condition.
  2. Gives an enormous number of module support.
  3. Iterative improvement to the code.
  4. Build failures are stored at incorporation arrange.
  5. For each code commit changes a programmed assemble report warning produces.
  6. To advise engineers about build success /failures, it is coordinated with the LDAP mail server.
  7. Accomplishes continuous integration agile development and test driven improvement.
  8. With straightforward advances, the maven release project is automated.

If you somehow managed to diagram the conventional programming development life cycle on a bit of paper, the left half of the chart would almost certainly incorporate tasks, for example, design and development, while the right side would probably incorporate client feedback, stress testing, and production staging. To move left in DevOps infers a longing to take a significant number of those undertakings that regularly occur close to the finish of the application advancement procedure and move them into prior stages. Now and again, this may intend to join static code investigation schedules in each build. Another approach to play out a DevOps move left is to make production-ready artifacts toward the finish of each Agile sprint with the goal that clients and partners can get gradual reports on how improvement is advancing. Proper DevOps means moving left however much as could be expected.

Relative costs to fix software defects

Shifting left testing in software production allows developers to catch and fix issues earlier.

Expect to come across this, one of the most important Jenkins interview questions for experienced professionals in product management, in your next interviews.

Prerequisites: Before you continue to introduce Jenkins in your windows framework, there are a few requirements for Jenkins to introduce Jenkins in your PC.

Hardware requirements: 

  • You need at least 256 MB of RAM in your PC or workstation to introduce Jenkins. You need in any event 1 GB of space in your hard drive for Jenkins.

 Software Requirements:

  • Since Jenkins keeps running on Java, you need either Java Development Kit (JDK) or Java Runtime Environment (JRE).
  • Release Types: Jenkins releases two sorts of adaptations dependent on enterprise needs. Long-term support release, Weekly release
  • Long-term support release (LTS): Long-term support releases are accessible at regular intervals. They are steady and are generally tried. This release is expected for end clients.
  • Weekly release: Weekly release are made accessible consistently by fixing bugs in its prior rendition. These releases are expected towards module engineers. We will utilize the LTS release however the procedure continues as before for Weekly release.

Move a job starting with one installation of Jenkins then onto the next by just duplicating the relating work directory. Make a duplicate of a current job by making a clone of a job directory by an alternate name. Rename a current job by renaming a directory. Note that on the off chance that you change a job name you should change whatever other activity that endeavors to call the renamed job. Those tasks should be possible notwithstanding when Jenkins is running. For changes like these to produce results, you need to click "reload config" to compel Jenkins to reload design from the disk.

Follow these steps to move or copy Jenkins from one server to another:

  • Copy the related job directory and move a job from one installation of Jenkins to another.
  • Make a copy of an already existing job by making a clone of a job directory by a different name.
  • Renaming an existing job by renaming a directory.

We can also try one of the plug-ins as well as having Job export options like Job Importer Plug-in. Jenkins CLI can be also used if we have less number of jobs but usually at the enterprise level, we have a large number of jobs so not widely used.

Ansible is an incredible asset for IT automation and can be utilized in a CI/CD procedure to arrangement the target environment and to an application on it. The most effective method to utilize Ansible for environment provisioning and application deployment in a Continuous Integration/Continuous Delivery (CI/CD) process utilizing a Jenkins Pipeline. Jenkins is a notable tool for automation CI/CD. Shell scripts are regularly utilized for provisioning environments or to deploy applications amid the pipeline stream. Despite the fact that this could work, it is bulky to keep up and reuse contents over the long haul.

The motivation behind utilizing Ansible in the pipeline stream is to reuse jobs and Playbooks for provisioning, leaving Jenkins just as a procedure orchestrator rather than a shell script agent.

Use of Ansible in Jenkins

The above  represents the accompanying engineering components:

  • Github is the place where our project is facilitated and where Jenkins will survey for changes to begin the pipeline stream.
  • SonarSource is our source code analysis server. On the off chance that anything turns out badly amid the examination (for example insufficient unit tests), the stream has interfered. This progression is imperative to ensure the source code quality record.
  • Nexus is the artifact archive. After an effective accumulation, unit tests, and quality examinations, the binaries are transferred into it. Later those binaries will be downloaded by Ansible amid the application deployment.
  • The Ansible Playbook, which is a YAML record coordinated in the application source code, conveys the Spring Boot App onto a CentOS machine.
  • Jenkins is our CI/CD process orchestrator. It is dependable to assemble every one of the pieces, bringing about the application effectively deployed in the target machine.

Individuals utilizing DevOps practices depend on a few key performance indicators(KPIs) to pass judgment on the achievement of their DevOps endeavors:

  • Deployment frequency: The capacity to make code changes rapidly and effectively is a key upper hand for any organization that necessities to convey new highlights rapidly to clients, and react to their feedback.
  • Speed of deployment: Frequent code deployment depends in enormous part on having the option to move rapidly from committed code to that code running effectively in the production condition.
  • Failure Rate: It is extraordinary to send all the more regularly and rapidly, however on the off chance that changes bomb similarly as every now and again, you've picked up nothing. Failed deployments can bring services down, bringing about lost income and baffled clients. DevOps practices can have a major effect on the failure rate.
  • Time to Recovery: At the point when service goes out, the capacity to recuperate rapidly can have an immense effect on business results. It's not astonishing, at that point, that enormous web organizations like Google, Etsy, Netflix, and Amazon push the envelope in their endeavors to improve time to recuperation, normally breaking their applications and foundation to find -   arrangement against - anything that can turn out badly.

Groovy is the default scripting language that is being utilized being developed from JMeter Version 3.1. Presently Apache Groovy is the dynamic object-oriented programming language that is utilized as a scripting language for the Java stage. Apache Groovy accompanies some valuable highlights, for example, Java Compatibility and Development Support. The groovy module adds the capacity to straightforwardly execute Groovy code.

To arrange accessible Groovy installation on your framework, go to Jenkins setup page, discover area 'Groovy' and fill the structure as appeared.

Groovy in Jenkins

On the off chance that you don't arrange any Groovy establishment and select (Default) choice in a job, the module will fall back into considering only the groove order, accepting you have awesome paired on the default way on a given machine.

 A must-know for anyone looking for top Jenkins interview questions, this is one of the frequently asked interview questions for Jenkins.

For defining a Pipeline in SCM, a Jenkins file is a text file that contains the meaning of a Jenkins Pipeline and is registered with source control. Consider the accompanying Pipeline which executes an essential three-organize continuous delivery pipeline. Making a Jenkins file, which is registered with source control, gives various prompt advantages:

  1. Code review/emphasis on the Pipeline
  2. Audit trail for the Pipeline
  3. Single wellspring of truth for the Pipeline, which can be seen and altered by numerous individuals from the undertaking.

Please find below Pipeline which implements a basic three-stage continuous delivery pipeline:

Code

We ensure that we perform effective clean introduce on our local machine with all unit tests. At that point, we ensure that we check in all code changes. We do complete a Synchronize with the archive to ensure that all required config and POM changes and any differences are registered with the repository. You have to pursue the underneath referenced strides to ensure that the Project build does not break:

  • Perfect and successful installation of Jenkins on your local machine with all unit tests.
  • All code changes are reflected effectively.
  • Checking for repository synchronization to ensure that every one of the distinctions and changes identified with config and different settings is spared in the repository.

There is a build failure analyzer plug-in which tells us a reason for build failure in case the build gets failed even after following the above steps. The module accompanies an empty information base of failure causes. Populating this learning base is finished by utilizing the connection "Failure Cause Management". The connection has appeared if the consent UpdateCauses is set for the current user. Press "Create New" and include a name and a depiction for the Failure Cause. The depiction ought to contain the motivation behind why this build flopped just as potential answers for the build failure.

Break in Jenkins

The basic difference is one is corporate offering while other is community offering. If we consider all aspect then this would be very exciting to know that Hudson is the conventional name for Jenkins. Jenkins is the latest offering by the core engineers of Hudson. To know why you have to know the historical backdrop of the project. It was initially open source and bolstered by Sun System. Like quite a bit of what Sun did, it was genuinely open, yet there was a touch of amiable disregard. The source, trackers, site, and so forth were facilitated by Sun on their closed java.net environment. 

At that point Oracle purchased Sun. For different reasons Oracle has not been modest about utilizing what it sees as its assets. Those incorporate some power over the strategic platform of Hudson, and especially command over the product name. Numerous clients and developers weren't happy with that and chosen to leave. So it comes down to what Hudson versus Jenkins offers. Both Oracle's Hudson and Jenkins have the code. Hudson has Oracle and  Sonatype corporate help and the brand. Jenkins has the vast majority of the core engineers, the network, and (up until this point) considerably more genuine work.

There are various ways to re-trigger the pipeline in an automated way, triggers are characterized in the Jenkins. A few of pipelines are cooperating with sources like GitHub, BitBucket, or different triggers first then they are actualized to play out a particular activity.

A build trigger might be utilized for different purposes relying upon the context of the project.

  • For instance: In the event that an enterprise might want to have a CI/CD pipeline setup utilizing Jenkins. In this case, an organization can set up build triggers to trigger downstream system build, for example, 
    • Integration tests
    • Code health check up
    • Load Tests
    • Start to finish Tests
    • Deployment.
  • The above steps will be anchored to the parent work and can be activated one by one or in parallel contingent upon the stage (this is the place where build trigger is utilized, trigger the downstream system build if the parent build is a success).
  • Build occasionally can be utilized to keep running on standard jobs(in the event that we have a team set up which will be deploying master every evening). At that point, we can set up the jobs to build occasionally late night times at a fixed time (additionally the job can be activated dependent on progress as clarified in (1)
  • Polling SCM keeps checking for any new code addition by checking commit history and trigger build thereafter.Pooling your archive and construct dependent on that.

A continuously checking /monitoring SCM tools like GIT or subversion for identifying any new commits are considered a waste of clock cycles and not considered best practices. We should avoid this process. There is a better approach available to us which is a reversal of the above approach. The other approach is industry standard one and more popular. The other approach encourages build getting triggered from source code tool whenever new code is committed or existing code undergoes some change. This is very easy to configure with GitHub or GItLab using a post-commit hook that runs every time code commit is successful. This setup eliminates the need for constantly monitoring source code as a post-commit hook will trigger the build whenever any code gets committed in the source code.

Move a job starting with one installation of Jenkins then onto the next by just duplicating the relating work directory. Make a duplicate of a current job by making a clone of a job directory by an alternate name. Rename a current job by renaming a directory. Note that on the off chance that you change a job name you should change whatever other activity that endeavors to call the renamed job. Those tasks should be possible notwithstanding when Jenkins is running. For changes like these to produce results, you need to click "reload config" to compel Jenkins to reload design from the disk.

We can do the steps below to copy Jenkins from the primary server to other servers:

  1. As part of the first step, we can start copying and moving Job directory & job name from the primary installation of Jenkins to another one.
  2. We can also start making another copy of the already configured job by cloning job directory by renaming it to a different name.
  3. Renaming already configured job by giving it some other name.

We can also try one of the plug-ins as well as having Job export options like Job Importer Plugin. Jenkins CLI can be also used if we have less number of jobs but usually at the enterprise level, we have a large number of jobs so not widely used. 

Don't be surprised if this question pops up as one of the top interview questions on Jenkins pipeline in your next interview.

Continuous integration

CI is DevOps practices where developers regularly integrate code with at regular time interval  The integration is accompanied by build and tests in an automatic way. The automated testing is optional in  CI but it is implied. The biggest advantage of CI is frequent integration which leads to early detection of errors and early remediation. We can work on changes which lead to specific build error or code issue. Since integration was frequent so fixing becomes easier.

Continuous integration

Continuous delivery 

CD is the next step of CI so it can be considered as an extension of continuous integration which facilitates releasing new changes in the production environment in a controlled way. The code base has to always be in a deployable state to facilitate CD pipeline in DevOps.The continuous delivery ensures seamless delivery of any changes whether it be code /configuration in production. This is very useful for the complex system where several developers working on the same code base making it very challenging to ensuring code base in a deployable state. The advantages of  CD is early detection of production issue and quick fix for same.

Continuous delivery

Continuous deployment

This practice is one more step ahead of continuous delivery. The continuous delivery releases changes in production in a controlled way but continuous deployment facilitate the release of changes to production automatically once all changes pass through production pipeline stages successfully. It ensures no human intervention. The feedback loop is much faster if we enable continuous deployment for our project. It supports the theme fast to market. It is similar to continuous delivery but release happens seamlessly.

Continuous deployment

Maven and ANT are classified as build tool but Maven has one additional advantage that it supports project management, dependency management, and standard project layout. Jenkins is a continuous integration tool which is much more superior than a build tool. ANT is the oldest one in a  lot and widely used build tool. A build tool is useful in creating binary artifacts like JAR file or WAR file. We can easily set up continuous delivery pipeline using Jenkins /Hudson which ensures triggering automatic build, test and deploys code base to production. The gang of four ANT,  Maven, Jenkins, and Hudson are tools for building, uni testing, continuous integration and facilitate project management. 

The Jenkins and Hudson are siblings from the same family while one is enterprise offerings and other is open source supported by dev community. Maven offerings are not mere of any other build tool offerings. We need to set up everything in ANT  like souce, build and target directory while Maven has a predefined structure for same. It is said that convention rule dominates Maven while ANT follows configuration rule. The continuous integration tool helps us in triggering build whenever any new changes committed in the source code which ensures if build gets compiled successfully or not. At the same time when compilation is successful, run the unit test and deploy build at a scheduled time ANT vs Maven

Difference between Maven, Ant, and Jenkins

This is a common yet one of the most important Jenkins pipeline interview questions and answers for experienced professionals, don't miss this one.

Continuous Delivery is the capacity to get changes of assorted types—including new highlights, design changes, bug fixes and analyzes—into generation, or under the control of clients, securely and rapidly in a reasonable manner. We accomplish this by guaranteeing our code is always in a deployable state, even despite groups of thousands of engineers making changes once a day. We along these lines totally kill the integration, testing and hardening stages that generally pursued "dev complete", just as code freezes.

Developers take a shot at their local environment for making changes in the source code and push it into the code repository. At the point when a change is distinguished, Jenkins plays out a few tests and code standards to check whether the changes are great to deploy or not. Upon a successful build, it is being seen by the developers. At that point, the change is conveyed manually on a staging environment where the customer can examine it. At the point when every one of the changes gets approved by the developers, testers, and customers, the final result is saved manually on the production server to be utilized by the end clients of the product. Along these lines, Jenkins pursues a Continuous Delivery approach and is known as the Continuous delivery Tool. Although there are manual steps

Continuous Delivery

Advanced

Below are the steps to deploy a custom build of a core plugin:

  •  Stop Jenkins.
  •  Copy the custom HPI to $Jenkins_Home/plugins.
  •  Delete the previously expanded plugin directory.
  •  Make an empty file called <plugin>.hpi.pinned.
  •  Start Jenkins. 

The Jenkins Job DSL Plugin is basically the Domain Specific Language (DSL) plugin that allows users to describe jobs using a scripting language named Groovy, the plugin can manage the scripts and updating of the Jenkins jobs which are created and maintain them as a result.

  • CI (Continuous integration)

Continuous Integration is a process that helps developers to integrate code into a shared repository several times. Each check-in is verified by an automated build process, allowing teams to detect problems early. 

  • CD (Continuous delivery)

Continuous delivery is an extension of a continuous integration process which helps to make sure that new changes can be released to the customers quickly in a sustainable way.

With the help of input directives, the user could prompt the inputs with the help of input steps. The stage is paused after different options have been applied and before we can enter to the stage agent to evaluate its condition. Once the input is approved, we could continue with the stage ahead. The parameters that are given as the part of input submission will be available in the environment for the rest of the stage.

The scripted pipeline is very much similar to the declarative pipeline that's built on the top of the underlying pipeline sub-system. The scripted pipeline is based on a general-purpose language based on Groovy. A list of features or benefits that are available in Groovy can be used along scripted pipeline too. In brief, this is a highly flexible tool that can be used in multiple continuous delivery pipelines.

Blue Ocean is a project that rethinks the user experience of Jenkins, modeling and presenting the process of software delivery by surfacing information that’s important to development teams with as few clicks as possible, while still staying true to the extensibility that is core to Jenkins. While this project is in the alpha stage of development, the intent is that Jenkins users can install Blue Ocean side-by-side with the Jenkins Classic UI via a plugin.

A pipeline is a collection of jobs that brings the product from version control under the control of the end clients by utilizing computerization apparatuses. It is an element used to consolidate consistent conveyance in our product advancement work process. Throughout the years, there have been various Jenkins pipeline releases including, Jenkins Build stream, Jenkins Build Pipeline module, Jenkins Workflow, and so forth. What are the key highlights of these modules? They speak to different Jenkins jobs as one entire work process as a pipeline. These pipelines are an accumulation of Jenkins employments which trigger each other in a predefined arrangement.

Let me clarify this with a model. Assume I'm building up a little application on Jenkins and I need to fabricate, test and convey it. To do this, I will distribute 3 employment to play out each procedure. Along these lines, job1 would be for fabricating, job2 would perform tests and job3 for an organization. I can utilize the Jenkins assemble pipeline module to play out this assignment. In the wake of making three employments and binding them in a grouping, the fabricate module will run these occupations as a pipeline.

Step 1:  For creating a build pipeline, First create a set of jobs.

build pipeline

Step 2:  Now create a build pipeline view and in the configure section add the first job that you want to run in the pipeline

 pipeline view

Step 3: Now for every job, add a post-build action, which can be modified many ways, for example, manual trigger or automatic trigger on the success of the build.

Automatic trigger

step 4: Once all the downstream jobs are configured, just build the p1 job. and go to the new pipeline view that you have created.

New pipeline

Note: You won't see a pipeline until the first job is started. In this example, after the p1 job is started you will start seeing a pipeline.

Works in Jenkins can be activated intermittently (on a timetable, indicated in the arrangement), or when source changes in the task have been distinguished, or they can be naturally activated by mentioning the URL:

http://YOURHOST/jenkins/work/PROJECTNAME/construct

In the undertaking arrangement, there ought to be a Build Triggers area. This controls how frequently Jenkins polls your SCM for code changes. When utilizing SVN, you can stand to check as often as possible in light of the fact that the checking isn't costly. So you can advise Jenkins to check each moment or something like that. Set this to Poll SCM and set the calendar to something like */n * * * * (supplant n with your poll interval in minutes).

Source code management

It is critical to have Jenkins reinforcement with its information and setups. It incorporates, work configs, manufactured logs, modules, module design, etc. Jenkins Thin Backup is a   module for sponsorship up Jenkins. It backs up every one of the information dependent on your timetable and it handles the reinforcement maintenance too.

To begin, first, introduce the module. 

  1. Go to Manage Jenkins – > Manage Plugins
  2. Snap the Available tab and look for “Thin backup”
  3. Introduce the module and restart Jenkins.

Once introduced, pursue the means given beneath for designing reinforcement settings. 

  1. Go to Manage Jenkins — > ThinBackup
  2. Snap settings alternative.
  3. Enter the reinforcement choices as appeared and spare it. Every one of the alternatives is clear as crystal. The reinforcement index you determine ought to be writable by the client which is running the Jenkins administration. All the Jenkins reinforcement will be spared to the reinforcement registry you indicate.

ThinBackup Configuration

It's anything but a smart thought to keep the Jenkins back in Jenkins itself. It is an absolute necessity to move slim reinforcements to distributed storage or some other reinforcement area. So that, regardless of whether Jenkins server crashes you will have every one of the information. In the event that you are on AWS, Azure or Google CLoud, you can transfer the reinforcements separate stockpiling arrangements.

There is an alternative way of backup of the Jenkins Home folder. This contains all of your build jobs configurations, your slave node configurations, and your build history. To create a backup of your Jenkins setup, just copy this directory.

You can see where is your Jenkins home with:

echo $JENKINS_HOME

And for example, if you only want to back up the jobs you can go to:

cd $JENKINS_HOME/jobs

And make a backup for that folder.

All that configuration will be a bunch of XML files.

If you are using the official Jenkins docker image, the home will be on: /var/jenkins_home

There are various ways to start and stop Jenkins. We have commands/ plug-ins to achieve the same.

Firstly, we can use any one of the below commands to start Jenkins manually using Jenkins URL:

  • (Jenkins_url)/restart: Forces a restart without waiting for builds to complete.
  • (Jenkin_url)/safeRestart: Allows all running builds to complete.

We should always do safeRestart as it waits for running build to complete before restarting Jenkin server. There are other ways as well to start/stop. We can go to Open Console/Command line --> Go to our Jenkins installation directory. The below commands also produce the same output:

to stop:

jenkins.exe stop

to start:

jenkins.exe start

to restart:

jenkins.exe restart

The SafeRestart Plugin can also be used. It is pretty useful (Restart Safely). Once you install the plug-in, It adds a link to be able to restart from the main menu only:

Jenkins Manually

In Red Hat Linux you can use below commands as well :

  • To know the status of Jenkins: sudo service Jenkins status
  • To start the Jenkins: sudo service Jenkins start
  • To stop the Jenkins: sudo service Jenkins stop
  • To restart the Jenkins: sudo service Jenkins restart

We can check the console output in Jenkins to investigate the output of the build. This can help us in identifying if any files change missed during commit. If the console output is of no help we can use a local copy of workspace to replicate the issue.

Last time we had a failing build with Jenkins showing a red ball to prove it. The first place to look when we have a failing build is the console output. You can get to the console output via the main menu on the left of your project page.

Console output

Console output

Jenkins employs the cron syntax to schedule jobs within the tool.

Five asterisks are the core of cron language structure, with every one isolated by a space. The principal asterisk mark speaks to minutes, the second speaks to hours, the third day of the month, the fourth the month itself and the fifth day of the week. For instance, to plan a build job to pull from GitHub each Friday at 6:30 p.m., the language structure would be: 30 18 * 4.

A CRON expression is a string containing five or six fields isolated by space that speaks to a lot of times, regularly as a schedule to execute some everyday practice. In certain employment of the CRON group, there is likewise a seconds field toward the start of the example. In that case, the CRON expression is a string involving 6 or 7 fields. A CRON syntax also supports special  Backing for every special character relies upon explicit appropriations and versions of cron.

  • Asterisk( *): The reference mark shows that the cron expression matches for all values of the field. E.g., utilizing an asterisk in the fourth field (month) shows each month.
  • Slash(/): Slash portray additions of reaches. For instance, 4-59/15 in the first field (minutes) demonstrates the 4 minute of the hour and every 15 minutes from that point. The structure "*/… " is comparable to the structure "first-last/… ", that is, an addition over the largest possible range of the field.
  • The comma (,): Commas are utilized to isolate things of a rundown. For instance, utilizing "MON, WED, FRI" in the fifth field (day of the week) implies Mondays, Wednesdays, and Fridays.
  • Hyphen ( – ): Hyphens characterize ranges. For instance, 2000-2010 shows each year somewhere in the range of 2000 and 2010 AD, comprehensive.
  • Percent ( %): Percent-signs (%) in the direction, except if got away with a backslash (\), are changed into newline characters, and all information after the first % is sent to the order as standard info.

SVN polling

One of the most frequently posed Jenkins scenario based interview questions, be ready for this conceptual question.

The procedure to utilize a third party tool, for example, Artifactory, Node, SonarQube or Git normally pursues a four-advance procedure.

  1. The third-party tool must be installed.
  2. A Jenkins module that supports the third party tool must be introduced through the Jenkins administrator console.
  3. The third-party tool must be arranged in the Tools tab of the Manage Jenkins area of the administrator console.
  4. At last, the plug-ins can be utilized from inside a Jenkins build job. The module will at that point encourage correspondence between the Jenkins build job and the third party.

The JENKINS_HOME organizer contains a document named config.xml. At the point when security is empowered, this record contains an XML component named useSecurity that will be set to true. By changing this setting to false, security will be handicapped whenever Jenkins is restarted. <useSecurity>false</useSecurity> 

The incapacitating security ought to dependably be both a final retreat and a brief measure. When any conformation issues are settled, make certain to re-empower Jenkins security and reboot the CI server.

Please find the steps below :

  • Go to $JENKINS_HOME in the file system and discover the config.xml document.
  • Open this document in the editorial manager.
  • Search for the <useSecurity>true</useSecurity> component in this document.
  • Supplant true with false
  • Expel the components authorization strategy and security realm
  • Start Jenkins

At the point when Jenkins returns, it will be in an unsecured mode where everybody gets full access to the framework. In the event that this is as yet not working, taking a stab at renaming or erasing config.xml.

Agent: The straightforward answer is, Node is for scripted pipelines & Agent is for declarative pipelines. In declarative pipelines, the agent directive is utilized for determining which agent /slave the job/task is to be executed on. This mandate just enables you to indicate where the undertaking is to be executed, which agent, slave, mark or docker image. In scripted pipelines, the node step can be utilized for executing a script/advance on a particular agent, mark, slave. The node step alternatively takes the operator or name and afterward a conclusion with code that will be executed on that node.

The declarative pipelines is another augmentation of the pipeline DSL (it is fundamentally a pipeline content with just one stage, a pipeline venture with contentions (called directives), these directives ought to pursue a particular language structure. The purpose of this new arrangement is that it is progressively exacting and accordingly ought to be simpler for those new to pipelines, take into account graphical altering and considerably more. scripted pipelines are the fallback for cutting edge prerequisites.

In the default setup of Jenkins 1.x, Jenkins does not play out any security checks. This implies the capacity of Jenkins to launch procedures and access local files are accessible to any individual who can get to Jenkins web UI and some more.

Securing Jenkins has two viewpoints to it.

  1. Access control, which guarantees clients are verified when getting to Jenkins and their activities are approved.
  2. Securing Jenkins against outer dangers

You should secure the entrance to Jenkins UI with the goal that clients are validated and suitable arrangement of authorizations are given to them. This setting is controlled for the most part in two ways:

  • Security Realm, which decides clients and their passwords, just as what groups the clients have a place with.
  • Approval Strategy, which figures out who approaches what.

You may utilize outside LDAP or Active Directory as the security domain, and you may pick "everybody full access once signed in" mode for approval methodology. Or then again you may let Jenkins run its very own client database, and perform access control dependent on the authorization/client grid.

Some important security considerations:

  • Global security ought to be empowered.
  • Jenkins ought to be incorporated with suitable modules.
  • Automate the way toward setting rights and benefits.
  • Limit the physical access to organizers.
  • Intermittently run security reviews.

A staple in Jenkins interview questions and answers for experienced, be prepared to answer this one using your hands-on experience.

As a part of the first step we need to visit the Jenkins URL, once we are on Jenkin page we need to click "New Job", then we need to click on  "Build a free-style software project". The above activity consists of several components: like SCM, for instance, CVS or Subversion where our source code abides. optional triggers to control when builds will be performed by Jenkins.some sort of script that plays out the build(ant, maven, shell script and so on.) where the veritable work happens optional steps to accumulate information out of the build, for instance, archiving the artifacts along with recording Javadoc and test results.

Please find the below steps for creating jobs in Jenkins :

  • Stage 1: First of all we need to sign in to Jenkins Dashboard for creating a freestyle job. Usually, Jenkins hosted locally at only if we have configured Jenkins in some other way, use the hosted URL to get to our Jenkins dashboard.

Jenkins dashboard

  • Stage 2: In the upper left-hand side of our dashboard click the "New Item".

upper left-hand

  • Stage 3: In the following screen, we need to enter the name of the item which we would like to create. We will create the "Hello world" for this exercise: Choose Freestyle project and click OK

upper left-hand

  • stage 4: Let's enter some project details like below screen :

upper left-hand

  • Step 5: Now we can click on the Source Code Management section and enter our repository URL. We have a test repository hosted at https://github.com/*/*.git

Source Code Management

We can also leverage local repo as well in case if we have.

On the off chance that if our GitHub repo is private, Jenkins will initially approve our login accreditation with GitHub and after validation pulls the source code from our GitHub repo.

  • Stage 6: Once all configuration settings completed, it's an ideal opportunity to build the code. We can change the settings under the build segment to build the code at the time whenever we need. Build can be scheduled as well to trigger it at the specified time.

In the build section we need to do below two things :

  1. We need to first click on "Add build step"
  2. We need to click on  "Execute Windows batch command" to execute the commands we wanted to trigger during the build process.

Execute Windows batch command

Let's add some sample java commands to test the workflow :

We can add below commands for our testing :

javac HelloWorld.java
java HelloWorld

commands for our testing

  • Stage 7: Once we are done with all the changes, we can click apply and then save the project.
  • Stage 8: We should click the Build Now button which is on the left-hand side to trigger the build.

left-hand side to trigger the build

  • Stage 9: Under build history section we can check the status of every build triggered.

every build triggered

  • Step 10: Click on the build number and then Click on console output to see the status of the build you run. It should show you a success message, provided you have followed the setup properly.

Console Output

Some most useful plugins in Jenkins:

  • Amazon ECS Container Service

The objective of  ‘Amazon ECS Container Service’ plugin to manage Jenkins agent hosted on the cloud and at the same time helping in deploying Docker-based applications on a cloud. Each Jenkin build is carried out in separate docker container which gets cleaned up after.

  • Dashboard View Plugin

This dashboard gives a bird view of the status of tasks configured in Jenkins. This is used for monitoring purposes. It also helps us in tracking the time taken in building jobs which are configured and entire time duration for all the jobs.

  • View Job Filters Plugin

It creates views for Jenkins jobs. We can have a view of build status and different triggers.

  • Build Pipeline Plugin

This gives us clear sight of jobs making our build pipeline also we can have a better look of upstream and downstream. Additionally, it helps us in defining manual triggers for some tasks which need some customization before executing. It is one of the critical plugins as it has in-built support of scripts which helps in building complex DevOps pipeline.

  • Git Plugin

This plug-is needed in case If we need to access the GitHub but at the same time it works as a repository browser for other SCM providers.

  • GitHub Integration Plugin

This is one of the basic plug-ins which we need to get the source code from code repository hosted with GIT. It helps us in scheduling build, pulling code base on regular interval from GIT to Jenkin. The build gets triggered automatically once scheduled.

An agent represents the build pipeline or highlighting specific steps where execution will be getting performed or agent location. Inside the pipeline block,  an agent is available at a higher level but it is optional to use stage-level.

  1. The agent section determines whether an entire pipeline or particular stage will be part of Jenkins environment driven by where exactly agent section is located. The agent section must be at the top-level and within pipeline block only. You can visit the below URL to get some more details on agent syntax in Jenkin pipeline. .(https://jenkins.io/doc/book/pipeline/syntax/#agent
  2. There are many ways to create Agent/Node but we can follow the below tutorial to know about steps to create agent/Node.https://devopscube.com/setup-slaves-on-jenkins-2/
  3. The different parameters are supported by Agent to support various use   These parameters actually work at the top level of pipeline block or it can work under stage directive as well.

Jenkins agent

The Jenkins environment variables frequently prove to be useful when you have to keep in touch with some advanced shell contents. Moreover, in the event that you realize how to infuse environment variables into the Jenkins build process, it can open up a totally different universe of specialized chances, as it'll give you access to a portion of the product's internals.

A simple method to get the Jenkins environment variables list from your local installation is to annex env-vars.html to the server's URL. For a privately facilitated Jenkins server, the URL would be http://localhost:8080/env-vars.html. Jenkins environment variable

The least demanding approach to perceive how these Jenkins environment variables work is to make a freestyle job, reverberation every section in the rundown and see the value Jenkins relegates to every property. By default, few environment variables are always available in Build Job.  Some of them are available only when a relevant plug-in is configured in Jenkin set up some GIT related variables are available when GIT plugin gets configured.

This, along with other Jenkins basic questions for freshers, is a regular feature in Jenkins interviews, be ready to tackle it with the approach mentioned below.

Jenkins pipeline is configured to build a project by extracting it from source code and then ensure that the build goes through different stages like unit, performance, and user acceptance testing. Once every stage is successful, it also facilitates deployment to an application server. So overall if we talk about different stages any project goes through can be classified into three broad categories 

  • Build -This stage ensures code extracted from code repository for build purpose and in case of any failure, developers come to know the reason for build failure. This is a very critical stage in build pipeline and subsequent stages will be triggered only when the project exits this stage successfully.
  • Test - This stage ensures the build is unit /performance/user tested so the issue can be caught at an early stage only.
  • Deploy - This stage took care of deployment request once testing is successful It is the last stage in the pipeline.

The above stage can be further divided into smaller sections which help us in understanding the importance of three primary stages of Jenkins pipeline:

  • Pull the code from source repository using the proper plug-in. for GIT source code we can utilize GIT plugin and so on.
  • Once the code extracted ensure to compile the code using compatible compiler library like Maven Plugin for Java code.
  • The conformation with coding standard is also well supported by some of the plugins available in Jenkins. We can use the Checkstyle plugin for the same.
  • The Code health check-up is also well supported by different tools available like SonarQube, PMD or FindBugs.
  • Incorporating groovy syntax we can get manual sign off from Business users easily.
  • We can run different tests to measure application load performance.
  • It also helps is packaging application in a form we called ready to deploy stage. For example, the WAR format supported for JAVA web application project, etc.
  • It also facilitates deployment of binary to artifacts repository like Nexus.
  • It also helps in storing most of the reports for future reference.

There are two ways we can start Jenkins Node agent :

  1. We can start a Node agent from a browser window itself.
  2. There is another way of starting an agent from the command line as well.

The JNLP file gets downloaded when we start an agent using a browser window. When this file runs it triggers a separate process to launch Jenkins jobs. There is a different process which runs in case of agent get launched from the command line. There is one JAR file which is required on the client machine and this file gets launched from the command line but still, you need to refer slave agent JNLP file available on Jenkin server. The command line triggers a process on the client machine which actually communicates with Jenkin’s mater and triggers build jobs in Jenkin when it identifies idle clock cycles

There are five key areas in which tools can assist in a DevOps transition:

  • configuration management: There are various configuration management tools. For example Chef, Puppet and Ansible are considered to be best configuration management tools.
  • source code management: This helps any enterprise in managing source code in such a way that a distributed team can work efficiently. There are several popular source code tool available like Git, GitHub or GitLab, are few of them to name.
  • CI: Jenkins is leading one in CI tools and most popular one if industry wise trends are considered. There are other CI tools as well like Concourse CI and Atlassian's Bamboo.
  • containerization: docker is a market leader in this section but there are others as well like Rkt and LXD.
  • collaboration: JIRA from Atlassian is an incredible Project Management programming and in the meantime can likewise be an exceptionally solid Collaboration device that can be utilized in an Organization. It is a product device structured particularly to catch, dole out and organize tasks for the advancement of the Project execution. It has one of the basic and natural interfaces that assists anybody with no learning to pick up a grasp over it right away.

Jenkins has two ways of creating pipelines. One is declarative and another one is scripted. The scripted pipelines are also called as traditional pipelines, it is dependent on groovy syntax.  The declarative one is an easy one as syntax is more simplified. Declarative pipeline got introduced after Jenkin version 2.5. Declarative Pipelines are the latest offering from Jenkins that streamline the traditional groovy syntax of Jenkins pipelines (top-level pipeline) with specific use cases, for example, No semicolon can be used as a statement separator. The top-level pipeline ought to be encased inside block viz;

The regular syntax  structure for a declarative pipeline is below :

Code

The declarative pipeline has three major sections explained below :

  • PipelineThe section of the script where we write declarative pipelines. It is an umbrella under which all other sections will reside.
  • AgentThis is the starting point of a pipeline from where it starts executing.
  • Stage: The stage is nothing but steps enclosing all pipelines sections.

A staple in Jenkins advanced interview questions with answers, be prepared to answer this one using your hands-on experience. This is also one of the top interview questions to ask a Jenkins Developer.

Jenkins Backup Plugin is used to take a back up of the configurations and settings so as to use them later on if there is any failure. We can follow the following steps for backing up our settings by utilizing the Backup Plugin.

Stage 1: Log in to Jenkin server and then click on Manage Jenkins section.

Jenkin server

Stage 2: Once we click on Manage Plugins it opens up the below page.

Manage Plugins

Stage 3: Click on Available section in the below page and search for ThinBackup in the filter section. Once we click on the selected it will start installing the plug-in the backend.

plug-in the backend.

Stage 4: After successful installation, the below screen will pop up. Click the settings sections highlighted below.

 installation

Stage 5: Here we have to fill the basic details like backup directory as shown on the below screen and save the settings. The back-up will be stored to the Backup Directory.

 Backup Directory.

Stage 6: We can test if a backup is working on not by clicking on Backup Now as shown in the below picture.

ThinBackup Settings

Stage 7: We can navigate to the back-up directory defined in  ThinBackup Settings to check if a backup exists or not.

Description

Jenkins, a very important tool in DevOps that automates the rest of the process with the help of plugins once the changes are made in a source code repository (SCR). If you are applying for any DevOps role or any Cloud-native practitioner role, you can encounter many Jenkins interview questions. Jenkins Course also helps you prepare for these interviews with hands-on projects.   

Following are the most frequently asked interview questions and answers on Jenkins. These interview questions on Jenkins can be useful for freshers and experienced to clear the interview. 

When any organization seeking to fill the vacancies for the DevOps roles, they mostly look for the following set of skills in an individual and these are:

  • An individual should be fluent in web languages like Ruby, Python, PHP or Java.
  • An individual should be aware of the various infrastructure automation tools like Chef, Puppet, Ansible, SaltStack or Windows PowerShell DSC.
  • Interpersonal skills that help you communicate and collaborate across teams and roles.

This skills-set will make you eligible for the interview. If you are looking to build more job-relevant skills, consider taking some IT programming courses on the way. Jenkins basic interview questions and answers mentioned here will help you to prepare for the interview. Here is a list of the top Jenkins interview questions and answers.

Read More
Levels