Categories
Bibliography DevOps DevSecOps-Security-Privacy Software Engineering

B078Y98RG8

See: The Phoenix Project: A Novel about IT, DevOps, and Helping Your Business Win

Fair Use Source:

Categories
AWS Azure Cloud DevOps GCP Linux Operating Systems Software Engineering

List of Linux containers

Linux containers are implementations of operating system-level virtualization for the Linux operating system. Several implementations exist, all based on the virtualization, isolation, and resource management mechanisms provided by the Linux kernel, notably Linux namespaces and cgroups.[1] These include:” (WP)

See also

References

  1. ^ Rami, Rosen. “Namespaces and Cgroups, the basis of Linux Containers” (PDF). Retrieved 18 August 2016.
  2. ^ “LXC – Linux Containers”linuxcontainers.org. Retrieved 2014-11-10.
  3. ^ “LXD”linuxcontainers.org. Retrieved 2021-02-11.
  4. ^ “Rkt container engine”.
  5. ^ “CNCF Archives RKT”. CNCF. Retrieved 19 Aug 2019.
  6. ^ “Red Hat to Acquire CoreOS”. Red Hat inc. Retrieved 30 Jan 2018.
  7. ^ Poettering, Lennart. “systemd For Administrators, Part XXI”. Retrieved 2 July 2016.
  8. ^ Rootless containers with Podman and fuse-overlayfs, CERN Workshop, 2019-06-04
  9. ^ https://hpc.github.io/charliecloud/. Retrieved 4 October 2020. Missing or empty |title= (help)
  10. ^ “Bottlerocket is a Linux-based operating system purpose-built to run containers”.
This Linux-related article is a stub. You can help Wikipedia by expanding it.

Categories

” (WP)

Sources:

Fair Use Sources:

Categories
Cloud DevOps DevSecOps-Security-Privacy Linux Software Engineering

DevOps toolchain

See also: CloudOps, toolchain

“A DevOps toolchain is a set or combination of tools that aid in the delivery, development, and management of software applications throughout the systems development life cycle, as coordinated by an organization that uses DevOps practices.

Generally, DevOps tools fit into one or more activities, which supports specific DevOps initiatives: Plan, Create, Verify, Package, Release, Configure, Monitor, and Version Control.[1][2]” (WP)

Toolchains

“In software, a toolchain is the set of programming tools that is used to perform a complex software development task or to create a software product, which is typically another computer program or a set of related programs. In general, the tools forming a toolchain are executed consecutively so the output or resulting environment state of each tool becomes the input or starting environment for the next one, but the term is also used when referring to a set of related tools that are not necessarily executed consecutively.[3][4][5]

As DevOps is a set of practices that emphasizes the collaboration and communication of both software developers and other information technology (IT) professionals, while automating the process of software delivery and infrastructure changes, its implementation can include the definition of the series of tools used at various stages of the lifecycle; because DevOps is a cultural shift and collaboration between development and operations, there is no one product that can be considered a single DevOps tool. Instead a collection of tools, potentially from a variety of vendors, are used in one or more stages of the lifecycle.[6][7]” (WP)

Stages of DevOps

Further information: DevOps

Plan

Plan is composed of two things: “define” and “plan”.[8] This activity refers to the business value and application requirements. Specifically “Plan” activities include:

  • Production metrics, objects and feedback
  • Requirements
  • Business metrics
  • Update release metrics
  • Release plan, timing and business case
  • Security policy and requirement

A combination of the IT personnel will be involved in these activities: business application owners, software developmentsoftware architects, continual release management, security officers and the organization responsible for managing the production of IT infrastructure.

Create

Create is composed of the building (see also build automation), coding, and configuring of the software development process.[8] The specific activities are:

Tools and vendors in this category often overlap with other categories. Because DevOps is about breaking down silos, this is reflective in the activities and product solutions.[clarification needed]

Verify

Verify is directly associated with ensuring the quality of the software release; activities designed to ensure code quality is maintained and the highest quality is deployed to production.[8] The main activities in this are:

Solutions for verify related activities generally fall under four main categories: Test automation , Static analysis , Test Lab, and Security.

Packaging

Packaging refers to the activities involved once the release is ready for deployment, often also referred to as staging or Preproduction / “preprod”.[8] This often includes tasks and activities such as:

  • Approval/preapprovals
  • Package configuration
  • Triggered releases
  • Release staging and holding

Release

Release related activities include schedule, orchestration, provisioning and deploying software into production and targeted environment.[9] The specific Release activities include:

  • Release coordination
  • Deploying and promoting applications
  • Fallbacks and recovery
  • Scheduled/timed releases

Solutions that cover this aspect of the toolchain include application release automation, deployment automation and release management.

Configure

Configure activities fall under the operation side of DevOps. Once software is deployed, there may be additional IT infrastructure provisioning and configuration activities required.[8] Specific activities including:

  • Infrastructure storage, database and network provisioning and configuring
  • Application provision and configuration.

The main types of solutions that facilitate these activities are continuous configuration automationconfiguration management, and infrastructure as code tools.[10]

Monitor

Monitoring is an important link in a DevOps toolchain. It allows IT organization to identify specific issues of specific releases and to understand the impact on end-users.[8] A summary of Monitor related activities are:

  • Performance of IT infrastructure
  • End-user response and experience
  • Production metrics and statistics

Information from monitoring activities often impacts Plan activities required for changes and for new release cycles.

Version Control

Version Control is an important link in a DevOps toolchain and a component of software configuration management. Version Control is the management of changes to documents, computer programs, large web sites, and other collections of information.[8] A summary of Version Control related activities are:

  • Non-linear development
  • Distributed development
  • Compatibility with existent systems and protocols
  • Toolkit-based design

Information from Version Control often supports Release activities required for changes and for new release cycles.

See also

References

  1. ^ Edwards, Damon. “Integrating DevOps tools into a Service Delivery Platform”dev2ops.org.
  2. ^ Seroter, Richard. “Exploring the ENTIRE DevOps Toolchain for (Cloud) Teams”infoq.com.
  3. ^ “Toolchain Overview”nongnu.org. 2012-01-03. Retrieved 2013-10-21.
  4. ^ “Toolchains”elinux.org. 2013-09-08. Retrieved 2013-10-21.
  5. ^ Imran, Saed; Buchheit, Martin; Hollunder, Bernhard; Schreier, Ulf (2015-10-29). Tool Chains in Agile ALM Environments: A Short IntroductionLecture Notes in Computer Science9416. pp. 371–380. doi:10.1007/978-3-319-26138-6_40ISBN 978-3-319-26137-9.
  6. ^ Loukides, Mike (2012-06-07). “What is DevOps?”.
  7. ^ Garner Market Trends: DevOps – Not a Market, but Tool-Centric Philosophy That supports a Continuous Delivery Value Chain (Report). Gartner. 18 February 2015.
  8. a b c d e f g Avoid Failure by Developing a Toolchain that Enables DevOps (Report). Gartner. 16 March 2016.
  9. ^ Best Practices in Change, Configuration and Release Management (Report). Gartner. 14 July 2010.
  10. ^ Roger S. Pressman (2009). Software Engineering: A Practitioner’s Approach (7th International ed.). New York: McGraw-Hill.

Categories

Sources:

Fair Use Sources:

Categories
DevOps Java Kotlin Software Engineering

Apache Maven build automation tool

Maven logo.svg
Developer(s)Apache Software Foundation
Initial release13 July 2004; 16 years ago
Stable release3.6.3 / 25 November 2019; 15 months ago[1]
RepositoryMaven Repository
Written inJava
TypeBuild tool
LicenseApache License 2.0
Websitemaven.apache.org

Maven is a build automation tool used primarily for Java projects. Maven can also be used to build and manage projects written in C#RubyScala, and other languages. The Maven project is hosted by the Apache Software Foundation, where it was formerly part of the Jakarta Project.

Maven addresses two aspects of building software: how software is built, and its dependencies. Unlike earlier tools like Apache Ant, it uses conventions for the build procedure, and only exceptions need to be written down. An XML file describes the software project being built, its dependencies on other external modules and components, the build order, directories, and required plug-ins. It comes with pre-defined targets for performing certain well-defined tasks such as compilation of code and its packaging. Maven dynamically downloads Java libraries and Maven plug-ins from one or more repositories such as the Maven 2 Central Repository, and stores them in a local cache.[2] This local cache of downloaded artifacts can also be updated with artifacts created by local projects. Public repositories can also be updated.

Maven is built using a plugin-based architecture that allows it to make use of any application controllable through standard input. A C/C++ native plugin is maintained for Maven 2.[3]

Alternative technologies like Gradle and sbt as build tools do not rely on XML, but keep the key concepts Maven introduced. With Apache Ivy, a dedicated dependency manager was developed as well that also supports Maven repositories.[4]

Apache Maven has support for reproducible builds.[5][6]

History

The number of artifacts on Maven’s central repository has grown rapidly

Maven, created by Jason van Zyl, began as a sub-project of Apache Turbine in 2002. In 2003, it was voted on and accepted as a top level Apache Software Foundation project. In July 2004, Maven’s release was the critical first milestone, v1.0. Maven 2 was declared v2.0 in October 2005 after about six months in beta cycles. Maven 3.0 was released in October 2010 being mostly backwards compatible with Maven 2.

Maven 3.0 information began trickling out in 2008. After eight alpha releases, the first beta version of Maven 3.0 was released in April 2010. Maven 3.0 has reworked the core Project Builder infrastructure resulting in the POM’s file-based representation being decoupled from its in-memory object representation. This has expanded the possibility for Maven 3.0 add-ons to leverage non-XML based project definition files. Languages suggested include Ruby (already in private prototype by Jason van Zyl), YAML, and Groovy.

Special attention was given to ensuring backward compatibility of Maven 3 to Maven 2. For most projects, upgrading to Maven 3 will not require any adjustments of their project structure. The first beta of Maven 3 saw the introduction of a parallel build feature which leverages a configurable number of cores on a multi-core machine and is especially suited for large multi-module projects.

Syntax[edit]

A directory structure for a Java project auto-generated by Maven

Maven projects are configured using a Project Object Model, which is stored in a pom.xml-file. An example file looks like:

<project>
  <!-- model version is always 4.0.0 for Maven 2.x POMs -->
  <modelVersion>4.0.0</modelVersion>
  <!-- project coordinates, i.e. a group of values which uniquely identify this project -->
  <groupId>com.mycompany.app</groupId>
  <artifactId>my-app</artifactId>
  <version>1.0</version>
  <!-- library dependencies -->
  <dependencies>
    <dependency>
      <!-- coordinates of the required library -->
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <!-- this dependency is only used for running and compiling tests -->
      <scope>test</scope>
    </dependency>
  </dependencies>
</project>

This POM only defines a unique identifier for the project (coordinates) and its dependency on the JUnit framework. However, that is already enough for building the project and running the unit tests associated with the project. Maven accomplishes this by embracing the idea of Convention over Configuration, that is, Maven provides default values for the project’s configuration.

The directory structure of a normal idiomatic Maven project has the following directory entries:

Directory namePurpose
project homeContains the pom.xml and all subdirectories.
src/main/javaContains the deliverable Java sourcecode for the project.
src/main/resourcesContains the deliverable resources for the project, such as property files.
src/test/javaContains the testing Java sourcecode (JUnit or TestNG test cases, for example) for the project.
src/test/resourcesContains resources necessary for testing.

The command mvn package will compile all the Java files, run any tests, and package the deliverable code and resources into target/my-app-1.0.jar (assuming the artifactId is my-app and the version is 1.0.)

Using Maven, the user provides only configuration for the project, while the configurable plug-ins do the actual work of compiling the project, cleaning target directories, running unit tests, generating API documentation and so on. In general, users should not have to write plugins themselves. Contrast this with Ant and make, in which one writes imperative procedures for doing the aforementioned tasks.

Design[edit]

Project Object Model[edit]

A Project Object Model (POM) provides all the configuration for a single project. General configuration covers the project’s name, its owner and its dependencies on other projects. One can also configure individual phases of the build process, which are implemented as plugins. For example, one can configure the compiler-plugin to use Java version 1.5 for compilation, or specify packaging the project even if some unit tests fail.

Larger projects should be divided into several modules, or sub-projects, each with its own POM. One can then write a root POM through which one can compile all the modules with a single command. POMs can also inherit configuration from other POMs. All POMs inherit from the Super POM[7] by default. The Super POM provides default configuration, such as default source directories, default plugins, and so on.

Plug-ins[edit]

Most of Maven’s functionality is in plug-ins. A plugin provides a set of goals that can be executed using the command mvn [plugin-name]:[goal-name]. For example, a Java project can be compiled with the compiler-plugin’s compile-goal[8] by running mvn compiler:compile.

There are Maven plugins for building, testing, source control management, running a web server, generating Eclipse project files, and much more.[9] Plugins are introduced and configured in a <plugins>-section of a pom.xml file. Some basic plugins are included in every project by default, and they have sensible default settings.

However, it would be cumbersome if the archetypal build sequence of building, testing and packaging a software project required running each respective goal manually:

  • mvn compiler:compile
  • mvn surefire:test
  • mvn jar:jar

Maven’s lifecycle concept handles this issue.

Plugins are the primary way to extend Maven. Developing a Maven plugin can be done by extending the org.apache.maven.plugin.AbstractMojo class. Example code and explanation for a Maven plugin to create a cloud-based virtual machine running an application server is given in the article Automate development and management of cloud virtual machines.[10]

Build lifecycles[edit]

The build lifecycle is a list of named phases that can be used to give order to goal execution. One of Maven’s standard lifecycles is the default lifecycle, which includes the following phases, in this order:[11]

  • validate
  • generate-sources
  • process-sources
  • generate-resources
  • process-resources
  • compile
  • process-test-sources
  • process-test-resources
  • test-compile
  • test
  • package
  • install
  • deploy

Goals provided by plugins can be associated with different phases of the lifecycle. For example, by default, the goal “compiler:compile” is associated with the “compile” phase, while the goal “surefire:test” is associated with the “test” phase. When the mvn test command is executed, Maven runs all goals associated with each of the phases up to and including the “test” phase. In such a case, Maven runs the “resources:resources” goal associated with the “process-resources” phase, then “compiler:compile”, and so on until it finally runs the “surefire:test” goal.

Maven also has standard phases for cleaning the project and for generating a project site. If cleaning were part of the default lifecycle, the project would be cleaned every time it was built. This is clearly undesirable, so cleaning has been given its own lifecycle.

Standard lifecycles enable users new to a project the ability to accurately build, test and install every Maven project by issuing the single command mvn install. By default, Maven packages the POM file in generated JAR and WAR files. Tools like diet4j[12] can use this information to recursively resolve and run Maven modules at run-time without requiring an “uber”-jar that contains all project code.

Dependencies[edit]

A central feature in Maven is dependency management. Maven’s dependency-handling mechanism is organized around a coordinate system identifying individual artifacts such as software libraries or modules. The POM example above references the JUnit coordinates as a direct dependency of the project. A project that needs, say, the Hibernate library simply has to declare Hibernate’s project coordinates in its POM. Maven will automatically download the dependency and the dependencies that Hibernate itself needs (called transitive dependencies) and store them in the user’s local repository. Maven 2 Central Repository[2] is used by default to search for libraries, but one can configure the repositories to be used (e.g., company-private repositories) within the POM.

The fundamental difference between Maven and Ant is that Maven’s design regards all projects as having a certain structure and a set of supported task work-flows (e.g., getting resources from source control, compiling the project, unit testing, etc.). While most software projects in effect support these operations and actually do have a well-defined structure, Maven requires that this structure and the operation implementation details be defined in the POM file. Thus, Maven relies on a convention on how to define projects and on the list of work-flows that are generally supported in all projects.[13]

There are search engines such as The Central Repository Search Engine[14] which can be used to find out coordinates for different open-source libraries and frameworks.

Projects developed on a single machine can depend on each other through the local repository. The local repository is a simple folder structure that acts both as a cache for downloaded dependencies and as a centralized storage place for locally built artifacts. The Maven command mvn install builds a project and places its binaries in the local repository. Then other projects can utilize this project by specifying its coordinates in their POMs.

Interoperability[edit]

Add-ons to several popular integrated development environments targeting the Java programming language exist to provide integration of Maven with the IDE’s build mechanism and source editing tools, allowing Maven to compile projects from within the IDE, and also to set the classpath for code completion, highlighting compiler errors, etc. Examples of popular IDEs supporting development with Maven include:

These add-ons also provide the ability to edit the POM or use the POM to determine a project’s complete set of dependencies directly within the IDE.

Some built-in features of IDEs are forfeited when the IDE no longer performs compilation. For example, Eclipse’s JDT has the ability to recompile a single Java source file after it has been edited. Many IDEs work with a flat set of projects instead of the hierarchy of folders preferred by Maven. This complicates the use of SCM systems in IDEs when using Maven.[15][16][17]

See also[edit]

References[edit]

  1. ^ “Apache Projects Releases”projects.apache.org.
  2. a b “Index of /maven2/”. Archived from the original on 2018-09-17. Retrieved 2009-04-15.
  3. ^ Laugstol, Trygve. “MojoHaus Native Maven Plugin”.
  4. ^ “IBiblio Resolver | Apache Ivy™”.
  5. ^ “Reproducible/Verifiable Builds – Apache Maven – Apache Software Foundation”cwiki.apache.org.
  6. ^ “Reproducible Builds in Java – DZone Java”dzone.com.
  7. ^ Super POM
  8. ^ Punzalan, Edwin. “Apache Maven Compiler Plugin – Introduction”.
  9. ^ Marbaise, Brett Porter Jason van Zyl Dennis Lundberg Olivier Lamy Benson Margulies Karl-Heinz. “Maven – Available Plugins”.
  10. ^ Amies, Alex; Zou P X; Wang Yi S (29 Oct 2011). “Automate development and management of cloud virtual machines”IBM developerWorks. IBM.
  11. ^ Porter, Brett. “Maven – Introduction to the Build Lifecycle”.
  12. ^ “diet4j – put Java JARs on a diet, and load maven modules as needed”.
  13. ^ “Maven: The Complete Reference”. Sonatype. Archived from the original on 21 April 2013. Retrieved 11 April 2013.
  14. ^ The Central Repository Search Engine,
  15. ^ “maven.apache.org/eclipse-plugin.html”. Archived from the original on May 7, 2015.
  16. ^ “IntelliJ IDEA :: Features”.
  17. ^ “MavenBestPractices – NetBeans Wiki”.

Further reading[edit]

External links[edit]

vteApache Software Foundation
Top-level
projects
AccumuloActiveMQAirflowAmbariAntAriesApache HTTP ServerAPRAvroAxisAxis2BeamBloodhoundBrooklynBuildrCalciteCamelCarbonDataCassandraCayenneChemistryCloudStackCocoonCordovaCouchDBcTAKESCXFDerbyDirectoryDrillDruidEmpire-dbFelixFlexFlinkFlumeGeronimoGiraphGumpHadoopHBaseHelixHiveImpalaJackrabbitJamesJenaJiniJMeterKafkaKarafKuduKylinLuceneMahoutMarmottaMavenMINAmod_perlMyFacesNetBeansNutchOFBizOozieOpenEJBOpenJPAOpenNLPOрenOfficeORCPDFBoxParquetPhoenixPOIPigPivotQpidRollerRocketMQSamzaServiceMixShiroSINGASlingSolrSparkStormSpamAssassinSqoopStruts 1Struts 2SubversionSupersetSystemMLTapestryThriftTikaTomcatTrafodionTraffic ServerUIMAVelocityWicketXalanXercesXMLBeansYetusZooKeeper
CommonsBCELBSFDaemonJellyLogging
IncubatorIcebergMXNetNuttXTavernaXAP
Other projectsBatikChainsawFOPIvyLog4j
AtticAbderaApexAxKitBeehiveBlueskyiBATISC++ Standard LibraryCactusClickContinuumDeltacloudEtchExcaliburForrestHamaHarmonyHiveMindJakartaLenyaODEShaleShindigSlideStanbolTuscanyWaveWink
LicensesApache License
Category Category

Categories

” (WP)

Sources:

Fair Use Sources:

Categories
Software Engineering

Package manager – Package management system – Software package (installation)

package manager or package-management system is a collection of software tools that automates the process of installing, upgrading, configuring, and removing computer programs for a computer‘s operating system in a consistent manner.[1]

A package manager deals with packages, distributions of software and data in archive files. Packages contain metadata, such as the software’s name, description of its purpose, version number, vendor, checksum (preferably a cryptographic hash function), and a list of dependencies necessary for the software to run properly. Upon installation, metadata is stored in a local package database. Package managers typically maintain a database of software dependencies and version information to prevent software mismatches and missing prerequisites. They work closely with software repositoriesbinary repository managers, and app stores.

Package managers are designed to eliminate the need for manual installs and updates. This can be particularly useful for large enterprises whose operating systems are typically consisting of hundreds or even tens of thousands of distinct software packages.[2]

Functions

Illustration of a package manager being used to download new software. Manual actions can include accepting a license agreement or selecting some package-specific configuration options.

A software package is an archive file containing a computer program as well as necessary metadata for its deployment. The computer program can be in source code that has to be compiled and built first.[3] Package metadata include package description, package version, and dependencies (other packages that need to be installed beforehand).

Package managers are charged with the task of finding, installing, maintaining or uninstalling software packages upon the user’s command. Typical functions of a package management system include:

  • Working with file archivers to extract package archives
  • Ensuring the integrity and authenticity of the package by verifying their checksums and digital certificates, respectively
  • Looking up, downloading, installing, or updating existing software from a software repository or app store
  • Grouping packages by function to reduce user confusion
  • Managing dependencies to ensure a package is installed with all packages it requires, thus avoiding “dependency hell

Challenges with shared libraries

Computer systems that rely on dynamic library linking, instead of static library linking, share executable libraries of machine instructions across packages and applications. In these systems, complex relationships between different packages requiring different versions of libraries results in a challenge colloquially known as “dependency hell“. On Microsoft Windows systems, this is also called “DLL hell” when working with dynamically linked libraries. Good package management is vital on these systems.[4] The Framework system from OPENSTEP was an attempt at solving this issue, by allowing multiple versions of libraries to be installed simultaneously, and for software packages to specify which version they were linked against.

Front-ends for locally compiled packages

System administrators may install and maintain software using tools other than package management software. For example, a local administrator may download unpackaged source code, compile it, and install it. This may cause the state of the local system to fall out of synchronization with the state of the package manager’s database. The local administrator will be required to take additional measures, such as manually managing some dependencies or integrating the changes into the package manager.

There are tools available to ensure that locally compiled packages are integrated with the package management. For distributions based on .deb and .rpm files as well as Slackware Linux, there is CheckInstall, and for recipe-based systems such as Gentoo Linux and hybrid systems such as Arch Linux, it is possible to write a recipe first, which then ensures that the package fits into the local package database.[citation needed]

Maintenance of configuration

Particularly troublesome with software upgrades are upgrades of configuration files. Since package managers, at least on Unix systems, originated as extensions of file archiving utilities, they can usually only either overwrite or retain configuration files, rather than applying rules to them. There are exceptions to this that usually apply to kernel configuration (which, if broken, will render the computer unusable after a restart). Problems can be caused if the format of configuration files changes; for instance, if the old configuration file does not explicitly disable new options that should be disabled. Some package managers, such as Debian‘s dpkg, allow configuration during installation. In other situations, it is desirable to install packages with the default configuration and then overwrite this configuration, for instance, in headless installations to a large number of computers. This kind of pre-configured installation is also supported by dpkg.

Repositories

To give users more control over the kinds of software that they are allowing to be installed on their system (and sometimes due to legal or convenience reasons on the distributors’ side), software is often downloaded from a number of software repositories.[5]

Upgrade suppression

When a user interacts with the package management software to bring about an upgrade, it is customary to present the user with the list of actions to be executed (usually the list of packages to be upgraded, and possibly giving the old and new version numbers), and allow the user to either accept the upgrade in bulk, or select individual packages for upgrades. Many package managers can be configured to never upgrade certain packages, or to upgrade them only when critical vulnerabilities or instabilities are found in the previous version, as defined by the packager of the software. This process is sometimes called version pinning.

For instance:

  • yum supports this with the syntax exclude=openoffice*[6]
  • pacman with IgnorePkg= openoffice[7] (to suppress upgrading openoffice in both cases)
  • dpkg and dselect support this partially through the hold flag in package selections
  • APT extends the hold flag through the complex “pinning” mechanism[8] (Users can also blacklist a package[9])
  • aptitude has “hold” and “forbid” flags
  • portage supports this through the package.mask configuration file

Cascading package removal

Some of the more advanced package management features offer “cascading package removal”,[7] in which all packages that depend on the target package and all packages that only the target package depends on, are also removed.

Comparison of commands

Although the commands are specific for every particular package manager, they are to a large extent translatable, as most package managers offer similar functions.

Actionzypper[10]pacmanaptdnf (yum)portage
install packagezypper in PKGpacman -S PACKAGEapt install PACKAGEdnf install PACKAGEemerge PACKAGE
remove packagezypper rm -RU PKGpacman -R PACKAGEapt remove PACKAGEdnf remove --nodeps PACKAGEemerge -C PACKAGE or
emerge --unmerge PACKAGE
remove package+orphanszypper rm -u --force-resolution PKGpacman -Rs PACKAGEapt autoremove PACKAGEdnf remove PACKAGEemerge -c PACKAGE or
emerge --depclean PACKAGE
update software databasezypper refpacman -Syapt updatednf check-updateemerge --sync
show updatable packageszypper lupacman -Quapt list --upgradablednf check-updateemerge -avtuDN --with-bdeps=y @world or
emerge --update --pretend @world
delete orphans+configzypper rm -upacman -Rsn $(pacman -Qdtq)apt autoremovednf erase PKGemerge --depclean
show orphanszypper pa --orphaned --unneededpacman -Qdtpackage-cleanup --quiet --leaves --exclude-binemerge -caD or
emerge --depclean --pretend
update allzypper uppacman -Syuapt upgradednf updateemerge --update --deep --with-bdeps=y @world

The Arch Linux Pacman/Rosetta wiki offers an extensive overview.[11]

Prevalence

Package managers like dpkg have existed as early as 1994.[12]

Linux distributions oriented to binary packages rely heavily on package management systems as their primary means of managing and maintaining software. Mobile operating systems such as Android (Linux-based), iOS (Unix-like), and Windows Phone rely almost exclusively on their respective vendors’ app stores and thus use their own dedicated package management systems.

Comparison with installers

A package manager is often called an “install manager”, which can lead to a confusion between package managers and installers. The differences include:This box: 

CriterionPackage managerInstaller
Shipped withUsually, the operating systemEach computer program
Location of installation informationOne central installation databaseIt is entirely at the discretion of the installer. It could be a file within the app’s folder, or among the operating system’s files and folders. At best, they may register themselves with an uninstallers list without exposing installation information.
Scope of maintenancePotentially all packages on the systemOnly the product with which it was bundled
Developed byOne package manager vendorMultiple installer vendors
Package formatA handful of well-known formatsThere could be as many formats as the number of apps
Package format compatibilityCan be consumed as long as the package manager supports it. Either newer versions of the package manager keep supporting it or the user does not upgrade the package manager.The installer is always compatible with its archive format, if it uses any. However, installers, like all computer programs, may be affected by software rot.

Comparison with build automation utility

Most software configuration management systems treat building software and deploying software as separate, independent steps. A build automation utility typically takes human-readable source code files already on a computer, and automates the process of converting them into a binary executable package on the same computer. Later a package manager typically running on some other computer downloads those pre-built binary executable packages over the internet and installs them.

However, both kinds of tools have many commonalities:

  • For example, the dependency graph topological sorting used in a package manager to handle dependencies between binary components is also used in a build manager to handle the dependency between source components.
  • For example, many makefiles support not only building executables, but also installing them with make install.
  • For example, every package manager for a source-based distribution – PortageSorceryHomebrew, etc. – supports converting human-readable source code to binary executables and installing it.

A few tools, such as Maak and A-A-P, are designed to handle both building and deployment, and can be used as either a build automation utility or as a package manager or both.[13]

Common package managers and formats

Universal package manager

Also known as binary repository manager, it is a software tool designed to optimize the download and storage of binary files, artifacts and packages used and produced in the software development process.[14] These package managers aim to standardize the way enterprises treat all package types. They give users the ability to apply security and compliance metrics across all artifact types. Universal package managers have been referred to as being at the center of a DevOps toolchain.[15]

Package formats

Main articles: Package format and File archive

Each package manager relies on the format and metadata of the packages it can manage. That is, package managers need groups of files to be bundled for the specific package manager along with appropriate metadata, such as dependencies. Often, a core set of utilities manages the basic installation from these packages and multiple package managers use these utilities to provide additional functionality.

For example, yum relies on rpm as a backend. Yum extends the functionality of the backend by adding features such as simple configuration for maintaining a network of systems. As another example, the Synaptic Package Manager provides a graphical user interface by using the Advanced Packaging Tool (apt) library, which, in turn, relies on dpkg for core functionality.

Alien is a program that converts between different Linux package formats, supporting conversion between Linux Standard Base (LSB) compliant .rpm packages, .deb, Stampede (.slp), Solaris (.pkg) and Slackware (.tgz.txz, .tbz, .tlz) packages.

In mobile operating systems, Google Play consumes Android application package (APK) package format while Windows Store uses APPX and XAP formats. (Both Google Play and Windows Store have eponymous package managers.)

Free and open source software systems

By the nature of free and open source software, packages under similar and compatible licenses are available for use on a number of operating systems. These packages can be combined and distributed using configurable and internally complex packaging systems to handle many permutations of software and manage version-specific dependencies and conflicts. Some packaging systems of free and open source software are also themselves released as free and open source software. One typical difference between package management in proprietary operating systems, such as Mac OS X and Windows, and those in free and open source software, such as Linux, is that free and open source software systems permit third-party packages to also be installed and upgraded through the same mechanism, whereas the package managers of Mac OS X and Windows will only upgrade software provided by Apple and Microsoft, respectively (with the exception of some third party drivers in Windows). The ability to continuously upgrade third party software is typically added by adding the URL of the corresponding repository to the package management’s configuration file.

Application-level package managers

See also: List of software package management systems § Application-level package managers

Beside the system-level application managers, there are some add-on package managers for operating systems with limited capabilities and for programming languages in which developers need the latest libraries.

In contrast to system-level package managers, application-level package managers focus on a small part of the software system. They typically reside within a directory tree that is not maintained by the system-level package manager, such as c:\cygwin or /usr/local/fink. However, this might not be the case for the package managers that deal with programming libraries, leading to a possible conflict as both package managers may claim to “own” a file and might break upgrades.

Impact

Ian Murdock had commented that package management is “the single biggest advancement Linux has brought to the industry”, that it blurs the boundaries between operating system and applications, and that it makes it “easier to push new innovations […] into the marketplace and […] evolve the OS”.[16]

See also

” (WP)

Sources:

Fair Use Sources:

Categories
C Language C# .NET C++ Cloud Data Science - Big Data DevOps Django Web Framework Flask Web Framework Go Programming Language Java JavaScript Kotlin PowerShell Python Ruby Software Engineering Spring Framework Swift TypeScript

Integrated Development Environment (IDE)

“An integrated development environment (IDE) is a software application that provides comprehensive facilities to computer programmers for software development. An IDE normally consists of at least a source code editorbuild automation tools and a debugger. Some IDEs, such as Visual Studio, NetBeans and Eclipse, contain the necessary compilerinterpreter, or both; others, such as SharpDevelop and Lazarus, do not.” (WP)

“The boundary between an IDE and other parts of the broader software development environment is not well-defined; sometimes a version control system or various tools to simplify the construction of a graphical user interface (GUI) are integrated. Many modern IDEs also have a class browser, an object browser, and a class hierarchy diagram for use in object-oriented software development.” (WP)

Categories
Bibliography

Techopedia.com

Fair Use Source: techopedia.com (TcPd)

Categories
Cloud DevOps Linux Operating Systems

Kubuntu Linux Operating System

Official websitekubuntu.org
Kubuntu logo and wordmark.svg

Kubuntu (/kʊˈbʊntuː/ kuu-BUUN-too)[4] is an official flavour of the Ubuntu Linux distribution operating system that uses the KDE Plasma Desktop instead of the GNOME desktop environment. As part of the Ubuntu project, Kubuntu uses the same underlying systems. Every package in Kubuntu shares the same repositories as Ubuntu,[5] and it is released regularly on the same schedule as Ubuntu.[6]

Kubuntu was sponsored by Canonical Ltd. until 2012 and then directly by Blue Systems. Now, employees of Blue Systems contribute upstream, to KDE and Debian, and Kubuntu development is led by community contributors. During the changeover, Kubuntu retained the use of Ubuntu project servers and existing developers.[7]

Kubuntu was born on 10 December 2004 at the Ubuntu Mataro Conference in Mataró, Spain.

(WP)

Sources:

Fair Use Sources:

Categories
Cloud DevOps Linux Operating Systems

Linux Mint Operating System

Linux Mint Official Logo.svg
Official websitelinuxmint.com

Linux Mint is a community-driven Linux distribution based on Ubuntu which itself is based on Debian, and bundled with a variety of free and open-source applications.[5][6] It can provide full out-of-the-box multimedia support for those who choose (by ticking one box during its installation process) to include proprietary software such as multimedia codecs.[7]

The Linux Mint project was created by Clément Lefèbvre and is actively maintained by the Linux Mint Team and community.[8]

Linux Mint began in 2006 with a beta release, 1.0, code-named ‘Ada’,[9] based on Kubuntu. Linux Mint 2.0 ‘Barbara’ was the first version to use Ubuntu as its codebase. It had few users until the release of Linux Mint 3.0, ‘Cassandra’.[10][11]

(WP)

Sources:

Fair Use Sources:

Categories
Cloud DevOps Linux Operating Systems

Ubuntu Linux Operating System

Logo-ubuntu no(r)-black orange-hex.svg

Ubuntu (/ʊˈbʊntuː/ (listenuu-BUUN-too)[7] is a Linux distribution based on Debian and composed mostly of free and open-source software.[8][9][10] Ubuntu is officially released in three editions: Desktop,[11] Server,[12] and Core[13] for Internet of things devices[14] and robots.[15][16] All the editions can run on the computer alone, or in a virtual machine.[17] Ubuntu is a popular operating system for cloud computing, with support for OpenStack.[18] Ubuntu’s default desktop has been GNOME, since version 17.10.[19]

Ubuntu is released every six months, with long-term support (LTS) releases every two years.[7][20][21] As of 22 October 2020, the most recent long-term support release is 20.04 (“Focal Fossa”), which is supported until 2025 under public support and until 2030 as a paid option. The latest standard release is 20.10 (“Groovy Gorilla”), which is supported for nine months.

Ubuntu is developed by Canonical,[22] and a community of other developers, under a meritocratic governance model.[7][23] Canonical provides security updates and support for each Ubuntu release, starting from the release date and until the release reaches its designated end-of-life (EOL) date.[7][24][25] Canonical generates revenue through the sale of premium services related to Ubuntu.[26][27]

Ubuntu is named after the Nguni philosophy of ubuntu, which Canonical indicates means “humanity to others” with a connotation of “I am what I am because of who we all are”.[7]

(WP)

Sources:

Fair Use Sources:

Categories
Cloud DevOps Linux Operating Systems

Tails Linux Operating System

The Amnesic Incognito Live System

Tails logo

Tails, or The Amnesic Incognito Live System, is a security-focused Debian-based Linux distribution aimed at preserving privacy and anonymity.[4] All its incoming and outgoing connections are forced to go through Tor,[5] and any non-anonymous connections are blocked. The system is designed to be booted as a live DVD or live USB, and will leave no digital footprint on the machine unless explicitly told to do so. The Tor Project provided financial support for its development in the beginnings of the project.[6] Tails comes with UEFI Secure Boot.

History:

Tails was first released on 23 June 2009. It is the next iteration of development on Incognito, a discontinued Gentoo-based Linux distribution.[7] The Tor Project provided financial support for its development in the beginnings of the project.[6] Tails also received funding from the Open Technology FundMozilla, and the Freedom of the Press Foundation.[8]

Laura PoitrasGlenn Greenwald, and Barton Gellman have each said that Tails was an important tool they used in their work with National Security Agency whistleblower Edward Snowden.[9][10][11]

From release 3.0, Tails requires a 64-bit processor to run.[12]

Bundled software:

Networking

Note: Due to the fact that Tails includes uBlock Origin (compared to the normal Tor Browser Bundle), it could be subject to an attack to determine if the user is using Tails (since the userbase for Tails is less than the Tor Browser Bundle) by checking if the website is blocking advertising.[14] Although this can be avoided by disabling uBlock Origin.

(WP)

Sources:

Fair Use Sources:

Categories
Cloud DevOps Linux Operating Systems

Raspberry Pi OS Linux Operating System

Raspberry Pi OS[3] (formerly Raspbian) is a Debian-based Linux distribution operating system for Raspberry Pi. Since 2015 it has been officially provided by the Raspberry Pi Foundation as the primary operating system for the Raspberry Pi family of compact single-board computers.[4] The first version of Raspbian was created by Mike Thompson and Peter Green as an independent project.[5] The initial build was completed in June 2012.[6]

Previous Raspberry Pi OS versions were 32bit only and based on Debian, taking the name Raspbian. Since the more recent 64bit versions no longer use Debian, the name was changed to Raspberry Pi OS for both the 64bit and 32bit versions. As of 1 February 2021, the 64-bit version is in beta and is not suitable for general use.[7] [8]

Raspberry Pi OS is highly optimized for the Raspberry Pi line of compact single-board computers with ARM CPUs. It runs on every Raspberry Pi except the Pico microcontroller. Raspberry Pi OS uses a modified LXDE as its desktop environment with the Openbox stacking window manager, along with a unique theme. The distribution is shipped with a copy of the algebra program Wolfram Mathematica[4] and a version of Minecraft called Minecraft: Pi Edition, as well as a lightweight version of the Chromium web browser.”

(WP)

Sources:

Fair Use Sources:

Categories
Cloud DevOps Linux Operating Systems

Knoppix Linux Operating System

KNOPPIX (/ˈknɒpɪks/ KNOP-iks)[2] is an Linux distribution operating system based on Debian designed to be run directly from a CD / DVD (Live CD) or a USB flash drive (Live USB), one of the first of its kind for any operating system[vague]. Knoppix was developed by, and named after, Linux consultant Klaus Knopper. When starting a program, it is loaded from the removable medium and decompressed into a RAM drive. The decompression is transparent and on-the-fly.

Although KNOPPIX is primarily designed to be used as a Live CD, it can also be installed on a hard disk like a typical operating system. Computers that support booting from USB devices can load KNOPPIX from a live USB flash drive or memory card.

There are two main editions: the traditional compact-disc (700 megabytes) edition and the DVD (4.7 gigabytes) “Maxi” edition. However, it appears that the CD edition has not been updated since June of 2013.[3] Each main edition has two language-specific editions: English and German.

KNOPPIX mostly consists of free and open source software, but also includes some proprietary software, as long as it fulfils certain conditions.[4]

Knoppix can be used to copy files easily from hard drives with inaccessible operating systems. To quickly and more safely use Linux software, the Live CD can be used instead of installing another OS.”

(WP)

Sources:

Fair Use Sources:

Categories
Cloud DevOps Linux Operating Systems

Kali Linux Operating System

Kali Linux is a Debian-derived Linux distribution designed for digital forensics and penetration testing.[3] It is maintained and funded by Offensive Security.[4]

Kali Linux has around 600[5] pre-installed penetration-testing programs(tools), including Armitage (a graphical cyber attack management tool), Nmap (a port scanner), Wireshark (a packet analyzer), metasploit (penetration testing framework, awarded as the best penetration testing software), John the Ripper (a password cracker), sqlmap (automatic SQL injection and database takeover tool), Aircrack-ng (a software suite for penetration-testing wireless LANs), Burp suite and OWASP ZAP web application security scanners,[6][7] etc.

It was developed by Mati Aharoni and Devon Kearns of Offensive Security through the rewrite of BackTrack, their previous information security testing Linux distribution based on Knoppix. Originally, it was designed with a focus on kernel auditing, from which it got its name Kernel Auditing Linux. The name is sometimes incorrectly assumed to come from Kali the Hindu goddess.[8][9] The third core developer, Raphaël Hertzog, joined them as a Debian expert.[10][11]

Kali Linux is based on the Debian Testing branch. Most packages Kali uses are imported from the Debian repositories.[12]

Kali Linux’s popularity grew when it was featured in multiple episodes of the TV series Mr. Robot. Tools highlighted in the show and provided by Kali Linux include Bluesniff, Bluetooth Scanner (btscanner), John the Ripper, Metasploit Framework, Nmap, Shellshock, and Wget.[13][14][15]

(WP)

Sources:

Fair Use Sources:

Categories
Cloud DevOps Linux Operating Systems

Debian GNU/Linux Operating System

Debian (/ˈdɛbiən/),[5][6] also known as Debian GNU/Linux, is a Linux distribution composed of free and open-source software, developed by the community-supported Debian Project, which was established by Ian Murdock on August 16, 1993. The first version of Debian (0.01) was released on September 15, 1993,[7] and its first stable version (1.1) was released on June 17, 1996.[8] The Debian Stable branch is the most popular edition for personal computers and servers. Debian is also the basis for many other distributions, most notably Ubuntu.

Debian is one of the oldest operating systems based on the Linux kernel. The project is coordinated over the Internet by a team of volunteers guided by the Debian Project Leader and three foundational documents: the Debian Social Contract, the Debian Constitution, and the Debian Free Software Guidelines. New distributions are updated continually, and the next candidate is released after a time-based freeze.

Since its founding, Debian has been developed openly and distributed freely according to the principles of the GNU Project. Because of this, the Free Software Foundation sponsored the project from November 1994 to November 1995. When the sponsorship ended, the Debian Project formed the nonprofit organization Software in the Public Interest to continue financially supporting development.”

(WP)

Sources:

Fair Use Sources: