Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

How Docker Streamlines the  Onboarding Process and Sets Up Developers for Success

By: Yiwen Xu
22 January 2025 at 21:00

Nearly half (45%) of developers say they don’t have enough time for learning and development, according to a developer experience research study by Harness and Wakefield Research. Additionally, developer onboarding is a slow and painful process, with 71% of executive buyers saying that onboarding new developers takes at least two months. 

To accelerate innovation and bring products to market faster, organizations must empower developers with robust support and intuitive guardrails, enabling them to succeed within a structured yet flexible environment. That’s where Docker fits in: We help developers onboard quickly and help organizations set up the right guardrails to give developers the flexibility to innovate within the boundaries of company policies. 

2400x1260 docker evergreen logo blog C 1

Setting up developer teams for success 

Docker is recognized as one of the most used, desired, and admired developer tools, making it an essential component of any development team’s toolkit. For developers who are new to Docker, you can quickly get them up and running with Docker’s integrated development workflows, verified secure content, and accessible learning resources and community support.

Streamlined developer onboarding

When new developers join a team, Docker Desktop can significantly reduce the time and effort required to set up their development environments. Docker Desktop integrates seamlessly with popular IDEs, such as Visual Studio Code, allowing developers to containerize directly within familiar tools, accelerating learning within their usual workflows. Docker Extensions expand Docker Desktop’s capabilities and establish new functionalities, integrating developers’ favorite development tools into their application development and deployment workflows. 

Developers can also use Docker for GitHub Copilot for seamless onboarding with assistance for containerizing applications, generating Docker assets, and analyzing project vulnerabilities. In fact, the Docker extension is a top choice among developers in GitHub Copilot’s extension leaderboard, as highlighted by Visual Studio Magazine.

Docker Build Cloud integrates with Docker Compose and CI workflows, making it a seamless transition for dev teams. Verified content on Docker Hub gives developers preconfigured, trusted images, reducing setup time and ensuring a secure foundation as they onboard onto projects. 

Docker Scout provides actionable insights and recommendations, allowing developers to enhance their container security awareness, scan for vulnerabilities, and improve security posture with real-time feedback. And, Testcontainers Cloud lets developers run reliable integration tests, with real dependencies defined in code. With these tools, developers can be confident about delivering high-quality and reliable apps and experiences in production.  

Continuous learning with accessible knowledge resources

Continuous learning is a priority for Docker, with a wide range of accessible resources and tools designed to help developers deepen their knowledge and stay current in their containerization journey.

Docker Docs offers beginner-friendly guides, tutorials, and AI tools to guide developers through foundational concepts, empowering them to quickly build their container skills. Our collection of guides takes developers step by step to learn how Docker can optimize development workflows and how to use it with specific languages, frameworks, or technologies.

Docker Hub’s AI Catalog empowers developers to discover, pull, and integrate AI models into their workflows, bridging the gap between innovation and implementation. 

Docker also offers regular webinars and tech talks that help developers stay updated on new features and best practices and provide a platform to discuss real-world challenges. If you’re a Docker Business customer, you can even request additional, customized training from our Docker experts. 

Docker’s partnerships with educational platforms and organizations, such as Udemy Training and LinkedIn Learning, ensure developers have access to comprehensive training — from beginner tutorials to advanced containerization topics.

Docker’s global developer community

One of Docker’s greatest strengths is its thriving global developer community, offering organizations a unique advantage by connecting them with a wealth of shared expertise, resources, and real-world solutions.

With more than 20 million monthly active users, Docker’s community forums and events foster vibrant collaboration, giving developers access to a collective knowledge base that spans industries and expertise levels. Developers can ask questions, solve challenges, and gain insights from a diverse range of peers — from beginners to seasoned experts. Whether you’re troubleshooting an issue or exploring best practices, the Docker community ensures you’re never working in isolation.

A key pillar of this ecosystem is the Docker Captains program — a network of experienced and passionate Docker advocates who are leaders in their fields. Captains share technical knowledge through blog posts, videos, webinars, and workshops, giving businesses and teams access to curated expertise that accelerates onboarding and productivity.

Beyond forums and the Docker Captains program, Docker’s community-driven events, such as meetups and virtual workshops (Figure 1), provide developers with direct access to real-world use cases, innovative workflows, and emerging trends. These interactions foster continuous learning and help developers and their organizations keep pace with the ever-evolving software development landscape.

Photo showing a group of people sitting and standing in front of a large window at a Docker DevTools event.
Figure 1: Docker DevTools Day 1.0 Meetup in Singapore.

For businesses, tapping into Docker’s extensive community means access to a vast pool of knowledge, support, and inspiration, which is a critical asset in driving developer productivity and innovation.

Empowering developers with enhanced user management and security

In previous articles, we looked at how Docker simplifies complexity and boosts developer productivity (the right tool) and how to unlock efficiency with Docker for AI and cloud-native development (the right process).

To scale and standardize app development processes across the entire company, you also need to have the right guardrails in place for governance, compliance, and security, which is often handled through enterprise control and admin management tools. Ideally, organizations provide guardrails without being overly prescriptive and slowing developer productivity and innovation. 

Modern enterprises require a layered security approach, beginning with trusted content as the foundation for building robust and compliant applications. This approach gives your dev teams a good foundation for building securely from the start. 

Throughout the software development process, you need a secure platform. For regulated industries like finance and public sectors, this means fortified dev environments. Security vulnerability analysis and policy evaluation tools also help inform improvements and remediation. 

Additionally, you need enterprise controls and dashboards that ensure enterprise IT and security teams can confidently monitor and manage risk. 

Setting up the right guardrails 

Docker provides a number of admin tools to safeguard your software with integrated container security in the Docker Business plan. Our goal is to improve security and compliance of developer environments with minimal impact on developer experience or productivity. 

Centralized settings for improved dev environments security 

Docker provides developer teams with access to a vast library of trusted and certified application content, including Docker Official Images, Docker Verified Publisher, and Docker Trusted Open Source content. Coupled with advanced image and registry management rules — with tools like Image Access Management and Registry Access Management — you can ensure that your developers only use software that satisfies your company’s security policies. 

With a solid foundation to build securely from the start, your organization can further enhance security throughout the software development process. Docker ensures software supply chain integrity through vulnerability scanning and image analysis with Docker Scout. Rapid remediation capabilities paired with detailed CVE reporting help developers quickly find and fix vulnerabilities, resulting in speedy time to resolution.

Although containers are generally secure, container development tools still must be properly secured to reduce the risk of security breaches in the developer’s environment. Hardened Docker Desktop is an example of Docker’s fortified development environments with enhanced container isolation. It lets you enforce strict security settings and prevent developers and their containers from bypassing these controls. With air-gapped containers, you can further restrict containers from accessing network resources, limiting where data can be uploaded to or downloaded from.

Continuous monitoring and managing risks

With the Admin Console and Docker Desktop Insights, IT administrators and security teams can visualize and understand how Docker is used within their organizations and manage the implementation of organizational configurations and policies (Figure 2). 

These insights help teams streamline processes and improve efficiency. For example, you can enforce sign-in for developers who don’t sign in to an account associated with your organization. This step ensures that developers receive the benefits of your Docker subscription and work within the boundaries of the company policies. 

Screenshot of Docker Desktop Insights Dashboard containing numbers, information, and blue-colored graphs relating to Docker Desktop Users, Builds, Containers, Usage, and Images.
Figure 2: Docker Desktop Insights Dashboard provides information on product usage.

For business and engineering leaders, full visibility and governance over the development process help ensure compliance and mitigate risk while driving developer productivity. 

Unlock innovation with Docker’s development suite

Docker is the leading suite of tools purpose-built for cloud-native development, combining a best-in-class developer experience with enterprise-grade security and governance. With Docker, your organization can streamline onboarding, foster innovation, and maintain robust compliance — all while empowering your teams to deliver impactful solutions to market faster and more securely. 

Explore the Docker Business plan today and unlock the full potential of your development processes.

Learn more

Mastering Docker and Jenkins: Build Robust CI/CD Pipelines Efficiently

16 January 2025 at 20:17

Hey there, fellow engineers and tech enthusiasts! I’m excited to share one of my favorite strategies for modern software delivery: combining Docker and Jenkins to power up your CI/CD pipelines. 

Throughout my career as a Senior DevOps Engineer and Docker Captain, I’ve found that these two tools can drastically streamline releases, reduce environment-related headaches, and give teams the confidence they need to ship faster.

In this post, I’ll walk you through what Docker and Jenkins are, why they pair perfectly, and how you can build and maintain efficient pipelines. My goal is to help you feel right at home when automating your workflows. Let’s dive in.

2400x1260 evergreen docker blog g

Brief overview of continuous integration and continuous delivery

Continuous integration (CI) and continuous delivery (CD) are key pillars of modern development. If you’re new to these concepts, here’s a quick rundown:

  • Continuous integration (CI): Developers frequently commit their code to a shared repository, triggering automated builds and tests. This practice prevents conflicts and ensures defects are caught early.
  • Continuous delivery (CD): With CI in place, organizations can then confidently automate releases. That means shorter release cycles, fewer surprises, and the ability to roll back changes quickly if needed.

Leveraging CI/CD can dramatically improve your team’s velocity and quality. Once you experience the benefits of dependable, streamlined pipelines, there’s no going back.

Why combine Docker and Jenkins for CI/CD?

Docker allows you to containerize your applications, creating consistent environments across development, testing, and production. Jenkins, on the other hand, helps you automate tasks such as building, testing, and deploying your code. I like to think of Jenkins as the tireless “assembly line worker,” while Docker provides identical “containers” to ensure consistency throughout your project’s life cycle.

Here’s why blending these tools is so powerful:

  • Consistent environments: Docker containers guarantee uniformity from a developer’s laptop all the way to production. This consistency reduces errors and eliminates the dreaded “works on my machine” excuse.
  • Speedy deployments and rollbacks: Docker images are lightweight. You can ship or revert changes at the drop of a hat — perfect for short delivery process cycles where minimal downtime is crucial.
  • Scalability: Need to run 1,000 tests in parallel or support multiple teams working on microservices? No problem. Spin up multiple Docker containers whenever you need more build agents, and let Jenkins orchestrate everything with Jenkins pipelines.

For a DevOps junkie like me, this synergy between Jenkins and Docker is a dream come true.

Setting up your CI/CD pipeline with Docker and Jenkins

Before you roll up your sleeves, let’s cover the essentials you’ll need:

  • Docker Desktop (or a Docker server environment) installed and running. You can get Docker for various operating systems.
  • Jenkins downloaded from Docker Hub or installed on your machine. These days, you’ll want jenkins/jenkins:lts (the long-term support image) rather than the deprecated library/jenkins image.
  • Proper permissions for Docker commands and the ability to manage Docker images on your system.
  • A GitHub or similar code repository where you can store your Jenkins pipeline configuration (optional, but recommended).

Pro tip: If you’re planning a production setup, consider a container orchestration platform like Kubernetes. This approach simplifies scaling Jenkins, updating Jenkins, and managing additional Docker servers for heavier workloads.

Building a robust CI/CD pipeline with Docker and Jenkins

After prepping your environment, it’s time to create your first Jenkins-Docker pipeline. Below, I’ll walk you through common steps for a typical pipeline — feel free to modify them to fit your stack.

1. Install necessary Jenkins plugins

Jenkins offers countless plugins, so let’s start with a few that make configuring Jenkins with Docker easier:

  • Docker Pipeline Plugin
  • Docker
  • CloudBees Docker Build and Publish

How to install plugins:

  1. Open Manage Jenkins > Manage Plugins in Jenkins.
  2. Click the Available tab and search for the plugins listed above.
  3. Install them (and restart Jenkins if needed).

Code example (plugin installation via CLI):

# Install plugins using Jenkins CLI
java -jar jenkins-cli.jar -s http://<jenkins-server>:8080/ install-plugin docker-pipeline
java -jar jenkins-cli.jar -s http://<jenkins-server>:8080/ install-plugin docker
java -jar jenkins-cli.jar -s http://<jenkins-server>:8080/ install-plugin docker-build-publish

Pro tip (advanced approach): If you’re aiming for a fully infrastructure-as-code setup, consider using Jenkins configuration as code (JCasC). With JCasC, you can declare all your Jenkins settings — including plugins, credentials, and pipeline definitions — in a YAML file. This means your entire Jenkins configuration is version-controlled and reproducible, making it effortless to spin up fresh Jenkins instances or apply consistent settings across multiple environments. It’s especially handy for large teams looking to manage Jenkins at scale.

Reference:

2. Set up your Jenkins pipeline

In this step, you’ll define your pipeline. A Jenkins “pipeline” job uses a Jenkinsfile (stored in your code repository) to specify the steps, stages, and environment requirements.

Example Jenkinsfile:

pipeline {
    agent any
    stages {
        stage('Checkout') {
            steps {
                git branch: 'main', url: 'https://github.com/your-org/your-repo.git'
            }
        }
        stage('Build') {
            steps {
                script {
                    dockerImage = docker.build("your-org/your-app:${env.BUILD_NUMBER}")
                }
            }
        }
        stage('Test') {
            steps {
                sh 'docker run --rm your-org/your-app:${env.BUILD_NUMBER} ./run-tests.sh'
            }
        }
        stage('Push') {
            steps {
                script {
                    docker.withRegistry('https://index.docker.io/v1/', 'dockerhub-credentials') {
                        dockerImage.push()
                    }
                }
            }
        }
    }
}

Let’s look at what’s happening here:

  1. Checkout: Pulls your repository.
  2. Build: Creates a built docker image (your-org/your-app) with the build number as a tag.
  3. Test: Runs your test suite inside a fresh container, ensuring Docker containers create consistent environments for every test run.
  4. Push: Pushes the image to your Docker registry (e.g., Docker Hub) if the tests pass.

Reference: Jenkins Pipeline Documentation.

3. Configure Jenkins for automated builds

Now that your pipeline is set up, you’ll want Jenkins to run it automatically:

  • Webhook triggers: Configure your source control (e.g., GitHub) to send a webhook whenever code is pushed. Jenkins will kick off a build immediately.
  • Poll SCM: Jenkins periodically checks your repo for new commits and starts a build if it detects changes.

Which trigger method should you choose?

  • Webhook triggers are ideal if you want near real-time builds. As soon as you push to your repo, Jenkins is notified, and a new build starts almost instantly. This approach is typically more efficient, as Jenkins doesn’t have to continuously check your repository for updates. However, it requires that your source control system and network environment support webhooks.
  • Poll SCM is useful if your environment can’t support incoming webhooks — for example, if you’re behind a corporate firewall or your repository isn’t configured for outbound hooks. In that case, Jenkins routinely checks for new commits on a schedule you define (e.g., every five minutes), which can add a small delay and extra overhead but may simplify setup in locked-down environments.

Personal experience: I love webhook triggers because they keep everything as close to real-time as possible. Polling works fine if webhooks aren’t feasible, but you’ll see a slight delay between code pushes and build starts. It can also generate extra network traffic if your polling interval is too frequent.

4. Build, test, and deploy with Docker containers

Here comes the fun part — automating the entire cycle from build to deploy:

  1. Build Docker image: After pulling the code, Jenkins calls docker.build to create a new image.
  2. Run tests: Automated or automated acceptance testing runs inside a container spun up from that image, ensuring consistency.
  3. Push to registry: Assuming tests pass, Jenkins pushes the tagged image to your Docker registry — this could be Docker Hub or a private registry.
  4. Deploy: Optionally, Jenkins can then deploy the image to a remote server or a container orchestrator (Kubernetes, etc.).

This streamlined approach ensures every step — build, test, deploy — lives in one cohesive pipeline, preventing those “where’d that step go?” mysteries.

5. Optimize and maintain your pipeline

Once your pipeline is up and running, here are a few maintenance tips and enhancements to keep everything running smoothly:

  • Clean up images: Routine cleanup of Docker images can reclaim space and reduce clutter.
  • Security updates: Stay on top of updates for Docker, Jenkins, and any plugins. Applying patches promptly helps protect your CI/CD environment from vulnerabilities.
  • Resource monitoring: Ensure Jenkins nodes have enough memory, CPU, and disk space for builds. Overworked nodes can slow down your pipeline and cause intermittent failures.

Pro tip: In large projects, consider separating your build agents from your Jenkins controller by running them in ephemeral Docker containers (also known as Jenkins agents). If an agent goes down or becomes stale, you can quickly spin up a fresh one — ensuring a clean, consistent environment for every build and reducing the load on your main Jenkins server.

Why use Declarative Pipelines for CI/CD?

Although Jenkins supports multiple pipeline syntaxes, Declarative Pipelines stand out for their clarity and resource-friendly design. Here’s why:

  • Simplified, opinionated syntax: Everything is wrapped in a single pipeline { ... } block, which minimizes “scripting sprawl.” It’s perfect for teams who want a quick path to best practices without diving deeply into Groovy specifics.
  • Easier resource allocation: By specifying an agent at either the pipeline level or within each stage, you can offload heavyweight tasks (builds, tests) onto separate worker nodes or Docker containers. This approach helps prevent your main Jenkins controller from becoming overloaded.
  • Parallelization and matrix builds: If you need to run multiple test suites or support various OS/browser combinations, Declarative Pipelines make it straightforward to define parallel stages or set up a matrix build. This tactic is incredibly handy for microservices or large test suites requiring different environments in parallel.
  • Built-in “escape hatch”: Need advanced Groovy features? Just drop into a script block. This lets you access Scripted Pipeline capabilities for niche cases, while still enjoying Declarative’s streamlined structure most of the time.
  • Cleaner parameterization: Want to let users pick which tests to run or which Docker image to use? The parameters directive makes your pipeline more flexible. A single Jenkinsfile can handle multiple scenarios — like unit vs. integration testing — without duplicating stages.

Declarative Pipeline examples

Below are sample pipelines to illustrate how declarative syntax can simplify resource allocation and keep your Jenkins controller healthy.

Example 1: Basic Declarative Pipeline

pipeline {
    agent any
    stages {
        stage('Build') {
            steps {
                echo 'Building...'
            }
        }
        stage('Test') {
            steps {
                echo 'Testing...'
            }
        }
    }
}
  • Runs on any available Jenkins agent (worker).
  • Uses two stages in a simple sequence.

Example 2: Stage-level agents for resource isolation

pipeline {
    agent none  // Avoid using a global agent at the pipeline level
    stages {
        stage('Build') {
            agent { docker 'maven:3.9.3-eclipse-temurin-17' }
            steps {
                sh 'mvn clean package'
            }
        }
        stage('Test') {
            agent { docker 'openjdk:17-jdk' }
            steps {
                sh 'java -jar target/my-app-tests.jar'
            }
        }
    }
}
  • Each stage runs in its own container, preventing any single node from being overwhelmed.
  • agent none at the top ensures no global agent is allocated unnecessarily.

Example 3: Parallelizing test stages

pipeline {
    agent none
    stages {
        stage('Test') {
            parallel {
                stage('Unit Tests') {
                    agent { label 'linux-node' }
                    steps {
                        sh './run-unit-tests.sh'
                    }
                }
                stage('Integration Tests') {
                    agent { label 'linux-node' }
                    steps {
                        sh './run-integration-tests.sh'
                    }
                }
            }
        }
    }
}
  • Splits tests into two parallel stages.
  • Each stage can run on a different node or container, speeding up feedback loops.

Example 4: Parameterized pipeline

pipeline {
    agent any

    parameters {
        choice(name: 'TEST_TYPE', choices: ['unit', 'integration', 'all'], description: 'Which test suite to run?')
    }

    stages {
        stage('Build') {
            steps {
                echo 'Building...'
            }
        }
        stage('Test') {
            when {
                expression { return params.TEST_TYPE == 'unit' || params.TEST_TYPE == 'all' }
            }
            steps {
                echo 'Running unit tests...'
            }
        }
        stage('Integration') {
            when {
                expression { return params.TEST_TYPE == 'integration' || params.TEST_TYPE == 'all' }
            }
            steps {
                echo 'Running integration tests...'
            }
        }
    }
}
  • Lets you choose which tests to run (unit, integration, or both).
  • Only executes relevant stages based on the chosen parameter, saving resources.

Example 5: Matrix builds

pipeline {
    agent none

    stages {
        stage('Build and Test Matrix') {
            matrix {
                agent {
                    label "${PLATFORM}-docker"
                }
                axes {
                    axis {
                        name 'PLATFORM'
                        values 'linux', 'windows'
                    }
                    axis {
                        name 'BROWSER'
                        values 'chrome', 'firefox'
                    }
                }
                stages {
                    stage('Build') {
                        steps {
                            echo "Build on ${PLATFORM} with ${BROWSER}"
                        }
                    }
                    stage('Test') {
                        steps {
                            echo "Test on ${PLATFORM} with ${BROWSER}"
                        }
                    }
                }
            }
        }
    }
}
  • Defines a matrix of PLATFORM x BROWSER, running each combination in parallel.
  • Perfect for testing multiple OS/browser combinations without duplicating pipeline logic.

Additional resources:

Using Declarative Pipelines helps ensure your CI/CD setup is easier to maintain, scalable, and secure. By properly configuring agents — whether Docker-based or label-based — you can spread workloads across multiple worker nodes, minimize resource contention, and keep your Jenkins controller humming along happily.

Best practices for CI/CD with Docker and Jenkins

Ready to supercharge your setup? Here are a few tried-and-true habits I’ve cultivated:

  • Leverage Docker’s layer caching: Optimize your Dockerfiles so stable (less frequently changing) layers appear early. This drastically reduces build times.
  • Run tests in parallel: Jenkins can run multiple containers for different services or microservices, letting you test them side by side. Declarative Pipelines make it easy to define parallel stages, each on its own agent.
  • Shift left on security: Integrate security checks early in the pipeline. Tools like Docker Scout let you scan images for vulnerabilities, while Jenkins plugins can enforce compliance policies. Don’t wait until production to discover issues.
  • Optimize resource allocation: Properly configure CPU and memory limits for Jenkins and Docker containers to avoid resource hogging. If you’re scaling Jenkins, distribute builds across multiple worker nodes or ephemeral agents for maximum efficiency.
  • Configuration management: Store Jenkins jobs, pipeline definitions, and plugin configurations in source control. Tools like Jenkins Configuration as Code simplify versioning and replicating your setup across multiple Docker servers.

With these strategies — plus a healthy dose of Declarative Pipelines — you’ll have a lean, high-octane CI/CD pipeline that’s easier to maintain and evolve.

Troubleshooting Docker and Jenkins Pipelines

Even the best systems hit a snag now and then. Here are a few hurdles I’ve seen (and conquered):

  • Handling environment variability: Keep Docker and Jenkins versions synced across different nodes. If multiple Jenkins nodes are in play, standardize Docker versions to avoid random build failures.
  • Troubleshooting build failures: Use docker logs -f <container-id> to see exactly what happened inside a container. Often, the logs reveal missing dependencies or misconfigured environment variables.
  • Networking challenges: If your containers need to talk to each other — especially across multiple hosts — make sure you configure Docker networks or an orchestration platform properly. Read Docker’s networking documentation for details, and check out the Jenkins diagnosing issues guide for more troubleshooting tips.

Conclusion

Pairing Docker and Jenkins offers a nimble, robust approach to CI/CD. Docker locks down consistent environments and lightning-fast rollouts, while Jenkins automates key tasks like building, testing, and pushing your changes to production. When these two are in harmony, you can expect shorter release cycles, fewer integration headaches, and more time to focus on developing awesome features.

A healthy pipeline also means your team can respond quickly to user feedback and confidently roll out updates — two crucial ingredients for any successful software project. And if you’re concerned about security, there are plenty of tools and best practices to keep your applications safe.

I hope this guide helps you build (and maintain) a high-octane CI/CD pipeline that your team will love. If you have questions or need a hand, feel free to reach out on the community forums, join the conversation on Slack, or open a ticket on GitHub issues. You’ll find plenty of fellow Docker and Jenkins enthusiasts who are happy to help.

Thanks for reading, and happy building!

Learn more

Protecting the Software Supply Chain: The Art of Continuous Improvement

16 January 2025 at 20:04

Without continuous improvement in software security, you’re not standing still — you’re walking backward into oncoming traffic. Attack vectors multiply, evolve, and look for the weakest link in your software supply chain daily. 

Cybersecurity Ventures forecasts that the global cost of software supply chain attacks will reach nearly $138 billion by 2031, up from $60 billion in 2025 and $46 billion in 2023. A single overlooked vulnerability isn’t just a flaw; it’s an open invitation for compromise, potentially threatening your entire system. The cost of a breach doesn’t stop with your software — it extends to your reputation and customer trust, which are far harder to rebuild. 

Docker’s suite of products offers your team peace of mind. With tools like Docker Scout, you can expose vulnerabilities before they expose you. Continuous image analysis doesn’t just find the cracks; it empowers your teams to seal them from code to production. But Docker Scout is just the beginning. Tools like Docker Hub’s trusted content, Docker Official Images (DOI), Image Access Management (IAM), and Hardened Docker Desktop work together to secure every stage of your software supply chain. 

In this post, we’ll explore how these tools provide built-in security, governance, and visibility, helping your team innovate faster while staying protected. 

2400x1260 evergreen docker blog a

Securing the supply chain

Your software supply chain isn’t just an automated sequence of tools and processes. It’s a promise — to your customers, team, and future. Promises are fragile. The cracks can start to show with every dependency, third-party integration, and production push. Tools like Image Access Management help protect your supply chain by providing granular control over who can pull, share, or modify images, ensuring only trusted team members access sensitive assets. Meanwhile, Hardened Docker Desktop ensures developers work in a secure, tamper-proof environment, giving your team confidence that development is aligned with enterprise security standards. The solution isn’t to slow down or second-guess; it’s to continuously improve on securing your software supply chain, such as automated vulnerability scans and trusted content from Docker Hub.

A breach is more than a line item in the budget. Customers ask themselves, “If they couldn’t protect this, what else can’t they protect?” Downtime halts innovation as fines for compliance failures and engineering efforts re-route to forensic security analysis. The brand you spent years perfecting could be reduced to a cautionary tale. Regardless of how innovative your product is, it’s not trusted if it’s not secure. 

Organizations must stay prepared by regularly updating their security measures and embracing new technologies to outpace evolving threats. As highlighted in the article Rising Tide of Software Supply Chain Attacks: An Urgent Problem, software supply chain attacks are increasingly targeting critical points in development workflows, such as third-party dependencies and build environments. High-profile incidents like the SolarWinds attack have demonstrated how adversaries exploit trust relationships and weaknesses in widely used components to cause widespread damage. 

Preventing security problems from the start

Preventing attacks like the SolarWinds breach requires prioritizing code integrity and adopting secure software development practices. Tools like Docker Scout seamlessly integrate security into developers’ workflows, enabling proactive identification of vulnerabilities in dependencies and ensuring that trusted components form the backbone of your applications.

Docker Hub’s trusted content and Docker Scout’s policy evaluation features help ensure that your organization uses compliant and secure images. Docker Official Images (DOI) provide a robust foundation for deployments, mitigating risks from untrusted components. To extend this security foundation, Image Access Management allows teams to enforce image-sharing policies and restrict access to sensitive components, preventing accidental exposure or misuse. For local development, Hardened Docker Desktop ensures that developers operate in a secure, enterprise-grade environment, minimizing risks from the outset. This combination of tools enables your engineering team to put out fires and, more importantly, prevent them from starting in the first place.

Building guardrails

Governance isn’t a roadblock; it’s the blueprint for progress. The problem is that some companies treat security like a fire extinguisher — something you grab when things go wrong. That is not a viable strategy in the long run. Real innovation happens when security guardrails are so well-designed that they feel like open highways, empowering teams to move fast without compromising safety. 

A structured policy lifecycle loop — mapping connections, planning changes, deploying cleanly, and retiring the dead weight — turns governance into your competitive edge. Automate it, and you’re not just checking boxes; you’re giving your teams the freedom to move fast and trust the road ahead. 

Continuous improvement on security policy management doesn’t have to feel like a bureaucratic chokehold. Docker provides a streamlined workflow to secure your software supply chain effectively. Docker Scout integrates seamlessly into your development lifecycle, delivering vulnerability scans, image analysis, and detailed reports and recommendations to help teams address issues before code reaches production. 

With the introduction of Docker Health Scores — a security grading system for container images — teams gain a clear and actionable snapshot of their image security posture. These scores empower developers to prioritize remediation efforts and continuously improve their software’s security from code to production.

Keeping up with continuous improvement

Security threats aren’t slowing down. New attack vectors and vulnerabilities grow every day. With cybercrime costs expected to rise from $9.22 trillion in 2024 to $13.82 trillion by 2028, organizations face a critical choice: adapt to this evolving threat landscape or risk falling behind, exposing themselves to escalating costs and reputational damage. Continuous improvement in software security isn’t a luxury. Building and maintaining trust with your customers is essential so they know that every fresh deployment is better than the one that came before. Otherwise, expect high costs due to imminent software supply chain attacks. 

Best practices for securing the software supply chain involve integrating vulnerability scans early in the development lifecycle, leveraging verified content from trusted sources, and implementing governance policies to ensure consistent compliance standards without manual intervention. Continuous monitoring of vulnerabilities and enforcing runtime policies help maintain security at scale, adapting to the dynamic nature of modern software ecosystems.

Start today

Securing your software supply chain is a journey of continuous improvement. With Docker’s tools, you can empower your teams to build and deploy software securely, ensuring vulnerabilities are addressed before they become liabilities.

Don’t wait until vulnerabilities turn into liabilities. Explore Docker Hub, Docker Scout, Hardened Docker Desktop, and Image Access Management to embed security into every stage of development. From granular control over image access to tamper-proof local environments, Docker’s suite of tools helps safeguard your innovation, protect your reputation, and empower your organization to thrive in a dynamic ecosystem.

Learn more

  • Docker Scout: Integrates seamlessly into your development lifecycle, delivering vulnerability scans, image analysis, and actionable recommendations to address issues before they reach production.
  • Docker Health Scores: A security grading system for container images, offering teams clear insights into their image security posture.
  • Docker Hub: Access trusted, verified content, including Docker Official Images (DOI), to build secure and compliant software applications.
  • Docker Official Images (DOI): A curated set of high-quality images that provide a secure foundation for your containerized applications.
  • Image Access Management (IAM): Enforce image-sharing policies and restrict access to sensitive components, ensuring only trusted team members access critical assets.
  • Hardened Docker Desktop: A tamper-proof, enterprise-grade development environment that aligns with security standards to minimize risks from local development.

Unlocking Efficiency with Docker for AI and Cloud-Native Development

By: Yiwen Xu
8 January 2025 at 21:22

The need for secure and high quality software becomes more critical every day as the impact of vulnerabilities increases and related costs continue to rise. For example, flawed software cost the U.S. economy $2.08 trillion in 2020 alone, according to the Consortium for Information and Software Quality (CISQ). And, a software defect that might cost $100 to fix if found early in the development process can grow exponentially to $10,000 if discovered later in production. 

Docker helps you deliver secure, efficient applications by providing consistent environments and fast, reliable container management, building on best practices that let you discover and resolve issues earlier in the software development life cycle (SDLC).

2400x1260 docker evergreen logo blog E

Shifting left to ensure fewer defects

In a previous blog post, we talked about using the right tools, including Docker’s suite of products to boost developer productivity. Besides having the right tools, you also need to implement the right processes to optimize your software development and improve team productivity. 

The software development process is typically broken into two distinct loops, the inner and the outer loops. At Docker, we believe that investing in the inner loop is crucial. This means shifting security left and identifying problems as soon as you can. This approach improves efficiency and reduces costs by helping teams find and fix software issues earlier.

Using Docker tools to adopt best practices

Docker’s products help you adopt these best practices — we are focused on enhancing the software development lifecycle, especially around refining the inner loop. Products like Docker Desktop allow your dev team in the inner loop to run, test, code, and build everything fast and consistently. This consistency eliminates the “it works on my machine” issue, meaning applications behave the same in both development and production.  

Shifting left lets your dev team identify problems earlier in your software project lifecycle. When you detect issues sooner, you increase efficiency and help ensure secure builds and compliance. By shifting security left with Docker Scout, your dev teams can identify vulnerabilities sooner and help avoid issues down the road. 

Another example of shifting left involves testing — doing testing earlier in the process leads to more robust software and faster release cycles. This is when Testcontainers Cloud comes in handy because it enables developers to run reliable integration tests, with real dependencies defined in code. 

Accelerate development within the hybrid inner loop

We see more and more companies adopting the so-called hybrid inner loop, which combines the best of two worlds — local and cloud. The results provide greater flexibility for your dev teams and encourage better collaboration. For example, Docker Build Cloud uses the power of the cloud to speed up build time without sacrificing the local development experience that developers love. 

By using these Docker products across the software development life cycle, teams get quick feedback loops and faster issue resolution, ensuring a smooth development flow from inception to deployment. 

Simplifying AI application development

When you’re using the right tools and processes to accelerate your application delivery and maximize efficiency throughout your SDLC, processes that were once cumbersome become your new baseline, freeing up time for true innovation. 

Docker also helps accelerate innovation by simplifying AI/ML development. We are continually investing in AI to help your developers deliver AI-backed applications that differentiate your business and enhance competitiveness.

Docker AI tools

Docker’s GenAI Stack accelerates the incorporation of large language models (LLMs) and AI/ML into your code, enabling the delivery of AI-backed applications. All containers work harmoniously and are managed directly from Docker Desktop, allowing your team to monitor and adjust components without leaving their development environment. Deploying the GenAI Stack is quick and easy, and leveraging Docker’s containerization technology helps speed setup and simplify scaling as applications grow.

Earlier this year, we announced the preview of Docker Extension for GitHub Copilot. By standardizing best practices and enabling integrations with tools like GitHub Copilot, Docker empowers developers to focus on innovation, closing the gap from the first line of code to production.

And, more recently, we launched the Docker AI Catalog in Docker Hub. This new feature simplifies the process of integrating AI into applications by providing trusted and ready-to-use content supported by comprehensive documentation. Your dev team will benefit from shorter development cycles, improved productivity, and a more streamlined path to integrating AI into both new and existing applications.

Wrapping up

Docker products help you establish sound processes and practices related to shifting left and discovering issues earlier to avoid headaches down the road. This approach ultimately unlocks developer productivity, giving your dev team more time to code and innovate. Docker also allows you to quickly use AI to close knowledge gaps and offers trusted tools to build AI/ML applications and accelerate time to market. 

To see how Docker continues to empower developers with the latest innovations and tools, check out our Docker 2024 Highlights.

Learn about Docker’s updated subscriptions and find the ideal plan for your team’s needs.

Learn more

Docker 2024 Highlights: Innovations in AI, Security, and Empowering Development Teams

17 December 2024 at 20:45

In 2024, as developers and engineering teams focused on delivering high-quality, secure software faster, Docker continued to evolve with impactful updates and a streamlined user experience. This commitment to empowering developers was recognized in the annual Stack Overflow Developer Survey, where Docker ranked as one of the most loved and widely used tools for yet another year. Here’s a look back at Docker’s 2024 milestones and how we helped teams build, test, and deploy with greater ease, security, and control than ever.

2400x1260 docker evergreen logo blog D 1

Streamlining the developer experience

Docker focused heavily on streamlining workflows, creating efficiencies, and reducing the complexities often associated with managing multiple tools. One big announcement in 2024 is our upgraded Docker plans. With the launch of updated Docker subscriptions, developers now have access to the entire suite of Docker products under their existing subscription. 

The all-in-one subscription model enables seamless integration of Docker Desktop, Docker Hub, Docker Build Cloud, Docker Scout, and Testcontainers Cloud, giving developers everything they need to build efficiently. By providing easy access to the suite of products and flexibility to scale, Docker allows developers to focus on what matters most — building and innovating without unnecessary distractions.

For more details on Docker’s all-in-one subscription approach, check out our Docker plans announcement.

Build up to 39x faster with Docker Build Cloud

Docker Build Cloud, introduced in 2024, brings the best of two worlds — local development and the cloud to developers and engineering teams worldwide. It offloads resource-intensive build processes to the cloud, ensuring faster, more consistent builds while freeing up local machines for other tasks.

A standout feature is shared build caches, which dramatically improve efficiency for engineering teams working on large-scale projects. Shared caches allow teams to avoid redundant rebuilds by reusing intermediate layers of images across builds, accelerating iteration cycles and reducing resource consumption. This approach is especially valuable for collaborative teams working on shared codebases, as it minimizes duplicated effort and enhances productivity.

Docker Build Cloud also offers native support for multi-architecture builds, eliminating the need for setting up and maintaining multiple native builders. This support removes the challenges associated with emulation, further improving build efficiency.

We’ve designed Docker Build Cloud to be easy to set up wherever you run your builds, without requiring a massive lift-and-shift effort. Docker Build Cloud also works well with Docker Compose, GitHub Actions, and other CI solutions. This means you can seamlessly incorporate Docker Build Cloud into your existing development tools and services and immediately start reaping the benefits of enhanced speed and efficiency.

Check out our build time savings calculator to estimate your potential savings in hours and dollars. 

Optimizing development workflows with performance enhancements

In 2024, Docker Desktop introduced a series of enterprise-grade performance enhancements designed to streamline development workflows at scale. These updates cater to the unique needs of development teams operating in diverse, high-performance environments.

One notable feature is the Virtual Machine Manager (VMM) in Docker Desktop for Mac, which provides a robust alternative to the Apple Virtualization Framework. Available since Docker Desktop 4.35, VMM significantly boosts performance for native Arm-based images, delivering faster and more efficient workflows for M1 and M2 Mac users. For development teams relying on Apple’s latest hardware, this enhancement translates into reduced build times and a smoother experience when working with containerized applications.

Additionally, Docker Desktop expanded its platform support to include Red Hat Enterprise Linux (RHEL) and Windows on Arm architectures, enabling organizations to maintain a consistent Docker Desktop experience across a wide array of operating systems. This flexibility ensures that development teams can optimize their workflows regardless of the underlying platform, leveraging platform-specific optimizations while maintaining uniformity in their tooling.

These advancements reflect Docker’s unwavering commitment to speed, reliability, and cross-platform support, ensuring that development teams can scale their operations without bottlenecks. By minimizing downtime and enhancing performance, Docker Desktop empowers developers to focus on innovation, improving productivity across even the most demanding enterprise environments.

More options to improve file operations for large projects

We enhanced Docker Desktop with synchronized file shares (Figure 1), a feature that can significantly improve file operation speeds by 2-10x. This enhancement brings fast and flexible host-to-VM file sharing, offering a performance boost for developers dealing with extensive codebases.

Synchronized file sharing is ideal for developers who:

  • Develop on projects that consist of a significant number of files (such as PHP or Node projects).
  • Develop using large repositories or monorepos with more than 100,000 files, totaling significant storage.
  • Utilize virtual file systems (such as VirtioFS, gRPC FUSE, or osxfs) and face scalability issues with their workflows.
  • Encounter performance limitations and want a seamless file-sharing solution without worrying about ownership conflicts.

This integration streamlines workflows, allowing developers to focus more on coding and less on managing file synchronization issues and slow file read times. 

Screenshot of Docker Desktop showing Synchronized file shares within Resources.
Figure 1: Synchronized file shares.

Enhancing developer productivity with Docker Debug 

Docker Debug enhances the ability of developer teams to debug any container, especially those without a shell (that is, distroless or scratch images). The ability to peek into “secure” images significantly improves the debugging experience for both local and remote containerized applications. 

Docker Debug does this by attaching a dedicated debugging toolkit to any image and allows developers to easily install additional tools for quick issue identification and resolution. Docker Debug not only streamlines debugging for both running and stopped containers but also is accessible directly from both the Docker Desktop CLI and GUI (Figure 2). 

Screenshot of Docker Desktop showing Docker Debug.
Figure 2: Docker Debug.

Being able to troubleshoot images without modifying them is crucial for maintaining the security and performance of containerized applications, especially those images that traditionally have been hard to debug. Docker Debug offers:

  • Streamlined debugging process: Easily debug local and remote containerized applications, even those not running, directly from Docker Desktop.
  • Cross-device and cloud compatibility: Initiate debugging effortlessly from any device, whether local or in the cloud, enhancing flexibility and productivity.

Docker Debug improves productivity and seamless integration. The docker debug command simplifies attaching a shell to any container or image. This capability reduces the cognitive load on developers, allowing them to focus on solving problems rather than configuring their environment. 

Ensuring reliable image builds with Docker Build checks

Docker Desktop 4.33 was a big release because, in addition to including the GA release of Docker Debug, it included the GA release of Docker Build checks, a new feature that ensures smoother and more reliable image builds. Build checks automatically validate common issues in your Dockerfiles before the build process begins, catching errors like invalid syntax, unsupported instructions, or missing dependencies. By surfacing these issues upfront, Docker Build checks help developers save time and avoid costly build failures.

You can access Docker Build checks in the CLI and in the Docker Desktop Builds view. The feature also works seamlessly with Docker Build Cloud, both locally and through CI. Whether you’re optimizing your Dockerfiles or troubleshooting build errors, Docker Build checks let you create efficient, high-quality container images with confidence — streamlining your development workflow from start to finish.

Onboarding and learning resources for developer success  

To further reduce friction, Docker revamped its learning resources and integrated new tools to enhance developer onboarding. By adding beginner-friendly tutorials, Docker’s learning center makes it easier for developers to ramp up and quickly learn to use Docker tools, helping them spend more time coding and less time troubleshooting. 

As Docker continues to rank as a top developer tool globally, we’re dedicated to empowering our community with continuous learning support.

Built-in container security from code to production

In an era where software supply chain security is essential, Docker has raised the bar on container security. With integrated security measures across every phase of the development lifecycle, Docker helps teams build, test, and deploy confidently.

Proactive security insights with Docker Scout Health Scores

Docker Scout, launched in 2023,  has become a cornerstone of Docker’s security ecosystem, empowering developer teams to identify and address vulnerabilities in container images early in the development lifecycle. By integrating with Docker Hub, Docker Desktop, and CI/CD workflows, Scout ensures that security is seamlessly embedded into every build. 

Addressing vulnerabilities during the inner loop — the development phase — is estimated to be up to 100 times less costly than fixing them in production. This underscores the critical importance of early risk visibility and remediation for engineering teams striving to deliver secure, production-ready software efficiently.

In 2024, we announced Docker Scout Health Scores (Figure 3), a feature designed to better communicate the security posture of container images development teams use every day. Docker Scout Health Scores provide a clear, alphabetical grading system (A to F) that evaluates common vulnerabilities and exposures (CVEs) for software components within Docker Hub. This feature allows developers to quickly assess and wisely choose trusted content for a secure software supply chain. 

creenshot of Docker Scout health score page showing checks for high profile vulnerabilities, Supply chain attestations, unapproved images, outdated images, and more.
Figure 3: Docker Scout health score.

For a deeper dive, check out our blog post on enhancing container security with Docker Scout and secure repositories.

Air-gapped containers: Enhanced security for isolated environments

Docker introduced support for air-gapped containers in Docker Desktop 4.31, addressing the unique needs of highly secure, offline environments. Air-gapped containers enable developers to build, run, and test containerized applications without requiring an active internet connection. 

This feature is crucial for organizations operating in industries with stringent compliance and security requirements, such as government, healthcare, and finance. By allowing developers to securely transfer container images and dependencies to air-gapped systems, Docker simplifies workflows and ensures that even isolated environments benefit from the power of containerization.

Strengthening trust with SOC 2 Type 2 and ISO 27001 certifications

Docker also achieved two major milestones in its commitment to security and reliability: SOC 2 Type 2 attestation and ISO 27001 certification. These globally recognized standards validate Docker’s dedication to safeguarding customer data, maintaining robust operational controls, and adhering to stringent security practices. SOC 2 Type 2 attestation focuses on the effective implementation of security, availability, and confidentiality controls, while ISO 27001 certification ensures compliance with best practices for managing information security systems.

These certifications provide developers and organizations with increased confidence in Docker’s ability to support secure software supply chains and protect sensitive information. They also demonstrate Docker’s focus on aligning its services with the needs of modern enterprises.

Accelerating success for development teams and organizations

In 2024, Docker introduced a range of features and enhancements designed to empower development teams and streamline operations across organizations. From harnessing the potential of AI to simplifying deployment workflows and improving security, Docker’s advancements are focused on enabling teams to work smarter and build with confidence. By addressing key challenges in development, management, and security, Docker continues to drive meaningful outcomes for developers and businesses alike.

Docker Home: A central hub to access and manage Docker products

Docker introduced Docker Home (Figure 4), a central hub for users to access Docker products, manage subscriptions, adjust settings, and find resources — all in one place. This approach simplifies navigation for developers and admins. Docker Home allows admins to manage organizations, users, and onboarding processes, with access to dashboards for monitoring Docker usage.

Future updates will add personalized features for different roles, and business subscribers will gain access to tools like the Docker Support portal and organization-wide notifications.

Screenshot of Docker Home showing options to explore Docker products, Admin console, and more.
Figure 4: Docker Home.

Empowering AI innovation  

Docker’s ecosystem supports AI/ML workflows, helping developers work with these cutting-edge technologies while staying cloud-native and agile. Read the Docker Labs GenAI series to see how we’re innovating and experimenting in the open.

Through partnerships like those with NVIDIA and GitHub, Docker ensures seamless integration of AI tools, allowing teams to rapidly experiment, deploy, and iterate. This emphasis on enabling advanced tech aligns Docker with organizations looking to leverage AI and ML in containerized environments.

Optimizing AI application development with Docker Desktop and NVIDIA AI Workbench

Docker and NVIDIA partnered to integrate Docker Desktop with NVIDIA AI Workbench, streamlining AI development workflows. This collaboration simplifies setup by automatically installing Docker Desktop when selected as the container runtime in AI Workbench, allowing developers to focus on creating, testing, and deploying AI models without configuration hassles. By combining Docker’s containerization capabilities with NVIDIA’s advanced AI tools, this integration provides a seamless platform for model training and deployment, enhancing productivity and accelerating innovation in AI application development. 

Docker + GitHub Copilot: AI-powered developer productivity

We announced that Docker joined GitHub’s Partner Program and unveiled the Docker extension for GitHub Copilot (@docker). This extension is designed to assist developers in working with Docker directly within their GitHub workflows. This integration extends GitHub Copilot’s technology, enabling developers to generate Docker assets, learn about containerization, and analyze project vulnerabilities using Docker Scout, all from within the GitHub environment.

Accelerating AI development with the Docker AI catalog

Docker launched the AI Catalog, a curated collection of generative AI images and tools designed to simplify and accelerate AI application development. This catalog offers developers access to powerful models like IBM Granite, Llama, Mistral, Phi 2, and SolarLLM, as well as applications such as JupyterHub and H2O.ai. By providing essential tools for machine learning, model deployment, inference optimization, orchestration, ML frameworks, and databases, the AI Catalog enables developers to build and deploy AI solutions more efficiently. 

The Docker AI Catalog addresses common challenges in AI development, such as decision overload from the vast array of tools and frameworks, steep learning curves, and complex configurations. By offering a curated list of trusted content and container images, Docker simplifies the decision-making process, allowing developers to focus on innovation rather than setup. This initiative underscores Docker’s commitment to empowering developers and publishers in the AI space, fostering a more streamlined and productive development environment. 

Streamlining enterprise administration 

Simplified deployment and management with Docker’s MSI and PKG installers

Docker simplifies deploying and managing Docker Desktop with the new MSI Installer for Windows and PKG Installer for macOS. The MSI Installer enables silent installations, automated updates, and login enforcement, streamlining workflows for IT admins. Similarly, the PKG Installer offers macOS users easy deployment and management with standard tools. These installers enhance efficiency, making it easier for organizations to equip teams and maintain secure, compliant environments.

These new installers also align with Docker’s commitment to simplifying the developer experience and improving organizational management. Whether you’re setting up a few machines or deploying Docker Desktop across an entire enterprise, these tools provide a reliable and efficient way to keep teams equipped and ready to build.

New sign-in enforcement options enhance security and help streamline IT administration 

Docker simplifies IT administration and strengthens organizational security with new sign-in enforcement options for Docker Desktop. These features allow organizations to ensure users are signed in while using Docker, aligning local software with modern security standards. With flexible deployment options — including macOS Config Profiles, Windows Registry Keys, and the cross-platform registry.json file — IT administrators can easily enforce policies that prevent tampering and enhance security. These tools empower organizations to manage development environments more effectively, providing a secure foundation for teams to build confidently.

Desktop Insights: Unlocking performance and usage analytics

Docker introduced Desktop Insights, a powerful feature that provides developers and teams with actionable analytics to optimize their use of Docker Desktop. Accessible through the Docker Dashboard, Desktop Insights offers a detailed view of resource usage, build times, and performance metrics, helping users identify inefficiencies and fine-tune their workflows (Figure 5).

Whether you’re tracking the speed of container builds or understanding how resources like CPU and memory are being utilized, Desktop Insights empowers developers to make data-driven decisions. By bringing transparency to local development environments, this feature aligns with Docker’s mission to streamline container workflows and ensure developers have the tools to build faster and more effectively.

Screenshot of Docker Insights within Admin console, showing data for Total active users, Users with license, Total Builds, Total Containers run, and more
Figure 5: Desktop Insights dashboard.

New usage dashboards in Docker Hub

Docker introduced Usage dashboards in Docker Hub, giving organizations greater visibility into how they consume resources. These dashboards provide detailed insights into storage and image pull activity, helping teams understand their usage patterns at a granular level (Figure 6). 

By breaking down data by repository, tag, and even IP address, the dashboards make it easy to identify high-traffic images or repositories that might require optimization. With this added transparency, teams can better manage their storage, avoid unnecessary pull requests, and optimize workflows to control costs. 

Usage dashboards enhance accountability and empower organizations to fine-tune their Docker Hub usage, ensuring resources are used efficiently and effectively across all projects.

Screenshot of Docker Usage dashboard showing a graph of daily pulls over time.
Figure 6: Usage dashboard.

Enhancing security with organization access tokens

Docker introduced organization access tokens, which let teams manage access to Docker Hub repositories at an organizational level. Unlike personal access tokens tied to individual users, these tokens are associated with the organization itself, allowing for centralized control and reducing reliance on individual accounts. This approach enhances security by enabling fine-grained permissions and simplifying the management of automated processes and CI/CD pipelines. 

Organization access tokens offer several advantages, including the ability to set specific access permissions for each token, such as read or write access to selected repositories. They also support expiration dates, aligning with compliance requirements and bolstering security. By providing visibility into token usage and centralizing management within the Admin Console, these tokens streamline operations and improve governance for organizations of all sizes. 

Docker’s vision for 2025

Docker’s journey doesn’t end here. In 2025, Docker remains committed to expanding its support for cloud-native and AI/ML development, reinforcing its position as the go-to container platform. New integrations and expanded multi-cloud capabilities are on the horizon, promising a more connected and versatile Docker ecosystem.

As Docker continues to build for the future, we’re committed to empowering developers, supporting the open source community, and driving efficiency in software development at scale. 

2024 was a year of transformation for Docker and the developer community. With major advances in our product suite, continued focus on security, and streamlined experiences that deliver value, Docker is ready to help developer teams and organizations succeed in an evolving tech landscape. As we head into 2025, we invite you to explore Docker’s suite of tools and see how Docker can help your team build, innovate, and secure software faster than ever.

Learn more

From Legacy to Cloud-Native: How Docker Simplifies Complexity and Boosts Developer Productivity

By: Yiwen Xu
13 December 2024 at 20:30

Modern application development has evolved dramatically. Gone are the days when a couple of developers, a few machines, and some pizza were enough to launch an app. As the industry grew, DevOps revolutionized collaboration, and Docker popularized containerization, simplifying workflows and accelerating delivery. 

Later, DevSecOps brought security into the mix. Fast forward to today, and the demand for software has never been greater, with more than 750 million cloud-native apps expected by 2025.

This explosion in demand has created a new challenge: complexity. Applications now span multiple programming languages, frameworks, and architectures, integrating both legacy and modern systems. Development workflows must navigate hybrid environments — local, cloud, and everything in between. This complexity makes it harder for companies to deliver innovation on time and stay competitive. 

2400x1260 evergreen docker blog e

To overcome these challenges, you need a development platform that’s as reliable and ubiquitous as electricity or Wi-Fi — a platform that works consistently across diverse applications, development tools, and environments. Whether you’re just starting to move toward microservices or fully embracing cloud-native development, Docker meets your team where they are, integrates seamlessly into existing workflows, and scales to meet the needs of individual developers, teams, and entire enterprises.

Docker: Simplifying the complex

The Docker suite of products provides the tools you need to accelerate development, modernize legacy applications, and empower your team to work efficiently and securely. With Docker, you can:

  • Modernize legacy applications: Docker makes it easy to containerize existing systems, bringing them closer to modern technology stacks without disrupting operations.
  • Boost productivity for cloud-native teams: Docker ensures consistent environments, integrates with CI/CD workflows, supports hybrid development environments, and enhances collaboration

Consistent environments: Build once, run anywhere

Docker ensures consistency across development, testing, and production environments, eliminating the dreaded “works on my machine” problem. With Docker, your team can build applications in unified environments — whether on macOS, Windows, or Linux — for reliable code, better collaboration, and faster time to market.

With Docker Desktop, developers have a powerful GUI and CLI for managing containers locally. Integration with popular IDEs like Visual Studio Code allows developers to code, build, and debug within familiar tools. Built-in Kubernetes support enables teams to test and deploy applications on a local Kubernetes cluster, giving developers confidence that their code will perform in production as expected.

Integrated workflows for hybrid environments

Development today spans both local and cloud environments. Docker bridges the gap and provides flexibility with solutions like Docker Build Cloud, which speeds up build pipelines by up to 39x using cloud-based, multi-platform builders. This allows developers to focus more on coding and innovation, rather than waiting on builds.

Docker also integrates seamlessly with CI/CD tools like Jenkins, GitLab CI, and GitHub Actions. This automation reduces manual intervention, enabling consistent and reliable deployments. Whether you’re building in the cloud or locally, Docker ensures flexibility and productivity at every stage.

Team collaboration: Better together

Collaboration is central to Docker. With integrations like Docker Hub and other registries, teams can easily share container images and work together on builds. Docker Desktop features like Docker Debug and the Builds view dashboards empower developers to troubleshoot issues together, speeding up resolution and boosting team efficiency.

Docker Scout provides actionable security insights, helping teams identify and resolve vulnerabilities early in the development process. With these tools, Docker fosters a collaborative environment where teams can innovate faster and more securely.

Why Docker?

In today’s fast-paced development landscape, complexity can slow you down. Docker’s unified platform reduces complexity as it simplifies workflows, standardizes environments, and empowers teams to deliver software faster and more securely. Whether you’re modernizing legacy applications, bridging local and cloud environments, or building cutting-edge, cloud-native apps, Docker helps you achieve efficiency and scale at every stage of the development lifecycle.

Docker offers a unified platform that combines industry-leading tools — Docker Desktop, Docker Hub, Docker Build Cloud, Docker Scout, and Testcontainers Cloud — into a seamless experience. Docker’s flexible plans ensure there’s a solution for every developer and every team, from individual contributors to large enterprises.

Get started today

Ready to simplify your development workflows? Start your Docker journey now and equip your team with the tools they need to innovate, collaborate, and deliver with confidence.

Looking for tips and tricks? Subscribe to Docker Navigator for the latest updates and insights delivered straight to your inbox.

Learn more

Learn How to Optimize Docker Hub Costs With Our Usage Dashboards

13 November 2024 at 21:46

Effective infrastructure management is crucial for organizations using Docker Hub. Without a clear understanding of resource consumption, unexpected usage can emerge and skyrocket. This is particularly true if pulls and storage needs are not budgeted and forecasted correctly. By implementing proactive post controls and monitoring usage patterns, development teams can sustain their Docker Hub usage while keeping expenses under control. 

To support these goals, we’ve introduced new Docker Hub Usage dashboards, offering organizations the ability to access and analyze their usage patterns for storage and pulls. 

Docker Hub’s Usage dashboards put you in control, giving visibility into every pull and image your Docker systems request. Each pull and cache becomes a deliberate choice — not a random event — so you can make every byte count. With clear insights into what’s happening and why, you can design more efficient, optimized systems.

2400x1260 generic hub blog c

Reclaim control and manage technical resources by kicking bad habits

hub usage f1
Figure 1: Docker Hub Usage dashboards.

The Docker Hub Usage dashboards (Figure 1) provide valuable insights, allowing teams to track peaks and valleys, detect high usage periods, and identify the images and repositories driving the most consumption. This visibility not only aids in managing usage but also strengthens continuous improvement efforts across your software supply chain, helping teams build applications more efficiently and sustainably. 

This information helps development teams to stay on top of challenges, such as: 

  • Redundant pulls and misconfigured repositories: These can quickly and quietly drive up technical expenses while falling out of scope of the most relevant or critical use cases. Docker Hub’s Usage dashboards can help development teams identify patterns and optimize accordingly. They let you view usage trends across IPs and users as well, which helps with pinpointing high consumption areas and ensuring accountability in an organization when it comes to resource management. 
  • Poor caching management: Repository insights and image tagging helps customers assess internal usage patterns, such as frequently accessed images, where there might be an opportunity to improve caching. With proper governance models, organizations can also establish policies and processes that reduce the variability of resource usage as a whole. This goal goes beyond keeping track of seasonality usage patterns to help you design more predictable usage patterns so you can budget accordingly. 
  • Accidental automation: Accidental automated system activities can really hurt your usage. Let’s say you are using a CI/CD pipeline or automated scripts configured to pull images more often than they should. They may pull on every build instead of the actual version change, for example. 

Usage dashboards can help you identify these inefficiencies by showing detailed pull data associated with automated tooling. This information can help your teams quickly identify and adjust misconfigured systems, fine-tune automations to only pull when needed, and ultimately focus on the most relevant use cases for your organization, avoiding accidental overuse of resources:

Details of Docker Hub Usage Dashboard with columns for Date/hour, Username, Repository library, IPs, Version checks, pulls, and more.
Figure 2: Details from the Usage dashboards.

Docker Hub’s Usage dashboards offer a comprehensive view of your usage data, including downloadable CSV reports that include metrics such as pull counts, repository names, IP addresses, and version checks (Figure 2). This granular approach allows your organization to gain valuable insights and trend data to help optimize your team’s workflows and inform policies. 

Integrate robust operational principles into your development pipeline by leveraging these data-driven reports and maintain control over resource consumption and operational efficiency with Docker Hub. 

Learn more

Accelerating AI Development with the Docker AI Catalog

12 November 2024 at 21:38

Developers are increasingly expected to integrate AI capabilities into their applications but they also face many challenges. Namely, the steep learning curve, coupled with an overwhelming array of tools and frameworks, makes this process too tedious. Docker aims to bridge this gap with the Docker AI Catalog, a curated experience designed to simplify AI development and empower both developers and publishers.

2400x1260 generic hub blog f

Why Docker for AI?

Docker and container technology has been a key technology used by developers at the forefront of AI applications for the past few years. Now, Docker is doubling down on that effort with our AI Catalog. Developers using Docker’s suite of products are often responsible for building, deploying, and managing complex applications — and, now, they must also navigate generative AI (GenAI) technologies, such as large language models (LLMs), vector databases, and GPU support.

For developers, the AI Catalog simplifies the process of integrating AI into applications by providing trusted and ready-to-use content supported by comprehensive documentation. This approach removes the hassle of evaluating numerous tools and configurations, allowing developers to focus on building innovative AI applications.

Key benefits for development teams

The Docker AI Catalog is tailored to help users overcome common hurdles in the evolving AI application development landscape, such as:

  • Decision overload: The GenAI ecosystem is crowded with new tools and frameworks. The Docker AI Catalog simplifies the decision-making process by offering a curated list of trusted content and container images, so developers don’t have to wade through endless options.
  • Steep learning curve: With the rise of new technologies like LLMs and retrieval-augmented generation (RAG), the learning curve can be overwhelming. Docker provides an all-in-one resource to help developers quickly get up to speed.
  • Complex configurations preventing production readiness: Running AI applications often requires specialized hardware configurations, especially with GPUs. Docker’s AI stacks make this process more accessible, ensuring that developers can harness the full power of these resources without extensive setup.

The result? Shorter development cycles, improved productivity, and a more streamlined path to integrating AI into both new and existing applications.

Empowering publishers

For Docker verified publishers, the AI Catalog provides a platform to differentiate themselves in a crowded market. Independent software vendors (ISVs) and open source contributors can promote their content, gain insights into adoption, and improve visibility to a growing community of AI developers.

Key features for publishers include:

  • Increased discoverability: Publishers can highlight their AI content within a trusted ecosystem used by millions of developers worldwide.
  • Metrics and insights: Verified publishers gain valuable insights into the performance of their content, helping them optimize strategies and drive engagement.

Unified experience for AI application development

The AI Catalog is more than just a repository of AI tools. It’s a unified ecosystem designed to foster collaboration between developers and publishers, creating a path forward for more innovative approaches to building applications supported by AI capabilities. Developers get easy access to essential AI tools and content, while publishers gain the visibility and feedback they need to thrive in a competitive marketplace.

With Docker’s trusted platform, development teams can build AI applications confidently, knowing they have access to the most relevant and reliable tools available.

The road ahead: What’s next?

Docker will launch the AI Catalog in preview on November 12, 2024, alongside a joint webinar with MongoDB. This initiative will further Docker’s role as a leader in AI application development, ensuring that developers and publishers alike can take full advantage of the opportunities presented by AI tools.

Stay tuned for more updates and prepare to dive into a world of possibilities with the Docker AI Catalog. Whether you’re an AI developer seeking to streamline your workflows or a publisher looking to grow your audience, Docker has the tools and support you need to succeed.

Ready to simplify your AI development process? Explore the AI Catalog and get access to trusted content that will accelerate your development journey. Start building smarter, faster, and more efficiently.

For publishers, now is the perfect time to join the AI Catalog and gain visibility for your content. Become a trusted source in the AI development space and connect with millions of developers looking for the right tools to power their next breakthrough.

Learn more

❌
❌