❌

Normal view

There are new articles available, click to refresh the page.
Yesterday β€” 4 December 2024Red Hat

IT Automation: a key enabler for enterprise AI adoption

4 December 2024 at 07:00
Having spent my career in the technology industry, I've had the opportunity to experience major shifts in the field through my work with customers. Specifically in the last decade, my projects have consistently involved at least one of three trends: advanced data analytics/artificial intelligence, automation and IoT/edge computing. It’s fascinating to observe how these areas continue to converge, transforming all industries by enabling smarter, more efficient, real-time decision-making.AI is vital for companies to enhance efficiency, drive innovation and improve customer satisfaction. IT env
Before yesterdayRed Hat

Migrating Windows virtual machines to OpenShift Virtualization on Red Hat OpenShift Service on AWS

3 December 2024 at 07:00
Red Hat OpenShift Service on AWS (ROSA) provides customers a single pane of glass to view and manage their virtual machines (VMs) and containers in the same environment along with common tooling for observability, governance, and infrastructure as code. The initial release of OpenShift Virtualization on ROSA gave [HT1] customers a unified way to manage their VMs alongside their containerized workloads, but lacked the ability for customers to be compliant with Windows licenses and to pay AWS and Microsoft for those licenses.We are pleased to be announcing the public preview of Windows Server li

Achieve better large language model inference with fewer GPUs

3 December 2024 at 07:00
As enterprises increasingly adopt large language models (LLMs) into their mission-critical applications, improving inference run-time performance is becoming essential for operational efficiency and cost reduction. With the MLPerf 4.1 inference submission, Red Hat OpenShift AI delivers impressive performance with vLLM delivering groundbreaking performance results on the Llama-2-70b inference benchmark on a Dell R760xa server with 4x NVIDIA L40S GPUs. The NVIDIA L40S GPU offers competitive inference performance by offering the benefit of 8-bit floating point (FP8 precision) support.Applying FP8

Announcing the Open Container InitiativeReferrers API on Quay.io: A step towards enhanced security and compliance

2 December 2024 at 07:00
To help enhance IT security and compliance, we are announcing the availability of the Open Container Initiative (OCI) Referrers API on Quay.io. This new feature aligns with our recent improvements in OCI content discovery compliance, helping Quay.io meet current OCI standards. The introduction of the Referrers API is a significant advancement that enables better artifact management and supports secure software supply chain practices.Why is this relevant?As software supply chain attacks have surged, establishing provenance for container images has become increasingly critical. The OCI Referrers

Red Hat Ansible Automation Platform Service on AWS now available in the AWS Marketplace

2 December 2024 at 07:00
Red Hat has released the latest cloud offering for Red Hat Ansible Automation Platform. Ansible Automation Platform Service on AWS is a Red Hat managed service available in AWS Marketplace. This new offer saves customers time and money by enabling them to focus on innovation through automation instead of managing the platform. The Red Hat Ansible Automation Platform Service on AWS scales to meet the demands of enterprises, simplifies networking, and brings automation closer to workloads. Additionally, when purchased in AWS Marketplace, it applies to committed spend agreements (EDP - AWS Enterp

Friday Five β€” November 29, 2024

29 November 2024 at 07:00
CRN - Red Hat Updates Present β€˜Huge’ Partner Opportunities in OpenShift, EdgeImproved capabilities around virtualization in Red Hat OpenShift, more model training support in Red Hat OpenShift AI, better abilities for low latency in Red Hat Edge Device and new artificial intelligence templates in Red Hat Developer Hub offer partners more ways to do business with customers. Learn more Cloud Native Now - Red Hat to Donate Podman Along With Other Container Tools to CNCFRed Hat recently announced its intent to donate multiple tools for creating and managing containers, including Podman and Po

Shaping the future with open source innovation

28 November 2024 at 07:00
This month, I’m celebrating two milestones. First, it’s my eighth year at Red Hat, a journey marked by extraordinary growth and collaboration. Second, this month also marks the five-year anniversary of IBM’s acquisition of Red Hat - a partnership that has deepened Red Hat’s reach while allowing us to remain true to our mission.IBM’s Q3 results highlight just how impactful that partnership has been. Double-digit growth in software, with Red Hat revenue accelerating to an impressive 14%, is a testament to open source’s growing dominance in enterprise technology and Red Hat’s leader

InstructLab tutorial: Installing and fine-tuning your first AI model (part 2)

26 November 2024 at 07:00
In the first part of this article, you learned some key concepts, tested InstructLab and successfully chatted with the out-of-the-box model. In this article, I'll show you how to infuse your knowledge into the model, using a sample dataset to train it using some Brazilian soccer teams data.Preparing your system for trainingIf you were running/chatting with your model, ensure to stop both the chat and the server instances.If you are running Red Hat Enterprise Linux AI (RHEL AI), elevate yourself to root:sudo su -Now, download the model that we will be using as the teacher (the question generato

Doing more with less: LLM quantization (part 2)

22 November 2024 at 07:00
What if you could get similar results from your large language model (LLM) with 75% less GPU memory? In my previous article,, we discussed the benefits of smaller LLMs and some of the techniques for shrinking them. In this article, we’ll put this to test by comparing the results of the smaller and larger versions of the same LLM.As you’ll recall, quantization is one of the techniques for reducing the size of a LLM. Quantization achieves this by representing the LLM parameters (e.g. weights) in lower precision formats: from 32-bit floating point (FP32) to 8-bit integer (INT8) or INT4. The

Friday Five β€” November 22, 2024

22 November 2024 at 07:00
Red Hat Enterprise Linux AI Brings Greater Generative AI Choice to Microsoft AzureRHEL AI expands the ability of organizations to streamline AI model development and deployment on Microsoft Azure to fast-track AI innovation in the cloud. Learn more Technically Speaking | How open source can help with AI transparencyExplore the challenges of transparency in AI and how open source development processes can help create a more open and accessible future for AI. Learn more ZDNet - Red Hat's new OpenShift delivers AI, edge and security enhancementsRed Hat introduces new capabilities for Red Hat O

InstructLab tutorial: Installing and fine-tuning your first AI model (part 1)

21 November 2024 at 07:00
After shifting my career from a product security program manager to chief architect, it was time for me to dip my toes into artificial intelligence (AI)β€”until that moment, I was pretty much a prompt operator.Why train my own models? Sometimes you have confidential, regulated or restricted information that can’t be uploaded to other third-party platforms (well, your data might end up training their model). Or, you might want to have tighter control over various aspects of your model.Think of a provider uploading your data into an external platformβ€”there are lots of risks involved, like le

Rationalizing virtualized workloads: Load balancers and reverse proxies

20 November 2024 at 07:00
Many organizations are now migrating virtualized workloads to Red Hat OpenShift Virtualization.In many cases, this migration consists of a lift-and-shift (rehosting) approach, in which a virtual machine (VM) is moved from a source platform to OpenShift Virtualization while retaining the same network identity (MAC addresses, IP addresses, and so on). Essentially, the hypervisor and VM orchestrator change, but everything else remains the same. This is suitable when the objective is to replatform quickly.But once the migration is completed, you might ask yourself whether you can optimize, or even

The EU Cyber Resilience Act - what you need to know

20 November 2024 at 07:00
Today marks a new milestone in European cybersecurity: the Cyber Resilience Act (CRA) has been published in the EU’s Official Journal, bringing significant changes for businesses operating in the EU. But what does this mean for companies and users alike, and how is Red Hat positioned to support your needs in the new landscape?The CRA is a robust new legislative framework aimed at enhancing the cybersecurity of (hardware and software) products with digital elements - everything from smart home devices to complex operating systems in critical national infrastructure. The CRA enters into force

Red Hat announces 2024 North America Public Sector Partner Pinnacle Award winners

19 November 2024 at 07:00
The Red Hat 2024 North America Public Sector Partner Pinnacle Awards recognize public sector partners for their continued efforts in developing innovative solutions using Red Hat technologies to meet the U.S. government’s needs and improve mission critical outcomes.In today’s rapidly evolving landscape, collaboration between industry and government is essential for developing innovative, customized solutions that enable agencies to better serve constituents. This year’s Public Sector Partner Pinnacle Award winners stand out not only for their commitment to reimagining the future of gover

Managed Identity and Workload Identity support in Azure Red Hat OpenShift

19 November 2024 at 07:00
As organizations are looking to modernize their applications they are also looking for a more secure and easy-to-use application platform. Along with this move to modernization, there is a noticeable shift away from managing long-lived credentials in favor of short-term, limited privilege mechanisms that do not require active management. This has led to the rapid adoption of managed identities in Microsoft Azure, and our customers expect the same from their application platforms such as Azure Red Hat OpenShift (ARO) – a fully-managed turnkey application platform that allows organizations to

Landing Zone for Red Hat Enterprise Linux on Azure

19 November 2024 at 07:00
Deploying new infrastructure can be difficult, whether you’re moving on-premises deployments to the cloud, deploying a new service into your existing architecture or starting from a completely clean slate. There are a lot of choices to make, a lot of potential pitfalls and a lot of places where you can choose to integrate with the services offered by your cloud provider. On top of that, if you’re new to Red Hat Enterprise Linux (RHEL), you may not be familiar with all of the available features and extras. RHEL alone includes a number of products that you might not be aware of, such as Iden

How Red Hat Enterprise Linux powers the world’s fastest supercomputer and the future of exascale computing

19 November 2024 at 07:00
Innovation in computing is fueled by a combination of leading hardware and software–and the latest computing revolution is happening at the exascale level. Exascale machines are capable of performing an exaflop, or one quintillion calculations per second. This marks a new frontier, and one of the leading supercomputers at this level, El Capitan, has set a new benchmark for computational power. The underlying software powering this machine? Red Hat Enterprise Linux (RHEL).Red Hat and supercomputing: A legacy of performanceRed Hat has long been at the forefront of high-performance computing (H

Bringing Red Hat Enterprise Linux to Windows Subsystem for Linux

19 November 2024 at 07:00
The hybrid cloud is an innovation driver, whether pushing the enterprise technology envelope with breakthroughs like generative AI (gen AI) or simply making traditional IT more efficient and responsive through application modernization. Underpinning successful hybrid cloud strategies is choice - of architecture, of cloud provider and of technology stack.While we see this technology stack starting with Linux, many enterprise IT organizations and developer teams have standardized on Windows environments. For developers that need to build Linux applications on Windows desktops, Microsoft provides

Security of LLMs and LLM systems: Key risks and safeguards

18 November 2024 at 07:00
Now that large language models (LLMs) and LLM systems are flourishing, it’s important to reflect upon their security, the risks affecting them and the security controls to reduce these risks to acceptable levels.First of all, let’s differentiate between LLMs and LLM systems. This difference is key when analyzing the risks and the countermeasures that need to be applied. An LLM is an algorithm designed to analyze data, identify patterns and make predictions based on that data. A LLM system is a piece of software composed of artificial intelligence (AI) components, which includes a LLM along

Celebrating Red Hat's origin story: No one innovates alone

18 November 2024 at 07:00
Do you know the story behind our name? Our co-founder Marc Ewing used to wear his grandfather’s red Cornell lacrosse cap in his college computer lab, and people would say, "If you need help, look for the guy in the red hat." The tech industry is full of stories of lone geniuses, exaggerated for the sake of egos and investors. But innovation really happens when people help each other expand their possibilities, together.In open source, collaboration shows you what you need to askβ€”and who. It’s not long before you find people to teach you. Pretty soon, you find yourself teaching others.Wha
❌
❌