July 11, 2025
|
12
MIN Read

The Future of Mainframe Security is Agentless. Legacy Tools Can't Keep Pace

By
Lindsay Kleuskens

For CISOs, the mainframe represents a modern paradox. For decades, it has been the secure, reliable bedrock of the business - a digital vault holding decades of critical customer data. Its stability is legendary, but this very stability creates a new, pressing challenge: how do we protect the sensitive data  inside it?

Organizations like financial services and telecommunications rely on this data, but it frequently resides in cleartext, adding risk to the organization. Securing this data is challenging as modifying the legacy application code (like COBOL) or database structures to add modern security is often unfeasible - the process is high-risk, expensive, and requires a niche – and dwindling – pool of developers with particular skills.

This article outlines a strategic approach to mainframe security thats focused on protecting the data itself, enabling you to gain complete control over sensitive information both at-rest and in-motion, all without requiring a single, high-risk change to your core applications.

See exactly how a leading financial services firm executed this agentless blueprint by downloading the step-by-step technical case study.

Get the Financial Services Case Study

Executive Summary for Security Leaders

  • The Problem: Vast quantities of sensitive data sit in cleartext on mainframe databases. Modifying legacy applications to protect this data is too risky and costly, while replicating it for modern analytics or accessing it via legacy terminals (TN3270) creates significant new points of exposure.

  • The Solution: An agentless data security platform that operates inline, outside the mainframe. It intercepts data flows to apply modern protection methods like tokenization and real-time dynamic data masking, without requiring agents or code changes.

  • The Benefit: This approach secures sensitive data at its source and during transit, satisfying compliance mandates and reducing the risk of exposure. It eliminates the "security vs. stability" conflict and acts as a secure bridge, enabling modernization initiatives without compromising foundational data.

The Real Mainframe Security Challenge: It’s About the Data

The mainframe itself is an engineering marvel. The true challenge is not the platform's reliability, but the vulnerability of the cleartext data it holds. The legacy tactics for managing this risk are no longer viable in a connected, hybrid world, leading to three core business problems.

Problem 1: The Prohibitive Risk of Changing Legacy Code

The most direct way to secure mainframe data would be to rewrite the COBOL applications or alter the DB2 database schemas. However, the complexity of these decades-old systems, combined with a scarcity of specialized mainframe developers, makes this approach a non-starter. A single code change can have unforeseen consequences, destabilizing the critical applications that power the business. This creates a culture of "if it isn't broken, don't touch it," leaving sensitive data exposed by default.

Problem 2: The Operational Friction of Mainframe Change Control

Even if code changes were feasible, deploying anything new on the mainframe creates a three-way operational standoff between teams that prioritize stability (Mainframe Ops), risk reduction (Security), and feature velocity (Application Owners). The rightly protective stance of mainframe teams, who maintain systems designed for 99.999% uptime, means any proposed change faces a long and arduous cycle of testing and approval, delaying crucial security projects indefinitely.

Problem 3: The Inherent Security Gaps of a Host-Based Approach

The risk is magnified the moment data leaves the mainframe.

  • Replication Risk: When data is replicated to downstream systems for analytics or fraud detection, its security is often diminished. This increases the attack surface and creates compliance challenges across different security zones.

  • Legacy Access Risk: Critical data is still accessed daily through legacy "green screen" terminal sessions using protocols like TN3270. These protocols stream data directly to user screens, often beyond the reach of modern security controls, creating a major blind spot for data leakage.

The New Paradigm: Protecting the Data itself, Agentlessly

The new paradigm shifts the focus from trying to modify the vault to securing the valuables inside it. Instead of a risky, host-based approach, this strategy places security controls directly in the flow of data as it moves to and from the mainframe.

It’s a data-centric approach that presumes the safest way to protect data is to make it unusable to unauthorized parties. By replacing sensitive data with secure tokens or masking it in real-time, you neutralize the risk of exposure without altering the source system.

How Agentless Data Protection Works

This approach is fundamentally non-intrusive. An agentless data security platform is deployed inline, communicating with the mainframe using its native protocols. It requires no software installation or code alteration on the mainframe itself.

  1. For Data-at-Rest and Replication (e.g., DB2): The solution intercepts data flows to apply vaulted tokenization. Original sensitive values are replaced with format-preserving tokens that have no mathematical relationship to the real data, rendering them secure even against quantum computing threats. This tokenized data can be safely stored on the mainframe or replicated to other systems.

  2. For Data-in-Use (e.g., TN3270): For live terminal sessions, the platform applies Dynamic Data Masking (DDM). By integrating with Identity and Access Management (IAM) systems, it identifies users by their roles and dynamically masks sensitive fields on their screen in real-time, ensuring they see only the data they are explicitly authorized to view.

The Strategic Advantage of a Data-Centric Approach

This agentless, data-centric strategy delivers three immediate strategic advantages:

  1. Effective Data Protection Without the Risk: Sensitive data on legacy systems is secured without necessitating risky or complex modifications to COBOL or other mainframe application code. This significantly reduces costs, operational risk, and the demand for niche developers.

  2. Reduced Risk of Data Exposure: The solution minimizes the exposure of cleartext data during replication, transit, and user access. By tokenizing data before it moves and masking it at the point of display, you strengthen your security posture and support compliance requirements.

  3. Facilitate Modernization Initiatives: This approach acts as a crucial bridge for digital transformation. It enables secure data sharing between legacy mainframes and modern analytics platforms, allowing the company to innovate without compromising the security of its most foundational data.

A CISO’s Roadmap: 4 Steps to Agentless Mainframe Security and Data Protection

Adopting an agentless security strategy is not a massive, multi-year overhaul. It’s a logical, phased approach that delivers value at every step:

Adopting this strategy is a logical, phased approach focused on tangible risk reduction.

  1. Step 1: Discover and Classify Sensitive Data. You cannot protect what you don't know you have. The first step is to use data discovery tools to build a complete inventory of where sensitive data resides within your mainframe databases.

  2. Step 2: Map Critical Data Flows. Based on the discovery results, map out all the pathways where this sensitive data travels—from replication flows to downstream systems to terminal access sessions.

  3. Step 3: Define Data Protection Policies. With a clear map, define access and protection policies. Determine which data fields should be tokenized at rest and which should be dynamically masked based on the user's role and the context of their access.

  4. Step 4: Deploy and Enforce Inline. Activate the agentless platform to sit in the flow of data and enforce your defined policies automatically, securing your data without any disruption to end-users or mainframe operations.

Key Evaluation Criteria for an Agentless Security Solution

When evaluating solutions to execute this strategy, ensure they meet these core requirements:

  • Truly Agentless Architecture: The solution must require no code changes or software installation on the mainframe.

  • Dual Protection Capabilities: It must support both data-at-rest protection (like tokenization) and data-in-use protection (like dynamic data masking).

  • Native Protocol Support: It must handle both database replication flows and legacy terminal access protocols like TN3270.

  • Format-Preserving Technology: Its tokenization or masking must preserve the original data's format to avoid breaking downstream legacy applications that have rigid data structure requirements.

Blueprint to Reality: Seeing Agentless Data Protection in Action

The shift from perimeter controls to a data-centric security strategy is the new imperative for protecting legacy systems. As we've outlined, the blueprint for modern mainframe security involves protecting the data itself -agentlessly and without the immense risk of disruptive code changes.

But for any CISO, a blueprint is only as valuable as the proven success of its execution. The crucial question is no longer just what the strategy is, but how a peer organization has successfully implemented it.

This is where theory meets practice. In the reference case study, you will learn how a leading enterprise in the telecommunication sector implemented this exact strategy using DataStealth. The document provides a technical look into how they used:

  • Tokenization to secure sensitive customer data at-rest within their mainframe IBM DB2 databases.
  • Dynamic Data Masking to protect data in-use during live TN3270 terminal sessions, based on user roles.
  • An agentless, inline architecture to achieve this comprehensive data protection without installing any software on the mainframe or modifying a single line of their legacy COBOL applications.

To see how a peer organization successfully navigated this complex challenge and to understand the tangible outcomes of this data-centric approach, we invite you to read the full case study.

Frequently Asked Questions from CISOs

Can this approach help us meet our PCI DSS, SOX, or HIPAA audit requirements for data access?

Yes. This approach directly supports data-centric compliance mandates like GDPR and PCI DSS. By using tokenization, you can remove sensitive data from scope, dramatically simplifying audit processes. By providing an immutable log of who accessed, viewed, or detokenized data, you provide auditors with concrete proof that your data access controls are being enforced effectively. 

Will this approach slow down our critical batch processing or increase MIPS costs?

No. Because the solution is agentless and operates externally to the mainframe, it does not consume mainframe MIPS or interfere with its processing. It is designed to be non-intrusive, removing the performance risk associated with legacy, host-based security tools.

Can it feed data and alerts into our existing SIEM (Splunk, Sentinel, etc) and SOAR platforms?

Yes. It enriches your SIEM with highly valuable, data-centric audit logs. Instead of just network alerts, it provides specific events on who attempted to access sensitive data, which policies were enforced (e.g., data masked or tokenized), and who was granted access to cleartext values. This gives your security operations team precise intelligence on data-related risks.

About the Author:
Lindsay Kleuskens
Account Executive
LinkedIn Icon.
Lindsay Kleuskens is a data security specialist helping enterprises reduce risk and simplify compliance. At DataStealth, she supports large organizations in protecting sensitive data by default, without interrupting user workflows. Her work focuses on PCI DSS scope reduction, preventing client-side attacks, and enabling secure third-party integrations without the security risk. Lindsay regularly shares practical insights on modern data protection challenges and helps organizations navigate evolving compliance standards with confidence.