Hadoop and Apache Ranger Migration

Migrate and Modernize Apache Ranger Access Control

As data teams face a variety of tasks to migrate Hadoop-based platforms to cloud-native platforms such as Databricks or Snowflake, a common pitfall is overlooking access control policies that block adoption or limit new business capabilities. Cloud workloads require modern, granular access controls to enable new business capabilities and workloads that were not possible on legacy Hadoop platforms with Ranger.

An independent study showed Immuta required 75X fewer policy changes to implement a common set of cloud data access policies of which 4/15 were not possible in Ranger altogether that translate into saying “no” to business requirements.

Request a Demo

Graph showing the number of cumulative policies would be needed to accomplish the 11 scenarios

Scale Data Access

Immuta scales policy enforcement using the attribute-based access control (ABAC) model. Unlike open source solutions from the Hadoop era, such as Apache Ranger, this approach uses dynamic user subject attributes, represented as policy variables. This means that a single Immuta ABAC policy can simplify the number of required policy changes by 90X versus Ranger policies, saving time, improving understandability, reducing risk from misconfiguration, and meeting SLAs on business requests for data.

Apply Real-Time Policy Decisions

Immuta makes context-aware decisions at query runtime using user and subject attributes from existing systems for a single source of policy truth. Unlike Apache Ranger, which grants permissions without any context, real-time policy enforcement is secure and automated without the risk of bypassing policies due to manually configuring and syncing user subject attributes from systems such as IAM, CRM, or HRIS.

Expand Cloud Data Use Cases

Immuta’s policy engine was engineered for the cloud to support modern workloads. Unlike Hadoop era solutions such as Apache Ranger, this approach expands support for new types of workloads and for users building new data products or collaborating on data use with more internal and external users. Immuta is required to deliver new business capabilities following Hadoop migration and modernization.

Distribute Policy Management

Immuta policies are authored in plain English and are easy to understand. Unlike Hadoop era interfaces for popular projects, such as Apache Ranger and Apache Atlas, this approach empowers both technical and non-technical stakeholders across legal, compliance, or those embedded in lines of business to bring domain expertise to scale safe and efficient data use across the organization.

“We operate in a highly complex and sensitive data environment. We participated in Immuta’s customer preview of Databricks SQL Analytics and were able to seamlessly extend Immuta’s controls to the new service. This enables us to continue expansion of our business capabilities in the cloud and better serve our clients, while reducing risk.”
Don Garnica   |   Global Head Of Core Cloud Platforms, Janus Henderson Investors U.S.

Hadoop & Apache Ranger Migration

Frequently Asked Questions

  • What is Apache Ranger access control?

    Apache Ranger is an access control framework that operates within the Hadoop open-source data storage and management platform. Created with the goal of enabling and monitoring comprehensive data security, Apache Ranger was one of the first popular data access solutions, making it both original and somewhat antiquated. While Ranger has supplemented the safe operation of Hadoop, its traditional role-based access control model is more rigid and unscalable than modern solutions built for the evolution of data use.
  • What are the limitations of Apache Ranger policies?

    The Apache Ranger framework leverages role-based access control with object tagging (OT-RBAC), which restricts data access based on roles assigned to the data users. Object tagging allows DataOps and engineering teams to manually tag and sort data within their environment, but it is a static system that limits both adaptability and scalability. RBAC also provides minimal support for attributes, and places a high policy management burden on data owners.
  • What are Hadoop migration to cloud best practices?

    When migrating from Hadoop to a cloud-based model, it is important to separate policy from compute and storage platforms. Adapting to a model that abstracts policy creation and enforcement allows for universal policy application, and avoids manual enforcement in each internal system. Tools like Immuta provide a standard policy layer that facilitates scalable data access using attribute-based access control. This approach will also allow for real-time policy decisions, expanded organizational use cases, policy management distribution, and more logical functions that will enhance an organization’s data usage.

Request a demo to start modernizing your data access control strategy and maximize the value of your cloud data platforms

During your demo, you can expect to learn:

  • How Immuta’s attribute-based access control model streamlines data access management and helps eliminate bottlenecks to access data caused by static access controls
  • Where Immuta fits in your data stack and how its native integrations with leading cloud data platforms and services enables consistent, universal policy enforcement
  • How Immuta helps avoid data access bottlenecks by enabling transparent, distributed policy management without sacrificing security or privacy

Request your demo

Provide us with your contact information and someone will contact you shortly.

Reference Materials