Leading Malaysian Islamic Bank Achieves 90% Faster Performance with DigileEdge

Date:

April 8, 2026

Client:

Category:

Results & Impact

  • 99.95% reduction in table count (400,000 → optimized architecture)
  • 90%+ improvement in query performance (8 minutes → <1 minute)
  • 85% reduction in reporting time
  • 100% elimination of manual reporting workflows
  • 99% platform uptime achieved
  • Linear scalability supporting billion+ record workloads
  • End-to-end audit traceability across pipelines

Challenge

The bank’s legacy data platform, built on a high-cost, premium-licensed infrastructure, had become a critical bottleneck to growth and operational efficiency.

Built on an end-of-life BDA system, the platform lacked vendor support, scalability, and a viable upgrade path.

Over time, architectural inefficiencies compounded:

  • A single schema had grown to over 400,000 tables, driven by poor backup and data design practices
  • Query execution times stretched up to 8 minutes, severely impacting analytics and reporting
  • Batch ingestion cycles ran long, delaying access to critical business data
  • Reporting processes were highly manual, relying on CSV and Excel workflows with no real-time visibility

This resulted in limited agility, delayed insights, and increasing operational risk in a highly regulated banking environment.

Digile implemented a cloud-ready lakehouse architecture built on DigileEdge (powered by Stackable), enabling a complete modernization of the bank’s data platform.

DigileEdge provided a modular, Kubernetes-native data foundation with built-in automation, governance, and reusable pipeline frameworks , accelerating deployment while ensuring scalability and resilience.

The modernized platform simplified data pipelines, reduced metadata complexity, and introduced a standardized integration framework. This shift improved processing speeds at scale, while automated, self-healing features enhanced reliability.

The resulting curated datasets now provide real-time access, accelerating data-driven decision-making while reducing infrastructure and processing costs.

Key transformation outcomes included:

  • Simplified data architecture, significantly reducing metadata complexity
  • Real-time data ingestion and processing, replacing outdated batch cycles
  • Automated, self-healing pipelines, minimizing manual intervention
  • Curated, business-ready datasets, enabling faster and more reliable consumption
  • Unified governance and observability, ensuring auditability and compliance

Solution Highlights

  • Kubernetes-native data platform implementation
  • Integration of Change Data Capture (CDC) using Striim
  • Real-time streaming and processing enablement
  • Centralized data governance and observability framework
  • Infrastructure optimization to reduce operational overhead
  • Policy-based access control enforced at the platform layer.
  • Proactive monitoring with real-time alerts and SLA tracking
  • Zero data loss with automated failover and recovery mechanisms

Technology Stack

  • Stackable
         - Trin
          - NiFi
         - Superset
          - Iceberg
          - Airflow
         - Hive
  • DataHub
  • Striim

Check Your AI Readiness

Check Your AI Readiness

Get a personalized readiness score and actionable next steps for your AI journey.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.