Jinesh Ranawat

Jinesh Ranawat

Staff AI/Data Engineer

AWS Azure Palantir Gen AI Ollama LLMs

Pune, Maharashtra, India

BP Logo

Featured on BP's Global Website

My story of innovation and leadership featured on BP's global careers page

Read My Story

Cost Savings

$1.7M

Daily Data Processed

500TB

Team Leadership

2 → 50+

Leadership Awards

17

Leadership Development(Bp Advanced FLL) & Industry Influence

My Vision & Aspirations

"To democratize data engineering excellence and create systems that empower everyone to harness the full potential of data, regardless of their technical expertise."

Revolutionizing Pipeline Development

I'm leading BP's data engineering monorepo initiative using Bazel for build optimization. Our one-click pipeline generation system has reduced development time from weeks to hours, enabling engineers to create complex transformation pipelines with minimal manual coding. This system now supports over 10,000 pipelines with 99.9% reliability across multiple business units.

Accelerating Data Engineering

I've architected a suite of data engineering accelerators providing a 360-degree view of technical assets and data products. These accelerators integrate with ServiceNow for seamless workflow management and include pre-built templates for common tasks. The platform has reduced implementation time by 40% while improving quality and consistency, resulting in estimated savings of $2.5M annually.

AI Innovation & Democratization

I'm leading the development of advanced AI systems using Ollama and fine-tuning deepseek and llama models to create domain-specific AI assistants. These systems can be deployed on standard laptops, democratizing access to advanced AI capabilities. My current focus is making data more accessible through conversational interfaces that allow non-technical users to interact with complex data systems through natural language. My innovations include AI agents that automate routine data engineering tasks, provide intelligent suggestions for pipeline optimization, and reduce the expertise barrier for new team members.

Building Knowledge Libraries

I've created a comprehensive 'recipe library' of data engineering patterns and best practices, enabling teams to tackle common challenges with proven solutions. This standardized approach has improved code quality, reduced technical debt, and accelerated onboarding of new team members. The library continues to grow as we discover new efficient patterns.

About Me

I'm a Staff AI/Data Engineer with over 10 years of experience across financial services, fintech, and energy sectors. Currently at BP, I lead innovative data engineering initiatives, building multi-modal conversational AI agents and managing highly scalable, cost-efficient platforms across AWS/Azure.

My expertise spans cloud architecture optimization, real-time data processing, and team leadership. I've successfully built and scaled a data engineering team from 2 to 50+ engineers while maintaining zero escalations and 90% team satisfaction. As a passionate educator, I also work as a freelance consultant and trainer, sharing knowledge through well-crafted technical content and training sessions that have reached over 300 professionals.

Featured Projects View All

Geospatial Analytics Platform

Real-time vessel tracking with millisecond performance

Python Spark Azure GeoJSON

Developed a real-time geospatial tracking system enhancing safety across BP assets. The platform processes vessel location data with millisecond performance, providing critical insights for offshore operations and safety monitoring.

LOC8 Data Discovery

Pattern-based data discovery system

Java Spring Boot Neo4j React

Engineered a revolutionary pattern-based data discovery system that significantly improved data accessibility and usability across the organization. The system leverages machine learning techniques for intelligent pattern recognition and metadata management.

ADL Lineage with Neo4J

Data lineage and monitoring solution

Neo4j Python GraphQL D3.js

Implemented a comprehensive data lineage and monitoring solution using Neo4j, enabling full visibility into data flows and transformations. The solution helps in regulatory compliance, impact analysis, and data governance.

Azure Storage Optimizer

Cost optimization for cloud storage

Python Azure PowerShell Terraform

Developed an innovative solution that saved millions in Azure storage costs by optimizing data lifecycle management, implementing intelligent tiering, and automating cleanup of redundant data while maintaining data integrity and access requirements.

TensorFlow Deep Learning

Advanced Deep Learning Implementation

TensorFlow Deep Learning Neural Networks Python

Comprehensive guide to TensorFlow implementation, covering neural network architectures, model optimization, and real-world applications. Includes detailed methodology and best practices for deep learning projects.

Technical Writing View All Articles

Demystifying Spark Resource Calculation

Aug 13, 2024

Have you ever felt like you're playing a guessing game when allocating resources for your Apache Spark jobs? You're not alone...

Read More

Forward Deployed Engineers: Building AI Systems

Feb 6, 2024

In the rapidly evolving landscape of artificial intelligence, a new role has emerged that bridges the gap between cutting-edge AI...

Read More

How 103 Lines of Code Saved Millions on Azure Storage

Apr 22, 2024

As businesses grow and data accumulates, the costs associated with storing and managing that data can quickly escalate...

Read More

Professional Experience

Staff AI/Data Engineer

BP | November 2024 - Present

  • Building multi-modal conversational AI agents and managing highly scalable platform services
  • Accelerating adoption of AWS EMR, Iceberg, Glue, Azure Databricks, and Generative AI
  • Leading cloud cost optimization initiatives, achieving $2M in savings for 2024
  • Building a strong Data Engineering developer community across the organization

Data Engineering AI Tech Lead

BP | December 2021 - Present

  • Led a federated team of 25+ data engineers supporting 30+ critical P&O products
  • Developed real-time geospatial systems with millisecond performance
  • Implemented cost optimization strategies reducing cloud costs from $50K to $7K monthly
  • Delivered projects across Asset Intelligence, Production, SIRIUS, Digital Twin, and Geospatial
  • Received 17 leadership awards across technical excellence, leadership, and team development

What Colleagues Say

Technical Skills

Cloud Platforms (AWS, Azure)

95%

Data Engineering

98%

Python & Java

90%

Databricks, Spark

92%

Neo4j, Graph Databases

85%

CI/CD, DevOps (Bazel)

88%

Real-time Processing

94%

Generative AI & LLMs

90%

AI Model Fine-tuning

88%

Certifications

  • Azure Certified Architect

    AZ-303 & AZ-304

  • Databricks Certified Data Engineer

    Associate Level

  • Generative AI Fundamentals

    Databricks Academy

  • Palantir Certified Professional

    Data Integration Specialist

Professional Advisory & Community Engagement

On weekends, I provide professional advisory services to organizations and individuals in the areas of Site Reliability Engineering (SRE), cloud architecture, and data engineering. This focused engagement allows me to contribute to the broader technology community while sharing specialized knowledge across various industries.

Technical Advisory

  • SRE implementation for cloud-native applications
  • Data infrastructure reliability and optimization
  • Cloud architecture for enterprise workloads
  • ML/AI platform engineering
  • Data observability and governance frameworks

Provided strategic direction to multiple organizations implementing cloud-native architectures and SRE practices.

Knowledge Sharing

  • Architecture review sessions for technical teams
  • Technical mentoring for mid-senior engineers
  • Technical content development for organizations
  • Best practices implementation guidance
  • Performance optimization methodologies

Engaged with over 300 professionals through structured knowledge-sharing initiatives focused on practical implementation.

Community Initiatives

  • Mentor for FUEL NGO supporting technology graduates in career development
  • Technical advisor for early-stage startups in data infrastructure
  • Author of technical publications on Medium
  • Contributor to open-source data engineering projects
  • Advocate for cost-efficient cloud architectures and practices

Project Portfolio in Detail

Real-time Geospatial Vessel Tracking

Python Spark Structured Streaming Azure Event Hubs GeoJSON AIS Data Power BI

Problem:

BP needed a real-time monitoring system to track vessel movements near offshore platforms to enhance safety and operational awareness. Traditional tracking systems had significant latency issues (5-10 minutes), making them inadequate for safety-critical applications.

Solution:

Designed and implemented a high-performance geospatial tracking platform that ingests, processes, and visualizes AIS (Automatic Identification System) vessel data in milliseconds. The system provides real-time alerts for proximity violations and captures historical vessel trajectories.

Technical Implementation:

  • Built ingestion pipeline using Azure Event Hubs for AIS data from multiple providers
  • Implemented Spark Structured Streaming with Python for real-time data processing
  • Created geospatial functions for proximity calculations and trajectory analytics
  • Developed alert system for vessels approaching restricted zones or platforms
  • Designed dashboard for operational visibility using Power BI and custom visualizations

Impact:

  • Reduced vessel tracking latency from minutes to milliseconds
  • Improved offshore safety by providing proactive alerts for potential vessel collisions
  • System now monitors over 30,000 vessels daily across BP's global operations
  • Solution extended to Castrol division for supply chain optimization
  • Recognized with "Play to Win" award for technical excellence

Get In Touch