Disabilities Jobs

Disability Jobs

Search Jobs from Disability Friendly Employers

Job Information

IBM Data Engineer-Data Platforms-AWS in BANGALORE, India

Introduction

In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.

Your role and responsibilities

  • Design, develop, and manage our data infrastructure on AWS, with a focus on data warehousing solutions.

  • Write efficient, complex SQL queries for data extraction, transformation, and loading.

  • Utilize DBT for data modelling and transformation.

  • Use Python for data engineering tasks, demonstrating strong work experience in this area.

  • Implement scheduling tools like Airflow, Control M, or shell scripting to automate data processes and workflows.Participate in an Agile environment, adapting quickly to changing priorities and requirements.

Required technical and professional expertise

  • Minimum of 5 years of experience as a Data Engineer with extensive expertise in AWS, and PySpark.

  • Deep knowledge of SQL and experience with data warehouse design and optimization.

  • Strong understanding of AWS services and how they integrate with Databricks and other data engineering tools.

  • Demonstrated ability to design, build, and maintain end-to-end data pipelines.

  • Excellent problem-solving abilities, with a track record of implementing complex data solutions

Preferred technical and professional experience

  • Experience in managing and automating workflows using Apache Airflow.

  • Familiarity with Python, Snowflake, and CI/CD processes using GitHub.

  • Strong communication skills for effective collaboration across technical teams and stakeholders

DirectEmployers