Proactively assess and monitor your cloud environments based on operational excellence, security, reliability, performance efficiency and cost optimisation. CloudPlexo provides both developers and management with the insights needed to run an efficient, lean and reliable cloud environment. CloudPlexo Cloud Management Platform (CCMP) is a Software-as-a-Service (SaaS) cloud management solution. We provide a software platform for businesses to acquire a full view of their cloud health, cost and operations from multiple cloud environments in one platform.
Job Description:
- We are seeking a skilled and experienced Data Engineer (AWS) to join our team. The ideal candidate will have a strong background in designing, building, and maintaining data pipelines on AWS, as well as expertise in data warehousing, ETL processes, and cloud-based data solutions. You will work closely with clients to understand their needs and implement best-in-class data engineering solutions tailored to their business goals.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (e.g., AWS Glue, Lambda, Redshift, S3, EMR, Kinesis, etc.).
- Implement and optimize data storage solutions, ensuring high performance and cost efficiency.
- Collaborate with clients and internal teams to understand business requirements and translate them into technical solutions.
- Work with structured and unstructured data, performing data transformation and integration from multiple sources.
- Develop and maintain data models, data warehouses, and data lakes using AWS technologies.
- Ensure data quality, integrity, and security by implementing best practices in governance and compliance.
- Monitor and troubleshoot data pipelines to maintain optimal performance and reliability.
- Stay up-to-date with the latest AWS technologies and industry trends to provide innovative solutions to clients.
Required Skills & Experience:
- Proven experience as a Data Engineer, Cloud Engineer, or similar role with a strong focus on AWS technologies.
- Hands-on expertise with AWS services such as AWS Glue, Redshift, Lambda, S3, Athena, Kinesis, DynamoDB, EMR, and RDS.
- Strong programming skills in Python, SQL, or Scala for data processing and automation.
- Experience with ETL/ELT development and data pipeline orchestration tools such as Apache Airflow, Step Functions, or AWS Glue Workflows.
- Familiarity with data warehousing concepts and experience working with Redshift, Snowflake, or BigQuery.
- Knowledge of IaC (Infrastructure as Code) tools like Terraform or AWS CloudFormation is a plus.
- Strong understanding of data governance, security, and compliance best practices.
- Excellent problem-solving skills, with the ability to analyze and optimize complex data workflows.
Method of Application
Signup to view application details.
Signup Now