Syxlabs is your all-in-one technology partner, providing value driven AI & Blockchain solutions that stimulate growth
We are seeking a skilled Data Engineer to join our team.
- The ideal candidate will be responsible for designing, developing, and maintaining data pipelines that support our AI initiatives, specifically focused on enhancing customer service and operational efficiency for our clients.
- This role requires a strong understanding of data architecture, data processing, and real-time data integration.
Key Responsibilities:
Data Collection and Integration:
- Collect, preprocess, and transform raw data from various sources into usable formats.
- Develop and maintain scalable ETL (Extract, Transform, Load) pipelines.
- Ensure seamless integration of real-time data streams from customer interactions and transactional systems.
Data Pipeline Development:
- Design and implement robust data pipelines to support machine learning model development and deployment.
- Automate data processing workflows to ensure continuous data flow and real-time data availability.
- Monitor and optimize data pipelines for performance and scalability.
Data Storage and Management:
- Develop and manage data storage solutions that are efficient and scalable.
- Ensure data is securely stored and accessible for analysis and model training.
- Implement data retention policies and manage data lifecycle.
Data Quality and Governance:
- Implement data quality checks to ensure data integrity and accuracy.
- Develop and enforce data governance policies and best practices.
- Work closely with data scientists and machine learning engineers to understand data requirements and ensure data meets their needs.
Collaboration and Communication:
- Collaborate with cross-functional teams, including ML engineers, software developers, and product managers, to understand data needs and deliver solutions.
- Provide technical guidance and support to other team members.
- Communicate effectively with stakeholders to understand business requirements and translate them into technical specifications.
Qualifications:
- Education: Bachelor’s degree in Computer Science, Engineering, or a related field. Master’s degree preferred.
Experience:
- 3+ years of experience in data engineering or a related field.
- Proven experience with data pipeline development, data integration, and data management.
Technical Skills:
- Proficiency in SQL, noSQL and experience with relational databases.
- Strong programming skills in Python or JavaScript.
- Experience with big data technologies such as Hadoop, Spark, and Kafka.
- Familiarity with cloud platforms (AWS, Azure or GCP) and their data services.
- Knowledge of data modeling, ETL processes, and data warehousing concepts.
- Experience with real-time data processing and streaming technologies.
Soft Skills:
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
- Ability to work in a fast-paced,
- dynamic environment.
Method of Application
Signup to view application details.
Signup Now