Estuary Business Solutions (EBS) is a leading IT and Business Services Consultancy located in Lagos, Nigeria. EBS enables business transformation for our clients through the innovative use of technology. Strategic planning and business management. We work closely with our clients to realize their potential - enabling change that increases their efficiency, accelerates growth and manages risk. EBS works with a host of clients in both the private and public sectors.
Responsibilties
- Clarify business requirements on data pipeline, convert the requirements intodata templates and make ready as input for data preparation and transformation by data engineers
- Designs, develops and tests new and/or existing data solutions running on the organization’s Big Data platform.
- Utilizes agile software development practices, secure coding practices, code reviews, and software architecture.
- Interpret data, analyze results using statistical techniques and provide ongoing reports
- Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
- Acquire data from primary or secondary data sources and maintain databases/data systems
- Identify, analyze, and interpret trends or patterns in complex data sets
- Build metadata for various data feeds.
- Handle development tasks and scripting related to data services
- Implement data services solutions (Data Ingestion, Data Processing, APIs, Computations)
- Implement data schemas and structure
- Implement and develop data quality control
- Business intelligence and report generation Development
- Develop data set processes
- Locate and define new process improvement opportunities
- Uses knowledge of distributed computing techniques to design, develop and test scalable ETL/Processes that operate on large volume datasets.
- Familiar with handling datasets containing mixes of structured and unstructured data.
- Transforms unstructured data into suitable forms for analysis and modeling.
- Performs extract, transform and load (ETL) integrations with variety of data sources.
- Perform business and system impact analysis.
Requirements
- Education Qualification: Computer Science or Engineering Degree – must have
- Statistics and Mathematical qualification – plus
- Big data and business intelligence certifications is an advantage.
Experience Criteria:
- At least 3 years hands-on ETL experience
- Hands-on telco experience
- Strong foundation in big data concepts - map-reduce, RDDs, batch and stream processing, data formats, ETL process flow
- Experience in Python programming.
- Proficient in Spark and Spark streaming architecture.
- Utilizes agile software development practices, secure coding practices, code reviews, and software architecture.
- Uses knowledge of distributed computing techniques to design, develop and test scalable applications that operate on large volume datasets.
- Familiar with handling datasets containing mixes of structured and unstructured data.
- Transforms unstructured data into suitable forms for analysis and modeling.
- Writes ad-hoc scripts and queries, schedules batch jobs and develops real-time streaming applications and monitors.
- Experience using Big Data based applications/tooling/languages such as Hadoop, Spark, Kafka, Hive, HBase.
- Integrating and implementing AI algorithms and logic into the delivering enterprise project.
Soft Skills:
- Good communication
- Analytical and problem-solving
- Presentation and Documentation.
Tools:
- Good knowledge of below applications
- MS Office
- MS Visio
- NoSQL, SQL Query builder.
Method of Application
Signup to view application details.
Signup Now