Big Data Engineer (Databricks, SQL, Hadoop, Python, Pyspark, Kafka) at Interswitch.
Interswitch is an Africa-focused integrated digital payments and commerce company that facilitates the electronic circulation of money as well as the exchange of value between individuals and organisations on a timely and consistent basis. We started operations in 2002 as a transaction switching and electronic payments processing, and have progressively evolved into an integrated payment services company, building and managing payment infrastructure as well as delivering innovative payment products and transactional services throughout the African continent. At Interswitch, we offer unique career opportunities for individuals capable of playing key roles and adding value in an innovative and fun environment.
We are recruiting to fill the position below:
Job Title: Big Data Engineer (Databricks, SQL, Hadoop, Python, Pyspark, Kafka)
Department: Indeco – Industry Ecosystem & Platforms
Job type: Permanent
- Collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
- You will also be responsible for integrating them with the architecture used across the company. You are at the intersection of data, engineering and product.
- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities
- Implementing ETL processes
- Design, build, maintain and test robust data pipelines for different types of data streams
- Monitoring performance and advising any necessary infrastructure changes
- Analyzing and defining data retention policies.
- BSc in Computer Science / Engineering or related field
- Big Data related certifications/qualifications is required
- 3+ years of industry experience in area of technical expertise.
Application Closing Date
6th December, 2021.
How to Apply
Interested and qualified candidates should:
Click here to apply online