Python Engineer – ETL Development - Tunis, تونس - BINITNS

    BINITNS Tunis, تونس

    منذ شهر

    Default job background

    Binit Nearshore Services (BinitNS) is a Consulting andServices company operating in the ITO and BPO areas. We advise our EuropeanCustomers for their IT projects: From Business Process Digitalization toInfrastructure evolutions and Cloud migrations.

    One of our clients is a technology start-up focused onsolving the increasing data challenges created by the continued growth ofprivate markets, private equity, venture capital, real estate, and others.Utilizing a variety of artificial intelligence and machine learning techniquesto automate the extraction of critical data from unstructured andsemi-structured documents, simplifying the demanding workflows of privatemarket participants such as institutional investors, funds, and fundadministrators.

    For that client, we are looking to hire a Python Engineerbased in Tunis, who will collaborate with engineering teams in London and Paristo deliver high-quality software solutions.

    As a Python Engineer, you will play a pivotal role indeveloping and maintaining ETL solutions (Extract, Transform, Load) that translatedata from internal data stores (PostgreSQL, DataBricks, S3) into various customformats and layouts, fulfilling diverse client needs.

    This role demands a blend of technical prowess inPython programming, a deep understanding of data manipulation techniques, and acommitment to producing secure, high-quality code.

    Your Missions:

    • Design, build, and maintain Python ETL jobs thataccurately extract data from a variety of sources, including internal APIs(ReST, gRPC, GraphQL), S3 (Parquet), DataBricks SQL, PostgreSQL, transform itfollowing specific business logic, and load it into diverse layouts and fileformats.
    • Collaborate with the product team to gatherrequirements and adapt ETL processes and outputs to meet evolving needs.
    • Implement and refine automated testing frameworks toensure data transformations and outputs maintain high integrity and accuracy.
    • Develop monitoring tools to track ETL job performance,with automated alerting for failures.
    • Leverage PySpark and Databricks to build, modify, andmaintain data pipelines that seamlessly integrate with analytics tools,supporting the creation of interactive dashboards following product teamrequirements.


    • Engineering or master's degree, at least 2 years ofexperience in development with the required technologies.
    • Proficiency in Python with a strong grasp of datamanipulation and processing.
    • Commitment to type safety and secure coding practicesin Python.
    • Skilled in writing unit and integration tests, withexperience in CI/CD pipeline integration.
    • Deep knowledge of API integration (ReST, gRPC,GraphQL) and complex SQL query construction.
    • Experience with pandas for ETL processes, and Sparkfor handling large datasets.
    • Familiarity with Git and GitHub for version control.
    • Background in automated testing, CI/CD practices, andunderstanding of data security and privacy principles including authenticationmethods, encryption, and cryptographic data signing basics.
    • Experience with major cloud platforms (AWS, Azure,Google Cloud) and their data services.

    Preferred Skills and Qualifications:

    • Experience with NoSQL databases and Databricks.
    • Familiarity with containerization and orchestrationtools (Docker, Kubernetes).
    • Knowledge of additional programming languages (Java,Kotlin, Scala, Go) is a plus.
    We offer good working conditions and an interestingpackage.
    Starting date: As soon as possible