Perfect role for experienced Data Engineers with leadership capabilities to shape the future of the company's data infrastructure and processes, driving impactful projects from the ground up. Key Responsibilities: Develop and maintain data pipelines, data warehouses, and cloud-based systems. Engage with stakeholders to create and build analytical products and services. Participate in daily stand-ups and team meetings to coordinate and collaborate on development efforts. Create performant batch and streaming data pipelines capable of handling large volumes of data in both real-time market data and batch formats (Apache Airflow, Apache Kafka, Apache Flink). Utilize tools like Snowflake to construct reliable and scalable data warehousing solutions. AWS or GCP to build robust and scalable cloud-based systems. Improve CI/CD processes to automate development, testing, and deployment using Jenkins and GitLab Oversee the maintenance and monitoring of various data applications, pipelines, and databases. What you will bring in: Strong Data Engineering experience Proficiency in Python and SQL. Experience with Cloud Knowledgeable in ETL processes and event streaming technologies (such as Kafka, Flink, etc.). Good communication skills Show more Show less