Develop and manage secure and compliant data processing pipelines using various tools and techniques.
Resolve unexpected problems efficiently and minimize data loss and ensure data integrity.
Ensure data pipelines and storage systems are efficient, well-organized, reliable, and meet business requirements.
Design, implement, monitor, and optimize data platforms to support pipeline needs.
Qualifications
Bachelor’s degree in engineering, science, business, or a related field
Over 3 years of hands-on experience in data engineering, ETL/ELT development, or data ingestion roles.
Strong experience with Databricks, CosmosDB, and familiarity with Azure Data Lake Storage, Azure KeyVault (or equivalents on GCP/AWS), and secure data ingestion practices.
Excellent team collaboration, prioritization, adaptability, and communication skills in English.
We use cookies to remember user preferences (such as display language) and analyze page relevance to determine user interest. ISM’s website does not use third-party cookies.OkPrivacy policy