QUALIFICATIONS
• Bachelor’s degree (BS/BE) in Computer Science or related field,
• Min. 1-2 years of experience,
• Knowledge of GDPR and Data Privacy Regulations,
• Proficiency with strong SQL,
• Cloud system AZURE and/or GCP and/or AWS
• Experience in object-oriented scriting languages, preferably in Pyhton,
• Linux or Shell scripting is plus,
• No military obligation for male candidates,
• Fluent English,
• Nice to have experience with the following tools: Jenkins, Git, Airflow, Spark.
JOB DESCRIPTION
• Set up APIs from various data sources (including via webscraping),
• ETL data tables and ensure data accuracy and consistency across all our data sources,
• To integrate the internal and external data sources to the main data warehouse environment,
• Implement and manage GDPR and Regional Data Security/Privacy rules,
• To develop the needed codes for aggregating the source data (ex: CRM ETLs),
• Data process such as migrate/ prep/ aggregate,
• Alarm configuration for monitoring,
• Schedule settings for Reports,
• Analytic platform support - Administration, monitoring, maintenance, support, and routine upgrade/patching tasks for analytic platform,