Understand, profile, and assess the quality of raw data from diverse client sources
Design, build, test, and maintain scalable and robust data pipelines for ingestion, transformation, and enrichment
Lead discussions with client stakeholders on data architecture, integration, and delivery strategies
Required Qualifications
BSc, MSc, or PhD in computer science, software engineering, or a related technical field
Expertise in distributed data processing frameworks (e.g., Spark, Hadoop)
Strong coding skills in Python, Go, or Scala, with proficiency in Python
Experience with workflow orchestration tools (e.g., Airflow, Dagster, Prefect)
Familiarity with cloud environments (AWS, GCP, or Azure) and infrastructure-as-code tools
Average salary estimate
$0
/ YEARLY (est.)
min
max
$0K
$0K
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.