Bachelor's degree in CS or related technical field.
• 7+ years of experience in analytics, data development, and data warehousing.
• Experience working with Big Data technologies (Hadoop, Hive, Spark, Kafka, Kinesis).
• Experience with large scale data processing systems for both batch and streaming technologies (Hadoop, Spark, Kinesis, Flink).
• Experience in programming using Python, Java or Scala.
• Experience with data orchestration tools (Airflow, Oozie, Step Functions).
• Solid understanding of database technologies including NoSQL and SQL.
• Track record of delivering reliable data pipelines with solid test infrastructure, CICD, data quality checks, monitoring, and alerting.
• Strong organizational and multitasking skills with ability to balance competing priorities.
• Excellent communication (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams.
•An ability to work in a fast-paced environment where continuous innovation is occurring, and ambiguity is the norm.
• Experience with AWS big data technologies - S3, EMR, Kinesis, Redshift, Glue