• Take ownership of core components of data ingestion & integration processing that intakes ~ 17 billon events per hour with data volume in excess of 2TB per hour.
• Design and implement new features and enhancements to our data platform in an intelligent fashion that advances component architecture toward strategic data governance/management objectives
• Work with world class high frequency/low latency design experts to perform in-depth analysis and optimization of data pipeline components to ensure smooth execution within strict time and resource limitations
• Work closely with product stakeholders and users to understand data and reporting requirements
• Prioritize bug fixes to ensure critical up-time
• Serve as a mentor and guide for other team members
BA/BS/MS degree and 2-20 years of experience; we'll designate level of role base on experience. (Degree in Computer Science or related field preferred)
• Must be expert coder with some 3GL (java, C#, C++ or C); with deep understanding for distributed and low latency designs patterns.
• Experience with Big Data and distributed systems using technologies such as Hadoop, Spark, Python, Java and Scala
• Experience/Achievements in coping with large scale data deployments and/or high frequency/low latency projects
• Familiarity with Enterprise Data Management & Governance practices or Data Architecture methodologies are a plus.
Job ID 2013379X Date posted 03/27/2020
Posting ID: 554942737Posted: 2020-05-21