The job below is no longer available.
You might also like
in Metropolitan, MI
Principal Date Engineer
•24 days ago
Estimated Pay | $22 per hour |
---|---|
Hours | Full-time, Part-time |
Location | Metropolitan, Michigan |
Compare Pay
Estimated Pay We estimate that this job pays $22.03 per hour based on our data.
$13.19
$22.03
$33.98
About this job
Job Description
Job Description
Infomatics is partnered with a large finance company that is hiring a Principal Data Engineer on a direct/FTE basis in Detroit Michigan. It is an on-site position. All applicants must be willing & eligible to be hired directly on W2 without sponsorship. No 3rd parties need apply.
Responsibilities:
- Design, build and enhance Data Cloud workflows/pipelines to process billions of records in large-scale data environments with experience in end-to-end design and development of near-real time and batch data pipelines.
- Design and build Data Warehouse based on Data Vault (DV-2) style of data model.
- Design and build Data Lake and big data analytic solutions on AWS using streaming and batch processes.
- Develop test strategies, software testing frameworks, and test automation
- Champion a modern engineering culture and best practices-based software development
- Leverage DevSecOps techniques and have working experience with modern tools such as GitHub, Jira, Jenkins, Crucible, and build automation.
- Engage in application design and data modeling discussions.
- Participate in developing and enforcing data security policies Drive delivery efficiency with automtion and reusable.
Background & Experience Required
- Bachelor's degree in Computer Science or related field required.
- Minimum 5 years’ experience in field of data engineering involving analytics-focused data warehouse environments such as AWS, Snowflake, Hadoop, Oracle, etc.
- Minimum 2 years working experience in AWS utilizing services such as S3, AWS CLI, and DynamoDB
- Deep working knowledge of NoSQL, RDBMS, SQL, JSON, XML and ETL skills
- Extensive experience in data transformations, cleansing, and de-duplication
- Advanced knowledge of SQL (PSQL or TSQL)
- Experience developing data pipelines for both Cloud and Hybrid Cloud infrastructures
- Knowledge of python and other scripting languages is highly
- Experience using modern ETL tools such as InfoSphere Datastage Cloud Pack, Apache NiFi, etc.
- Experience working in an Agile delivery environment
- Hands on experience building and using DevOps pipelines and automation.