The job below is no longer available.

You might also like

in Columbus, OH

  • $47
    est. per hour
    Synopsys 8h ago
    Urgently hiring6.8 mi Use left and right arrow keys to navigate
  • $168,854.00
    Verified per year
    PMI (Project Management Institute) 8h ago
    Urgently hiring6.8 mi Use left and right arrow keys to navigate
  • PMI (Project Management Institute) 8h ago
    Just postedUrgently hiring6.8 mi Use left and right arrow keys to navigate
  • $47
    est. per hour
    TEKsystems 4h ago
    Urgently hiring12.7 mi Use left and right arrow keys to navigate
  • $26
    est. per hour
    The Tidewater Group 10d ago
    Urgently hiring6.8 mi Use left and right arrow keys to navigate
Use left and right arrow keys to navigate
Estimated Pay $60 per hour
Hours Full-time, Part-time
Location Columbus, Ohio

Compare Pay

Estimated Pay
We estimate that this job pays $60.34 per hour based on our data.

$35.74

$60.34

$77.88


About this job

Job Description

Job Description

Data Engineer (Java/Spark/AWS/ETL)
Work Location: Columbus, OH ( need local to OH) Hybrid (3 days onsite 2 days Remote)
Experience: Minimum 8 + years Required
Position type: W2 contract

Must Have Skills :
- Teradata, DBMS knowledge
- Cloud knowledge - AWS preferably
- ETL knowledge
- CICD and data warehouse concept
- Java & Spark

Candidates with Database knowledge & ETL knowledge will get 100% preference
Data pipeline work, sending data to data warehouse, etc Currently set up on Java/Spark (MUST HAVE Java/Spark)
Must have Teradata and DBMS knowledge (SQL)
Must have good ETL knowledge (Abinitio or Informatica is fine)
Must have good CICD and Data Warehouse concepts
Must have good AWS experience
Postgres DB is a big plus
Snowflake is a big plus but not a must have Team will utilize some Snowflake eventually
Python is a big plus will eventually be moving from Java/Spark to Python/Spark
Looking for 8-10+ years total experience

Job Responsibilities:
Execute software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems.
Write secure and high-quality code and maintains algorithms that run synchronously with appropriate systems.
Produce architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development.
Apply knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation.
Apply technical troubleshooting to break down solutions and solve technical problems of basic complexity.
Gather, analyze, synthesize, and develop visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems.
Proactively identify hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture.
Contribute to software engineering communities of practice and events that explore new and emerging technologies.
Add to team culture of diversity, equity, inclusion, and respect.

Required qualifications, capabilities, and skills.
8 to 10 years of Spark on Cloud development experience
9 - 10 years of strong SQL skills; Teradata is preference but experience in any other RDBMS
Proven experience in understanding requirement related to extraction, transformation, and loading (ETL) of data using Spark on Cloud
Formal training or certification on software engineering concepts and 3+ years applied experience.
Ability to independently design, build, test, and deploy code. Should be able to lead by example and guide the team with his/her technical expertise.
Ability to identify risks/issues for the project and manage them accordingly.
Hands-on development experience and in-depth knowledge of Java/Python, Microservices, Containers/Kubernetes, Spark, and SQL.
Hands-on practical experience in system design, application development, testing, and operational stability
Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages.
Proficient in coding in one or more programming languages
Experience across the whole Software Development Life Cycle
Proven understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security
Proven knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)

Preferred qualifications, capabilities, and skills
Knowledge about Data warehousing Concepts.
Experience with Agile based project methodology.
Ability to identify risks/issues for the project and manage them accordingly.
Knowledge or experience on ETL technologies like Informatica or Ab-initio would be preferable
People management skills would be given preference but is not mandatory.