The job below is no longer available.

You might also like

in Jenkintown, PA

Use left and right arrow keys to navigate
Estimated Pay $62 per hour
Hours Full-time, Part-time
Location Jenkintown, Pennsylvania

Compare Pay

Estimated Pay
We estimate that this job pays $62.1 per hour based on our data.

$30.44

$62.10

$85.85


About this job

Data Engineer
Become a Part of the NPT Team

National Philanthropic Trust is a public charity dedicated to providing philanthropic expertise to donors, foundations, and financial institutions, enabling them to realize their philanthropic aspirations. NPT was founded in 1996. Since that time, we have raised more than $57.3 billion in charitable contributions and currently manage $35.1 billion in charitable assets. We have made more than 720,000 grants totaling more than $28.5 billion to charities all over the world. We rank among the largest grantmaking institutions in the United States.

Our mission is to increase philanthropy in society. To that end, our experienced staff of philanthropic professionals are fully prepared to help you establish and administer your donor-advised fund. NPT is led by a Board of Trustees composed of nationally known experts in philanthropy and business.

At National Philanthropic Trust, we foster a welcoming environment for all.People are valued and respected for who they are-with opportunities to bring theirentrepreneurial spirit andtalents to increase giving around the world. We have an inclusive, supportive, collaborative culture that makes National Philanthropic Trust one of the most rewarding places to work.
What You'll Do
The Data Engineer should have a broad foundation of data architecture principles, experience working with integration technologies and a demonstrated ability to bring an AI-centric mindset to the team. They will play a key role in driving our data-driven transformation by designing, implementing, and maintaining our data infrastructure.
This role will also be responsible for contributing to the strategic development and maintenance of reports and dashboards for business users, driving scalability, furthering the data integrity and accuracy in applying long-term data management best practices. Once established, they will act as a subject matter expert on NPT's data models, identifying key areas of opportunity for improving data integrity, dataset segmentation and modularity, and scalability technique for larger reports and queries. This work will be executed while strategically iterating toward effectively leveraging emerging AI technologies.
Salary range: $80,000 - $85,000
Duties/Responsibilities:
  • Collaborate with the team to implement data warehousing, ETL solutions, and automated data flows, supporting API development and establishing a foundation for an AI-centric landscape.
  • Continuously improve our data management platform using established software development methodologies.
  • Define development and data integration standards to ensure consistency and efficiency within the team.
  • Implement and maintain the data management backbone of complete solutions as part of the data integration strategy to proactively support rapid business changes and empower our AI initiatives.
  • Integrate new data sources, expanding our external data interactions within partner data models.
  • Develop test cases to validate data integration meets or exceeds expectations.
  • Serve as a resource for assigned data-related projects, ongoing data integration, and maintenance.
  • Ensure data model alignment across APIs, applications, reporting, and AI consumption.
  • Design and build input/output interfaces as part of our expansion into custom application development and evolving reporting landscape.
  • Assist in transitioning ongoing maintenance and development of mature products to support resources.
  • Collaborate on defining internal data entry standards through automation and technical enforcement.
  • Oversee existing technical and user-level data process documentation.
  • Utilize and maintain data flow diagrams and process materials for critical business applications.
  • Stay updated on industry trends to proactively identify areas for growth, improvement, and development.
  • Develop and manage ETL processes using SSIS and Mulesoft as part of end-to-end solutions.
  • Serve as a data architect for database development within both the datamart and internally developed applications.
  • Build and maintain SQL interaction flows through codebase execution.
  • Provide oversight on the maintenance of our SQL Server software configuration, ensuring best practices in security, optimization, and integration.
  • Offer guidance based on business requirements and technical translations to data analysts and contractors for Power BI modeling, application integration, and development.
  • Plan future implementations into cloud infrastructure through scalable deployments and management.
The above list is not exhaustive, additional tasks may be assigned as necessary but are not a major function of the position.
What You Bring:
  • Passion for expanding into cloud and AI architecture.
  • Demonstrated experience working with API implementations.
  • Understanding of data preparation techniques, such as cleaning, outlier detection, and dimensionality reduction.
  • Solid foundation in programming languages used for data manipulation (e.g., Python, R, Java, C++).
  • Familiarity with machine learning models and their requirements.
  • Extensive knowledge of implementing ETL processes using tools like MS SSIS and Mulesoft.
  • Strong understanding of basic programming and ETL logic (loops, data types, parsing).
  • Experience managing Power BI infrastructure and supporting code-based integration points (DAX, PowerQuery, TSQL).
  • Experience building data input/output interfaces within application platforms as part of a comprehensive solution.
  • Comfort with Windows Server environments and MS SQL Server infrastructure.
  • Extensive knowledge of Salesforce CRM object, field, and custom metadata structures.
  • Advanced knowledge of Excel (including pivot tables, macros, vlookups, queries, advanced formulas).
  • Ability to conduct basic troubleshooting.
  • Proven track record of delivering project implementations as a technical leader.
  • Experience working in a small company or team is a plus.
  • Self-motivated, collaborative, and willing to define and drive the direction of the role.
  • Ability to work independently and collaboratively, with a strong sense of urgency in meeting deadlines and resolving critical issues.
  • Excellent written and oral communication skills.
Education and Experience:
  • Bachelor's degree or equivalent experience in technical, computer science, or math discipline.
  • 7+ years contributing to project-based IT implementations.
  • 5+ years of experience working directly with MS SQL (TSQL).
  • 2+ years of experience developing data processes for integration.
Physical Requirements:
  • Prolonged periods of sitting at a desk and working on a computer.