Use left and right arrow keys to navigate
Based on similar jobs in your market
Estimated Pay info$84 per hour
Hours Full-time
Location Washington, District of Columbia

About this job

COSMOTE Global Solutions, as a member of OTE Group of Companies, is an ICT Systems Integrator delivering a broad range of ICT Solutions and Services.

CGS provides a broad range of ICT Services focusing on: Cloud, Data Centre operations, Networking, Cybersecurity, BI and Data Warehouse, Big Data, Service Desk, Proactive Monitoring, Operations and Support, Service Management, Project and Programme Management, and Professional Services.

Responsibilities:

  • Design and implement ETL processes to extract, transform, and load structured, semi-structured, and unstructured data.
  • Integrate data into downstream systems including applications, data warehouses, and data lake storage platforms
  • Collaborate with data scientists, analysts, and other stakeholders to ensure data availability and usability
  • Develop and maintain scalable data pipelines using tools like Python, Kafka, and REST/SOAP APIs
  • Manage and optimize relational databases such as Sybase, Oracle, and PostgreSQL
  • Automate schema versioning and migrations using Liquibase
  • Build and maintain cloud-based data infrastructure on AWS and Azure
  • Use Anaconda and Python for data manipulation, scripting, and analytics support
  • Ensure data quality, consistency, and security across all environments
  • Contribute to the strategy, development, and execution of enterprise-wide data initiatives
  • Provide ongoing maintenance and support for the data analytics infrastructure

Requirements

  • Master’s degree in Computer Science, Software Engineering, or a related field.
  • Proficiency in English
  • Expertise in REST/SOAP APIs, API Gateway, and Kafka.
  • Experience with JSF, JBoss Seam, RichFaces frameworks.
  • Proficient with Liquibase, SonarQube and GitHub/Nexus/AzureSEA
  • Strong experience in CI/CD using Jenkins, Maven, AzureDevOps, Terraform, Ansible, Kurebernetes, Linux, HELM
  • At least 7 years experience in provisioning of infrastructure on the cloud mainly AWS and Azure and experience in Terraform, Ansible, Kubernetes, Linux, HELM
  • Database management experience with Sybase, Oracle, and PostgreSQL
  • Strong analytical and problem-solving skills.
  • Minimum 3 years of experience in Agility (SCRUM / Kanban)
  • At least 3 years of experience with Anaconda and Python
  • At least 5 years of experience in coordinating the 2nd level production support on an application supported by a team of minimum 20 IT people

Nearby locations

Posting ID: 1184335997 Posted: 2025-12-04 Job Title: Senior Data Engineer