The job below is no longer available.

You might also like

in Oak Ridge, TN

Use left and right arrow keys to navigate
Hours Full-time, Part-time
Location Oak Ridge, Tennessee

About this job

Job Description

Job Description
Data Operations Analyst
East Tennessee Research company is seeking a Data Operations Analyst/Engineer to join their growing team. This position is a contract-to-hire opportunity and will require candidates to pass a background and drug screen and be able to obtain a federal security clearance. This position will require candidates to work onsite and is a great opportunity for someone who is seeking to work with cutting edge technologies in an R&D environment.
Major Duties/Responsibilities:
You will work independently and/or as part of a team in research, analysis, and operations of backend data flow processes, including analyzing scientific data-related problems and formulating the necessary solutions; using information technologies (i.e., Python and Bash scripting) and learning new technologies; conducting requirements analysis; developing automated workflows; using scripting languages; knowledge of Linux command-line interface; working with open source tools; database relationships and query development; configuring, deploying and executing backend processes on Linux based servers; ability to work with distributed and diversified teams on various data flow processes; ability to multitask; create documentation for developers, other operational team members, and program staff.
We are seeking a self-starter willing to develop expertise in data preparation for archiving and distribution, as well as advanced data tools development and data flow operations. The following skills and experience are desirable:
Qualifications:
Associate or bachelor’s degree in Computer Science or Information Systems or related field is preferred.
Two years of progressive Linux systems administration, including experience in monitoring and management of data workflows and standards is required.
Experience in working in a Dev/Ops environment is required.
Strong software experience in building data analysis or automating data workflows using modern technologies.
Experience working with scientific data is a plus.
Knowledge and experience in software languages such as Python and scripting languages is a must.
Knowledge and experience in database applications such as PostgreSQL and database query language.
Knowledge of any of the following is a plus: web servers (Apache, NGINX, Tomcat, etc.), DNS (bind), OpenLDAP, databases (PostgreSQL, MySQL, NoSQL), virtualization and container infrastructure (VMware, Docker, Singularity, Kubernetes), storage and communication protocols (zfs, nfs, ftp, scp, etc.), HPC/cluster technologies (Slurm), and monitoring (Nagios, Checkmk).
Good communication skills in English, both oral and written.
Desire to learn and adopt new tools and technologies as required by the projects.