Job Reference #
Are you a developer who is passionate about coding and wants to build a career in Java, Big-Data, Cloud technologies? Are you a passionate leader for change? Our ideal candidate will participate in design and development of a complex strategic Risk Aggregation platform using Core java, Big-Data, Grid computing and caching technologies. The candidate will interact with globally distributed Market Risk Officers, Quants and IT Dev/QA team
IT Java Developer
Country / State
United States - Tennessee
Information Technology (IT)
We're a truly global, collaborative and friendly group of people. Having a diverse, inclusive and respectful workplace is important to us. And we support your career development, internal mobility and work-life balance. If this sounds interesting, apply now.
Disclaimer / Policy Statements
UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
•Design and Develop high quality software solutions for Market Risk VaR (Value-at-Risk) Aggregation
•Develop new applications to meet regulatory commitments (e.g: FRTB, GMETH)
•Improve Performance of Risk Calculation Engine
•Implement scalable solutions to meet the ever increasing data volumes, using big data/cloud technologies Apache Spark, Hadoop, Microsoft Azure Cloud computing etc.
•Develop proof of concepts using Big Data, Apache Spark and Azure.
• Bachelors or Master's degree in Computer Science or Information Technology or the international equivalent in computer science, Math, Physics, Engineering or a related field
• Strong hands on experience with Core Java, Python and Scala etc.
• Experience in holding a technical role within a team of technocrats; especially for high-performance, large-scale systems through various project lifecycle stages like design, build, testing etc.
• 5+ years of experience as a data engineer designing and developing large scale distributed software systems using open source tools and big data technologies such as Apache Spark, Hadoop platforms such as Hortonworks, Cloudera or Databricks is required.
• Experience with clustered/distributed computing systems, such as Hadoop, Spark, No SQL Databases, Analytics Notebooks like Jupyter, Zeppelin etc
• Knowledge of Data Warehousing best practices; modeling techniques and processes and complex data integration pipelines.
• Experience building data lakes and data pipelines in cloud using Azure and Databricks will be an added advantage.
• Experience building data pipelines for structured/unstructured, real-time/batch, events/synchronous/asynchronous using Kafka, Steam processing.
Expert advice. Wealth management. Investment banking. Asset management. Retail banking in Switzerland. And all the support functions. That's what we do. And we do it for private and institutional clients as well as corporations around the world.
We are about 60,000 employees in all major financial centers, in more than 50 countries. Do you want to be one of us?
Posting ID: 559419202Posted: 2020-07-14