Clearing, Markets & Issuer Services Technology (CMIST)
is responsible for application development and support for critical business systems including Repo Edge (collateral management), Enterprise Payment Hub (multi-currency payment processing), and Broker Dealer Clearance (securities clearing), along with approximately 350 other applications used by the following high-priority business services and their clients.Clearance and Collateral Technology (CCT)
within CMIST builds clearance and collateral management platforms to service BNY Mellon’s broker-dealer clients and is the sole provider for Government Securities Clearing Services. Considering the core nature of the business and the dominant market share, high-performance and resiliency are key pillars of the technology architecture. The group is currently focused on Repo Modernization, which will pave the way to the Future of Collateral for the business – providing strengthened platform resiliency and new trade capabilities within a unified system. All CCT applications are built for real-time operations aided by a data warehouse for reporting and business insights.This is a 100% hands-on role, and doesn't involve any team management responsibilities Responsibilities:
- Develop, deploy, and maintain scalable/optimal data pipeline and the supporting data platforms and infrastructure
- Standardize all data pipeline processes, develop reusable modules (auditing, logging, traceability), integrate with DevOps pipeline
- Recommend improvements to existing data integration architecture and implementation of standard frameworks that automate manual processes, improve overall scalability, increase reliability, and improve quality
- Define the right data models working with ML engineers, data scientists, and reporting/analytics developers – optimizing data storage, improving data accessibility, and making them re-usable
Serves as the technical expert in the design, development, implementation and maintenance of data, reporting and database technologies and tools. Consults with businesses to resolve highly complex data issues. Formulates standards, processes and procedures to align with the data architecture/management for major application projects. Leads the creation and evolution of the strategy and direction of database design, business intelligence and analytics. Leads development of complex database designs in multiple parallel projects through in-depth understanding of business needs and functionalities. Consults with database administration and client areas and provides solutions in resolving highly complex issues during the translation to a physical database design. Provides expertise in the most complex processes of integrating data across existing and modified applications. Provides innovative direction and guidance on reports and ensures recommendations are aligned with user needs and capabilities. Stays abreast of emerging technologies and identifies potential use of new and existing technology within lines of business by participating in industry-wide conferences and research and by having in-depth knowledge of various business areas. Contributes to the achievement of Data Modeling/Warehousing objectives.
BNY Mellon is an Equal Employment Opportunity/Affirmative Action Employer.Minorities/Females/Individuals With Disabilities/Protected Veterans.Our ambition is to build the best global team – one that is representative and inclusive of the diverse talent, clients and communities we work with and serve – and to empower our team to do their best work. We support wellbeing and a balanced life, and offer a range of family-friendly, inclusive employment policies and employee forums.Primary Location:
- BS degree (Computer Science, Math, Physics, Engineering). MS / PhD preferred.
- 12+ years of experience in software development required
- 5+ years overall hands-on experience in building and maintaining production grade data pipelines, developed in Python or Java, that support critical business functions, reporting, and analytics
- 3+ years overall hands-on experience in Python
- Very strong SQL skills with multiple years of experience in developing and maintaining complex SQLs across different standard and big data technologies (preferred)
- Extensive experience with big data platforms, technologies, and tools and technologies (Cloudera Hadoop, Spark, Hive, Impala, Dremio, Google BigQuery, ECS, etc.) – understanding their architecture, infrastructure requirements, troubleshooting tricks, integration with other platforms, and automation of processes
- Should have experience in handling large volume/size (billion / > 100 TB), wide data sets (> 200) and velocity (<5 min micro-batches and exposure to streaming/event based architecture)
- Hands-on experience around compute and storage in any one of the public cloud platforms such as Google Cloud (preferred), AWS, or Azure is highly preferred.
- Experience in the securities or financial services industry is a plus.
United States-New York-New YorkInternal Jobcode:
Clearing Markets ISS Svcs Tech-HR16624Requisition Number: