Risk and Finance CIO organization provides technology to several lines of businesses including Risk, Compliance, Finance, Audit and Legal with a combined technology spend in excess of $500+MM annually and is a 1100+ person global organization. Risk & Finance Technology is part of Enterprise Business Technology team within Wells Fargo’s Enterprise Information Technology Division.
Risk & Finance Core Services is a horizontal function that supports Data & Reporting needs across Risk & Finance Technology. Risk & Finance Core Services delivers data and reporting as shared services/capabilities for consumption by Risk & Finance applications built by teams aligned to various business functions across Risk & Finance.
This is a senior technology lead position for developing innovative data & reporting services solutions using big-data, cloud and container based architecture solutions for Risk & Finance. Will lead a team of developers, analysts and testers. The primary responsibility of this individual is to lead Enterprise Risk data target state technology platform design and development. This lead will deliver data has a service/capability to Enterprise Risk CIO/CTO vertical teams, to be used in various Enterprise Risk applications. This individual will report to a senior technology manager aligned to Enterprise and Credit Risk data & reporting.
- Work closely with Enterprise Risk CTO vertical teams responsible for Enterprise risk to understand the data requirements.
- Analyze incoming data requests for Enterprise Risk and determine appropriate target solutions.
- Develop data service design
- Lead the data service build, test and deployment
- Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data
- Work with business analysts, development teams and project managers for requirements and business rules.
- Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
- Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
- Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
- Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance
- Support ongoing data management efforts for Development, QA and Production environments
- Utilizes a thorough understanding of available technology, tools, and existing designs.
- Acts as expert technical resource to programming staff in the program development, testing, and implementation process.
- 10+ years of application development and implementation experience
- 5+ years of ETL (Extract, Transform, Load) Programming experience
- 5+ years of Hadoop experience
- 5+ years of reporting experience, analytics experience or a combination of both
- 5+ years of experience creating executive level dashboards, analytics, and reports using advanced visualization tools
- 5+ years of Java or Python experience
- 3+ years of design and development experience with columnar databases using Parquet or ORC file formats on Hadoop
- 3+ years of Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs)
- 3+ years of Agile experience
- 2+ years of advanced scripting experience using Unix Shell Scripting, Perl, Python, Java, or PL-SQL
- Excellent verbal, written, and interpersonal communication skills
- Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
- Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects
- Knowledge and understanding of DevOps principles
- Ability to interact effectively and confidently with senior management
- Experience developing data processing or analytics solutions on Cloud platforms such as AWS
- Experience designing and developing data analytics solutions using object data stores such as S3
- Experience designing and developing data warehouse or analytics applications using Apache Spark
- Experience designing and developing business intelligence or data analytics applications using tools such as Tableau by integrating with Apache Spark
- Experience Hadoop ecosystem tools relevant for real-time and batch data ingestion, processing and provisioning using tools such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Hive or Apache Storm
- Experience performing proof of concepts
- A BS/BA degree or higher in information technology
- 3+ years of Risk or Finance domain experience in financial services industry
- Should have a service-oriented mentality and possess a strong sense of ownership of the scope assigned
- Wells Fargo application development experience
NC-Charlotte: 301 S Tryon St - Charlotte, NC
- All offers for employment with Wells Fargo are contingent upon the candidate having successfully completed a criminal background check. Wells Fargo will consider qualified candidates with criminal histories in a manner consistent with the requirements of applicable local, state and Federal law, including Section 19 of the Federal Deposit Insurance Act.
Relevant military experience is considered for veterans and transitioning service men and women.
Wells Fargo is an Affirmative Action and Equal Opportunity Employer, Minority/Female/Disabled/Veteran/Gender Identity/Sexual Orientation.