logo

View all jobs

Senior Data Engineer

Baltimore, MD

Job description

The Global Technology team at T. Rowe Price is playing a key role in helping build the future of financial services, working hand-in-hand with business partners to create client experiences that are changing the way people invest. You will work with smart, talented people across our business. 
 
 
 
The Data Domain services team in Baltimore, MD was established to build new capabilities for the firm’s research, data science, machine learning, and business analytics efforts. This is a very young team, enabling you to make an impact from the onset. The team is at the forefront of innovation as we embark on building capabilities to transform large amounts of structured and unstructured data such as text, time series, and events into machine-readable knowledge fueling applications and the investment decision making process. 
 
 
 
The ideal candidate is someone who combines ambition with humility and is happy to let their performance do the talking. The team is responsible for the ingestion and processing of large, disparate datasets and application of advanced statistical and machine learning methods to produce models and prescriptive data that result in better decisions by our Portfolio Managers, Analysts, and Traders, Quants and / or applications. 
 
 
 
Core Capabilities & Attributes:   
  • Exceptional technology skills; Recognised by your peers as an expert in your domain 
  • Craftsman-like approach to building software; Takes pride in engineering excellence and instils these values within the team and others
    • Participating in the design of effective engineering processes 
    • Product development expertise 
    • Architecture, craftsmanship & engineering discipline 
    • Domain skills & experience 
  • A proponent of strong collaborative software engineering techniques and methods: Agile development, continuous integration, code review or pairing, unit testing, refactoring and related approaches 
  • Excellent problem-solving and critical-thinking skills; demonstrated ability to employ fact-based decision-making to resolve complex problems, by applying logic analysis, experience and business knowledge 
  • Possess a passion for technology and staying sharp in your craft by keeping on top of new technologies, tools and trends 
  • Ensure and manage excellent customers relationships 
  • Demonstrable passion for technology (e.g. personal projects, open-source involvement) while using their problem solving capabilities to deliver solutions utilizing a top end engineering approach 
 
 
 
Responsibilities:
  • Engineer world-class products with maximum efficiency and agility 
  • Enable improvement of the engineering team through shaping of tools, processes and standards 
  • Interact with Quants and Analysts to understand their workflows and requirements 
  • Collaborate with your engineering manager to enable a fit-for-purpose application portfolio consistent with the target architecture and operating model 
  • Produce comprehensive, usable dataset documentation and metadata 
  • Evaluate and make decisions around dataset implementations designed and proposed by peer data engineers 
  • Ensure excellent customers relationships 
 
 
 
Minimum qualifications:
  • BS degree in Computer Science, Applied Mathematics, related field, or equivalent practical experience 
  • 5+ years of progressive engineering experience with 3+ years in Data Engineering 
  • Minimum of 3 years designing and building large scale data loading, manipulation, processing, analysis, blending and exploration solutions using cutting edge technologies such as EMR, Spark, Presto/Athena, Kinesis, Lambda, Parquet, Orc, S3, Kafka etc.), NoSQL(e.g. Hbase, Cassandra), and In-Memory Data Technologies 
  • Strong knowledge of one or more relevant data processing and database technologies e.g. Spark, Presto/Athena, Google Big Query, Parquet, ORC, Hive, Redshift/Netezza/Greenplum, PostgreSQL, NoSQL database e.g. Dynamo/HBase/Cassandra, KDB/OneTick 
  • Minimum of 4 years of experience in preparing and refining data sets and derived data in various emerging (Python, Scala, Spark) and traditional tools (Trifacta, Alteryx, Informatica, SQL) 
  • Experience in extracting, aggregating, structuring and optimizing performance of large data sets along different dimensions (e.g. position/ instrument/ counterparty level) 
  • Strong interpersonal skills; Able to establish and maintain a close working relationship with quantitative researchers, analysts, traders and senior business people alike 
  • Able to thrive in a fast-paced, high energy, demanding and team-oriented environment 
 
 
 
Preferred qualifications:
  • Advanced Degree in one or more of the following disciplines: Computer Science, Mathematics, Statistics, Economics, Computing, Quantitative Finance or other Quantitative, Numerical or Computing discipline 
  • Experience building containerized applications and deploying to public or private clouds, such as Amazon Web Services (AWS), Microsoft Azure, or similar providers. 
  • Open source involvement such as a well-curated blog, accepted contribution, or community presence 
  • Proficient with a range of open source frameworks and development tools – e.g. Angular, Node, SpringBoot, NiFi, Flask, NumPy, SciPy, pandas, Git, Jenkins, Maven, etc. 
  • Proficient on Linux platforms with knowledge of various scripting languages 
  • Working knowledge of data science and data analytics technologies e.g. Domino Data Labs, Data Bricks, Jupyter, AWS QuickSight, Qlik, Tableau 
  • A solid understanding of financial markets and instruments 
  • Experience of front office software development with an Asset Management, Hedge fund or Investment Bank 
 
 
 
VISA SPONSORSHIP IS NOT AVAILABLE FOR THIS POSITION

Share This Job

Powered by