Chicago, London, New York
Posted 3 years ago

Data Engineer

Data is the core of the investment process. Data Engineers architect and build our data platforms which drive how we source, enrich, and store data that integrates into the investment process. These Data Engineers own the entire data pipeline starting with how we ingest data from the outside world, transforming that information into actionable insights, and ultimately designing the interfaces and APIs that our investment professionals and quantitative researchers use to monetize ideas. Throughout the process, our Data Engineers partner with top investment professionals and data scientists to design systems that solve our most critical problems and answer the most challenging questions in finance.

 

Your Opportunity:

  • Develop solutions that enable investment professionals to efficiently extract insights from data. This includes owning the ingestion (web scrapes, S3/FTP sync, sensor collection), transformations (Spark, SQL, Kafka, Python/C++/Java), and interface (API, schema design, events)
  • Partner with the industry’s top investment professionals, quantitative researchers, and data scientists to design, develop, and deploy solutions that answer fundamental questions about financial markets
  • Build tools and automation capabilities for data pipelines that improve the efficiency, quality and resiliency of our data platform
  • Drive the evolution of our data strategy by challenging the status quo and identifying opportunities to enhance our platform

Your Skills & Talents:

  • Passion for working with data in order to accurately model and analyze complex systems such as a publicly traded company, commodity market, economy, or financial instruments
  • Strong interest in financial markets and a desire to work directly with investment professionals
  • Proficiency with one or more programming languages such as Java or C++ or Python
  • Proficiency with RDBMS, NoSQL, distributed compute platforms such as Spark, Dask or Hadoop
  • Experience with any of the following systems: Apache Airflow, AWS/GCE/Azure, Jupyter, Kafka, Docker, Kubernetes, or Snowflake
  • Strong written and verbal communications skills
  • Bachelor’s, Master’s or PhD degree in Computer Science or equivalent experience

 

Job Features

Job Category

Software & Engineering

Apply For This Job

No file chosen
Browse