Develop scalable data collection, storage, and distribution platform to house data from vendors, research providers, exchanges, PBs, and web-scraping.
Make data available to systematic & fundamental PMs, and enterprise functions: Ops, Risk, Trading, and Compliance. Develop internal data products and analytics.
Responsibilities
* Web scraping using scripts / APIs / Tools
* Help build and maintain greenfield data platform running on Snowflake and AWS
* Understand the existing pipelines and enhance pipelines for the new requirements.
* Onboarding new data providers
* Data migration projects
Skills
Must have:
* Streamlit Expertise
* Python
* Linux
* Containerization (Docker, Kubernetes)
* Good communication skills
* Strong on DevOps side of things (K8s, Docker, Jenkins)
Languages
English B2 Upper Intermediate
#J-18808-Ljbffr