Develop scalable data collection, storage, and distribution platform to house data from vendors, research providers, exchanges, PBs, and web-scraping.
Make data available to systematic & fundamental PMs, and enterprise functions: Ops, Risk, Trading, and Compliance. Develop internal data products and analytics.
Responsibilities
1. Web scraping using scripts / APIs / Tools
2. Help build and maintain greenfield data platform running on Snowflake and AWS
3. Understand the existing pipelines and enhance pipelines for the new requirements.
4. Onboarding new data providers
5. Data migration projects
Skills
Must have:
* Streamlit Expertise
* Python
* Linux
* Containerization (Docker, Kubernetes)
* Good communication skills
* Strong on DevOps side of things (K8s, Docker, Jenkins)
Languages
English B2 Upper Intermediate
#J-18808-Ljbffr