Exciting Opportunity for a Java Developer/Data Engineer/Engenheiro de Dados – Work Remotely from Brazil! Ready to take your career to the next level with a top-tier US company? If you’re passionate about data, thrive in remote work environments, and have the expertise we’re looking for, this is your chance! We’re offering competitive pay, paid time off, and a supportive team culture where you can grow. We need someone with your skills to drive innovation and scale our data systems! What We're Looking For: 5+ years of Java Development experience Expertise with AWS Data Processing Tools and ETL Pipelines Proven experience in building and optimizing data pipelines Ability to work independently and collaboratively in a fully remote role Based in Brazil – join our global team! What We Offer: 100% Remote Work – work from anywhere in Brazil Great Pay – we value your expertise! Paid Time Off – we support work-life balance Opportunity to innovate and impact global projects! Ready to make a big impact? Apply now and let's build the future of data, together! About Rocket Financial: Rocket Financial is an early-stage start-up in the Financial Technology space. Our mission is to change the way money moves around the planet. Primarily, we are building a business technology platform for those wanting to integrate and embed financial services into their customer applications. Company Website: https://rocketfncl.com/ RESPONSIBILITIES Partner with internal operations teams to identify, collect, and integrate data from various business systems, ensuring comprehensive and accurate data capture Design, implement, and maintain robust data pipelines that feed data into our Data Platform, ensuring high performance, scalability, and reliability Ensure data pipelines adhere to best practices and are optimized for performance and scalability Conduct thorough testing of data pipelines to validate data accuracy and integrity Monitor data pipelines, troubleshoot any issues that arise, and make improvements to these issues where applicable Establish and track SLAs for data processing and delivery, ensuring timely and reliable access to data for all users Become a mentor for less experienced team members, and establish patterns and practices that can be followed to increase quality, accuracy, and efficiency of solutions produced by the team Work with other teams in order to ensure access to data corresponds with company policies, and ensure data access, processing, and storage is in compliance with regulatory (e.g. GDPR, CCPA, etc.) requirements QUALIFICATIONS 4+ years experience with AWS data processing tools (storage, processing, etc.) 5+ years of Java development experience. Experience with Kafka and Kafka Connect. Experience creating ETL pipelines using AWS tools (Lambda, Glue, Redshift, S3). Experience with orchestration tools such as Apache Airflow. Experience working with data visualizations and dashboards. Experienced with JDK 17+, Spring Boot. Experience in creating and implementing REST API in microservices components Experience with both relational and non-relational data stores. Experience with Docker. Excellent verbal and written communication skills. Excellent time management and organizational skills. Ability to keep current and do independent research when needed. Experienced as a Data Engineer with Java.