Job Title: Platform Engineer Experience: 7+ years Work Mode: Remote Work Time: 5:30-6:00 AM CST to 1:30-2:00 AM CST Location: Anywhere Nearshore (Costa Rica, Brazil, Colombia, Argentina, Uruguay, Peru, Mexico, Chile) Overview: The Data Ops Engineer understands that data is vital for information-driven environments and crucial for business success.
They will collaborate with Data Architects, Data Engineers, and Data Scientists to design solutions that deliver consistent, reliable, and efficient data using best practices and standards.
These efforts result in accurate and trusted data assets for the Enterprise Data Platform and the organization.
Additionally, they will support tools to build the data platform for DevOps processes, enhancing the platform and utilities to improve existing data pipelines and engineer new ones.
Key Responsibilities Technical Platform Operations Automate administrative tasks using Python and SQL for APIs and CLIs to enhance the data platform.
Collaborate with technical staff to optimize their environments.
Manage cloud platforms such as Snowflake, Databricks, Qlik, Event Hubs, Power BI, and other Azure services.
Develop notification solutions to ensure cloud platforms remain operational.
Install patches and upgrades as required.
Provide issue resolution and escalate when necessary.
Report on business metrics of the platform, including adoption, goal alignment, and product team needs.
Develop custom DevOps capabilities when built-in solutions are unavailable.
Engage with vendors to learn and implement new platform capabilities.
Set platform standards and measure data pipeline compliance and platform uptime.
Participate in MVP and PoC activities to explore new technological capabilities.
Perform root cause analysis for technical issues.
Implement data quality checks and monitoring systems based on source data gaps and business rules.
Optimize compute clusters to reduce costs and ensure workload isolation.
Provide cost monitoring and visibility tools to track and report cloud platform compute expenses.
Product Development Operations Communicate effectively with technical teams and stakeholders.
Automate release processes for data applications.
Implement frameworks to support varied data ingestion methods.
Identify and resolve technical and process-related issues.
Document processes and facilitate onboarding for new engineers.
Measure and report cloud platform usage and efficiencies.
Required Competencies & Skills Experience: 5+ years in data operations, platform engineering, or related fields.
Proficiency in modern data platforms, including Snowflake, Databricks, Power BI, and Qlik Data Integration.
Basic technical knowledge and understanding of data processing.
Understanding and application of data quality principles.
Ability to identify potential technical issues and risks.
Strong collaboration skills across all levels of the data platform and analytics teams.
Experience participating in PoC efforts to research and test emerging technologies.
Hands-on experience working closely with technical personnel.
Proficiency in Python and SQL.
Strong technical problem-solving skills.
Enthusiasm for technology and a desire to advance in a tech-focused career.
Experience with CI/CD and Infrastructure as Code (IAC).
Exposure to platform monitoring tools and Airflow (preferred).
Self-starter mentality with strong communication skills and a passion for learning new technologies.
Team & Work Environment: Work closely within a small, agile team (7 members) in a fast-paced environment.
Focus on identifying and building tools that enhance user experience and improve platform capabilities.
Engage in a "build-focused" role, prioritizing automation, optimization, and efficiency.