Skip to content

Cloud Engineer (DevOps)

  • Remote
  • IT​

Job description

About Trafilea

Trafilea is a Consumer Tech Platform for Transformative Brand Growth. We’re building the AI Growth Engine that powers the next generation of consumer brands.

With over $1B+ in cumulative revenue, 12M+ customers, and 500+ talents across 19 countries, we combine technology, growth marketing, and operational excellence to scale purpose-driven, digitally native brands.

We own and operate our own digitally native brands (not an agency), with presence in Walmart, Nordstrom, and Amazon, and a strong global D2C footprint.

Why Trafilea

We’re a tech-led eCommerce group scaling our own globally loved DTC brands, while helping ambitious talent grow just as fast.

🚀 We build and scale our own brands.

🦾 We invest in AI and automation like few others in eCom.

📈 We test fast, grow fast, and help you do the same.

🤝 Be part of a dynamic, diverse, and talented global team.

🌍 100% Remote, USD competitive salary, paid time off, and more.

Key Responsibilities

We are seeking a Data Cloud Engineer (DevOps + DataOps) to design, build, and maintain our AWS-based cloud infrastructure and data pipelines. You’ll ensure scalability, security, and efficiency across mission-critical systems while driving automation and modern DevOps practices. This is a hands-on role with the opportunity to own infrastructure projects end-to-end, mentor peers, and introduce best practices that power data-driven decision-making at global scale.

  • Cloud Infrastructure (AWS): Design, deploy, and optimize AWS environments (S3, RDS, Redshift, DynamoDB, Glue, EMR, Lambda, ECS, etc.).

  • Data Pipelines: Build and manage robust ETL workflows and scalable data architectures.

  • Automation & DevOps: Implement Infrastructure as Code (Terraform/Terragrunt) and integrate CI/CD pipelines.

  • Containerization: Deploy applications with Docker and Kubernetes.

  • Security & Compliance: Apply cloud security best practices across all environments.

  • Collaboration: Partner with BI, data, and development teams for seamless deployments.

  • Troubleshooting: Diagnose and resolve cloud and data-related issues with agility.

  • Knowledge Sharing: Contribute to team learning and mentor junior engineers.

Job requirements

  • Strong experience with AWS services (S3, Redshift, Glue, EMR, Lambda, etc.).

  • Proficiency in data architecture and ETL tools.

  • Hands-on expertise in Terraform/Terragrunt and CI/CD pipelines.

  • Familiarity with Docker/Kubernetes for containerized deployment.

  • Skilled in Python, Bash, or similar scripting languages.

  • Problem-solver with strong analytical and troubleshooting ability.

  • Excellent communicator and team collaborator.

  • Proactive, ownership-driven, and eager to continuously learn.

or