Skip to content

Data Cloud Engineer (DataOps)

  • Remote
  • IT​

Job description

About Trafilea

Trafilea is a Consumer Tech Platform for Transformative Brand Growth. We’re building the AI Growth Engine that powers the next generation of consumer brands.

With over $1B+ in cumulative revenue, 12M+ customers, and 500+ talents across 19 countries, we combine technology, growth marketing, and operational excellence to scale purpose-driven, digitally native brands.

We own and operate our own digitally native brands (not an agency), with presence in Walmart, Nordstrom, and Amazon, and a strong global D2C footprint.

Why Trafilea

We’re a tech-led eCommerce group scaling our own globally loved DTC brands, while helping ambitious talent grow just as fast.

🚀 We build and scale our own brands.

🦾 We invest in AI and automation like few others in eCom.

📈 We test fast, grow fast, and help you do the same.

🤝 Be part of a dynamic, diverse, and talented global team.

🌍 100% Remote, USD competitive salary, paid time off, and more.

Key Responsibilities


The mission of the Data Cloud Engineer is to design, implement, and maintain scalable, secure, and cost-effective cloud infrastructure and data pipelines on AWS. This role combines DevOps and DataOps practices to ensure seamless infrastructure automation, data management, and deployment workflows. The engineer will collaborate cross-functionally with data, BI, and development teams to support critical business needs while proactively identifying improvements in automation, security, and cost efficiency.

  • Design, deploy, and optimize AWS cloud infrastructure (S3, Redshift, Lambda, ECS) for scalability and performance.

  • Build and maintain data architectures and ETL pipelines supporting BI and analytics.

  • Implement Infrastructure as Code (Terraform, Terragrunt) and manage CI/CD pipelines for automation.

  • Use Docker and Kubernetes to enable containerized, reliable deployments.

  • Ensure cloud security, compliance, and cost efficiency across environments.

  • Collaborate with engineering and data teams to resolve issues and enhance workflows.

  • Document best practices and share expertise to strengthen the team’s capabilities.

Job requirements

  • 3+ years of experience as a Cloud, DevOps, or Data Engineer.

  • Deep expertise with AWS services (S3, Redshift, Glue, Lambda, EMR, ECS, RDS, DynamoDB).

  • Strong background in data architecture, ETL design, and automation.

  • Proficiency in Terraform/Terragrunt for IaC and CI/CD pipeline management.

  • Knowledge of Docker, Kubernetes, and container orchestration.

  • Understanding of cloud security frameworks and compliance standards.

  • Solid scripting skills in Python or Bash.

  • Strong communication, collaboration, and problem-solving skills.

  • A mindset of ownership, curiosity, and continuous learning.

or