Data Platform Engineer (remote within US)
|, - United States
contact recruiter for details
|equity / benefits
WHY IS THIS A GREAT OPPORTUNITY?
At this current period of history, more people are working from home. Our client's business is booming because their technology is being used by remote workers. They understand how important it is to be connected, and they walk the walk.
Our client is thriving in this pandemic, providing solutions for employees that work out of their home. We are looking for an accomplished, enthusiastic, and driven engineer with experience building data processing and storage systems. Our ideal candidates have architected and deployed systems to support multiple (small) engineering teams with specific needs and they enjoy a large degree of autonomy and ownership a company's data infrastructure.
- Design and develop data pipelines, ETL, storage solutions, and workflows that are optimized for speed, fault-tolerance, and scalability
- Work with Application, Machine Learning, and Site Reliability/DevOps engineers to create systems that support their varied data needs while allowing for independent manipulation and iteration of data
- Define robust data schemas for the rapid intake and processing of customer data with diverse structures
- Support product-focused engineering teams with data infrastructure, APIs, and scalable deployments
- Architect and author internal libraries for use by fellow engineers
- Help create data analytics tools for software telemetry and business intelligence purposes
- Cultivate a better understanding of data handling best practices across engineering teams
- Collaborate on security efforts for customer data
Meanwhile, the requirements include:
- At least 4 years of experience writing code in any of our favorite languages: Ruby, Go, Python, Scala, Elixir, Java, or similar languages at a SaaS company.
- Bachelors or Masters degree
- Strong understanding of relational and non-relational databases such as PostgreSQL, ElasticSearch, and Redis
- Experience creating and deploying container-based software
- Familiarity with asynchronous data processing patterns with an added focus on monitoring and logging
- Prior experience working with AWS or a similar cloud provider
- Ability to communicate ideas to technical and non-technical colleagues
- Ability to organize and model data to support many use cases
- Experience designing, building, and maintaining distributed or event-driven systems
- Experience supporting Machine Learning engineers with data preparation, validation, annotation, and model evaluation
- Previous work with workflow management and/or task scheduling systems
- Prior use of Terraform/Ansible/Infrastructure as Code tools
University - Bachelor's Degree/3-4 Year Degree