What is required?
- 5+ years experience in software development
- BS in Computer Science or equivalent practical experience
- Strong SQL knowledge;
- Practical experience building Data Warehouse(s) with Snowflake or other DW-oriented databases (Google BigQuery, Redshift, etc.)
- Practical experience building ETL/ELT data pipelines from scratch or using existing frameworks like StreamSets DataCollector (SDC), Fivetran, Apache NiFi.
- Practical experience using BI tools like Looker, Tableau, etc.
- Significant practical experience with Java (at least 4+ years)
- AWS (DevOps oriented) knowledge is must have
- Must have experience with data assurance, including automating data quality validation, cleanup and optimization
- Strong knowledge of PostgreSQL or other relational database management systems (Oracle, SQL Server, MySQL)
- NoSQL databases experience
- Team leadership experience
- Ability to work with Product Management to define priorities
- Excellent analytical and time management skills
- At least upper-Intermediate English
What will you do?
- Build data pipelines (ETL flows) to connect different sources of data together
- Build data reports and optimize SQL queries
- Producing high-quality code
- Identify of performance bottlenecks
- Communicate with Engineering & Product management
- Align with existing development teams
- Provide data architecture vision and guidance
- Lead a team
What’s about project?
Our client is a healthcare software company driving a fundamental transformation in the way Pharma and Payers conduct business. We are the only Tech-enabled market solution that is effectively connecting Health Insurance Providers (“Payers”) and Pharma customers on one unified and standardized, end-to-end platform. Both Pharma and Payer customers save time, reduce errors, and improve financial performance through the utilization of our product, which is quickly becoming the industry-standard platform.