Check other Jobs

Lead Data Engineer

Technologies: SQL
Locations: Buenos Aires Colombia Guatemala Mexico Mexico City
Departments: Engineering
Hot position?: Hot

On behalf of Our client, AgileEngine is looking for a Lead Data Engineer to develop front- and back-end solutions for the internal products. The ideal candidate’s prior work experience reflects high standards for engineering with proven capabilities. The candidate possesses the ability to take on important business challenges and execute them with precision and quality.

What is required?

  • 5+ years experience in software development
  • BS in Computer Science or equivalent practical experience
  • Strong practical experience with SQL and Python (ideally 4+ years) for Data Engineering
  • Experience leading technical teams and collaborating with client-side stakeholders

What will be a plus?

  • Tableau
  • Matillion for automation
  • Experience with Cron jobs configuration
  • Snowflake
  • AWS
  • VoD experience

What you will do:

  • Takes ownership of the tasks/tickets to be implemented
  • Leads the team through the design and implementation
  • Onboards new developers into the team
  • Mentors the team, performs code inspections
  • Performs integration testing
  • Accountable for the quality produced by the team

What we offer:

  • Interesting and challenging tasks
  • Flexible work schedule
  • Zero bureaucracy
  • Friendly and a very skilled team with great corporate culture and mentorship (visit us and see it yourself)
  • US democratic management style
  • Opportunities for self-realization, professional and career growth
  • Cool events and team activities
  • Professional workshops and training, a great engineering culture

About the project

The client team has a number of internal Data Science and Data Engineering tasks and initiatives of high priority for its VoD platform that require contractor team support:

-Take atomic data and transform it into molecular data
-Apply Data Engineering to transform the data into useable format for Data Science analysis
-Data from Snowflake environment’s tables of data should be updating into aggregated tables on a daily basis
-Tableau reports need to be developed from this aggregated tables
-Existing models in python on Sagemaker (AWS) are currently manual and need to be automated
-Snowflake data written and python needs to either be moved into Sagemaker and automated or rewritten in SQL and automated on Snowflake using Matillion
-Potential to develop a web scraping tool for publicly available information and storing it in Snowflake clean tables.

Apply Now

Apply for this position

Allowed Type(s): .pdf, .doc

Our GeographyGEOS

UTC-5
WASHINGTON DC USA
UTC-5
MIAMI USA
UTC-6
MEXICO CITY MEXICO
UTC-5
BOGOTAColombia
UTC-3
BUENOS AIRES Argentina
UTC+2
KYIV Ukraine
UTC+2
KHARKIV Ukraine
UTC+2
CHERNIVTSI Ukraine
UTC+2
ODESA Ukraine
UTC+2
KYIV, KHARKIV,
ODESA, LVIV, Chernivtsi Ukraine
UTC+5:30
HYDERABAD India
loc icon

MIAMI

USA

loc icon

WASHINGTON DC

USA

loc icon

BUENOS AIRES

Argentina

loc icon

BOGOTA

Colombia

loc icon

HYDERABAD

India

loc icon

MEXICO CITY

Mexico

loc icon

KYIV

Ukraine

loc icon

KHARKIV

Ukraine

loc icon

ODESA

Ukraine

loc icon

LVIV

Ukraine

loc icon

CHERNIVTSI

Ukraine