Job title: Data Engineer
Job type: Permanent
Emp type: Full-time
Industry: Marketing
Expertise: Data Engineer Fullstack Developer
Skills: data engineer AWS GCP Terraform Airflow
Salary type: Annual
Salary: AUD $147,825.00
Location: CBD / Work from Home
Job published: 26/11/2020
Job ID: 33195

Job Description

 

Data Engineer

 

About us

 Our client was born 11 years ago from an unfulfilled need in the marketplace where large IT brands were unable to effectively market to and through their channel partners. Knowing the importance of consistent messaging, thought leadership content, and a simple, effective platform.

Today, the largest IT brands in the world leverage our client to reach over 3 million IT buyers and decision makers every month. We help bridge the gap between companies like Lenovo, HP, HPE, Dell, Cisco, Samsung and Microsoft, and the reseller community. Through digital marketing, innovative content, promotions, and insights we develop relationships that help keep IT partners informed and engaged with their customers. As a fast-growing technology services company with offices in Sydney, Austin (TX), Manila, and London we celebrate creative problem solving and an outcomes focused culture.

Our Values

As a purpose and values-led organisation who believe in ...

Make it win:win ​- ​We succeed when our customers do. So we’re dedicated to making it happen​.

Challenge the norm ​-​ ​There’s always a better way to do something. It just takes a fresh approach.

Step up and own it ​- We are a proactive, empowering business. Our team step up, and own their projects.

Stand in others’ shoes​ - Whether clients, or their customers, we engage with people, by being in tune with their needs.

About the Role

Our client is looking for a Data Engineer to join our development team innovating on our core B2B marketing platform. Reporting to our CTO this role offers you the opportunity to make your mark leading the way as we rebuild our data management platform, mature our analytics capabilities and help us discover valuable insights about our customers.

In particular, you will:

● Be responsible for building scalable data pipelines from a variety of data sources; turning them into data assets for analytical use by product, sales and operations teams.

● Organise and process data (structured and unstructured) from our data lake and data warehouse.

● Build robust infrastructure and tools to ingest data from multiple sources using Cloud (AWS and GCP) data technologies.

● Automate data pipeline and reporting processes.

● Work closely with Engineering, Product Management and Business Operations to design and implement data analytics solutions that inform our product roadmap and help drive operational efficiencies.

● Identify new ways to tackle data engineering problems with emerging and leading edge technologies.

● Build data expertise and own data quality for the awesome pipelines you build.

About You

To be considered for this position you will have experience with:

● Advanced SQL knowledge and experience working with relational databases (ideally MySQL and/or PostgreSQL); working familiarity with NoSQL DBMS (DynamoDB preferred); understanding strengths/limitations when selecting which database solution is fit for purpose.

● Building ETL/ELT data pipelines utilising Cloud based (AWS or GCP) data warehousing technologies.

● Data modelling and data warehouse design.

● Extensive Python experience; specifically - statistical/mathematical libraries, concurrency and parallelism.

● Cloud serverless function development: Java, Python, Node.js etc.

● Interacting with API and Webhooks to load into database/data warehouse.

● Experience with at least one source code management (Gitlab preferred).

● Excellent communication skills. You will collaborate with teams across the business, understand/identify the right problems to solve and articulate your solutions to technical and non-technical peers.

In addition, experience in four or more of the following areas would be highly regarded:

● Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, code reviews, source code management, build processes, testing, and operations.

● AWS and/or GCP certifications.

● Hands-on experience with Apache Airflow.

● Business Intelligence data visualisation tools - we use DOMO.

● Experience with Infrastructure as Code in AWS/GCP (preferrably Terraform).

● Performance optimisation. ● Software reliability engineering principles. ● Experience in working on a greenfield project.

● Experience in digital marketing and/or email campaign management.

● Hands-on or theoretical experience in Data Analysis / Data Science.

Benefits

If you are successful in this role, you will join a fun & creative work environment based in the heart of Sydney CBD and will be offered:

● A competitive remuneration package ● Flexible working conditions i.e. start/finish times, work from home options ● Regular work sponsored social events