Lead Data Engineer
Successful Startup/SME Environment
Convenient CBD HQ
About Our Client
Join a succesful new player to the Digital/Cloud Transformation game. This organisation has successfully been through the initial high-risk startup phase and has consolidated its spot in the market as a key player in this competitive space.
Modern offices, and flexible work options are consistent Melbourne's fast growing start-up culture. Work with industry leading execs that can offer career growth and up-skilling opportunities.
- Design and implement optimal data pipeline architectures
- Assemble large, complex data sets that meet business requirements
- Identify, design, and implement internal process improvements: including process automation, optimizing data delivery, etc.
- Design optimal ETL infrastructures from variety of data sources
- Incorporate governance processes and tools into the data landscape
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with executive, LOB, design and IT stakeholders on data-related technical issues and infrastructure needs
- Keep data separated and secure across national boundaries through replication and failover techniques
- Guide and mentor clients to become self-sufficient practitioners
The Successful Applicant
- From a development or operations background (or an existing DataOps Team)
- Understanding what getting something done done looks like (documentation, testing, operational requirements, continuous improvement of process)
- Have experience with the AWS ecosystem (EC2, EBS, S3, ASG's, ALB/ELB's, VPC's, etc…
- Experience with big data tools: Hadoop, Spark, Kafka, Kinesis etc.
- Experience working with the following Toolsets will be strongly considered: Talend on AWS, Snowflake SaaS and resulting DevOps, configuration activities.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Nifi, Airflow, etc.
- Experience with AWS cloud data services: EMR, RDS, Redshift, Kinesis, Glue
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: SQL, Python, PySpark, SCALA, etc
What's on Offer
- Work for a successful start up
- Lucrative package
- Centrally located
- Lots of opportunity to upskill, certify, and grow.