Remote Job Opportunity: Make a Positive Impact as an Analytics Engineer

Are you an analytics engineer looking to take the next step in your career?

Are you looking for a company where DBT is central to their data workflow?

Are you looking for a company where remote is built into the company DNA, and not coronavirus-induced?

Do you want to work at a company where the leadership is engaged with BI and sees it as a strategic priority?

Do you want a central role in crafting the BI strategy, and building a data team?

Do you want to reach and make a positive impact on millions of people?

This job might be for you if you:

  • like being able to set your own hours and work from home
  • like exercising your creativity and experimentation
  • like having responsibility
  • like working collaboratively
  • like having a dependable, reliable stream of work
  • want to make the world a better place
  • are comfortable with a fast-changing environment.

You should NOT take this job if you:

  • have a strong need/desire for in-person social interaction at work
  • dislike asynchronous communication
  • like following instructions and being told what to do
  • don’t like needing to come up with ideas
  • don’t have a real interest or experience in online education.

What’s the company?
FluentU is an online education company that helps people learn languages with real-world videos, including movie trailers, music videos, news and inspiring talks. We have a website, iOS app (usually among the top 50 grossing iOS education apps), and Android app. Founded in 2011, we’re a profitable, stable company with long-term focus, and we’re proudly self-funded. And we’ve been remote/distributed since day one.

We get 5 million visitors per month on our blogs, 100,000+ people on our email list, and many more receiving web and mobile notifications.

This is a unique opportunity to play a pivotal role on our business intelligence strategy, which is still in the early stages. We’ve worked with a reputable consultancy until now, and you’ll be our first dedicated hire focused specifically on BI data engineering, empowered to build a program from the foundation up.

What’s the role?
As our first analytics engineer, you will be responsible for:

  • working in an agile/kanban style methodology while balancing long-term data modeling & infrastructure planning
  • interpreting & executing analytics feature requests from senior stakeholders
  • planning & documenting work to be done, with regular feedback to stakeholders to minimize unnecessary work
  • maintaining & improving the entire data pipeline from start to finish, from extracting data from SaaS API’s to configuring display options/creating charts for end users in Looker, eg:
    • using data-extractor-as-a-service tools such as Stitch to extract data on a scheduled basis & oversee automated loading behavior
    • Creating & maintaining proprietary extraction tools, such as Bash scripts on Google Cloud instances
    • managing the DBT ETL pipeline
    • acting as Looker Admin (development, security & administration)
  • Using software engineering best-practices (such as version control, component-based software/architecture, DRY code etc).
  • Implementing testing frameworks & subjecting all work to QA

You would work closely with the founder of FluentU and our other leadership.

HOW WE WORK

We’re a 100% distributed/remote team. Here’s a little bit more about how we work:

  • Almost all of our communication is text-based (mostly via Asana) and we value clear communication (https://app.tettra.co/teams/fluentu/pages/communication-guidelines).
  • Most things are not urgent. We take pride in having a calm work environment.
  • We also have a flat collaborative environment.
  • We make decisions based on logic/reason.
  • We believe in getting things done and continuous improvement.

QUALIFICATIONS

Our ideal candidate:

  • can work in a fast-paced environment and be responsive to new requirements
  • is terrific at written communication
  • can explain technical concepts to a non-technical audience
  • understands basic principles around content marketing
  • is comfortable managing their own time and workflow independently
  • can juggle multiple projects & work streams concurrently
  • has most or all of the following technical skills:
    • experience working collaboratively using Git
    • has strong SQL skills (we use BigQuery Standard SQL)
    • understands data modelling concepts, à la Kimball dimensional modelling
    • exceptional understanding of data manipulation (ie joins, data granularity, referential integrity, uniqueness/primary/foreign keys etc)
    • some knowledge of sql database performance
    • Looker, dbt & google cloud knowledge (can be easily learned if you possess the above skills)
    • ideally linux/cloud architecture & python skills but these are not mandatory.
  • has a deep interest in language learning or online education.
  • is able to work a minimum of 25+ hours per week.

HOW TO APPLY

Please click here and fill out the form.

0 0 114
0 REPLIES 0