Data Engineer - Disney Streamingother related Employment listings - Forest Knolls, CA at Geebo

Data Engineer - Disney Streaming

Job
Summary:
We are seeking a Data Engineer who will partner with business, analytics and engineering teams to design, build and maintain ease for use data structures to facilitate reporting and monitoring key performance indicators.
Collaborating across disciplines, you will identify internal/external data sources to design table structure, define ETL strategy & automated QA checks and implement scalable ETL solutions.
Responsibilities:
Partner with technical and non-technical colleagues to understand data and reporting requirements.
Work with engineering teams to collect required data from internal and external systems.
Design table structures and define ETL strategy to build performant Data solutions that are reliable and scalable in a fast growing data ecosystem.
Develop Data Quality checks for source and target data sets.
Develop UAT plans and conduct QA.
Develop and maintain ETL routines using ETL and orchestration tools such as Airflow, Luigi and Jenkins.
Document and publish Metadata and table designs to facilitate data adoption.
Perform ad hoc analysis as necessary.
Perform SQL and ETL tuning as necessary.
Develop and maintain Dashboards/reports using Tableau and LookerCoordinate and resolve escalated production support incidents in Tier 2 support rotationCreate runbooks and actionable alerts as part of the development processBasic
Qualifications:
2
years of relevant Professional experience.
1
years of work experience implementing and reporting on business key performance indicators in data warehousing environments.
Strong understanding of data modeling principles including Dimensional modeling, data normalization principles etc.
1
years of experience using analytic SQL, working with traditional relational databases and/or distributed systems such as Hadoop / Hive, BigQuery, Redshift.
Experience programming languages (e.
g.
Python, R, bash) preferred.
1
years of experience with workflow management tools (Airflow, Oozie, Azkaban, UC4)Good understanding of SQL Engines and able to conduct advanced performance tuningExperience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase)Familiarity with data exploration / data visualization tools like Tableau, Looker, Chartio, etc.
Ability to think strategically, analyze and interpret market and consumer information.
Strong communication skills - written and verbal presentations.
Excellent conceptual and analytical reasoning competencies.
Degree in an analytical field such as economics, mathematics, or computer science is desired.
Comfortable working in a fast-paced and highly collaborative environment.
Process oriented with great documentation skillsExperience with Datorama, a plus.
.
Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.