View job on Handshake

·      6+ years of experience and demonstrated strength in data modeling, data warehousing, ETL (Extract, Transform, and Load) development, and clear understanding of the difference and rationale of ELT versus ETL

·      4+ years of experience with designing and using databases with one or more of the following: MySQL, MS SQL, MongoDB, or other professional databases

·      3+ years of experience designing for and using MongoDB or other document based database

·      3+ years of experience with using a broad range of AWS technologies (e.g. EC2, EMR, S3, Lake Formation, Redshift, VPC, Glacier, IAM, CloudWatch, SQS, Lambda, CloudTrail, Systems Manager, KMS, Kinesis Streams)

·      Excellent analytical, problem-solving and troubleshooting skills.

·      Expertise managing the data lifecycle and the lifecycle of technical data solutions.

·      Experience in documenting requirements

·      Ability to work as one team cross-organizationally to drive innovation and business results

·      Ability to work in teams and collaborate with others to clarify requirements, quickly identify problems, and collaboratively find solutions

·      Experience working in partnership with internal and external vendors.

·      Excellent communication skills, effective with varying organizational levels and skill set, and able to translate between technical and non-technical concepts.

Preferred additional experience:

·      Prior experience building highly available, distributed systems as it pertains to big data storage, flow, and processing of large, complex data

·      Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets

·      Prior experience with vendor-specific solutions such as Confluent, Cloudera, Snowflake, etc.

·      Working knowledge of common data analyst/science workflows, business intelligence tools, and AI/ML modeling

·      3+ years experience in data streaming technologies, such as Kafka

·      4+ years experience in implementing data-driven solutions using tools such as Tableau, Hadoop, Impala, Hive, NiFi, Prometheus, Spark, Athena, Redshift, ElasticSearch, BigTable, or Airflow

·      4+ years experience developing data solutions in Python

·      Strong communication skills