Employer: Deep 6 AI
Background:Different levels of experience welcome.Required:Strong object-oriented programming skills, with fluency in Java, Scala, or Python.Experience in the Hadoop ecosystem, ideally with MapReduce and/or Spark.Experience with different types of databases and their applications. Comfortable working in a Linux terminal (command-line interface).Experience with data modeling, particularly with Avro, Thrift, Protobuf, etc.Nice-to-haves:- Experience with HIPAA compliance.- Experience working with AWS or equivalent.- Knowledge of natural language processing and/or machine learning.- HBase or similar non-relational experience.- Strong understanding of issues in distributed, eventually-consistent environments.- Experience working with Electronic Health Records.- Knowledge of medicine, genomics, etc.Responsibilities:- Build data pipelines that are: robust and fault-tolerant; scalable and capable of handling kilobytes to petabytes of data; efficient; secure and HIPAA-compliant where necessary.- Work with data scientists to deploy machine learning algorithms in a distributed computing environment.- Design and deploy standardized data models capable of representing complex medical data from multiple sources.The above statements describe the general nature and level of work being performed in this job function. They’re not intended to be an exhaustive list of all duties, and indeed additional responsibilities may be assigned by Deep 6.At Deep 6, we appreciate the opportunity to benefit from the diverse backgrounds and experiences of others. Because of our deep commitment to respect every individual, Deep 6 is an equal opportunity employer.