View job on Handshake

Employer: Quantiphi

Expires: 07/31/2021

Responsibilities: Extract, transform, and load logic to automate data collection and manage data processes/pipelines including data quality and monitoringContribute to the development of data frameworks on cloudWrite and review technical documents, including requirements and design documents for existing and future data systems, as well as data standards and policiesArchitect data pipelinesCollaborate with analysts, support/system engineers, and business stakeholders to ensure our data infrastructure meets constantly evolving requirements Requirement:BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experienceHighly proficient in Java, with good knowledge of its ecosystem & solid understanding of object-oriented programming.Hands on experience on Dataflow, BigQuery, Cloud SQL, BigTable, DatastoreExperience with data processing software (such as Hadoop, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Flume)Experience in writing software in one or more languages such as Java, C++, Python, Go and/or JavaScriptExperience managing internal or client-facing projects to completion; experience troubleshooting clients’ technical issues; experience working with engineering teams, sales, services, and customersExperience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT and reporting/analytic tools and environments.Experience in technical consultingExperience architecting, developing software, or internet scale production-grade Data solutions on CloudExperience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, MongoDB, SparkML, Tensorflow)