- BS/MS in Computer Science, Engineering or related technical field
- 7 years software development/ programming experience in enterprise cloud-based data applications.
- 5 years of experience in data modeling, data design and persistence (e.g. warehousing, data marts, data lakes).
- Experience with supporting BIG DATA and Hadoop.
- Exposure to functional, imperative and object-oriented languages and methodologies.
- Experience with Big Data approaches and technologies including: Hadoop, Cloudera utilities, Spark, Kafka, Hive, Oozie.
- Experience with SQL (SQL Server, MySQL, Postgres) and NoSQL (Cosmos/MongoDB/HBase) database is expected.
- Distributed Systems experience (4+ years desired)
- Knowledge of various design patterns and technologies that enable business problem-solving at scale
- Great communication skills to drive collaborate cross-group and work effectively within the team
- Exposure to cloud technology stacks from Microsoft, Amazon, or Google.
- Industry experience as a Data Engineer or related specialty (e.g., Software Engineer, Business Intelligence Engineer, Data Scientist) with a track record of manipulating, processing, and extracting value from large datasets.
- 5+ Years of experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
- Experience developing and testing computer software and/or online services.
- Strong coding, debugging and problem-solving skills
- Strong knowledge of object-oriented programming language paradigms
- Great communication skills to collaborate cross-group and work effectively within the team