The ideal candidate will be a proponent for innovation, best practices, sound design, development standardization and frameworks and efficient team structures.
- 10+ combined years in analytics development and architecture
- Strong understanding of emerging trends in application architecture and development, especially regarding analytics.
- Proven track record of effective leadership and positive results.
- Ability to interact directly with business and IT executives.
Required Experience/Expertise:
- Big-data architecture.
- Hadoop toolsets: Spark, Kafka, Hive, Impala, Hbase. Preferred: Cloudera.
- Multiple BI reporting tools.
- Data science using R and/or Python. Preferred: machine learning and AI experience using libraries like TensorFlow and Keras.
- Multi-tier web application.
- Technical project delivery in a Plan, Build, Run environment.
- On-Shore, Off-Shore, Near-Shore project delivery models.
- Agile development. Preferred: DevOps
- Experience working in Informatica Technologies.
- Bachelor’s Degree in Computer Science or Engineering
Preferred
- At least 5 years of experience working in Big Data Technologies
- At least 5 years of experience in Data warehousing
- At least 4+ years of experience in core Java and its ecosystem
- Strong understanding and hands-on experience on the Big Data stack (HDFS, Sqoop, Hive, Java etc.)
- Big Data solution design and architecture
- Design, sizing and implementation of Big Data platforms based on Cloudera
- Deep understanding of Cloudera stack (Impala, Spark, Installation and configuration, Navigator etc.)
- Experience in extracting data from feeds into Data Lake using Kafka and other open source components
- Understanding of and experience in Data ingestion patterns
- Experience in configuring AWS components and managing data flows
- Patterns for S3 vs local HDFS