This position requires a BA/BS in Computer Science, Information Systems, Information Technology or related field with 7+ years of prior experience in software development, Data Engineering and Business Intelligence OR equivalent experience.
Following are the some of the key skills that you must have.
7+ years of strong programming background with Java/Python/Scala
At least 3+ years of experience working on Data Integration projects using Hadoop MapReduce, Sqoop, Oozie , Hive, Spark and other related Big Data technologies
At least 2+ years of experience on AWS preferably leveraging services such as Lambda, S3, Redshift, Glue services
Some working experience building Kafka based data ingestion/retrieval programs
Experience tuning Hadoop/Spark/hive parameters for optimal performance
Strong SQL query writing and data analysis skills
Good shell scripting experience
Rigor in high code quality, automated testing, and other engineering best practices, ability to write reusable code components
Skills nice to have:
Healthcare experience
Cloudera Developer certification
Apply for this Job
Please use the APPLY HERE link below to view additional details and application instructions.