Job Description:
Work Location: Hyderabad
Experience of 5+yrs (Relevant on Big Data)
Knowledge in Data Modelling, Data Warehouse concepts
Knowledge on Data Governance and Data Security
Cloud Analytics specially on AWS good to have Google Cloud, Azure
Minimum 3 End to End implementation experience on Big Data Projects
Hadoop Ecosystem specifically HBase, Hive, HDFS,
Data Ingestion and Streaming technologies like Apache Kafka, Apache NiFi,
Orchestration Tools like AirFlow
EMR Cluster configuration Knowledge, AWS Glue, Athena, QuickSight, Glue Catalog, LakeFormation, DataPipeline, Step Functions, Lambda, CloudWatch, CloudTrail, Terraform, CloudFormation, S3, VPC, EC2,Data Migration Service(DMS), Amazon Kinesis Data Strems, Kinesis Data Firehose, Kinesis Data Analytics
Hands-on experience on Data Management Framework like Apache Hudi on EMR cluster
Data Warehouse knowledge in Redshift, Hive,
Hands-on programming knowledge on Spark using either Python or Scala
Must have experience on Data Warehouse migration from on premise to Cloud
Must have experience in pre-sale support
Should have provided End to End solution architecture to Big Data Pipeline projects