02/02 Careers
HR at D Cube Analytics

Views:494 Applications:118 Rec. Actions:Recruiter Actions:13

D Cube Analytics - Senior Big Data Engineer - Data Pipeline/Distributed Computing (2-5 yrs)

Bangalore Job Code: 407037

Job Description :


- Designer, builder and manager of Big Data infrastructures Solution Focused professional with experience in Data Engineering, Data Analysis, Workflow monitoring in distributed computing environment for implementing Big Data solution.

- Good understanding of complex processing needs of big data and has experience in developing codes and modules to address those needs.

- Able to integrate state-of-the-art Cloud/Big Data technologies into the overall architecture and lead a team of developers through the construction, testing and implementation

- Experience with designing data pipelines

- Experience with Data Quality Management (Semantic and Syntactic Checks)

- Hands on Experience with SQL

- Experience with building custom jars on SBT server

- AWS Certified Solution Architect

- Scaled Agile Framework(SAFe- ) 4 Certified Practitioner

- Experience in Cloud Computing technologies like AWS (EC2, S3, ECS, Lambda, RedShift, EMR, SQS, SNS, IAM, Pivotal Cloud Foundry).

- Experience in Big Data Technologies like Hadoop, Python and knowledge of Spark and Scala.

- Experience in Hadoop components like HDFS, Map Reduce, Hive, Pig, HBase, Impala, Sqoop, Flume and Oozie.

- Experience in Python Scripting, Git, Maven, Junit, SonarCube, Nexus, Ansible, Docker, Microservices, Jenkins

- Experience in MapR, ElasticSearch, Fluentd, Collectd, Grafana, Kibana

- Knowledge on AWS (Amazon web services) : EC2, Lambda, Auto Scaling, Elastic Load Balancing, Elastic Beanstalk, Virtual Private Cloud, Direct Connect, Route 53, S3, Glacier, Elastic Block Store, Storage Gateway, CloudFront, RDS, DynamoDB, RedShift, RedShift Spectrum, AWS Athena, QuickSight, ElasticCache, CloudWatch, CloudFormation,CloudTrail, EMR, IAM, SNS, SES, SQS

- Experience with implementing docker

- Business Domain Exposure: Telecom, Banking, Financial Services and Insurance (BFSI), E-Commerce, Manufacturing & Supply Chain and HealthCare domain.

- Hands on experience in design and development using Core Java, Servlet, JSP, Struts, Hibernate, JDBC, WebLogic, WebSpehre, DB2,Oracle, PL/SQL.

- A good team player with excellent analytical and communication skills.

Technical Skill :

- Programming Language - Python3, Scala, Core Java

- Cloud Computing(AWS) - S3, RedShift, Pivotal Cloud Foundry, EMR, ECS, EC2, Elastic Load Balance, Auto Scaling,EBS,VPC,S3,IAM

- Automated Build Tools - Jenkins

- DevOps Tools - Git, Nexus, SonarCube, Ansible, Docker

- Big Data Technologies - Hadoop,Mapreduce, HDFS, Hive, HBase, Kafka, Sqoop, Spark

- Scripting languages - Python, Shell

- Framework - Hadoop, Log4J, JUnit, MRUnit

- Operating Systems - LINUX, Windows.

- IDE - IntelliJ IDEA, Eclipse, PyCharm

- Web Servers - Tomcat - Application Server - JBoss, Weblogic, WebSphere

- Database - RedShift, TeraData, Vertica, Oracle, IBM DB2, MySQL, MSSQL, PostGresSQL

- Version control System - Git, TFS, SVN

Add a note
Something suspicious? Report this job posting.