You’ll spend most of your time working with a wide variety of clients to deliver the latest big data technologies and practices to design, build and maintain scalable and robust solutions that unify, enrich and analyse data from multiple sources.
Job Roles and Responsibilities:
- Designing, Architecting, and Developing solutions leveraging cloud big data technology to ingest, process and analyze large, disparate data sets to exceed business requirements
- Unifying, enriching, and analyzing customer data to derive insights and opportunities
- Leveraging in-house data platforms as needed and recommending and building new data platforms/solutions as required to exceed business requirements
- Clearly communicating findings, recommendations, and opportunities to improve data systems and solutions
- Demonstrating deep understanding of big data technology, concepts, tools, features, functions and benefits of different approaches
- Seeking out information to learn about emerging methodologies and technologies
- Clarifying problems by driving to understand the true issue
- Looking for opportunities for improving methods and outcomes
- Applying data driven approach (KPIs) in tying technology solutions to specific business outcomes
- Collaborating, influencing and building consensus through constructive relationships and effective listening
- Solving problems by incorporating data into decision making
- A bachelor’s degree and approximately three years of related work experience; or a master’s degree and approximately two years of related work experience
- At least three years hands-on experience with various Cloud and Big Data technologies
- At least two years experience in implementing, automating and integrating Big Data infrastructure resources like S3, Redshift, Aurora, Kinesis, Kafka, EMR, Lambda, SNS, Azure Blob Storage Account ,SQL Data Warehouse, Microsoft Event Hubs , HDInsights, Azure Databricks, Azure Functions,Event Grid, Data Lake Analytics in an ephemeral/transient and in an elastic manner
- IaC & Config Management: Tools like Chef, puppet, CloudFormation ,terraform, ansible, boto3 and/or Azure/GCP equivalent
- Hands on experience of core Operating systems like Linux RHEL, Ubuntu, System administration tasks including shell scripting
- Network Engineering/Admin (vpc, subnet, security groups, VPC-Endpoints, nat/route tables, etc)
- Experience with container technology like docker, kubernetes etc.
- Security tools/concepts like At Rest and in transit Encryption, IAM, key and certificate management etc.
- CI/CD pipeline management like git/bitbucket, and code deployment tools like Jenkins, sonar cube
- Communication is essential, must be able to listen and understand the question and develop and deliver clear insights.
- Outstanding team player.
- Independent and able to manage and prioritize workload.
- Ability to quickly and positively adapt to change.
- A valid driver’s license in the US; willingness and ability to travel to meet client needs.
- Bachelor’s Degree or above in mathematics, information systems, statistics, computer science, or related disciplines