Designation – Big Data Architect
Location – Mumbai
About employer – Confidential
Description
Responsibilities
Qualification and Skills Required
- Bachelor’s Degree in Computer Science, Information Systems, Mathematics, Statistics, Finance, Business, related field or equivalent working experience
- 5+ years working experience with Data modeling, SQL, ETL and Data Warehousing
- Expert in writing SQL scripts
- Proven architecture and infrastructure design experience in big data batch and real time technologies like Hadoop, Hive, Pig, H Base, Map Reduce, SPARK and STORM, Kafka
- Experience with Big Data Analytics: ETL, in-stream processing, batch processing, querying, workflows and workflow & query optimization
- Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic Map Reduce) and considerations for scalable, distributed systems a plus
- Expert knowledge in an enterprise class RDBMS
- Experience with enterprise-class Business Intelligence tools such as Micro strategy, Tableau, Oracle BI, etc.
- Excellent verbal/written communication & data presentation skills, including ability to succinctly summarize key findings and effectively communicate with both business and technical teams. Ability to balance and prioritize multiple conflicting requirements with high attention to detail.
- Comfortable working in a Linux environment
- Experience with scripting language such as Python, Perl, Ruby or Javascript
- Experience with MPP databases such as Redshift
- Knowledge of AWS products and services
- Exposure to predictive/advanced analytics and tools (such as R, SAS, Matlab)
- Exposure to SQL databases (such as DynamoDB, MongoDB)
- Lead architecture, technology selection and implementation of a data platform over big data technologies
- Contribute to the technical design efforts – ensuring the system design meets scalability and performance requirements
- Provide technical mentorship for junior engineers on new technologies and development techniques
- Proven architecture and infrastructure design experience in big data batch and real time technologies like Hadoop, Hive, Pig, HBase, Map Reduce, SPARK and STORM, Kafka
- Experience with Big Data Analytics: ETL, in-stream processing, batch processing, querying, workflows and workflow & query optimization
- Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems a plus
- Firm understanding of major programming/scripting languages like Java, Python, R. Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture;
Interested people can apply for this job by sending their updated CV to [email protected] with subject as Big Data Architect– Mumbai and the following details:
- Total Experience
- Current CTC
- Expected CTC
- Notice Period