jobs-detai

Big Data Architect


Location:

India - Bangalore

Experience

13 Years - 18 Years

Industry

Aviation / Consulting

Job Application

Accepting Applications


Job Description

- Design ETL Hubs, ETL Architecture for Data warehouse/BI implementations.

- Ensure systems meet business requirements and industry practices.

- Build high-performance algorithms, prototypes, predictive models and proof of concepts.

- Develop data set processes for data modeling, mining and production.

- Collaborate with data architects, modelers and IT team members on project goals.

- Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.

- Implementing ETL process using SQL programming, database design/development and using ETL tools.

- Monitoring performance and advising any necessary infrastructure changes.

- Defining data retention policies.

- Work with Hive, Sqoop, Impala and Kudu components of the Hadoop ecosystem.

- Write complex scripts using Python/Linux scripting/Perl.


Ideal Candidate & Qualifications

- Ability to work with huge volumes of data so as to derive Business Intelligence.

- Analyze data, uncover information, derive insights and propose data-driven strategies.

- Database concepts, principles, structures and best practices.

- Hands-on experience in working with Hadoop Distribution platforms like HortonWorks, Cloudera, MapR and others.

- Full knowledge of Hadoop Architecture and HDFS is a must.

- Good knowledge of Data warehousing concepts and Business Intelligence, Data management & Data Architecture.

- Comprehensive understanding of Hadoop/MapReduce ecosystem and architecture.

- Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming.

- Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala.

- Experience with Spark, NoSQL databases, such as HBase, Cassandra and MongoDB.

- Knowledge of various ETL techniques and frameworks, such as Flume.

- Experience with various messaging systems, such as Kafka or RabbitMQ.

- Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O.

- Knowledge of Java & Web development.

- An analytical bent of mind and ability to learn-unlearn-relearn.


Share:


We Work With

Anandh

We partner with organisations across the world to find the most talented individuals. Read More