Welcome to Inetz
Courses Images
30 SEATS

COURSE INSTRUCTOR

Teachers Images

Bavani

Big Data Expert

Big Data Masters Program to professionals who seek to dependant on their knowledge in the field of Big Data.

BASIC INFORMATION

  • Modules : 4
  • Length : 6 Month
  • Level : Basic
  • Category : Software Training
  • Started : 01-04-2019
  • Shift : 02
  • Class : 120

Big Data Training and Course Description

Big Data Masters Program to professionals who seek to dependant on their knowledge in the field of Big Data. It makes you proficient in tools and systems used by Big Data experts. It includes training on Hadoop and Spark, Java Essentials, and SQL. The program is customized based on current industry standards that comprise of major sub-modules as a part of the training process. This program is designed by the industry experts to provide hands-on training with tools that are used to speed up the training process.

This program follows a set structure with 4 core courses. It makes you an expert in key technologies related to Big Data. At the end of each core course, you will be working on a real-time project to gain hands-on expertise. By the end of the program, you will be ready for seasoned Big Data job roles.

Learning Path Curriculum 

Software Professionals working in outdated technologies, JAVA Professionals, Analytics Professionals, ETL Professionals, Data warehousing Professionals, Testing Professionals, Project Managers can undergo our Hadoop training in Chennai and make a career shift. Our Big Data Master Training will give hands-on experience to you to meet the demands of industry needs.

Big Data Training and Course Syllabus

Level 1: Apache Hadoop

  • Introduction to Big Data & Hadoop Fundamentals
  • Dimensions of Big data
  • Type of Data generation
  • Apache ecosystem & its projects
  • Hadoop distributors
  • HDFS core concepts
  • Modes of Hadoop employment
  • HDFS Flow architecture
  • HDFS MrV1 vs. MrV2 architecture
  • Rack topology
  • HDFS utility commands
  • Min h/w requirements for a cluster & property files changes
  • MapReduce Design flow
  • MapReduce Program (Job) execution
  • Types of Input formats & Output Formats
  • MapReduce Datatypes
  • Performance tuning of MapReduce jobs
  • Counters techniques
  • Hive architecture flow
  • Types of hive tables flow
  • DML/DDL commands explanation
  • Partitioning logic
  • Bucketing logic
  • Hive script execution in shell & HUE
  • Introduction to Pig concepts
  • Pig modes of execution/storage concepts
  • Pig program logics explanation
  • Pig basic commands
  • Pig script execution in shell/HUE
  • Introduction to Hbase concepts
  • Introdcution to NoSQL/CAP theorem concepts
  • Hbase design/architecture flow
  • Hbase table commands
  • Hive + Hbase integration module/jars deployment
  • Hbase execution in shell/HUE
  • Introduction to Sqoop concepts
  • Sqoop internal design/architecture
  • Sqoop Import statements concepts
  • Sqoop Export Statements concepts
  • Quest Data connectors flow
  • Incremental updating concepts
  • Creating a database in MySQL for importing to HDFS
  • Sqoop commands execution in shell/HUE
  • Introduction to Flume & features
  • Flume topology & core concepts
  • Property file parameters logic
  • Introduction to Hue design
  • Hue architecture flow/UI interface
  • Introduction to zookeeper concepts
  • Zookeeper principles & usage in Hadoop framework
  • Basics of Zookeeper
  • Principles of Hadoop administration & its importance
  • Hadoop admin commands explanation
  • Balancer concepts
  • Rolling upgrade mechanism explanation