Experienced in Name Node where Hadoop stores all the file location information in HDFS and tracks the file data across the cluster or multiple machines.Expert in implementing the projects from end to end & in providing the Architectural with emphasis on requirements analysis, design, coding, testing and documentation.Good experience in data transformation, data mapping from source to target database schemas and data cleansing.Excellent experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.Strong Database experience using Oracle, XML, DB2, Teradata, SQL server, Big data and NoSQL.Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment.
![ca erwin data modeler template macros ca erwin data modeler template macros](https://www.softwaretestinghelp.com/wp-content/qa/uploads/2018/11/erwin.png)
Experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Elastic Search, HBase, PIG, Sqoop and Flume.Expert level understanding of using different databases in combinations for Data extraction and loading, joining data extracted from different databases and loading to a specific database.Excellent experience in troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.Experienced in generating and documenting Metadata while designing OLTP and OLAP systems environment.Experience working with Agile and Waterfall data modeling methodologies.Experience in writing SQL queries and optimizing the queries in Oracle, SQL Server, Netezza, Teradata and Big Data.Strong experience in Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools such as Ab Initio and Informatica Power Center.Expertise in developing transactional enterprise data models that strictly meet normalization rules, as well as Enterprise Data Warehouses using Kimball and Inmon Data Warehouse methodologies.
![ca erwin data modeler template macros ca erwin data modeler template macros](https://images-na.ssl-images-amazon.com/images/I/41TPKCSGy-L._SX384_BO1,204,203,200_.jpg)
Also drill down to the lowest levels of systems design and construction.
![ca erwin data modeler template macros ca erwin data modeler template macros](https://images.slideplayer.com/17/5348585/slides/slide_2.jpg)
Work on Background process in oracle Architecture.Strong hands on experience using Teradata utilities like BTEQ, Fast-Load, Multi-Load, Fast-Export, Tpump, Teradata Manager and Visual Explain.Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.Well versed expertise in cloud development using MS Azure, Azure Data Lake, Azure Data Factory, Azure PaaS & AWS services like EC2, EC3, Redshift.Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of Big data.Experienced in developing Entity-Relationship ER diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer.Highly proficient in Data Modeling retaining concepts of RDBMS using 3NormalForm (3NF) and Multidimensional Data Modeling Schema (Star schema, Snow - Flake Modeling, Facts and dimensions).Above 12+ years of Experience working as a Data Architect/Data Modeler and Data Analyst with emphasis on Data Mapping, Data Validation in Data Warehousing Environment.