Abhishek Agrawal

Abhishek Agrawal

Mentor
Rising Codementor
US$10.00
For every 15 mins
ABOUT ME
Experienced Azure Data Engineer | 8+ years | B.Tech (NIT) | Azure Cloud tools expert.
Experienced Azure Data Engineer | 8+ years | B.Tech (NIT) | Azure Cloud tools expert.

I'm an Azure Data Engineer offering 7+ years of experience with a proven ability to deliver short or long-term projects in data engineering, data warehousing, machine learning, and business intelligence realm. My passion is to partner with my clients to deliver top-notch, scalable data solutions to provide immediate and lasting value. I have completed my engineering (B.Tech) from NIT RAIPUR.

I specialize in the following data solutions:

✔️ Builiding End to End ETL Pipeline Using Azure Cloud Tools.

✔️ Building the migration process from Hadoop cluster to Azure Databricks spark cluster

✔️ Building data warehouses using modern cloud platforms and technologies.

✔️ Creating and automating data pipelines, & ETL processes

✔️ Building highly intuitive, interactive dashboards.

✔️ Data Cleaning, Processing, and Machine Learning models.

✔️ Data strategy advisory & technology selection/recommendation

Technologies I most frequently work with are:

☁️ Cloud: Azure

☁️ Cloud Tools: Azure Data Factory, Azure Syanpse Analytics, Azure Databricks, Azure Data Lake, Azure Analysis Service, Azure DevOps, Azure Key Vault, Azure Active Directory.

💬Language: SQL, Python, PySpark, SparkSQL, R, SAS, Dash.

👨‍💻 Databases: SQL Server, Azure Syanpase, Azure SQL Database

⚙️ Data Integration/ETL: SAP HANA, Dyanmics 365, EPM Onyx, QAD

📊 BI/Visualization: PowerBI, Excel

🤖 Machine learning - Jupyter Notebook, Python, Pandas, Numpy, Statistics, Probablity.

Berlin (+02:00)
Joined May 2022
EXPERTISE
8 years experience
8 years experience
8 years experience
8 years experience
8 years experience
8 years experience
8 years experience

REVIEWS FROM CLIENTS

Abhishek's profile has been carefully vetted and approved as a Codementor. Connect with Abhishek now, and leave a review for them once you're done!
EMPLOYMENTS
Azure Data Engineer
ALDI SUD Germany
2022-02-01-Present
ETL Pipeline for End to End Specials Use Case Squad Spearheading the migration process from Hadoop cluster to Azure Databricks spark clu...
ETL Pipeline for End to End Specials Use Case Squad Spearheading the migration process from Hadoop cluster to Azure Databricks spark cluster. Ensuring optimal code performance in the spark cluster and constantly seeking out ways to improve it. Industrializing the code base to facilitate seamless scaling. Taking the lead in forming the CI/CD process for both Azure Data Factory and Azure Databricks. Collaborating closely with Data Scientists to provide them with accurate data to derive actionable business insights. RETAIL HUB DASHBOARD Developed a dynamic spark notebook to pivot the web crawling data efficiently. Integrated Azure Synapse Warehouse and Data Lake with ADF and created parameterised pipeline in Data Factory for an end-to end ETL process. Designing and developing Warehouse Objects and Stored Procedures for streamlined data processing. Developing a Logic app to send daily email
SQL
Apache Spark
Azure SQL
View more
SQL
Apache Spark
Azure SQL
Databricks
Azure Data Factory
Azure Data Engineer
View more
Azure Data Engineer
Smiths Detection
2021-08-01-2022-12-01
EPM ONYX BUSINESS REPORTING 1)Gathered requirements from stakeholders for Business Reporting. 2)Prepared Technical & Scope Analysis docu...
EPM ONYX BUSINESS REPORTING 1)Gathered requirements from stakeholders for Business Reporting. 2)Prepared Technical & Scope Analysis document for Data Models which includes Facts & Dimensions mapping. 3)Fetched Data from a Multidimensional source system(EPM) using API through Azure Data Factory to Built ETL Pipeline. 4)Built Full load and Delta load Pipeline in Azure Data Factory. 5)tored the raw data into Azure Data Lake in the Date Time folder structure. 6)Written transformation logics using Pyspark and SparkSQL in Databricks to convert the raw data into Facts and Dimensions. 7)Done Erwin Modelling using Erwin Studio and Implemented Semantic Data Model using Azure Analysis Services for various Analytics Power BI reports/dashboards. 8)Created a technical incident/challenge Document for effective communication across the team. MLT O2C DASHBOARD 1)Added a new report in existing order to the cash dashboard to track Market Lead Time(MLT). 2)Read the uploaded data through Logic App into Databricks from Azure Blob Storage. 3)Made required transformation in Databricks using SQL and Python and written the transformed data into curated layer of Azure Data Lake Gen2. 4)Read the Data into Azure Synapse and done Erwin Modelling using ER Studio and implemented Data Models in Azure Analysis Service for Power BI Visualization. Disaster Recovery and CI/CD Pipeline 1)Taken backups for Azure Databricks, Azure Data Factory, Azure Analysis Service using Databricks Command-Line(CLI), ARM templates, SSMS, and PowerShell. 2)Designed a CI/CD pipeline using Azure DevOps to automate the build and release processes across various environments for Azure Data Factory (ADF), Azure Synapse (Data Warehouse) & Azure Analysis Services.
SQL
Apache Spark
Azure SQL
View more
SQL
Apache Spark
Azure SQL
Databricks
Azure blob storage
Azure Data Factory
View more
Azure Data Engineer
Syniti
2018-11-01-2021-03-01
AUTOMOTIVE MIS DASHBORAD REPORTING USING AZURE 1)Built a data integration ETL pipeline from SAP S/4HANA and Dynamics 365 source systems ...
AUTOMOTIVE MIS DASHBORAD REPORTING USING AZURE 1)Built a data integration ETL pipeline from SAP S/4HANA and Dynamics 365 source systems and moved data from on premise to cloud with the help of Azure Data Factory. 2)Created date hierarchy partition in Azure Data Lake Storage Gen 2 to store the data in multiple layers. 3)Transformed data using Azure Synapse (Azure Data Warehouse) created Schema, Facts, and Dimensions using MS SQL to organize and populate the data in table structures for various source systems with Stored Procedures and created a semantic layer for data modeling in Azure Analysis Services to build and deploy multiple interactive analytics dashboards or reports using Power BI. 4)Handled Dynamics 365 System End to End from Source Integration to making reports in Power BI. KPI REPORTING USING AZURE 1)Migrating data from On Premise server to Azure Data Lake using Azure Data Factory and processing CSV, Json, XML file using Scala, PySpark, Spark SQL in Databricks. 2)Writing processed file to Azure SQL and Delta Lake using Databricks and moving it to Archive container in Databricks. 3)Building Dashboard on top of Delta Table and Orchestrating the whole ETL process using Azure Data Factory. AUTOMATED HYPERANALYTICS PLATFORM 1)Developed an internal Data Science Platform using Python and ML Algorithms which can automate the various Data Science processes thereby reducing the computational complexity and time required to solve a specific problem. 2)Used Apache Airflow, Pyod Library, Facets, Pixiedust library, Auto Keras, H20 and ML Algorithms etc for overall automation. 3)Utilized the Flask module for developing the backend machine learning framework.
Azure SQL
Databricks
Azure blob storage
View more
Azure SQL
Databricks
Azure blob storage
Azure Data Factory
Azure Data Engineer
View more
PROJECTS
Dashboarding using Azure Synapse
2023
1)Orchestrated the secure migration of data from on-premise servers to Azure Data Lake using Synapse Analytics pipelines. 2)Executed data...
1)Orchestrated the secure migration of data from on-premise servers to Azure Data Lake using Synapse Analytics pipelines. 2)Executed data transformations using Synapse Spark Pool, efficiently writing data to Azure Data Lake. 3)Facilitated the movement of data from Azure Data Lake to Azure Synapse Dedicated Pool, optimizing data organization through stored procedures and SQL views for streamlined report generation. 4)Integrated tables into Power BI, enhancing comprehensive reporting capabilities.
Azure
Azure SQL
Databricks
View more
Azure
Azure SQL
Databricks
Azure blob storage
Azure Data Factory
Azure Data Engineer
View more
Reporting Using Azure Databricks
2022
1)Moved the data from on-premise servers to Azure Data Lake using Azure Data Factory. 2)Proficiently processed CSV, Json, and XML files u...
1)Moved the data from on-premise servers to Azure Data Lake using Azure Data Factory. 2)Proficiently processed CSV, Json, and XML files using PySpark and Spark SQL in Databricks. 3)Implemented the writing of processed files to Azure SQL using the JDBC connector and Delta Lake within Databricks. 4)Successfully managed the transition of data to the Archive container in Databricks. 5)Developed a comprehensive dashboard on the Delta Table and orchestrated the entire ETL process using Azure Data Factory.
Azure
Databricks
Azure Data Factory
View more
Azure
Databricks
Azure Data Factory
View more