Summary
Overview
Work History
Education
Skills
Certification
Languages
Timeline
Generic
Ashwini G A

Ashwini G A

Bangalore,Karnataka

Summary

Big Data Engineer with 5 years of experience in building and optimizing large-scale data processing systems using the Hadoop ecosystem. Experienced in managing complex data pipelines and delivering actionable insights. Strong team player with a focus on efficient, data-driven solutions.

Overview

6
6
years of professional experience
1
1
Certification

Work History

Software Engineer 2

Epsilon India
08.2022 - Current

Project Description :

Responsibilities:

  • Defined and enhanced internal processes through data mining, database changes and SQL queries.
  • Optimized existing PLSQL routines using performance tuning techniques such as indexing, query optimization, and partitioning strategies.
  • Participated in regular code reviews for consistent quality assurance across the team''s work output.
  • Enhanced data visualization capabilities by designing and implementing interactive Tableau dashboards.
  • Developed pipeline in Airflows to schedule Sql stored procedures, spark jobs, hive jobs and shell scripts.
  • Enhanced application functionality by developing custom stored procedures, triggers, and functions using SQL or PL/SQL as required.
  • Optimized deployments by automating repetitive tasks using scripting languages like Python or Bash for increased operational efficiency.
  • Reduced project delivery time by efficiently managing tasks using JIRA, prioritizing urgent issues, and maintaining open communication with the team.
  • Improved code maintainability by adhering to coding standards and performing regular refactoring exercises.
  • Enhanced software quality by implementing Agile methodologies and participating in daily stand-up meetings.

Application Consultant

IBM India Private Limited
02.2020 - 07.2022

Project Description :

  • Project ADVAIT mainly deals with transforming data received from various sources in meaningful business insights.
  • Data received from various sources in various forms is loaded in RDBMS, moved to HDP following which transformations on data is performed and loaded into Hive tables and exported so that it is further used to generate reports and display on dashboard to obtain meaningful business insights for stakeholders.

Responsibilities :

  • Prepared Sqoop jobs to import data to HDFS and export from HDFS.
  • Processing of jobs using Spark, Python and Pyspark.
  • Extensively worked on partitioning tables, bucketing, joins and DDL statements in Hive.
  • Involved in writing shell scripts in Linux for job execution.
  • Developed complex SQL queries on DB2 to generate reports.
  • Developed end-to-end Oozie workflow to facilitate job flows.
  • Documented the functionalities developed in the application.
  • Handled development and support operations on the existing systems.

Project Engineer

Wipro Limited
07.2018 - 05.2019

Project Description :

  • Marsh is a global professional services firm with operations in insurance broking and risk management.
  • This project mainly deals with extracting, transforming and loading data into tables by mapping them using Informatica and then generating reports using Business objects by writing queries and using Business objects tool.

Responsibilities :

  • Involved in writing SQL scripts for generating reports from BO tool.
  • Actively worked on creating and managing DDL and DML statements in SQL using teradata.
  • Worked with the Business Objects (BO) tool to create, share, and store reports by developing pipelines and mapping queries.

Education

Bachelor of Engineering - Information Science And Engineering

Sapthagiri College of Engineering, Bangalore
Karnataka, India
06.2018

Higher Secondary Education - PCMC

Jindal Girls P.U. College
Bangalore, Karnataka, India
04.2014

Skills

  • Programming Languages : Python, SQL, PL/SQL, Shell Scripting, Spark, Scala, PySpark
  • Hadoop Technologies : HDFS, MapReduce, Hive, Sqoop, Hortonworks, Cloudera, Bigdata
  • Scheduling : Airflow, Oozie
  • Databases : MYSQL, Teradata, IBM DB2, Greenplum
  • Platforms : Windows, Linux, Unix
  • Visualization tools : Tableau, Power BI
  • IDE : Eclipse, Jupyter notebook, Visual Studio Code, PyCharm
  • Development tools : Jira, Github, Bitbucet
  • Development methodologies : Scrum, Agile Modelling, Project Documentation, Test-Driven Development, Code Review Practices, Clean coding, Documenting SDLC

Certification

  • Microsoft Certified : Azure Fundamentals
  • Microsoft Certified : Azure Data Fundamentals
  • Neo4j Professional


Languages

English
Proficient
C2
Hindi
Proficient
C2
Kannada
Proficient
C2
German
Beginner
A1

Timeline

Software Engineer 2

Epsilon India
08.2022 - Current

Application Consultant

IBM India Private Limited
02.2020 - 07.2022

Project Engineer

Wipro Limited
07.2018 - 05.2019

Bachelor of Engineering - Information Science And Engineering

Sapthagiri College of Engineering, Bangalore

Higher Secondary Education - PCMC

Jindal Girls P.U. College
  • Microsoft Certified : Azure Fundamentals
  • Microsoft Certified : Azure Data Fundamentals
  • Neo4j Professional


Ashwini G A