Top 5%
Abhinay R.

Data Engineer

Externalisation nearshore
14 ans
Guadalajara, MEXICO
Contactez-nous à propos de ce profil

Why I'm Top 5%

  • Superior technical and people skills
  • English language proficiency
  • 14 years of industry experience
Learn how we hire

Mon expérience

Voir plus

PentalogFebruary 2021 - Présent

Data Engineer within a digital services platform dedicated to helping companies access world-class software engineering and product talent.
  • Developing and maintaining DW in Snowflake to ingest various kinds of data in different format from cloud services like S3/Azure /GCP using Snowflake stored procedures/Snowpipes.
  • Cleaning data presented to down-stream in form of view/Materialized views to be consumed by AE/looker for data analysis. 
  • Created data-quality/ingestion failure scripts in python to alert incase incoming data is malformed.
  • Worked with orchestration tool like Dagster/Air-flow to run ingestion pipelines.
  • Worked with stitch integration to ingest 3rd party data ingestion as well.
  • Troubleshooting, bug-fixing and new feature addition in SQL.
  • Maintaining documentation for all the pipelines and work done.
  • Using GitHub for CI/CD.
  • Maintaining security and right permission for various stakeholders accessing the data.

Python, Azure, GCP, Snowflake, SQL, AWS S3, GitHub, CI/CD
Voir plus

GlobantJuly 2019 - January 2021

Data Architect within a digitally native company where innovation, design and engineering meet scale.

Project: Adobe Analytics
Client: Autodesk.
Job Done At: Globant –Mexico (CDMX)Period: July-2019 onward.
Description: Data Modelling and data cleansing and manipulation from S3 and migrating to new table for business analysis.
Role: Senior Data Architect
  • Developing a Data Model and then start scripting in Pyspark, sparkSQL and S3 to help data cleansing and data migration to help the business analyst.
  • Working on another project involving google bigQuery, AWS EC2, AWS Lambda function and Snowflake.
Solution Environment: ADP
Tools: Qubole notebook, PyCharm
Voir plus

CiscoAugust 2016 - July 2019

Customer Support Engineer for a company that enables people to make powerful connections--whether in business, education, philanthropy, or creativity.


1. Cisco Internal Initiative
Client: Cisco.
Job Done At: CISCO –Mexico (CDMX)
Period: April-2018 to 2019.
Description: Automating report generation for HTOMs. Developed several Bots for different projects.
Role: Developer role. (Python+MongoDB)
  • Pulling all essential data and fields from SR case notes which is required in report and storing in MongoDB. Applying logic while generation of report as per HTOMs requirement.
  • Developed project specific automated On-call shift schedule for all Engineers.
  • Developed Bot to send case alert message to CSEs in spark-room and email respectively.
  • Wrote Python script to automate the xlsx reporting data generation for Cisco HTOMs.
Solution Environment: Linux/Windows
Tools: Robo 3T, PyCharm

2. Tetration
Client: Global Clients.
Job Done At: CISCO –Mexico (CDMX)
Period: 1-Aug-2016 to 15 April 2018.
Description: Managing Cisco product on Big-Data for Customer internal infrastructure.
Role: Application Support/TL (Tier-1). Customer Support Executive.
  • Working as production support executive to support Cisco product on Customer infrastructure and providing solutions and troubleshooting
  • Writing automation code in Python to parse the huge log file for troubleshooting.
Solution Environment: Linux/Windows/Sensors
Tools: Tetration UI explore for troubleshooting, Grafana
Voir plus

Global HITSSSeptember 2015 - June 2016

IT professional within a Digital Solutions and IT Services company with 30 years of experience in the market.

Project: HITSS Internal Projects
Client: HITSS
Job Done At: HITSS –Mexico (Guadalajara).
Period: 1-Sep-2015 to 30 June 2016.
Description: Working on HITSS internal projects, also training local Mexican resources on Hadoop Big-Data.
Role: Hadoop Developer (BI Architect).
Solution Environment: Hadoop, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL.
Voir plus

TATA Consultancy Services LimitedSeptember 2007 - September 2014

Hadoop Developer within a global leader in IT services, consulting & business solutions with a large network of innovation & delivery centers.


1. Target – Web Intelligence.
Client: Target Minneapolis, Minnesota, USA.
Period: 1-May 2013 to 30-Sep-2014.
Description: This Project is all about the re-hosting of their (Target) current existing project into Hadoop platform. Previously Target was using MySQL DB for storing their competitor’s retailer’s information. [The Crawled web data]. Early Target use to have only 4 competitor retailers namely, etc.
Role: Hadoop Developer.
  • Moving all crawl data flat files generated from various retailers to HDFS for further processing.
  • Wrote the Apache PIG scripts to process the HDFS data.
  • Created Hive tables to store the processed results in a tabular format.
  • Developed the Sqoop scripts in order to make the interaction between Pig and MySQL Database.
  • Wrote the scripting files for processing data and loading to HDFS by integrating all components.
  • Developed the UNIX shell scripts for creating the reports from Hive data.
  • Created External Hive Table on top of parsed data.
Solution Environment: Hadoop, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL.

2. Bank Of America
Client: Bank Of America.
Period: 1-Sep-2012 to 30-April-2013.
Description: Developing and managing Application for Stock options managed by BoA for its 500 client. Also providing fixes to production issues. Leading the team of 10 people from Nearshore to deliver services to high expectancy of BoA client.
Role: Application Developer/TL (Tier-3). Also Handling TCS managerial internal works.
  • Worked as Team Leader in leading the team to deliver maximum throughput to the client. Working as Developer and Designer for this project.
  • Vast experience as Tier-3 in Application development in Oracle.
Solution Environment: Unix , Oracle, HP Quality center.
Tools: Toad, PL/SQL Developer,  Unix, TFS.

3. At&t
Client: AMDOCS
Period: Sep 2011 – Aug 2012.
Description: Supporting Big Data ware-house for Multiple applications for different vendors. Engaged in production support as Tier-1 for monitoring & fixing failures in application within minimum time frame in order to prevent breach of SLA. Creating/managing/closing tickets using AOTS Remedy.
Role: Production executive (Tier-1/Operation)
  • Worked as Team Leader in leading the team to deliver maximum throughput to the client. Worked as production support Executive for this project.
  • Experience as Tier-1 in production support.
Solution Environment: Unix , Teradata, Tivoli workload Scheduler, AOTS Remedy (AT&T One Ticketing System)
Tools: Tivoli workload Scheduler, Informatica, ESP workstation, Teradata.

4. EIP
Client: British Telecommunication
Job Done At: TCS –Kolkata
Period: Oct 2010 to August 2011.
Description: BT is implementing REUSABLE components to avoid rewriting similar kinds of codes across different LOB’s.
E.g. XML parsing, Table-to-Table load, Flat file loading, logging, Housekeeping.  
Role: Developer
  • Leading the team to deliver maximum throughput to the client. 
  • Worked as Developer for this project. 
  • Experience in developing efficient and simple codes as per design for the project. 
  • Handled XML parsing, Flat file loading, housekeeping and Data warehousing using OWB. 
Solution Environment: Oracle 10g database, PL/SQL, XML
Tools: PL/SQL Developer, SVN, SQL Developer, PUTTY

5. Virtual Data Center Automation
Client: British Telecommunication
Job Done At: TCS –Kolkata
Period: Jan 2009 to Sep 2010
Description: BT also put its step into cloud computing and virtualization. This project is part of this VDC Automation Program where the end customer will ask for Virtual Data Center through online portal. This system will process the request fully and send implementation details  to the downstream. This system is so called the “Heart Of VDC” as it control and decide implementation details for the downstream System.
Role: Developer
  • Worked as Developer for this project.
  • Experience in developing efficient and simple codes as per design for the project. (Handling CLOB across different environment).
  • Taking self-initiative I automated the whole testing of service order by writing PL/SQL script for pre-data setup (huge raw data reqd.) required for testing.
  • Used QTP to automate SO validation and firing of Service Order (SO.) without any manual intervention.
Solution Environment: Oracle 10g database, PL/SQL, XML
Tools: PL/SQL Developer, SVN, SQL Developer, QTP

6. BT-Broad Band Customer Relationship (BBCR)
Client: British Telecommunication
Job Done At: TCS –Kolkata
Period: April-2008 to Dec-2008
Description: BT’s Business is migrating from 20CN to 21 CN. netview migration is a part of fulfilling its integration plan in line with 21CN.
Role: Developer
  • I was associated with netview migration team as developer and was involved in many development activities. (Some migration were quite complex in nature)
Solution Environment: HP Unix, Oracle 9i database, PL/SQL
Tools: TOAD, PL/SQL Developer

7. K.M.C (Domestic)
Client: Kolkata Municipal corporation
Job Done At: TCS –Kolkata
Period: Nov-2007 to March-2008
Description: Various types of reports (Invoice and Bills) were developed as per client requirement for different departments on Oracle report builder.  
Role: Developer
  • Developing various types of invoices as per client requirement using Oracle report builder.
Solution Environment: Oracle 9i database, PL/SQL 
Tools: Oracle report builder

Mes études et formations

Bachelor of Technology in Information Technology - Cochin University of Science & Technology.2003 - 2007