ETL Developers are responsible for importing, exporting, and processing data in a way that can be used by their organization. This includes developing new programs to collect data from multiple sources, updating existing programs to work with the latest technology, and developing ways to access stored data. The top skills for this position are business acumen, critical thinking skills, teamwork, excellent analytical skills, and computer software. Most employers require a bachelor’s degree in computer science or a related field.
The Best ETL Resume Samples
These are some examples of accomplishments we have handpicked from real ETL resumes for your reference.
- Responsible for Business meetings, conversion of business rules into technical specifications.
- Involved in Requirement gathering, business Analysis, Design and Development, testing, and implementation of business rules.
- Used Informatica PowerCenter 9.1 for extraction, loading, and transformation (ETL) of data in the data warehouse.
- Extensively worked with various Passive transformations in Informatica PowerCenter like Expression Transformation, Sequence Generator, and Lookup Transformation.
- Streamlined internet financial data to improve the company’s ability to meet regulatory reporting and investor relations requirements.
- Good Knowledge of Tableau installation and configuration with a Single Server System.
- Worked on technical strategy, provided hardware and software configuration, understood client-wide BI implementation project, designing and develop visualizations.
- Created batch scripts for tableau Server backup atomization on weekly basis and save the backups in a particular folder.
- Brainstorm ideas on what to include in views/dashboards from the data source and document on the fly business requirements.
- An implemented a new technique to recover lost data from crashed drives; worked with the technical support and training department to develop a comprehensive recovery procedure.
- Worked with External vendors to understand the marketing needs and identify the source data for mapping the requirement.
- Setting up the SFTP connection between the external vendors and the Honeywell security team to transfer the reports securely.
- Data for Connected home reports are pulled using both Informatica/Pentaho ETL Tool.
- Trained co-workers on TBS procedures; increased efficiency of workflow management system in the customer service department by eliminating duplicated tasks.
- Diagnosed and defined Reporting ETL improvements that saved the company $25,000+ per month; supported the change management team in the implementation of the solution.
- Extensively used Informatica Client tools – Power Center Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
- Extracted data from various heterogeneous sources like Oracle, SQL Server, Flat Files.
- Responsible for developing the ETL (Extract, Transform and Load) processes using Informatica Power Center.
- Authored standard work instructions for Data Warehouse automation; devised scripts that improved processing time and reduced overall cost-per-transaction.
- Implemented program logic to produce databases with over 10GB of data.
- Worked closely with DBA’s to develop a dimensional model using Erwin and created the physical model using Forward Engineering.
- Performed Design, Build, and Unit Test parallel jobs to extract files from Oracle Enterprise stage and Flat files.
- Used DataStage stages Join, Change Data Capture, Lookup, Transformer, Aggregator, Sequential file, Sort stage, Filter stage, Data Set, Remove Duplicate, surrogate key.
- Used Data Stage Administrator to define environment variables and project-level settings.
- Utilized stages of Job Sequence such as User Variable Activity, Notification Activity, Routine Activity, Terminator Activity, etc.
- Designed and developed ETL/SSIS programs to extract data from transactional database systems and load it to the Data Warehouse using SSIS (SQL Server Integration System).
- Troubleshoot and analyze ETL process failures, data anomalies, and other ETL or data warehouse issues identified by automated monitoring, other developers, and end-users.
- Developed AD-HOC reports for clients using SSRS (SQL server reporting services).
- Worked with customers to define and document the scope and functional requirements for data and reporting needs.
- Trained fellow team members in software development practices and methodologies.
- To understand and analyze the requirement documents and explain them to the team whenever and wherever necessary.
- Design and creation of detailed Technical Mapping document with information on the implementation of business logic.
- Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
- Provided leadership for a team of five developers, facilitating progress across multiple projects.
- Determined needed adjustments to existing software by working in client offices occasionally for one week at a time.
- Extensively used ETL methodology for supporting data extraction, transformations, and loading processing, in a complex EDW using Talend Integration Suite.
- Test the ETL process both before data validation and after the data validation process.
- Created and executed test cases for ETL jobs to upload master data to the repository.
- Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.
- Used applicable technologies to develop innovative solutions by researching new products and techniques; provided substantial business value by developing 40 reports utilizing Microsoft SQL Server 2008 R2 that reduced development cycle time by 25%.
- Experienced in full life cycle MDM development including data analysis, database design, data mapping, and data load in batch.
- Have involved in Estimating the Informatica development work effort individually for all new development.
- Created Various MicroStrategy objects such as Metrics, Filters, Prompts, Custom Groups, Consolidation, and created drill maps to obtain aggregated results.
- Tailored product for each client providing customizations in a data warehouse, ETL mappings, OBIEE Answers, Dashboards & BI Publisher.
- Assisted with the development of several database applications including data management system, dynamic reporting, and Master Data Management.
- Formulates and defines scope and objectives through data analysis and fact-finding to develop or modify moderately to complex Extract Transform Load (ETL) streams.
- Extract data from different source systems: risk systems, CDMS (customer data warehouse), CDR (credit derivatives), PSE (Pre settlement exposure) as per SLA and load into staging table as-is.
- Convert data from different source systems into a common format, apply enrichments/derivations, and derive calculated balances like unused committed amounts.
- Generate exception reports on source feeds as per data validation rules provided by risk architecture.
- Functioned as an expert consultant in the development and implementation of the suite of an enterprise resource management system.
- Collaborated with Business Analysts and the DBA to gather requirements, business analysis, and designing of data marts.
- Analyzed the logical model of the databases and normalized it when necessary.
- Involved in design and development of Business Requirements in liaison to business users and Technical teams by gathering requirement specification documents and identifying data sources and targets.
- An essential part of a team that developed an integrated database system utilizing open source components.
- Developed a Middle-Tier data connection program used by numerous clients to tie together business applications in multiple locations.
- Designed and developed mappings, defined workflows, and tasks, monitoring sessions, exported and imported mappings and workflows, backups, and recovery.
- Extensively worked with database and Informatica Partitioning for performance tuning.
- Extensively designed and developed reusable transformations & aggregations and created target mappings that contain business rules.
- Implemented Change Data Capture (CDC) to extract information and Partitioned sessions for concurrent loading of data into the target tables.
- Researched and implemented additional database enhancements to improve efficiency and security of client data.
Resumes are a crucial aspect of any job search. In order to make a good first impression, it is important that your resume be formatted and written professionally.
Hope these samples gave you an idea of what your resume should look like and some tips on how to make sure that your resume stands out from the rest.