A Data Architect is someone who solves problems for data companies. He creates the solution, how to execute the solution and then executes. He knows the right facilities and tools to use to solve the problem.
How to Become a Successful Data Architect?
Data Architects should have college degrees in Computer science, computer engineering or a related course. A data architect must have knowledge of application design, system development, and information management.
To be a Data Architect, you must have worked as a programmer or database programmer for 3-5 years. Experience is key when solving problems as a data architect.
A Data Architect should get an extra certificate to further his/her learning. In data architecture, a master’s degree in computer science or data science is better for many senior positions.
Database developers/programmers make sure that their systems can handle large data in data companies. They work with the software developing team. They provide a solution if any issue relating to database performance is seen. They protect the database by using appropriate security measures.
The Best Data Architect Resume Samples
These are some examples of job descriptions we have handpicked from real Data Architect resumes for your reference.
- Responsible for Master Data Management (MDM) and Data Lake design and architecture.
- Data Lake is built using Cloudera Hadoop and Talend ETL.
- Informatica MDM is the technology used for managing the master data management.
- Provided architecture and design/technology impact to the management for MDM and Data Lake, and other data architecture landscape.
- Oversee entire data lifecycle from acquisition to reporting through implementation of various ETL techniques and data management methodologies.
- Responsible for implementing data quality constraints to ensure accuracy and completeness.
- Create and maintain user profiles on SQL Server to ensure implementation of various user accesses.
- Create Power BI Dashboard reports and SSRS report type Parameterized reports, Linked reports, Snapshot reports, Cached reports, Ad hoc reports, Click through reports, Drill through reports, Sub reports.
- Responsible for managing all the production deliverables (deployment document, Work Order, Change Order).
- Maintain the best practices and RDMS best concept for writing the SQL queries and maintain the ETL tools.
- Created C# .net tool for invoking InRule entities in SSIS component and pass the rules through inRule engine.
- Responsible for Modelling and Building Enterprise data Marts and for different business reporting needs.
- Building EDW from Nextgen and Athena for clinical and financial reporting needs.
- Responsible for Analysis massive and highly complex data sets, performing ad-hoc analysis and data manipulation.
- Responsible for analysing data in different systems like Cerner, Meditech, Epic, Allscripts, PBAR, MedHost, First Net and Build Data marts for the different business reporting needs.
- Supported the analysis of SCRA remediation efforts by providing data profiling during early milestones of the analysis.
- Aided in the creation of critical data sets as inputs to the process.
- Performed analytics on large data sets creating summarized reports for a management audience.
- Contributed to the logical design and data modeling of the architecture that comprised the data structures of the project.
- Work with the management team on the implementation of data warehouse design and data integration.
- Provide 24* 7 support for the administration and maintenance of SQL Server, SSRS and SSIS servers.
- Refresh SQL Server database test and dev environments using database engine.
- Administer database performance issues using system tables, SQL Server database tuning advisor, DBCC commands, built in functions and stored procedures.
- Created enterprise data models for the call center and dispatch for onsite equipment repair.
- Designed data architecture for employee work schedules and appointments.
- Developed data models for business services required by multiple applications.
- Provided progressive vision for data service modeling that enable project team to keep its schedule.
- Designed and developed the process to migrate data from multiple source systems to Master Data Hub(MDH).
- Identified, compared and resolved data quality problems using T-SQL in MS SQL Server.
- Measured and audited large volumes of varying data for data quality issues.
- Extensively analyzed the data for data quality dimensions and worked with BI team to develop data quality dashboards using Microstrategy.
- Wrote stored procedures in SQL Server db using Visual Studio to analyze the data between source systems and target system (BMC Remedy).
- Conducted various meeting with business users and stake holders for requirement’s gathering.
- Extensively involved in analyzing various data formats using industry standard tools and effectively communicate them with business users and SME’s.
- Implemented data visualization techniques using tools and forecasting the results with business to help them for better decisions.
- Identified the Meta data properties and mapping them back by identifying the true source.
- Designed conceptual, logical, physical data models and proof of design for enterprise data management.
- Worked, in a senior role, as part of the architectural team to design a new system, the Legal Document Repository (LDR).
- The LDR will be the Credit Suisse (CS) repository to store all Legal Agreements between CS and its Counterparties.
- The design involved the accommodation of an interface between CS and a vendor, who will be using their proprietary product.
- This will be a replacement of an existing application, so ensuring that all existing functionality is maintained is a business imperative.
- Extensive data mapping of all data that is being captured and its subsequent migration to downstream consumers was a critical deliverable.
- Converted ad-hoc/manual reporting system into high performance reporting system using SSAS OLAP cubes.
- Worked closely with Servers/Network/VM ESX/Storage teams to configure high availability productions database servers.
- Scheduled and monitored all maintenance activities of SQL Server 2014/2012/2008R2 including database consistency check, index defragmentation and database backups.
- Responsible for performance tuning, capacity planning, SQL Server clustering, database security configuration.
- Re-defined the current Business Intelligence Platform with clarified data sources and reporting applications.
- Analyzed the Data Lineage from source systems including SAP systems and legacy applications to Data Warehouse.
- Defined the ETL layer being a combination of Informatica, Stored Procedures and Qlikview Extractors.
- Deep-dived four major Qlikview applications for Supply Chain Intelligence and presented the detailed As-Is architecture including Data Sources, ETL, Job Scheduling, Lineage, Data Models and Data Quality.