Chat to apply now!

JOB OVERVIEW:

Provide technical expertise in developing and maintaining a data quality framework. This role will participate in the full data quality lifecycle from requirement elicitation through ongoing support. The candidate selected for this role will create technical design specifications from business/functional requirements or from logged data incidents. The position plays a pivotal role in the definition of our data quality program, which includes policy, process, standards, and tools. The role demands an ability to understand the data semantics, business use and relationships.


ROLES AND RESPONSIBILITIES:

-Ability to translate business concerns into technical requirements/specifications

-Ability to elicit requirements both from business users as well as Data Incident Management. Ensure data issues detected in the past are not repeated in the future.

-Develop technical specifications/design that demonstrate how data quality will be preserved/enforced. Maintain as built documentation for the current as-built rules register.

-Build and quality rules code base to monitor data flows and ensure acceptable levels of trust.

-Assist in driving the tool and language selection for the data quality practice.

-Participate in Data Profiling POC evaluating product capabilities and make recommendations.

-Oversee the execution of data profiling both for the purpose of ensuring acceptable quality as part of ingestion but also to understand data shift over time. Adjust these two profiling processes as necessary to meet business objectives.

-Analyze Profiling and Rule results to ensure business requirements are being met. Fine tune configurations and rules to optimize both results and performance. Make design/process recommendations as needed.

-Triage data anomalies to determine root cause. Participate in solutioning sessions to determine anomaly resolution and address any data purity issues.

-Work with the BA team to generate data to power quality dashboards, which allow both data providers and data consumers to monitor data quality.

-Contribute to the creation of data quality standards for Darden at both the enterprise and data set level. Serve as an enterprise data steward to ensure quality standards and data principles are being met.

-Contribute to business/technical definitions of data objects within the data catalogue.

-Assist in the creation and maintenance of Data Lineage documentation.

-Serve as an SME for multiple data domains. Assist business users in the selection, understanding and use of data.

-Perform UAT on data sets as part of data ingestion, egress, transformation and rule execution.

- Ensure that all solutions, technical configurations and other work products are thoroughly unit-tested prior to delivery. Participate in system/integration testing as appropriate. Perform reviews and other QA steps as requested.

-Adhere to Darden SDLC and technology architecture requirements. Contribute to architecture design and overall data principles and standards as appropriate.

-Proactively identify and communicate potential problems and issues to project team members/leaders. Proactively identify alternatives and recommend/implement solutions as appropriate.

-Effectively communicate with stakeholders. Ensure issues are analyzed discussed and resolved in a timely manner.

-Consistently enhance skills and job knowledge by researching techniques, technologies and software products; reading professional publications.

-Provide for thorough and accurate Data Quality Tool administration to ensure the platform is properly managed.

-Train other team members on the data quality platform and tool suites. Provide leadership and guidance to new team members.


REQUIRED TECHNICAL SKILLS:

-Strong understanding of data structures, data types, and data transformation.

-Familiarity with industry data patterns, normalizations rules and data performance tuning.

-Ability to perform complex data mappings, workflows and sessions.

-Strong relational database skills and an understanding of columnar data structure.

-Extensive experience with SQL, and other data transformation/analytics tools such as Informatica, Talend, or Alteryx.

-Advanced expertise in reading, analyzing and debugging SQL.

-Ability to troubleshoot data processing performance issue.

-Experience or willingness to learn data profiling/quality tools such as Collibra, Ataccama, Informatica or OEDQ.

-Experience or willingness to learn SparkSQL and Databricks.

-Highly developed analytical, problem solving and debugging skills, with strong ability to quickly learn and comprehend business processes and problems in order to effectively analyze result set and triage quality issues.

-Work with all levels of development from analysis through implementation and support.

-Expertise in working with spreadsheets, strong understanding of financial concepts and data.

-Ability to work independently, take ownership of tasks and follow through to implementation/resolution.

-Resolve end user data problems through collaboration with both technical and functional personnel in a team environment.

-Demonstrated competency in designing, developing and testing complex rule sets.

-Demonstrated competency in accurately identifying the scope of work and preparing thorough, accurate and estimates.

-Exceptional verbal and written communications skills, with an ability to express complex technical concepts in business terms.

-Solid teamwork and interpersonal skills.

-Strong analytical, problem-solving and conceptual skills.


REQUIRED EDUCATION:

- Bachelor's degree in IT related discipline or equivalent experience (BS / BA in MIS, Computer Science, Business, Mathematics or Engineering)


OTHER KEY QUALIFICATIONS:

- Data modeling Experience

-Exposure to Cloud Infrastructure

-Experience with Cloud Data Warehouse products such as Snowflake or Azure Synapse

-Experience with data integration patterns, data pipelines and tools such as Azure Data Factory

-Experience using reporting tools like Power BI for Data quality visualizations

-Knowledge of Data and Delta Lake Structures

-Experience with Python, Scala or Java

-Knowledge of restaurant or retail business


PREFERRED SKILLS AND EXPERIENCE:

-7+ years’ experience across Business Intelligence/Data Warehouse/Data Lake projects

-2+ years’ experience on Data Quality and Governance initiatives, with at least one successful implementation

- Experience programming in SQL with ability to develop complex queries against large disparate data sets