Chat to apply now!


The Sr. Programmer Analyst - Enterprise Data Warehouse/Business Intelligence will provide technical expertise in designing and implementing data integrations for data warehouse/business intelligence solutions leveraging Extract, Transform and Load (ETL/ELT) tools such as Informatica. Create technical design specifications from business/functional requirements. Work with internal stakeholders to clarify data integration and system requirements as appropriate. Configure, program/create ETL solutions and unit test development work to ensure quality. Work within Darden’s SDLC methodology and adhere to technical architecture standards. Proactively identify potential problems and issues and actively communicate and manage issues to resolution. Work closely with multiple functional areas to understand data and assist in data architecture design. Provide assistance to IT and business users to effectively understand and leverage data and make it available to the business in a useful, usable format. Begin to use and explore Cloud ELT tools and development. Support replication of Darden’s on-premise Data Warehouse to the Cloud, and gain expertise and understanding of Snowflake cloud data warehouse.


-Work closely with key stakeholders and other project team members to understand and prioritize functional requirements and information needs. Participate in the development of functional requirements and design specifications as appropriate

-Develop, create and document technical specifications and designs from which applications and/or technical solutions can be developed that satisfy documented business /functional requirements; envision potential future requirements and business needs to ensure solutions are flexible and extensible

-Design and implement robust, efficient extract, transform, and load applications and processes to load data and supporting metadata into the Enterprise Data Warehouse using Informatica and other tools and languages; this includes implementation of parsing rules to assure selection of accurate data, transformation of data into the required format and structure to support the business requirements, loading the transformed data into the target database and the associated metadata, and where applicable, aggregations of data

-Ensure that all code / technical configurations and other work products are thoroughly unit-tested prior to delivery Participate in system/integration testing as appropriate. Perform code reviews and other QA steps as requested

-Responsible for the ongoing operational stability and maintenance of the ETL/ELT processes, systems, and data, ensuring they are properly monitored and audited to provide data integrity, accuracy and timeliness of delivery. This includes being on call to support production systems

-Work closely with Data Architect/Modeler, Business Analysts and BI Developers to understand requirements and use technology and best practices to develop ETL processes in support of the requirements, validating that they meet business and technical specifications

-Lead small projects, documenting programs and systems according to published standards.

-Adhere to Darden SDLC and technology architecture requirements. Contribute to architecture design principles and standards as appropriate

-Proactively identify and communicate potential problems and issues to project team members/leaders, including identifying alternatives and recommending/implementing solutions as appropriate

-Ensure accurate project status and work estimates (ETCs) are always reported /communicated to project leaders/managers in a timely fashion. Develop detailed project and task estimates as needed

-Effectively communicate with stakeholders throughout the project lifecycle. Ensure issues are analyzed, discussed and resolved in a timely manner

-Consistently enhance skills and job knowledge by researching new relevant technologies and software products and trends; reading professional publications; maintaining personal networks; participating in professional organizations.

-Support replication of the on-premise Oracle data warehouse to Cloud Data Warehouse (Snowflake)

-Begin efforts to transition ETL/ELT to the cloud, taking data through phases of a data lake and then into the cloud data warehouse, using technologies such as Databricks, Spark SQL, etc.


-5+ years programing/analysis experience

-Minimum of 3 years’ experience with developing complex ETL integrations in a large organization (Cloud ETLs IICS/Talend preferred)

-Minimum 3 years’ experience programming in SQL

-Extensive experience in extract, transform and load, data profiling, data quality, metadata management and data intake for large complex data systems

-Demonstrated ability to analyze, verify, and document the accuracy of the developed ETL code through self-directed testing

-Hands on experience with Informatica and/or cloud ELT technologies, including demonstrated strong SQL coding skills

-A strong familiarity with general DBMS concepts

-Knowledge of best practices through relevant experience across data-related disciplines and technologies particularly for enterprise wide data architectures and data warehousing/BI

-Ability to prepare design documentation to express necessary details for review and ongoing maintenance

-Resolve end user reporting problems through collaboration with both technical and functional personnel in a team environment


-Bachelor's degree in IT related discipline or equivalent experience (BS / BA in MIS, Computer Science, Business Analytics, or Mathematics)


-Ability to work independently, take ownership of tasks and follow through to implementation/resolution

-Strong verbal and written communication skills with an ability to express complex technical concepts in business terms and complex business concepts in technical terms

-Ability to prioritize and multi-task across numerous work streams

-Highly developed analytical, problem solving and debugging skills, with strong ability to quickly learn and comprehend business processes and problems in order to effectively develop technical solutions to their requirements


-Experience with PL/SQL Developer

-Working experience with revision control systems, Linux platform, Shell scripting, and big data

-Microsoft Azure, Databricks, Spark SQL, cloud technologies and tools

-Understanding of SDKs and Rest API calls

-Experience with customer/guest data and systems; knowledge of restaurant business