We are seeking an experienced Data Warehouse Architect to lead the evolution of our data infrastructure and oversee end to end ETL operations. This role involves working with technologies such as Python, PostgreSQL, AWS storage solutions, and Apache Airflow to ensure high quality, reliable data delivery across the organization. You will collaborate closely with internal teams, support data requests, and drive performance improvements across our data ecosystem.
Key Responsibilities
Data Warehouse & ETL Ownership
• Manage, enhance, and maintain the data warehouse environment built on relational and non relational database technologies.
• Develop and maintain Airflow DAGs to orchestrate automated ETL workflows across diverse data sources.
• Ensure the data platform is robust, scalable, and optimized for performance.
• Integrate and validate data from multiple systems while upholding data quality and consistency standards.
Data Delivery & Stakeholder Support
• Provide accurate, timely data to internal systems and business users.
• Partner with stakeholders to understand data needs and deliver effective, reliable solutions.
Performance Tuning
• Optimize SQL queries and database operations to improve API responsiveness and system efficiency.
• Monitor system health, diagnose issues, and implement corrective actions.
Cross Functional Collaboration
• Work closely with analysts, engineers, and other team members to support data driven initiatives.
• Address user inquiries and provide ongoing data support.
Documentation & Standards
• Maintain clear documentation of data workflows, processes, and system configurations.
• Advocate for best practices in data management and contribute to continuous improvement efforts.