Data Architect x2

CINCINNATI OH   Computer Software Posted: 2 weeks ago  

Job Description:
**MUST BE US Citizen**

Our customer is seeking a Data Architect (x2) to join their AI Factory team, which is responsible for designing, building, and implementing enterprise data systems that power their AI tools and solve some of the company’s toughest challenges (ie: improving supply chain, consolidating and connecting data from over 100+ factories, and automating multi-million-dollar contracts). This role combines strategic architectural decision-making with hands-on implementation work, to ensure enterprise data platforms are scalable and future ready. This person will define architectural patterns, governance practices, and performance standards, while also rolling up their sleeves to build critical components that contribute the current and future data ecosystem.

Requirements:
• 8-10 years of data architect experience, with 5+ of those years designing and building enterprise-level cloud (preferably AWS) data environments (not just pipelines).
• 5+ years leading cloud data platform & lakehouse architecture design, utilizing Databricks, Snowflake, Delta Lake/Iceberg for enterprise-scale ‘medallion-level’ architectures.
• Expertise in data governance and security, implementing data cataloging, access control, and lineage frameworks using tools like Unity Catalog, Collibra, Atlas, etc.
• Expertise in multi-dimensional data modeling (star/snowflake schemas, ERDs, etc), using semantic/ BI layer design (ie: LookML, Power BI, Tableau, etc), to create the business-facing data layers.
• Ability to build scalable pipelines and real-time data integrations utilizing Kafka, Glue, Airflow, or APIs, for data integration and streaming architecture.
• Experience deploying and managing data platform infrastructure as code (IaC) and automation using Terraform or CloudFormation.
• SQL & Python expertise, building and optimizing cloud data solutions, ETL pipelines, and analytics workflows.

Nice to have:

Responsibilities:
• Design and implement enterprise-scale cloud data architectures, including data lakehouse and medallion patterns (Bronze/Silver/Gold) for ingestion, transformation, and analytics.
• Lead data platform initiatives using Databricks, Snowflake, Delta Lake/Iceberg, and evaluate emerging technologies through architectural reviews and proof-of-concepts.
• Establish and maintain data governance, security, and lineage frameworks, including catalog administration, metadata management, and compliance enforcement using tools like Unity Catalog, Collibra, or Atlas.
• Develop and optimize enterprise data models, semantic layers, and knowledge graphs to support analytics, reporting, and business understanding.
• Architect and manage data integration and streaming solutions, including multi-source pipelines, APIs, Kafka, Glue, and Airflow for real-time and batch processing.
• Deploy and maintain infrastructure as code (IaC) using Terraform or CloudFormation, ensuring consistent, automated, and scalable platform provisioning.
• Monitor and optimize performance and cost efficiency, implementing strategies such as clustering, partitioning, Z-ordering, query tuning, and cloud cost governance.
• Implement and maintain complex schema evolution, large-scale data migrations, and architectural standards, while collaborating on enterprise data strategy and reference implementations.

Share This Job
Quick Apply
Stay Up To Date

Sign up for job alerts for
weekly job updates

APPLY NOW