Hey, I'm Jordan 馃憢馃徎

About

I鈥檓 a Data Engineer with experience working on data infrastructure and improving data accessibility. I鈥檝e been involved in deploying and managing data platforms, using tools like Kubernetes, APIs, and data virtualization solutions to help support data-driven projects.

I enjoy collaborating with teams like DevOps, IT, and governance to improve deployment processes, performance monitoring, and secure data access. I鈥檓 always eager to solve data challenges and find ways to make data more accessible and useful within organizations.

Outside of work, I enjoy hitting the trails on my mountain bike and exploring new adventures!

Education

  • Georgia Institute of Technology
    2024
    M.S. Computer Science
    Georgia Institute of Technology
  • Arizona State University
    2019
    B.S. Computer Science
    Arizona State University

Work Experience

  • RRD
    May 2024 Present
    Data Infrastructure Engineer II
    RR Donnelley (Remote)
    • Serving as a full-stack, versatile member of a highly agile team, contributing to many facets of our Data Mesh initiative.
    • Deployed and currently managing Dremio via Helm charts on an on-premises Kubernetes cluster as a proof of concept during product vetting, setting up JMX metrics, Prometheus, and Grafana dashboards for real-time performance monitoring.
    • Implemented single sign-on (SSO) and LDAP role-based authorization for Dremio to enforce secure and automated user access, ensuring permissions align with team-based governance policies.
    • Created MuleSoft API proxies on top of internal minion services with authentication through OneLogin and an internal auth minion service, facilitating secure data sharing across multiple acquired organizations.
    • Enhanced the internal Data Mesh website by integrating MuleSoft APIs with Dremio, enabling users to submit queries and retrieve results directly through the website, improving data accessibility and demonstrating the power of the Data Mesh.
    • Collaborating with IT teams to open networking pathways between Dremio and multiple on-premises Hadoop clusters, ensuring seamless data integration and accessibility for end users.
    • Partnering closely with DevOps, IT, and Data Platform teams to optimize deployment, integration, and maintenance of Data Mesh components, coordinating weekly troubleshooting calls with Dremio support to resolve challenges and improve system stability.
    • Implementing data governance practices by working with the governance team to ensure all imported data is properly classified, maintaining compliance and enhancing data discoverability across the organization.
    • Continuing my work and responsibilities listed below as a Data Infrastructure Engineer I.
    • Primary technologies: Collibra, Docker, Dremio, Git CI/CD, Grafana, Hadoop, Helm, JMX, Jenkins, Kubernetes, LDAP, MuleSoft, Next.js, OneLogin, Prometheus, Python, REST APIs, Snowflake, Unix.
  • RR Donnelley
    Aug 2022 May 2024
    Data Infrastructure Engineer I
    RR Donnelley (Remote)
    • Creating a centralized data catalog, providing a unified view of company-wide data assets.
    • Developing APIs to ensure seamless data access, contributing to a more responsive and user-friendly data ecosystem.
    • Integrating existing internal websites with the Data Mesh, enhancing user experiences across the organization.
    • Configuring ETL pipelines for consumers within the company, as well as Data Products on the public Snowflake Marketplace.
    • Developed and launched an internal Next.js 14 web application, serving as a landing page/information hub for the Data Mesh project. Features real-time API calls to display live stats on number of data products, data sets, columns, etc.
    • Authoring blog-style Data Mesh Guild posts which effectively communicate new deployments and key updates to stakeholders across the enterprise, enhancing cross-functional collaboration and driving informed decision-making.
    • Assessing, deploying internally to Kubernetes/OpenStack, and creating compelling demos for technologies which have potential to enhance the Data Mesh.
    • Participating in evaluations of various technologies for data virtualization, visualization, event streaming, generative AI, etc.
    • Primary technologies: Collibra, Docker, Git CI/CD, Hadoop, Jenkins, MuleSoft, Next.js, Python, Snowflake, Unix.
  • USAA
    May 2022 July 2022
    Tech Lead - Software Engineer II
    USAA (Plano, TX)
    • Leading a team of 5 globally diverse engineers in development features and code reviews.
    • Creating technical designs and product roadmaps to drive reliable, reusable, and scalable solutions.
    • Continuing my responsibilities listed below as a Software Engineer II.
  • USAA
    Aug 2021 May 2022
    Software Engineer II
    USAA (Plano, TX)
    • Developing small batch data load and model processes using Domino, R, Python, Git CI/CD, and Airflow.
    • First team in organization to transition business developed models to IT supported infrastructure/technology, benefiting in 60% reduction in code and 95% reduction in hours worked monthly (~$84,000 saved per year per model).
    • Programming ETL jobs using Python, shell scripts, and IBM DataStage.
    • Developing and maintaining data integrity through run time controls over various data hops.
    • Instructing quarterly course to bring employees up-to-speed on IBM DataStage as well as ETL methodologies.
    • Primary technologies: Airflow, Control-M, DataStage, Domino, Git CI/CD, Python, R, Tableau, Unix.
  • USAA
    Sep 2019 Aug 2021
    Software Engineer III
    USAA (Plano, TX)
    • Working systems that supply data to USAA data analysts for reporting.
    • Creating data pipelines from application databases to enterprise data warehouse.
    • Developing and maintaining data integrity through run time controls over various data hops.
    • Programming ETL jobs using Python, shell scripts, and IBM DataStage.
    • Instructing quarterly course to bring employees up-to-speed on IBM DataStage as well as ETL methodologies.
    • Primary technologies: Oracle, Python, DataStage, Control-M, Tableau.