We will discuss using a variety of tools in our CI/CD pipeline to make sure our product works, works well, and is written well. I hope you will find one new you aren’t using yet, learn how to use it, and integrate it into your CI/CD pipelines. A general understanding of software development and CI/CD pipelines is required.
Our presentation covers an overview of how a partnership between CS and CAE has been providing instances of Jupyterhub notebooks for instruction over the past several years. We will discuss how we have implemented the notebooks using AWS, structured load balancing and integrated with Canvas courses. There will be some demos of the tools we’ve used including terraform, helm and an alternating pair of A-B AWS instances for production and testing environments. We will conclude with a brief chat about future work items we are still figuring out and look for discussion with the community about best practices.
Familiarity with the concepts of cloud services, AWS especially, would be very helpful. Some familiarity with the concepts of containers and CI/CD pipelines will also help.
Attendees will become (more) familiar with cloud resources, learn about innovation in instructional support, discuss when projects take off at the university and stop being “pilots”. They will also learn about ways that people can work across various units in the university and facilitate inter-departmental cooperation.
When specialized IT fields – Development, Security and Operations – work together, they can solve problems that separate departments many not be able to handle. The process looks different from each perspective. Yet it is teamwork and respect for people with different skill sets that strengthens the UW IT community and keeps technology growing to meet the needs of the University and beyond.
*Captions have been auto-generated via YouTube. We are actively working to edit these. Please check back if you need captions.
A walk-through of a tested, GitHub-Action workflow for publishing Python code to PyPI. Topics covered include: setup.cfg vs setup.py, using a published open source setup.cfg-builder, rethinking the requirements.txt file, and automated semantic versioning.
PyPI-publishing doesn’t have to be tedious. Semantic versioning is your friend. Pip-installing packages by GitHub-commit URLs will cause errors.
My audience needs to know how to use GitHub and a basic understanding of Python packaging and PyPI (pip).
Our presentation will focus on how our team used Terraform, an Infrastructure as Code tool, and Gitlab to fully automate the provisioning and deployment of the Residency App, a serverless application that collects data from prospective college applicants and helps to determine their residency status.
Key points that we will cover:
Bootstrapping a terraform provisioning plan
Integrating Terraform with Gitlab CI/CD
Handling multiple application environments, such as QA and Production
Service account credential handling in Terraform and in the Cloud
We would like our audience to be able to get a basic sense of the work involved in provisioning and deploying a cloud-based application using an Infrastructure as Code strategy and the CI/CD tools available at UW.
API enabling your DNS server will let you do awesome things! See why having API access to DNS for all necessary parties enables automation on a level we’ve only previously dreamed about. We were manually doing certificate renewal for 1800+ certs, but after implementing Infoblox API + AWS Cert Manager + Wiscweb we have 1800+ Automatically renewed SSL Certificates FTW. And this pattern can be used by anyone! This session is perfect for people who have or host websites, or run DNS servers.
Get an update on the Interop Initiative including new infrastructure services and capabilities, and plans for the coming year. Learn about the new tools and approaches that are coming online, and how they impact data access and integration.
Azure Devops provides a platform for executing IaC pipelines. We will explore using GitOps to automatically trigger terraform build pipelines in dev, staging, and production environments. We will also explore the automation of Azure Image creation and the automation of vm patching. Learn how to automatically take infrastructure as code from a development environment to staging and production environments through Azure DevOps Pipelines and GitOps. This session is perfect for people in DevOps who are interested in automating their IaC development and deployment cycles.
This session will provide a technical look at UW-Madison’s new institutional data warehouse using the platform Snowflake, branded Badger Analytics, hosted by the ETL developers within the Office of Data Management and Analytics Services who were integral to bringing Badger Analytics to life. They will share the story of the modernization of the environment, the project’s progress to date, the architecture, benefits, and how the modernization has revolutionized the work they do and freed up capacity through speed and automation. This practical session, by technical professionals for technical professionals, will provide details about the technologies and how they have been leveraged to provide value and unlock previously unimagined capabilities. Learn about the potential to leverage this modernization for the benefit of other units and divisions within the UW-Madison community.
This session builds on last year’s presentation by Cathy Lloyd, chief data officer, about cloud data warehousing for UW-Madison’s institutional data and the overall analytics strategy.
This presentation discusses the Qualys Container Security scanner GitLab integration. DevOps is quickly changing the way that organizations build and deploy web applications such as Docker containers. With container technology, build workflow needs rapid release cycles and continuous deployment. By integrating automated security testing into the development tool chain workflows, developers can identify security issues associated with containers early in their build process. We will discuss the scripts and source code for tools to provide access to the Qualys container vulnerability scanning system through GitLab CI/CD jobs. This integration into a GitLab project allows developers to trigger a Docker container image scan pipeline on the image of their choosing. Any vulnerabilities found will be posted as a GitLab issue in the project from which it is executed. This integration uses a pre-configured VM for the GitLab runner that obtains access to what it needs via AWS IAM roles. This allows any developer with a project on the same GitLab instance to incorporate the Qualys scanner job by including a GitLab CI/CD template in their own gitlab-ci.yml file without having to set up access to the Qualys API for themselves. At the end of the presentation, participants will be able to learn about the architecture, scripts and source code, sample reports as well as setup instruction documentation and on-going improvements.
The architecture, scripts and source code, sample reports as well as benefit of container scanning, setup instruction documentation, and on-going improvements.