ChatGPT in DevOps - Hype vs. Practical Use

zev-schonberg
zev-schonberg

I recently came across this tweet from Aaron Levie, Box CEO and I couldn't agree more. 

“We’re at the start of a new software paradigm where you can have a single interface to ask AI to run a query across any number of systems, synthesize the responses in a human-readable way, or execute any number of tasks in that system. 🤯”

ChatGPT can answer really complex questions, unlike the simple keyword-based queries that Google answers. It can understand complex human language that a traditional search engine is unable to. Beyond answering questions, ChatGPT can also write programming scripts and code to help you build basic applications, or perform basic code-related tasks. This opens up a world of possibilities about how the technology can be used to improve DevOps processes. Let’s look at all that’s possible with a generative AI tool like ChatGPT, and solutions that are already available beyond ChatGPT.

What is ChatGPT?

ChatGPT is a breakthrough generative AI model based on OpenAI’s GPT (generative pre-trained transformers). GPT is a language model that produces natural, human-like text responding to text inputs. The success of ChatGPT is owed to a training methodology known as reinforcement learning from human feedback (RLHF). It is simply a machine-learning algorithm that learns from human feedback. It helps ChatGPT produce the best output to questions while continuously improving its answers.

Its predecessors were GPT-1 - introduced in 2018 - and GPT-2, followed by GPT-3, powered by 175 billion parameters. ChatGPT, built on top of GPT-3.5, was then launched in November of 2022. It is a Large Language Model (LLM) with a chatbot-like functionality that converses in human language.

Although ChatGPT has transformed how humans interact with technology, it is still a computer program trained on a finite dataset. For instance, the GPT-3.5 model is based on data until September 2021.

Some of the limitations of ChatGPT that OpenAI declared are:

  • Incorrect information: It can give incorrect answers as plausible output because there is no single source of truth on which the model can be trained.
  • Sensitivity to question formation: The same query framed in two different ways can produce two different outputs.
  • Bias: Assumes biases that training data possesses and could potentially overuse certain phrases.
  • Making assumptions: It tries to guess the intent of your prompts instead of seeking clarification in cases of unclear questions.

Despite these limitations, ChatGPT has the potential to be a game-changer for DevOps engineers. Let’s see how.

How does ChatGPT affect DevOps practitioners?

DevOps teams are overworked and understaffed. They would welcome any help in offloading mundane tasks so they can focus on higher priority tasks. ChatGPT can add great value as a virtual assistant for DevOps teams. It can perform tasks like incident detection and response, implementing IT risk policies, running diagnostics, and generating code. 

Specifically for code generation, OpenAI released Codex, an LLM that translates human natural language into code. It can either generate actionable code from commands in human language or explain code in natural language for debugging. It can address tasks like auto-populating config files and templates, recommending better and cleaner code snippets, and pacing up the scaffolding of new assets.

ChatGPT, in essence, is an LLM powered by neural networks trained on a humongous data bank. This training allows it to predict word sequences and generate human-like responses. You can train these LLMs to solve specific programming challenges in your applications, establish best practices and IT policies, and ensure proper QA. The potential use cases for ChatGPT are wide and varied.

ChatGPT Use Cases for DevOps

Here are a few interesting use cases for ChatGPT, some of which may catch you by surprise:

Create programming script templates easily

You can create templates for repetitive tasks or use cases by providing concise prompts about your requirements. 

A sample prompt can be "Create a Python script template to automate the deployment of a web application to a server. The script should pull the latest code from a Git repository, install any required dependencies, and restart the server."

It will generate a response with a template code you can modify to create your tailored script. However, you must cross-check the syntax and commands to see if the script works how you want it to.

Improve your research capabilities

ChatGPT is fed with incredible information that DevOps engineers can use to research and brainstorm on DevOps-related topics. You can get a verbose response on best practices, tools, and methodologies. Instead of using search engines to understand a topic, you can get the most sensible, popular, and authentic explanation. It can also help you save the time you usually spend going through StackOverflow threads to understand quick functions and routines. This, however, is subject to the limitations of ChatGPT. Regardless, it is an easy and vast encyclopedia tool to conduct your research and stay up-to-date.

Collaborate with the team effectively

You can build a self-serve DevOps platform that can serve as 

  • A knowledge-sharing platform for your team to share best practices and routines.
  • A troubleshooting tool to resolve issues they face with automated diagnosis and generate potential solutions.
  • A project management mechanism to track progress and mitigate bottlenecks by streamlining workflow.
  • A communication platform to share ideas and feedback and ensure healthy collaboration.

Refactor, describe and translate code

ChatGPT can help you write flawless code and turn complex logic into cleaner and faster scripts. Developers already use ChatGPT to refactor their scripts by simply feeding the code with a prompt. It can explain bits of code or entire programs. All you need to do is paste the code and ask, ‘How does this code work?’

Another little-known use case of ChatGPT is translating code into other programming languages. For example, you can generate a script in Bash and then ask ChatGPT to write the same in Python or Java. 

Debug code 

ChatGPT is great at identifying mistakes, syntax errors, and bugs in your code, given you ask the right questions. Since it is conversational and can reference up to 3,000 words from your current conversation, you can have a detailed back-and-forth. Check out this example from the OpenAI blog.

Not only can you check your code for bugs, you can also ask ChatGPT to improve your code quality. It can generate alternative algorithms along with explanations for its suggestions.

Existing LLM models like ChatGPT are limited to text generation based on human interaction. They are general-purpose chatbots that are useful for generic tasks. However, there is massive potential to leverage generative AI to build new products that focus solely on challenges that face DevOps teams. That’s what we’ve built at Kubiya.

Kubiya - Leveraging generative AI for DevOps self-serve

Kubiya provides DevOps practitioners with a powerful toolset that can provide intent and action-based experiences to kickstart an automated workflow. At Kubiya we have trained various LLMs on our workflow structure which are now able to generate complex workflows in seconds. From creating Lambda functions to deploying K8s or pretty much any resource in AWS, Kubiya can do it all. This aids operations teams in creating workflows with little to no effort, and ensures security and customization at scale.

At the other end of the software delivery pipeline, developers can make requests for resources in English and Kubiya’s LLMs (using vector embeddings) are able to determine their intent and deploy the required resources in seconds to minutes. This is incredible when seen in action or experienced firsthand.

Here’s an explanation of how Kubiya leverages generative AI to build a powerful self-service platform for DevOps:

https://youtu.be/O4yRCV6WU_M

Kubiya’s AI platform - which perfectly mixes generative and conversational AI - can empower developers to have self-service access to resources they need to ship their code into production safely and predictably. It minimizes risk and the possibility of errors. In addition, it can also trigger a peer review before committing changes and send real-time alerts for proactive action when something goes wrong.

Kubiya can:

  • Give your development team self-serve access to infrastructure, operational workflows, and knowledge directly from Slack. It does this by leveraging conversational AI to understand the requirements of a developer based on their text input in Slack.
  • Quicken infrastructure management using pre-built workflows and templates that are created using generative AI. These workflows are ready to be deployed, but can also be customized to any extent to ensure they meet any niche use case.
  • Democratize access and permissions to data and common organization-wide knowledge.
  • Manage access and permissions at a granular level while monitoring all user activity.

Kubiya features a bi-directional feedback loop using the RLHF training method to ensure domain-specific knowledge. One significant benefit of Kubiya’s generative AI self-service platform is user personalization, which democratizes the platform with LLMs continuously training on user interactions.

Meet your new DevOps teammate

Think of Kubiya as your DevOps teammate that can enable your team to simplify complex operations through human-like conversations. Although it features ChatGPT-like functionality, Kubiya goes beyond that, actually implementing and making changes to production clusters. It simultaneously delivers a self-service experience for developers while reducing the burden on the Ops team - all the while ensuring a simple and secure DevOps workflow automation.
Learn more about Kubiya here.

zev-schonberg
zev-schonberg