// Tutorial //

What is Debugging?

Published on February 1, 2021
What is Debugging?

In computing, debugging is the process of looking for and then resolving issues that prevent software from running correctly.

Software bugs refer to an error or fault in the codebase that leads to an unexpected result, or unintended outcomes. Because of this naming convention, the process of discovering and fixing bugs is referred to as debugging.

The famous historical precedent for referring to glitches as “bugs” comes from computer pioneer Grace Hopper’s account of a moth being trapped in the Harvard Mark II electromechanical computer, and subsequently being taped into the logbook. The etymology of “bug” in reference to an engineering defect predates this account from the 1940s, however, having been a part of jargon since at least the 1870s.

There are a number of systems and tools that can be used to address the occurrence of bugs in software and solve them through debugging. Tactics may include interactive debugging, unit testing, integration testing, and monitoring. Software development tools, practices, and programming languages themselves offer support in debugging. To learn more about debugging, you can read how it is approached in Python in our series Debugging Python Programs.

Thanks for learning with the DigitalOcean Community. Check out our offerings for compute, storage, networking, and managed databases.

Learn more about us

Want to learn more? Join the DigitalOcean Community!

Join our DigitalOcean community of over a million developers for free! Get help and share knowledge in our Questions & Answers section, find tutorials and tools that will help you grow as a developer and scale your project or business, and subscribe to topics of interest.

Sign up now
About the authors

Still looking for an answer?

Ask a questionSearch for more help

Was this helpful?