Last week, news broke that Dr. Madhu Gottumukkala, acting director of Cybersecurity and Infrastructure Security Agency (CISA), uploaded “For Official Use Only” government contracting documents into public ChatGPT.

This bypassed DHS-approved AI tools and triggered internal cyber alerts.

No malware, foreign adversary, or sophisticated hack. Just a smart professional using the fastest tool available.

And to be clear, this wasn’t ignorance. Dr. Gottumukkala holds degrees in engineering, business, and information systems. This wasn’t a lack-of-knowledge problem. It was a speed and judgment problem.

Most data exposure today doesn’t come from bad actors. It comes from convenience, urgency, and the occasional brain freeze.

Public AI tools are incredible, but if sensitive material touches them, control is gone. No mystery, just physics.

In my world, damaging incidents involve someone breaking in. But more often, they involve trusted people trying to move faster than good sense would recommend. Once again proving that the biggest security failures don’t come from malice. They come from confidence.

The takeaway isn’t “don’t use AI,” it’s “know the difference between decision support and data surrender.”

Speed without guardrails heightens risk, and if this can happen at the top of a federal security agency, it can happen anywhere.

The real exposure isn’t technology, it’s the human turning the knobs.