Look, I’m not a developer, and the last time I truly “wrote code” was probably a good number of years ago (and it was probably Perl so you may hate me). I am also not an appsec expert (as I often remind people).
Below I am describing my experience “vibe coding” an application. Before I go into the details of my lessons — and before this turns into a complete psychotherapy session — I want to briefly describe what the application is supposed to do.

We have a podcast (Cloud Security Podcast by Google), and I often feel that old episodes containing useful information aren’t being listened to and the insights from them go to waste. At the same time, for many organizations today, the answer to their current security problems may well have been discussed and solved in 2021. This may be strange to some, but for many organizations, the future is in the past. Somebody else’s past!
So I wanted “a machine” that turns old episodes into role-specific insights, without too much work by a human (me). This blog is a reflection on how things went.
First, my app is using public data — namely podcast transcripts and audio — to create other public data (social media posts). Since the inputs and outputs are public, this certainly made me at peace with vibe coding. Naturally, I needed to understand how the app would be coded, where it would live and what I should do to make it manifest in the real world. So I asked Gemini, and it suggested I use AI Studio by Google, and I did (non-critically) exactly that.
When I started creating the app, the question of storage immediately came up. Jumping a little bit ahead, you will see that authentication / credentials and storage were two security themes I reflected on the most.
You want to read a file from storage, but what storage? More importantly, whose storage? At this point, I had my first brush with anxiety of the “vibe process.” I didn’t want to just vibe code without a full understanding of the data access machinery. I immediately said, “No, I don’t want to store data in my Google Drive using my credentials.” I just didn’t trust it.
In fact, I didn’t trust the app with any credentials for anything — work or personal — at all! Given that I have public data, I decided to store it in a public wed folder. AI Studio suggested ways to store data that people might not fully understand, and this is my other reflection: If I’m not a developer, and I don’t know the machinery behind the app, how do I decide? These decisions are risk decisions and “a citizen vibe coder” is very much not equipped to make them. Well, I sure wasn’t.
So what are the security implications of the decisions a developer makes — sometimes guided by AI and sometimes on their own? Can I truly follow an AI recommendation that I don’t understand? Should I follow it? If you don’t understand what happens, I can assure you, you certainly do not understand the risks!
As a result, I did not trust the app with any credentials or authenticated access. Of course, a solution may have been to use throwaway storage with throwaway credentials, but I think I do not need this in my life… Anyhow, many actions that you take during vibe coding, whether suggested by AI or not, have security implications.
In addition, the app interacts with the environment. If the app is being built in a corporate environment, it interacts with corporate security “rules and tools”, and some things you may want to do wouldn’t work. I’m not going into details, but I had a couple of examples of that. If you vibe code at work and you are doing it through, let’s say, shadow AI, there will be things your AI (and you) would want to do, but your employer security would not allow. And often with good reasons too! So you ask AI for more ways and hope it won’t say “just disable the firewall.”
The next conundrum, apart from storage, was output quality. What about quality and those hallucinatory mistakes? Now, I know my app uses an LLM to condense a summary of the podcast transcript into brief insi
[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.
Read the original article: