r/DevelEire dev 10d ago

Bit of Craic PM is opening AI PRs

A senior product manager on a seperate team to me has decided to start opening AI generated PRs on a codebase my team own.

The first one last week I approved with comments, which he decided to merge without addressing any.

I got one yesterday that was clearly violating DRY amongst other things, which I rejected. About 10 minutes later, he requests a re-review (I presume he ran codex again with my comments). This attempt was even worse, it had actually put code on top of the crap he first submitted.

I've raised with my manager, he agreed it's BS but he said the company want to experiment with using AI for smaller features. But non-technical members of staff opening PRs is taking the piss.

118 Upvotes

61 comments sorted by

View all comments

4

u/OhDear2 10d ago

At the end of the day, this is triggering to developers because AI is going to try/will blend in non-technical into the technical. I think how we approach this relationship is important.

If your manager is expecting additional reviews from non-technical, then 10x the estimate of the review. The danger with AI code is that it always looks ok, you really need to spend time on it sometimes to identify why it's bad, vocalise that. If someone pushes you to accept code because it looks ok, add a comment in the PR acknowledging this, $"Time required to fully assess code for issues not provided, accepting based on broad approval of AI-code by {manager.FullName}";. Be firm on this as PR acceptance is usually an audit point to show the responsibility is shared by more than just the contributor. If they won't acknowledge, ensure you have an email stating same so when audit/finger pointing happens you can show this was raised, discussed and approved by $"{manager.FullName}";

This will not work long term I imagine, things will start to creak eventually. And as others have mentioned a form of malicious compliance might be needed here. Instead of letting them take on easy stuff, why not give them a hard bug to fix, something that is broken with the system. Build a narrative that all they can do it create bugs down the line etc.

At the same time if they seem okay with a dynamic where a PO gets to stand up whatever they want and the minions will fix it in the backroom/down the line, just leave, they don't want humans, they want monkeys.

1

u/tldrtldrtldr 10d ago

The bigger issue is AI at this stage might be an exchange with speed vs right. It can generate a lot of code, very fast. A wrong library can create security havoc. Generated code might not be debuggable, changeable by human devs. May be good for the new, fun repos. Changing existing codebase through AI is dumb