We're also in an industry with a ton of competitors.
On top of that, the company was founded by some very junior engineers. for most of them this was their first or second job out of college. Literally every anti-pattern is in our codebase, and a lot of them are considered best practices by them. Unit tests were perceived as a cost with little benefit, so none were written. New engineers were almost always new grads to save on money.
These facts combined make for an interesting environment.
For starters, leadership is afraid to ship new code, or even refactor existing code. Partially because nobody knows how it works, partially because they don't have unit tests to verify that things are going well. All new code has to be gated by feature flags (there's an experiment right now to switch from try-finally to try-with-resources). If there isn't a business reason to add code, it gets rejected (I had a rejected PR that removed a "synchronized" block from around "return boolValue;"). And it's hard to say they're wrong. If we push out a bad release, there's a very real chance that our customers will pack up and migrate to one of our competitors. Why risk it?
And the team's experience level plays a role too. With so many junior engineers and so much coding skill in-breeding, "best practices" have become pretty painful. Code is written without an eye towards future maintainability, and the classes are a gas factory mixed with a god object. It's not uncommon to trace a series of calls through a dozen classes, looping back to classes that you've already looked at. And trying to isolate chunks of the code is difficult. I recently tried to isolate 6 classes and I ended up with an interface that used 67 methods from the god object, ranging from logging, to thread management, to http calls, to state manipulation.
And because nobody else on the team has significant experience elsewhere, nobody else really sees the value of unit tests. They've all been brought up in this environment where unit test are not mentioned, and so it has ingrained this idea that they're useless.
So the question is how do you fix this and move forward?
Ideally we'd start by refactoring a couple of these classes so that they could be isolated and tested. While management doesn't see significant value in unit tests, they're not strictly against them, but they are against refactoring code. So we can't really add unit tests on the risky code. The only places that you can really add them without pushback would be in the simplest utility classes, which would benefit from them the least, and in doing so prove to management that unit tests aren't really valuable. And I mean the SIMPLEST utility classes. Most of our utility classes require the god object so that we can log and get feature flags.
I say we take off and nuke the entire site from orbit (start over from scratch with stronger principles). It's the only way to be sure. But there's no way I'm convincing management to let the entire dev team have the year they'd need to do that with feature parity, and leadership would only see it as a massive number of bugs to fix.
In the meantime developer velocity is slowing, but management seems to see that as a good thing. Slower development translates into more stable code in their minds. And the company makes enough that it pays well and can't figure out what to do with the excess money. So nobody really sees a problem. Our recruiters actually make this a selling point, making fun of other companies that say their code is "well organized".