I once interviewed for a COBOL position. It was a typical large financial institution. It was a lead position that wanted a migration plan out and then would lead to a team for me. I have a lot of systems integration and information management knowledge. I have worked on planning multi-year projects and migrating out of large systems like this before. The interview went well, and afterwards they showed me around the place and what they were doing.
I sat down again in the office with the hiring director, and I enquired about how long he has been with the company and what he has been working on. Turns out he had been hired in five years ago and had already attempted this once before without success. I knew without a doubt why they didn't have success, and it had nothing to do with COBOL.
Financial companies are highly risk adverse. COBOL developers know this. COBOL developers know that if the shop isn't COBOL their job is at risk. So, COBOL developers will constantly introduce "what about this risk" issues, which then must be considered in committee; thus, the company is eternally paralyzed. The fact that the director couldn't make it anywhere in five years meant nothing was going to change.
As I recall, I checked in a few years later and they were still in the same place. I suspect it might have been a decent place if all I wanted in life was a paycheck. They were nice people.
I suspect the way forward for COBOL is to modernize the development tools around it, rather then rewrite the code itself.
The financial companies running COBOL are. Generally it's quite boring, staid retail banking or the like. If you're a developer who sits on a trading desk writing code which goes straight into production for a trader, things get much more risky.
Two things shocked me at the time.
1. The company had a LOT of systems running in COBOL, for which they had lost the source code to the vagaries of time.
2. One of the big projects at the time was because of the lack of source, and the devaluation of the Italian Lira, they needed a "splitter"/"unsplitter" to take particularly big bond trades in Lira that overflowed the (eg) `PIC 9(9)` data clause for the total price, split them up into smaller trades, run them through the existing clearance system, and re-aggregate them into 1 trade after clearing.
You've then successfully ported your code base from one language for which it's really difficult to find programmers to another language for which it's really difficult to to find programmers.
I'd love to help them along because the workload is getting rough. Are there any tips you can share about finding them? Where do you post ads, what's the process like, remote or in person? Anything would help. Thank you!
If your first response to this is "but that would cost a fortune!", you are completely correct. The biggest illusion in software is that a "complete" program is now "done" and will need no more investment. Software is a machine for producing information, fundamentally not much different from a lathe or a textile loom. If your company depends on a piece of software to continue existing, it is now de facto a software company. Many institutions have not yet realized this and will run into existential problems within a few decades.
Disclaimer: Neither a fanboi nor user, just read that potential users thought those libs were abandoned, since they havent been updated in years.
For example, the COBOL might potentially have been on an ICL VME system. This will undoubtedly have a lot of support code written in things like SCL (System Control Language), and potentially even 68k assembly.
There are companies which specialise in migrating COBOL code from legacy environments. This can involve migrating the COBOL code itself to a different variant and rigorously testing it, and then often introducing utilities which replace operating system functionality which was present on the original mainframe.
The alternative is to "salami slice" bits of functionality into modern services written afresh, and slowly work towards decommissioning the old setup, but often this has to be done at the same time as a migration to modern hardware, due to the scale of the systems involved and the time taken for these sorts of projects to complete.
Transpiling, to more or less any language, is the easy bit. By a very, very long way. The hard bit is everything else. I mean, do they really think they're the first people to think about using a transpiler over the last 30/40/50 years?
If they had said that they'd used their transpiler on a real production codebase and their transpiled code was now running in production in any volume, I'd be quite impressed. But they didn't say that because it's blatantly obvious from the tone of the article that that will never occur.
Now, I don't have any issue with people being ignorant of a subject that they don't understand. That's the norm for most people for most things. But I have to say that this level of naivety doesn't reflect well on their company. They clearly made no effort to understand the problem that they are purporting to solve. Is that really how they function as a company?
Perhaps we should consider how we can regularly rewrite different parts of our software ecosystems using new languages and approaches while preserving the essential knowledge and security aspects of what we have already built?
The important part isn't the language that the code is written in, it is the full understanding of the problem and thus the solution provided.
Being around for the frontend engineering revolution taught me what it means for an industry to forget how to build, and have to re-learn many already hard learned lessons.
But maybe I'm looking at it wrong, and it was that re-learning process that was the re-building after all? I don't know. All I do know is that you rarely get enough time in software to master your tools before everything changes, and you have to build the same old software requirements in brand new ways.
Every ~6 years they completely rewrite Basecamp from the ground-up and sell it as a new product management tool. They are currently on rewrite #3 and rewrite #4 will be released next year.
https://signalvnoise.com/posts/3856-the-big-rewrite-revisite...
The only difference to your analogy is that they continue to support the old versions of Basecamp so that if a customer doesn't want to migrate to the current version, they don't have too.
At this point in time, the mistake is to hear “COBOL” and think about a programming language rather than environments where they've struggled to manage technical debt and staffing issues for decades. It's not that COBOL developers can't produce good code but rather than places which can modernize effectively did so 3+ decades ago. If some place makes the news now because of a COBOL-based system, COBOL is a symptom of a management failure.
I've been unfortunate enough to witness first hand what +-5,0000,000 lines of Natural Adabas code transpiled to +-15,000,000 lines of Java code looks like, and it was not pretty.
It was a municipal management system that grew over time (decades).
At some stage a government standards body raised the legacy risk inherent in the Natural Adabas codebase.
Some (pretty impressive) 3rd party tool was used to transpile the codebase to a more "modern Java web application architecture".
The code wasn't idiomatic Java and used some extension libraries to supply missing capabilities. For example I don't recall ever seeing any other Java system that uses continuations.
I spent an entire day tracing one field from the front-end to the database. I documented the components, flows, logic and db structures and then asked one of the team leads to verify if I got it right. She confirmed that I did. I then asked what the code on question actually does, functionally.
She replied something to the effect of "Good heavens, I don't know either - you'll need to speak to one of the functional guys and then spend some time with the Natural code from before the transpilation".
She needed about 6 months to get a new Java dev to a level where they could do minor feature requests or bug fixes. Understandably her ability to retain developers was quite low - they tended to leave quickly after spending time with the system.
Business didn't seem to get why the velocity for new features was so slow.
They didn't have an architect, documentation was fragmentary.
Previous modernisation initiatives had already become obsolete as well: IIRC the App server was Glassfish and front-end (web) components were using some JBoss widget library that was already deprecated at the time.
Like I said, I left quickly.
Phase 1: automation of cobol to (ugly) Java
Phase 2: make the code more idiomatic for (cleaner) Java
Phase 3: move the infrastructure to AWS
In the discussion, someone mentioned "There are commercial COBOL compilers available that compile to Java bytecode."
Offhand, if someone were still on cobol, I'm not sure if they'd trust the comparatively new Elixir, but I can also see leapfrogging being a thing. (And I know it's based on Erlang, which pre-dates Java.)
[0] https://stuff.mit.edu/afs/sipb.mit.edu/project/lisp2cobol/l2...
That answering the what if question. The project itself is interesting.
Sorry, no https availlable.
Also, I haven't seen any mention of any of the bits of the COBOL ecosystem here? No DB2, IDMS, IMS DB/DC, VSAM, CICS, REXX, JCL? These are a big part of why COBOL applications are sticky.