How big is 'large' here?
I built a simple CRUD inventory program to keep track of one's gaming backlog and progress, and the dumped JSON of my entire 500+ game statuses is under 60kB and can be imported in under a second on decade-old hardware.
I'm having difficulty picturing a JSON dataset big enough to slow down modern hardware. Maybe Gentoo's portage tree if it were JSON encoded?
I've been in the industry for a while. I've probably left more than one client site muttering "I've seen some things ...".
If it can be done, it will be done. And often in a way that shouldn't have even been considered at all.
Many times, "it works" is all that is needed. Not exactly the pinnacle of software design. But hey, it does indeed "work"!
Here is the anthem page. The toc link is 16gb
https://www.anthem.com/machine-readable-file/search/
They are complying with the mandate. But not optimizing for the parsers