For example, JSON describes a logical structure of nested lists and dictionaries. If you were doing data-level programming, you would just map the JSON into actual nested lists of dictionaries and get on about your business.
The alternative, which is more common in static languages like Java, is to transform it all into some set of domain model objects, and probably validate it up-front, too. Even the bits you don't actually need to look at in order to accomplish the job at hand. IMO, that approach tends to mean creating a lot of unnecessary work for oneself. It also makes it harder to obey Postel's law.
(The corollary to that last bit is that it is also possible for static typing to create bugs.)
A personal example of this was Httpd used to accept standard headers with spaces instead of dashes, this leads to strange behavior if you accidentally include both. So they decided to stop doing that in a major version. This major version was opaquely included by ops accidentally into our base images. This lead to a very long day of debugging on our end.
Point is, by being liberal with what you accept you create ambiguity, which you may not totally understand at the time. By putting that out into the wild you basically are forced to keep this ambiguous, undocumented spec alive or you no doubt will end up breaking some client.
I'm talking about situations where the JSON is formatted fine, it's just that some field wasn't specified, so then the entire input gets rejected. Even though there was zero need to read the contents of that field in the first place. It just happened to be included in some domain object that gets re-used everywhere, including some other places where the field's contents do matter.
Keep in mind that, when we're dealing with anything that might be transmitted in JSON, thinking that there might be a published spec, and that it manages to accurately cover all these details, is really optimistic. I've honestly never seen it happen in the wild. Oftentimes, any validation rules you might try to impose are guesswork as much as they are anything else. So complaining that a piece of data didn't conform to the spec might not even be a valid thing to do. All you can say for sure is that the data didn't meet the needs of some piece of business logic.
It's not perfect, but it's life. This tension, for example, is at the heart of why proto2 got replaced with proto3, and why using proto3 is strongly encouraged if you're looking to build a robust infrastructure.
One way is to treat the JSON as a generic JSON structure, and traverse it manually. Of course, you will have to be explicit about what should happen when children are of different types from what you expect, though this explicitness could just be throwing an exception or ignoring it. Haskell's Aeson and Rust's serde_json both support this, as does .NET's JsonElement type.
Unfortunately, this means you're passing around a lot of objects called something like "JSON" without any information about what they contain at the type level, and as an alternative between that approach and creating domain objects, there are row polymorphic records, which allow you to write functions that accept any record that has certain fields, and also specify that they may also contain other fields which you do not handle. This allows you to program to what you know about the types you're ingesting without having to write a lot of new types.
They're orthogonal concerns. C is statically and weakly typed. Clojure is dynamically and strongly typed. PHP is dynamically and weakly typed. Haskell is statically and strongly typed. Java, as the most design-by-committe language ever, manages to be a mix of all four.