It is not a "small" cost. The cost is proportional to the size of the data exported.
For all intents and purposes, large amounts of data are locked into Snowflake. Is it theoretically possible to export a petabyte out of SF? Sure.
Do I want to spend money on it? Not really. That is what I mean by the "data doesn't come out".
"Exporting" a petabyte out of Databricks is a no-op. I can already read Deltalake from other open source tools.