Look up OLTP vs OLAP data stores to get an idea. There are a lot of common patterns for the specifics of implementing this. Usually you run a regularly scheduled job that dumps data representing some time period (e.g. daily jobs). There are some considerations for late arriving data, which is a classic DE interview question, but for the most part, big nightly dumps of the last day’s data/transactions/snapshots to date-partitioned columnar stores using an orchestration engine like Airflow is sufficient for 99% of use cases.
Tangent: I hate OLTP and OLAP as acronyms. They're only one letter/word off and completely obscure the relevant meaning lots of semantic noise. Just say transactional vs analytical processing. (They are still good search key terms because lots of existing literature/resources use the terms)