Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
pletnes
1mo ago
0 comments
Share
Same here. I have however seen a few out of memory cases in the past when given large input files.
0 comments
default
newest
oldest
jastr
1mo ago
By default, it tries to take 80% of your memory. I've found that you need to set it to something much smaller in ~/.duckdbrc `set max_memory='1GB';`
skeeter2020
1mo ago
it's not the focus or very performant but you can have it spill to disk if you run out of memory. I wouldn't suggest building a solution based on this approach though; the sweet-spot is memory-constrained.
maxldn
1mo ago
Really? How large? I’ve only managed to crash it with hundreds/thousands of files so far, but haven’t so many huge files to deal with.
j
/
k
navigate · click thread line to collapse