Show HN: Sink 5000 rows/SEC from Kafka –> Postgres using Python and 250MiB Mem

sql-flow.com

1 points by dm03514 9 hours ago

Hello! I've been working on building a flink alternative using DuckDB.

Building off of DuckDB makes it easy to leverage all the integrations DuckDB supports.

Using batching it's trivial to insert 5000 rows / second into a small postgres instance running locally in docker!

Would love your thoughts and feedback, thank you!

What do your data stacks look like for doing this non-differentiating data work like connecting streams and sources together?