somehow I have different experience, anything unlike some simple select few fields from t where k = ? usually causes all kinds of errors and stability issues on good size of data.
You can be both right. I would say that ClickHouse is a focused system - it is really really good at the things it's good at, and it is not good at the rest.
Anecdote: I tested ClickHouse as possible replacement of Trino+DeltaLake+S3 for some of our use cases.
When querying a precomputed flat tables it was easily 10x to 100x faster. When running complex ETL to prepare those tables, I gave up when CTAS with six CTEs that takes Trino 30 seconds to compute on the fly turned into six intermediate tables that took 40 minutes to compute, and I wasn't even halfway done.
The tricky part is, how do you know whether your use case fits? But you have to ask this question about all the "specialized tools", including Pinot.