We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
proton :) select count() from test_r settings max_events=10000 SELECT count() FROM test_r SETTINGS max_events = 10000 Query id: 87eb84db-d07b-44ae-a354-cea470f55c02 ┌─count()─┐ │ 2000 │ └─────────┘ ┌─count()─┐ │ 4000 │ └─────────┘ ┌─count()─┐ │ 6000 │ └─────────┘ ┌─count()─┐ │ 8000 │ └─────────┘ 4 rows in set. Elapsed: 10.001 sec. Processed 10.00 thousand rows, 40.00 KB (999.94 rows/s., 4.00 KB/s.)
It actually generate 10000 rows data, but the aggregation result is not right , should end up with 10000?
10000
Another problem: random stream with date32 type seems always 2299-12-31, but the date type is ok
date32
2299-12-31
date
proton :) show create test_r SHOW CREATE STREAM test_r Query id: ff26d25f-a835-427d-870d-c0836ab05240 ┌─statement────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐ │ CREATE RANDOM STREAM default.test_r ( `value1` date32, `value2` date, `_tp_time` datetime64(3, 'UTC') DEFAULT now64(3, 'UTC') CODEC(DoubleDelta, LZ4), INDEX _tp_time_index _tp_time TYPE minmax GRANULARITY 256 ) │ └──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘ 1 row in set. Elapsed: 0.004 sec. proton :) select value1, value2 from test_r settings max_events=20 SELECT value1, value2 FROM test_r SETTINGS max_events = 20 Query id: bf02a09f-4830-4a86-981f-03e665c5627f ┌─────value1─┬─────value2─┐ │ 2299-12-31 │ 2069-08-07 │ │ 2299-12-31 │ 2096-06-21 │ │ 2299-12-31 │ 2116-10-28 │ │ 2299-12-31 │ 2063-01-11 │ │ 2299-12-31 │ 1974-02-26 │ └────────────┴────────────┘ ┌─────value1─┬─────value2─┐ │ 2299-12-31 │ 2034-10-26 │ │ 2299-12-31 │ 2011-07-22 │ │ 2299-12-31 │ 2099-04-12 │ │ 2299-12-31 │ 2036-10-13 │ │ 2299-12-31 │ 2143-10-17 │ └────────────┴────────────┘ ┌─────value1─┬─────value2─┐ │ 2299-12-31 │ 2057-07-17 │ │ 2299-12-31 │ 1979-04-21 │ │ 2299-12-31 │ 2107-12-12 │ │ 2299-12-31 │ 2128-02-19 │ │ 2299-12-31 │ 1988-02-14 │ └────────────┴────────────┘ ┌─────value1─┬─────value2─┐ │ 2299-12-31 │ 2046-12-09 │ │ 2299-12-31 │ 2138-06-11 │ │ 2299-12-31 │ 2141-03-22 │ │ 2299-12-31 │ 2117-05-21 │ │ 2299-12-31 │ 1983-04-23 │ └────────────┴────────────┘ 20 rows in set. Elapsed: 0.025 sec.
The text was updated successfully, but these errors were encountered:
yl-lisen
No branches or pull requests
It actually generate
10000
rows data, but the aggregation result is not right , should end up with 10000?Another problem: random stream with
date32
type seems always2299-12-31
, but thedate
type is okThe text was updated successfully, but these errors were encountered: