-
由 Venkatesh Raghavan 提交于
When the size of the array is more than `100` our translator does not expand the arrays. This make GPORCA's cardinality estimation model think it is an unsupported predicate which in turn makes it the cardinality wrong: ``` create table foo(a int, b int); insert into foo select i, i from generate_series(1,100) i; ``` Next force GPORCA to not expand the array ``` vraghavan=# set optimizer_array_expansion_threshold = 1; SET vraghavan=# explain select * from foo where b in (1, 2, 3); QUERY PLAN ------------------------------------------------------------------------------- Gather Motion 3:1 (slice1; segments: 3) (cost=0.00..431.00 rows=40 width=8) -> Table Scan on foo (cost=0.00..431.00 rows=14 width=8) Filter: b = ANY ('{1,2,3}'::integer[]) Settings: optimizer_array_expansion_threshold=1 Optimizer status: PQO version 2.75.0 (5 rows) ``` In the example,: * Table has 100 rows * Each column has unique value for b * Array of the in clause has 3 values. So cardinality must be at most 3. * Since the array is not expanded we get wrong cardinality estimation In this change, we pass the size of the array constant so that GPORCA can try do a better job estimating cardinality. Associated GPORCA PR: https://github.com/greenplum-db/gporca/pull/405
eadc5276