Skip to content

Commit

Permalink
[Bug](function)fix json_object function check null nums error as retu…
Browse files Browse the repository at this point in the history
…rn bool (#44321)

### What problem does this PR solve?

Related PR: #https://github.com/apache/doris/pull/34591/files

Problem Summary:
before the SQL report error, as it error change the simd::count_zero_num
function return value static as bool,
so it run into the if check of (not_null_num < size).
but actual it's maybe return int type.


now:
```
mysql [(none)]>select json_object ( CONCAT('k',t.number%30926%3000 + 0),CONCAT('k',t.number%30926%3000 + 0,t.number%1000000) ) from numbers("number" = "2") t order by 1;
+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| json_object(concat('k', cast((((number % 30926) % 3000) + 0) as VARCHAR(65533))), concat('k', cast((((number % 30926) % 3000) + 0) as VARCHAR(65533)), cast((number % 1000000) as VARCHAR(65533))), '66') |
+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| {"k0":"k00"}                                                                                                                                                                                              |
| {"k1":"k11"}                                                                                                                                                                                              |
+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
2 rows in set (0.04 sec)
```


before:
```
mysql [(none)]>select json_object
    -> (
    -> CONCAT('k',t.number%30926%3000 + 0),CONCAT('k',t.number%30926%3000 + 0,t.number%1000000)
    -> )
    -> from numbers("number" = "2") t;
ERROR 1105 (HY000): errCode = 2, detailMessage = (10.16.10.8)[INTERNAL_ERROR]function json_object can not input null value , JSON documents may not contain NULL member names.
mysql [(none)]>

```
  • Loading branch information
zhangstar333 authored and Your Name committed Nov 21, 2024
1 parent 9664b50 commit 43cf6d4
Show file tree
Hide file tree
Showing 3 changed files with 9 additions and 3 deletions.
6 changes: 3 additions & 3 deletions be/src/vec/functions/function_json.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -784,13 +784,13 @@ class FunctionJsonAlwaysNotNullable : public IFunction {
for (int i = 0; i < args; i += 2) {
const auto* null_map = nullmaps[i];
if (null_map) {
const bool not_null_num =
auto not_null_num =
simd::count_zero_num((int8_t*)null_map->get_data().data(), size);
if (not_null_num < size) {
return Status::InternalError(
"function {} can not input null value , JSON documents may not contain "
"NULL member names.",
name);
"NULL member names. input size is {}:{}",
name, size, not_null_num);
}
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,7 @@
{"k0":4,"k1":null,"k2":null,"k3":"test","k4":"2022-01-01 11:11:11","k5":null,"k6":"k6"}
{"k0":5,"k1":1,"k2":true,"k3":"test","k4":"2022-01-01 11:11:11","k5":null,"k6":"k6"}

-- !sql2 --
{"k0":"k00"}
{"k1":"k11"}

Original file line number Diff line number Diff line change
Expand Up @@ -45,5 +45,7 @@ suite("test_query_json_object", "query") {
sql """select k0,json_object(k3,123) from ${tableName} order by k0;"""
exception "function json_object can not input null value , JSON documents may not contain NULL member names."
}

qt_sql2 """select json_object ( CONCAT('k',t.number%30926%3000 + 0),CONCAT('k',t.number%30926%3000 + 0,t.number%1000000) ) from numbers("number" = "2") t order by 1;"""
sql "DROP TABLE ${tableName};"
}

0 comments on commit 43cf6d4

Please sign in to comment.