Spark Java sum 给出的值不正确
Java示例代码如下
List<Double> points = Arrays.asList(-6221.4, 6380.46);
Dataset<Row> dt = spark.createDataset(points, Encoders.DOUBLE()).toDF("double_vals");
dt.createOrReplaceTempView("dual_table");
spark.sql("select sum(double_vals) from dual_table").show(false);
预期结果是159.06,但我得到的结果如下
+-----------------+
|sum(double_vals) |
+-----------------+
|159.0600000000004|
+-----------------+
我做错了什么吗?
使用
round
函数截断精度
spark.sql("select round(sum(double_vals), 2) as sum_value from dual_table").show(false)
+---------+
|sum_value|
+---------+
|159.06 |
+---------+
使用
cast( sum(<column name>) AS decimal(10, 2))
spark.sql("select cast(sum(double_vals) as decimal(10, 2)) as sum_value from dual_table").show(false)
+---------+
|sum_value|
+---------+
|159.06 |
+---------+