scala 在 Spark 中四舍五入
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/34888419/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Round Down Double in Spark
提问by mithrix
I have some cassandra data that is of the type double that I need to round down in spark to 1 decimal place.
我有一些 cassandra 数据是 double 类型的,我需要在 spark 中四舍五入到小数点后一位。
The problem is how to extract it from cassandra, convert it to a decimal, round down to 1 decimal point and then write back to a table in cassandra. My rounding code is as follows:
问题是如何从 cassandra 中提取它,将其转换为小数,四舍五入到 1 个小数点,然后写回 cassandra 中的表。我的舍入代码如下:
BigDecimal(number).setScale(1, BigDecimal.RoundingMode.DOWN).toDouble
This works great if the number going in is a decimal but I dont know how to convert the double to a decimal before rouding. My Double needs to be divided by 1000000 prior to rounding.
如果输入的数字是小数,这很有效,但我不知道如何在 rouding 之前将双精度转换为小数。在四舍五入之前,我的 Double 需要除以 1000000。
For example 510999000 would be 510.990 before being rounded down to 510.9
例如,510999000 在四舍五入为 510.9 之前将是 510.990
EDIT:I was able to get it to do what I wanted with the following command.
编辑:我能够使用以下命令让它做我想做的事情。
BigDecimal(((number).toDouble) / 1000000).setScale(1, BigDecimal.RoundingMode.DOWN).toDouble
Not sure how good this is but it works.
不知道这有多好,但它有效。
采纳答案by mithrix
The answer I was able to work with was:
我能够使用的答案是:
BigDecimal(((number).toDouble) / 1000000).setScale(1, BigDecimal.RoundingMode.DOWN).toDouble
回答by Pramit
Great answer guys. Just chiming other ways to do the same
很好的答案伙计们。只是用其他方法来做同样的事情
1. If using Spark DataFrame then ( x and y are DataFrames )
import org.apache.spark.sql.functions.round
val y = x.withColumn("col1", round($"col1", 3))
2. val y = x.rdd.map( x => (x(0)*1000).round / 1000.toDouble )

