0
Follow
0
View

Round values of a varying quantity of columns on Databricks Scala

longnaxieye 注册会员
2023-01-25 14:51

You can get datatype information from dataframe schema:

import org.apache.spark.sql.types.FloatType

val floatColumns = df.schema.fields.filter(_.dataType == FloatType).map(_.name)

val selectExpr = df.columns.map(c =>
  if (floatColumns.contains(c)) 
     round(col(c), 0).as(c) 
  else col(c)
)

val df1 = df.select(selectExpr: _*)