I am using spark-sql-2.4.1v. Here I have scenario like below
val df = Seq(
(2010,"2018-11-24",71285,"USA","0.9192019", "0.1992019", "0.9955999"),
(2010,"2017-08-24",71286,"USA","0.9292018", "0.2992019", "0.99662018"),
(2010,"2019-02-24",71287,"USA","0.9392017", "0.3992019", "0.99772000")).toDF("seq_id","load_date","company_id","country_code","item1_value","item2_value","item3_value")
.withColumn("item1_value", $"item1_value".cast(DoubleType))
.withColumn("item2_value", $"item2_value".cast(DoubleType))
.withColumn("item3_value", $"item3_value".cast(DoubleType))
.withColumn("fiscal_year", year(col("load_date")).cast(IntegerType))
.withColumn("fiscal_quarter", quarter(col("load_date")).cast(IntegerType))
df.show()
val aggregateColumns = Seq("item1_value","item2_value","item3_value")
var aggDFs = aggregateColumns.map( c => {
df.groupBy("country_code").agg(lit(c).as("col_name"),sum(c).as("sum_of_column"))
})
var combinedDF = aggDFs.reduce(_ union _)
combinedDF.show
Output data i am getting like
|country_code| col_name| sum_of_column|
| USA|item1_value| 2.7876054|
| USA|item2_value| 0.8976057|
| USA|item3_value|2.9899400800000002|
I need to get other column in the out put i.e. "seq_id","load_date" and "company_id" How to get them after aggregation operation of dataframe ?