convert string to BigInt dataframe spark scala

12,267

For large integer you should use LongType:

cabArticleGold.withColumn("CAB", 'CAB.cast(LongType))

or

cabArticleGold.withColumn("CAB", 'CAB.cast("long"))

You can also use DecimalType

cabArticleGold.withColumn("CAB", 'CAB.cast(DecimalType(38, 0)))

or

cabArticleGold.withColumn("CAB", 'CAB.cast("decimal(38, 0)"))
Share:
12,267
Maher HTB
Author by

Maher HTB

Updated on June 27, 2022

Comments

  • Maher HTB
    Maher HTB almost 2 years

    I am trying to insert values into dataframe in which fields are string type into postgresql database in which field are big int type.

    I didn't find how to cast them as big int.I used before IntegerType I got no problem. But with this dataframe the cast cause me negative integer

    val sparkSession = SparkSession.builder.master("local").appName("spark session example").getOrCreate()
    
      val cabArticleGold = sparkSession.sqlContext.load("jdbc", Map("url" -> "jdbc:oracle:thin:System/maher@//localhost:1521/XE", "dbtable" -> "IPTECH.TMP_ARTCAB")).select("CODEART", "CAB").limit(10)
    import sparkSession.sqlContext.implicits._
     cabArticleGold.show()
    cabArticleGold.withColumn("CAB",'CAB.cast(IntegerType)).foreach(row=>println(row(1)))
    
    232524399
    -1613725482
    232524423
    -1613725465
    232524437
    -1191331072
    3486
    -1639094853
    232524461
    1564177573
    

    Any help to use Big Int would be appreciated.I know that scala supports Big Int, but how can I do it?