How to pass arguments dynamically to filter function in Apache Spark?
11,612
Solution 1
This can be done in scala you can change it to python
val emp_name = "Sam"
val employee_data = employee_df.filter(col("Name") === emp_name)
Hope this helps!
Solution 2
Try the following:
emp_Name='Sam'
employee_data = employee_df.filter(employee_df["Name"] == emp_Name).collect()
Author by
YRK
Updated on June 30, 2022Comments
-
YRK almost 2 years
I have a employees file which have data as below:
Name: Age: David 25 Jag 32 Paul 33 Sam 18
Which I loaded into
dataframe
in Apache Spark and I am filtering the values as below:employee_rdd=sc.textFile("employee.txt") employee_df=employee_rdd.toDF() employee_data = employee_df.filter("Name = 'David'").collect()
+-----------------+-------+ | Name:| Age:| +-----------------+-------+ |David |25 | +-----------------+-------+
But when I am trying to do something like this:
emp_Name='Sam'
and passing this Name to filter like below:employee_data = employee_df.filter("Name = 'emp_Name'").collect
but this is giving me empty list.
-
Mohan Kumar Kannan about 6 yearsrelevant answer here: stackoverflow.com/questions/45813272/…
-
Mohan Kumar Kannan about 6 yearsRelavant answers is already explained in the below link: stackoverflow.com/questions/45813272/…
-
AJm about 6 years@Mohan Kumar Kannan This does not work. Can u show with a where clause example
-
-
Erkan Şirin over 3 yearsThe question is about how to use a variable in the filter condition.