I am trying to do inner anti join in pyspark. For example i have a common key in both df, now what i need is to extract all the row which are not common in both df. That is id of one should not match with id of another.
df1=df1.join(df2,how='inner',df1.id !=df2.id)
But with this code,I am getting rows those ids are same in both df.
Thanks in advance for help.