foreach function not working in Spark DataFrame
You can cast it as Java RDD in order to use the lambda as you which:
df.toJavaRDD().foreach(x-> System.out.println(x));
First extend scala.runtime.AbstractFunction1
and implement Serializable like below
public abstract class SerializableFunction1<T,R> extends AbstractFunction1<T, R> implements Serializable {}
Now use this SerializableFunction1
class like below.
df.foreach(new SerializableFunction1<Row,BoxedUnit>(){ @Override public BoxedUnit apply(Row row) { System.out.println(row.get(0)); return BoxedUnit.UNIT; }});
Try with this code :
df.foreach(new VoidFunction<String>(){ public void call(String line) { //your function code here}});
If you want just to show df content, this much easier :
df.show();