With a local spark instance built with hive support, (-Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver)
The following script/sequence works in Pyspark without any error in 1.6.x, but fails in 2.x.
The error produced is:
The error goes away if sqlContext2 is replaced with sqlContext in the last (error) line. Since the SQLContext class is preserved for backward compatibility, the changes in 2.x break scripts/notebooks that follow the above pattern of calls and used to run fine with 1.6.x.