Details
Description
I want to process image by HIPI on Spark.So i use hadoopfile to creat RDD.
There are my Code.
val conf = new SparkConf().setAppName("BundleTest")
val sc = new SparkContext(conf)
val bundle0 = sc.hadoopFile[HipiImageHeader,HipiImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",1000)
But I got an error:type arguments.
Error:(39, 22) type arguments [org.hipi.image.HipiImageHeader,org.hipi.image.HipiImage,org.hipi.imagebundle.mapreduce.HibInputFormat] conform to the bounds of none of the overloaded alternatives of
value hadoopFile: [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)] <and> [K, V, F <: org.apache.hadoop.mapred.InputFormat[K,V]](path: String, minPartitions: Int)(implicit km: scala.reflect.ClassTag[K], implicit vm: scala.reflect.ClassTag[V], implicit fm: scala.reflect.ClassTag[F])org.apache.spark.rdd.RDD[(K, V)]
val bundle0 = sc.hadoopFile[HipiImageHeader,HipiImage,HibInputFormat]("hdfs://192.168.199.11:8020/Hdfs/Image/image.hib.dat",1000)
^
Please give me some advises to solve it.Or,How can I read imagebundle by other ways?
Thanks for your help.