[ How can assign different return types to a function in Scala? ]
I am trying to write a function which should return different pairs depending on the input. I have override the "+ - / *" in Scala for my specific use. Each one ( +, -,* ,/) has three implementations based on the input. I have RDD and Float as inputs so it can be a + between RDD and RDD, or Float and RDD, or Float and Float and so on.
Now I am having a parser which reads expression from input like : RDD+1 , parse it and create postfix to make calculations easier like : RDD1+ and then I want to do a calculation using my implemented + . with the help of this algorithm I am trying to change it in a way to make it performing a calculation based on my input expression. For instance it contains:
var lastOp: (Float, Float) => Float = add
How can I change this: (Float, Float) => Float
to something that will accept (RDD, Float)|(RDD, RDD) |(Float, Float) => RDD = add
// my implementation of add ???
Edition:
I added this part with the help of two answers below: Ok I wrote this :
def lastop:(Either[RDD[(Int,Array[Float])], Float], Either[RDD[(Int,Array[Float])], Float]) => RDD[(Int,Array[Float])] = sv.+
in which sv is an instance from my other class that I have been override + in that but in two different ways so now I ma getting an error which I guess is because compiler gets confused about which implementation to use this is the
error: type mismatch;
[error] found : (that: org.apache.spark.rdd.RDD[(Int, Array[Float])])org.apache.spark.rdd.RDD[(Int, Array[Float])] <and> (that: Float)org.apache.spark.rdd.RDD[(Int, Array[Float])]
[error] required: (Either[org.apache.spark.rdd.RDD[(Int, Array[Float])],Float], Either[org.apache.spark.rdd.RDD[(Int, Array[Float])],Float]) => org.apache.spark.rdd.RDD[(Int, Array[Float])]
Note: what it says it found are two different implementations for "+"
Answer 1
Well, I'm not sure this is the best way to do it, but it is ONE way to do it and should result in the usage you described (or at least close to it):
import scala.language.implicitConversions
// implicit conversions
implicit def float2Either(in: Float): Either[Float, RDD[(Int,Array[Float])]] = Left(in)
implicit def rdd2Either(in: RDD[(Int,Array[Float])]): Either[Float, RDD[(Int,Array[Float])]] = Right(in)
def add(left: Either[Float, RDD[(Int,Array[Float])]], right: Either[Float, RDD[(Int,Array[Float])]]): Float = {
(left, right) match {
case (Left(someFloat), Left(anotherFloat)) => ???
case (Left(someFloat), Right(someRdd)) => ???
case (Right(someRdd), Left(someFloat)) => ???
case (Right(someRdd), Right(anotherRdd)) => ???
}
}
val lastOp: (Either[Float, RDD[(Int,Array[Float])]], Either[Float, RDD[(Int,Array[Float])]]) => Float = add
Another way, and probably the better one, would be the pimp my library pattern.
However, you would not be able to decide yourself what (float + float) would yield. Which in the most sane cases should not be a problem.
You could write implicit wrapper classes for Float and RDD much like 'RichFloat' 'RichInt' and the like. implementing operators for each that will accept the other as input.
implicit class RichRdd(val underlying: RDD) extends AnyVal {
def +(in: Float): Float = ???
def +(in: Test): Float = ???
}
implicit class RicherFloat(val underlying: Float) extends AnyVal {
def +(in: RDD): Float = ???
}
Answer 2
I think pattern matching is the right way to go, you might need to do more research around operator overloading.
About RDD, it should be a collection of elements in Spark where I don't know what you are trying to achieve by adding a list to a number, only one element in the RDD? ..etc.
Without knowing exactly what you want, here is an example showing how can you handle different combo of types using pattern matching:
import math.hypot
object test {
def myadd(x: Any, y: Any) = (x, y) match {
case (x: String, y:String) => x.toInt + y.toInt
case (x: String, y:Int) => x.toInt + y.toInt
case (x: Int, y:String) => x + y.toInt
case (x: Int, y:Int) => x + y
case _ =>
} //> myadd: (x: Any, y: Any)AnyVal
var result = myadd(1,2) //> result : AnyVal = 3
println(result) //> 3
println(result.getClass()) //> class java.lang.Integer
result = myadd(1,"2")
println(result) //> 3
println(result.getClass()) //> class java.lang.Integer
result = myadd(1.0,2)
println(result) //> ()
println(result.getClass()) //> class scala.runtime.BoxedUnit
}