Why does the Scala compiler disallow overloaded methods with default arguments?

35,227

Solution 1

I'd like to cite Lukas Rytz (from here):

The reason is that we wanted a deterministic naming-scheme for the generated methods which return default arguments. If you write

def f(a: Int = 1)

the compiler generates

def f$default$1 = 1

If you have two overloads with defaults on the same parameter position, we would need a different naming scheme. But we want to keep the generated byte-code stable over multiple compiler runs.

A solution for future Scala version could be to incorporate type names of the non-default arguments (those at the beginning of a method, which disambiguate overloaded versions) into the naming schema, e.g. in this case:

def foo(a: String)(b: Int = 42) = a + b
def foo(a: Int)   (b: Int = 42) = a + b

it would be something like:

def foo$String$default$2 = 42
def foo$Int$default$2 = 42

Someone willing to write a SIP proposal?

Solution 2

It would be very hard to get a readable and precise spec for the interactions of overloading resolution with default arguments. Of course, for many individual cases, like the one presented here, it's easy to say what should happen. But that is not enough. We'd need a spec that decides all possible corner cases. Overloading resolution is already very hard to specify. Adding default arguments in the mix would make it harder still. That's why we have opted to separate the two.

Solution 3

I can't answer your question, but here is a workaround:

implicit def left2Either[A,B](a:A):Either[A,B] = Left(a)
implicit def right2Either[A,B](b:B):Either[A,B] = Right(b)

def foo(a: Either[Int, String], b: Int = 42) = a match {
  case Left(i) => i + b
  case Right(s) => s + b
}

If you have two very long arg lists which differ in only one arg, it might be worth the trouble...

Solution 4

What worked for me is to redefine (Java-style) the overloading methods.

def foo(a: Int, b: Int) = a + b
def foo(a: Int, b: String) = a + b
def foo(a: Int) = a + "42"
def foo(a: String) = a + "42"

This ensures the compiler what resolution you want according to the present parameters.

Solution 5

Here is a generalization of @Landei answer:

What you really want:

def pretty(tree: Tree, showFields: Boolean = false): String = // ...
def pretty(tree: List[Tree], showFields: Boolean = false): String = // ...
def pretty(tree: Option[Tree], showFields: Boolean = false): String = // ...

Workarround

def pretty(input: CanPretty, showFields: Boolean = false): String = {
  input match {
    case TreeCanPretty(tree)       => prettyTree(tree, showFields)
    case ListTreeCanPretty(tree)   => prettyList(tree, showFields)
    case OptionTreeCanPretty(tree) => prettyOption(tree, showFields)
  }
}

sealed trait CanPretty
case class TreeCanPretty(tree: Tree) extends CanPretty
case class ListTreeCanPretty(tree: List[Tree]) extends CanPretty
case class OptionTreeCanPretty(tree: Option[Tree]) extends CanPretty

import scala.language.implicitConversions
implicit def treeCanPretty(tree: Tree): CanPretty = TreeCanPretty(tree)
implicit def listTreeCanPretty(tree: List[Tree]): CanPretty = ListTreeCanPretty(tree)
implicit def optionTreeCanPretty(tree: Option[Tree]): CanPretty = OptionTreeCanPretty(tree)

private def prettyTree(tree: Tree, showFields: Boolean): String = "fun ..."
private def prettyList(tree: List[Tree], showFields: Boolean): String = "fun ..."
private def prettyOption(tree: Option[Tree], showFields: Boolean): String = "fun ..."
Share:
35,227

Related videos on Youtube

soc
Author by

soc

Updated on June 09, 2020

Comments

  • soc
    soc almost 4 years

    While there might be valid cases where such method overloadings could become ambiguous, why does the compiler disallow code which is neither ambiguous at compile time nor at run time?

    Example:

    // This fails:
    def foo(a: String)(b: Int = 42) = a + b
    def foo(a: Int)   (b: Int = 42) = a + b
    
    // This fails, too. Even if there is no position in the argument list,
    // where the types are the same.
    def foo(a: Int)   (b: Int = 42) = a + b
    def foo(a: String)(b: String = "Foo") = a + b
    
    // This is OK:
    def foo(a: String)(b: Int) = a + b
    def foo(a: Int)   (b: Int = 42) = a + b    
    
    // Even this is OK.
    def foo(a: Int)(b: Int) = a + b
    def foo(a: Int)(b: String = "Foo") = a + b
    
    val bar = foo(42)_ // This complains obviously ...
    

    Are there any reasons why these restrictions can't be loosened a bit?

    Especially when converting heavily overloaded Java code to Scala default arguments are a very important and it isn't nice to find out after replacing plenty of Java methods by one Scala methods that the spec/compiler imposes arbitrary restrictions.

    • KajMagnus
      KajMagnus about 11 years
      "arbitrary restrictions" :-)
    • user1609012
      user1609012 over 7 years
      It looks like you can get around the issue using type arguments. This compiles: object Test { def a[A](b: Int, c: Int, d: Int = 7): Unit = {}; def a[A](a:String, b: String = ""): Unit = {}; a(2,3,4); a("a");}
    • Landlocked Surfer
      Landlocked Surfer about 7 years
      @user1609012: Your trick did not work for me. I tried it out using Scala 2.12.0 and Scala 2.11.8.
    • cubic lettuce
      cubic lettuce over 6 years
      IMHO this is one of the strongest pain-points in Scala. Whenever I try to provide a flexible API, I often run into this issue, in particular when overloading the companion object's apply(). Although I slightly prefer Scala over Kotlin, in Kotlin you can do this kind of overloading...
    • Seth Tisue
      Seth Tisue over 3 years
      The ticket of record on this is github.com/scala/bug/issues/8161
  • soc
    soc over 13 years
    Well, I tried to use default arguments to make my code more concise and readable ... actually I added an implicit conversion to the class in one case which did just convert the alternative type to the type accepted. It just feels ugly. And the approach with the default args should just work!
  • soc
    soc over 13 years
    Thanks for your answer. The thing which probably confused me was that basically everywhere else the compiler only complains if there actually is some ambiguity. But here the compiler complains because there might be similar cases where ambiguity could arise. So in the first case the compiler only complains if there is a proven problem, but in the second case the compiler behavior is much less precise and triggers errors for "seemingly valid" code. Seeing this with the principle of the least astonishment, this is a bit unfortunate.
  • soc
    soc over 13 years
    Does "It would be very hard to get a readable and precise spec [...]" mean that there is an actual chance that the current situation might be improved if someone steps up with a good specification and/or implementation? The current situation imho limits the usability of named/default parameters quite a bit ...
  • James Iry
    James Iry about 13 years
    There is a process for proposing changes to the spec. scala-lang.org/node/233
  • Seth Tisue
    Seth Tisue almost 13 years
    Even if someone stepped up with a good specification and implementation, I have a feeling it wouldn't get in, because the new specification and implementation would be larger than the old, and largeness itself has continuing costs. The issue just isn't that important IMO.
  • Blaisorblade
    Blaisorblade over 12 years
    You should be careful with such conversions, since they apply for all uses of Either and not just for foo - this way, whenever a Either[A, B] value is requested, both A and B are accepted. One should define instead a type which is only accepted by functions having default arguments (like foo here), if you want to go this direction; of course, it becomes even less clear whether this is a convenient solution.
  • Shelby Moore III
    Shelby Moore III over 10 years
    A disjunction can be used to work around this limitation, yet it has a runtime cost. I am curious to learn why overload resolution is difficult to specify. I wish someone linked to the relevant section(s) of the spec, so readers don't have to dig. I found this.
  • Shelby Moore III
    Shelby Moore III over 10 years
    I have some comments (see my comments below the linked answer) about Scala making overloading frowned upon and a second-class citizen. If we continue to purposely weaken overloading in Scala, we are replacing typing with names, which IMO is a regressive direction.
  • Richard Gomes
    Richard Gomes about 10 years
    If Python can do, I don´t see any good reason why Scala couldn't. The argument for complexity is a good one: implementing this feature will make Scale less complex from the user's perspective. Read other answers and you will see people inventing very complex things just to solve a problem which should not even exist from the users' perspective.
  • Mark
    Mark over 9 years
    I think your proposal here makes a lot of sense, and I don't see what would be so complex about specifying/implementing it. Essentially, the parameter types are part of the function's ID. What does the compiler currently do with foo(String) and foo(Int) (i.e., overloaded methods WITHOUT a default)?
  • blast_hardcheese
    blast_hardcheese over 9 years
    Wouldn't this effectively introduce mandatory Hungarian Notation when accessing Scala methods from Java? It seems like it would make the interfaces extremely fragile, forcing the user to take care when type parameters to functions change.
  • blast_hardcheese
    blast_hardcheese over 9 years
    Also, what about complex types? A with B, for instance?
  • ayvango
    ayvango over 8 years
    Could such specs be borrowed from implicits? The common solution for such questions is to introduce some trait representing arguments and implicit conversions from various tuples to the trait. So rules for implicit's ambiguity may applied to the default arguments