enableIf.scala is a library that switches Scala code at compile-time, like #if in C/C++.
Suppose you want to create a library for both Scala 2.10 and Scala 2.11. When you implement the library, you may want to call the flatMap method on TailRec. However, the method does not exist on Scala 2.10.
With the help of this library, You can create your own implementation of flatMap for Scala 2.10 target, and the Scala 2.11 target should still use the flatMap method implemented by Scala standard library.
// Enable macro annotation by scalac flags for Scala 2.13
scalacOptions ++= {
import Ordering.Implicits._
if (VersionNumber(scalaVersion.value).numbers >= Seq(2L, 13L)) {
Seq("-Ymacro-annotations")
} else {
Nil
}
}
// Enable macro annotation by compiler plugins for Scala 2.12
libraryDependencies ++= {
import Ordering.Implicits._
if (VersionNumber(scalaVersion.value).numbers >= Seq(2L, 13L)) {
Nil
} else {
Seq(compilerPlugin("org.scalamacros" % "paradise" % "2.1.1" cross CrossVersion.full))
}
}
libraryDependencies += "com.thoughtworks.enableIf" %% "enableif" % "latest.release"import com.thoughtworks.enableIf
@enableIf(scala.util.Properties.versionNumberString.startsWith("2.10."))
implicit class FlatMapForTailRec[A](underlying: TailRec[A]) {
final def flatMap[B](f: A => TailRec[B]): TailRec[B] = {
tailcall(f(underlying.result))
}
}The @enableIf annotation accepts a Boolean expression that indicates if the FlatMapForTailRec definition should be compiled. The Boolean expression is evaluated at compile-time instead of run-time.
import scala.util.control.TailCalls._
def ten = done(10)
def tenPlusOne = ten.flatMap(i => done(i + 1))
assert(tenPlusOne.result == 11)For Scala 2.10, the expression scala.util.Properties.versionNumberString.startsWith("2.10.") is evaluated to true, hence the FlatMapForTailRec definition will be enabled. As a result, ten.flatMap will call to flatMap of the implicit class FlatMapForTailRec.
For Scala 2.11, the expression scala.util.Properties.versionNumberString.startsWith("2.10.") is evaluated to false, hence the FlatMapForTailRec definition will be disabled. As a result, ten.flatMap will call the native TailRec.flatmap.
- The
enableIfannotation does not work for top level traits, classes and objects. - The boolean condition been evaluated must refer
classs orobjects via fully quantified names from dependency libraries - The boolean condition been evaluated must not refer other
classs orobjects from the same library.
Suppose you want to create a Buffer-like collection, you may want create an ArrayBuffer for JVM target, and the native js.Array for Scala.js target.
/**
* Enable members in `Jvm` if no Scala.js plugin is found (i.e. Normal JVM target)
*/
@enableMembersIf(c => !c.compilerSettings.exists(_.matches("""^-Xplugin:.*scalajs-compiler_[0-9\.\-]*\.jar$""")))
private object Jvm {
def newBuffer[A] = collection.mutable.ArrayBuffer.empty[A]
}
/**
* Enable members in `Js` if a Scala.js plugin is found (i.e. Scala.js target)
*/
@enableMembersIf(c => c.compilerSettings.exists(_.matches("""^-Xplugin:.*scalajs-compiler_[0-9\.\-]*\.jar$""")))
private object Js {
@inline def newBuffer[A] = new scalajs.js.Array[A]
@inline implicit final class ReduceToSizeOps[A] @inline()(array: scalajs.js.Array[A]) {
@inline def reduceToSize(newSize: Int) = array.length = newSize
}
}
import Js._
import Jvm._
val optimizedBuffer = newBuffer[Int]
optimizedBuffer += 1
optimizedBuffer += 2
optimizedBuffer += 3
// resolved to native ArrayBuffer.reduceToSize for JVM, implicitly converted to ReduceToSizeOps for Scala.js
optimizedBuffer.reduceToSize(1)You can define a c parameter because the enableIf annotation accepts either a Boolean expression or a scala.reflect.macros.Context => Boolean function. You can extract information from the macro context c.
For breaking API changes of 3rd-party libraries, simply annotate the target method with the artifactId and the version to make it compatible.
To distinguish Apache Spark 3.1.x and 3.2.x:
object XYZ {
@enableIf(classpathMatches(".*spark-catalyst_2\\.\\d+-3\\.2\\..*".r))
private def getFuncName(f: UnresolvedFunction): String = {
// For Spark 3.2.x
f.nameParts.last
}
@enableIf(classpathMatches(".*spark-catalyst_2\\.\\d+-3\\.1\\..*".r))
private def getFuncName(f: UnresolvedFunction): String = {
// For Spark 3.1.x
f.name.funcName
}
}For specific Apache Spark versions:
@enableIf(classpathMatchesArtifact(crossScalaBinaryVersion("spark-catalyst"), "3.2.1"))
@enableIf(classpathMatchesArtifact(crossScalaBinaryVersion("spark-catalyst"), "3.1.2"))NOTICE:
classpathMatchesArtifactis for classpath without classifiers. For classpath with classifiers likeffmpeg-5.0-1.5.7-android-arm-gpl.jar, Please useclasspathMactchesorclasspathContains.
Hints to show the full classpath:
sbt "show Compile / fullClasspath"
mill show foo.compileClasspath