Version: 1.0, 2012
If you want to redistribute this work, please share this information with me, so I can link it too
Even as per a blog the Groovy developer says "if he knows that Scala exists and is about to come then he would never had developed Groovy". This statement itself says a lot about Scala.
Scala is a powerful language. And the power can makes problems to people who don't know Scala enough. Here comes this tutorial - to make Scala more friendly for programmers as well as present Scala in a compact way.
javap
- disassembles the class file. Prints the content of the classes and packages (values, fields, functions of the class)scalap
- similar to javap-optimise
Generates faster bytecode by applying optimisations to the program-unchecked
option to scala or scalac: Enable detailed unchecked (erasure) warnings. Useful for testing / debuging.
( expression 1;
expression 2;
expression 3;
)
// Second way:
{ expression 1 // don't need to use a ';'
expression 2
expression 3
}
For method call with exactly one argument you can use curly braces to surround the argument instead of parentheses.
The purpose of this ability to substitute curly braces for parentheses for passing in one argument is to enable client programmers to write function literals between curly braces. This can make a method call feel more like a control abstraction.
class X
val X = 1
new X // returns new object
X // returns 1
Identifier kind:
var x : String Pair Int = (1, "aa") // var x: Pair[String, Int] = ...
type X = M1 + M2 // for some type M1, M2, +[T, T2]
type +[A,B] = Pair[A,B]
Quantity[M + M2] // goes to Quantity[+[M, M2]]
head :: tail // apply method in class ::[T](head: T, tail: List[T])
When operator name ends with ":" then the operator is bind to right argument and left side is applied as an argument.
Default methods arguments have modifier val
return keyword is not required
The method return type, which don't return a value, is Unit
. To make the method returning to return Unit:
Unit
def f(){...}
0xcafebabe < 0
0xcafebabeL > 0
"""Hi "Robert", what's up?"""
stripMargin="|"
- line from left, below """" assign margin, for which whitespaces will be displayed.
final val x=5
type Action = () => Unit
Action is an alias to the function type ()=>Unit
Scala.Any
/ \
/ \
AnyVal AnyRef
/ | | \ / | \
/ | | \ / | String
/ | | Unit Seq |
Double | Boolean \ |
Int List
... ...
| / /
...
\ \ | | /
\ \ | | Null
\ \ /
Nothing
AnyRef == Object in Java.
Scala is a functional, so every function should return something, if not it returns Unit. This aims to difference between null
and ()
- so that the function returns something (Unit) which is not connected to null reference.Option
has a lot of advantages over null
None
value unambiguously means the optional value is missing. A null
value may mean a missing value or it may mean the variable was not initialized.isEmpty, isDefined, get, getOrElse, map, filter
) the actual value makes you think about the possibility that it might not be there and how to handle that situation, so you are less likely to write code that mistakenly assumes there is a value when there is not.NoSuchElementException
exception is raised when Option is used. It is more specific than a NullPointerException
and so should be easier to interpret, track down, and fix.
def getSystemProperty(s: String): Option[String] = ...
def loadPropertyFile(s: String): Option[String] = ...
val x = (getSystemProperty("PROPFILE") flatMap loadPropertyFile // Very nice implementation without boilerplate null checking
x flatMap (_.get("TIMEOUT")) map (_.toInt) getOrElse 60) // using monad methods chaining
implicit functions
.
if (e1) e2 [else e3]
// if else part is empty then it evaluates to
if (e1) e2 else ()
while (e1) e2
do e1 [';'] while (e2)
scala.util.control.Breaks
. Methods from class and it's companion object has access to private fields
class X(a: Int){
def unary_* = a*
def unary_- = a-1
}
var x = new X(2)
-x // OK
*x // Error, * not available as unary operator
x.unary_* // ok, method call
==
is predefined for any object as an alias to equals
. If we want to overwrite ==
, we should do it by overwriting equals
method.==
is not type aware since it is inherited from Any
class, which equals
head is equals(a:Any)
. So ==
takes argument of any type, and there is no compilation error when comparing not connected class, eg Fish("dolphin") == Office("desk")
. The result will always be false. The there is only one help from compile - it will warn you when you use such comparison.
To escape from this behavior you can use ===
method from Scalaz library, which is discussed in Tools and libraries section.
==
behavior are case classes, for which MyCase(pa1, pa2...) == MyCase(pb1, pb2...)
corresponds to pa1 == pb1 and pa2==pb2 ...
eq
ne
When we think about implementation of constructor and apply method of companion object, we need to remember that apply method companion object doesn't take part in inheritance and polymorphism
class X(var x: Int)
).
class A (arg: Int){
def f(a: A) = {
println(arg) // OK, arg will be stored as a object member
println(a.arg) // Error, arg is private[this]
}
}
class X private(...)...
class A (arg: Int, private[this] var xt: Int){
//body of primary constructor is here:
xt = some_function() // xt is temporary value, doesn't occupy any memory
var x1 = 1 // mutable variable
private var x2 = 2
private[this] val x3 = 3 // private constant - used only by this object, not accessible from outside!
val (x4, x5) = compute(arg) // in this case, compiler will create hidden object field for tuple.
val x6 = { // initialization using block code. All variables from block code are temporary
val t = 3
arg*t - xt*t
}
def this(arg: Int) = this(arg, 0) // auxiliary constructor, calls primary constructor
}
object A {
def apply(arg: Int) = new A (arg, 1)
}
abstract class Expectation[T] extends BooleanStatement {
val expected: Seq[T]
…
}
object Expectation {
def apply[T](expd: T ): Expectation[T] = new Expectation[T] {val expected = List(expd)}
def apply[T](expd: Seq[T]): Expectation[T] = new Expectation[T] {val expected = expd }
}
Here we explicitly define each apply
to return Expectation[T]
, else it would return a structural subtype Expectation[T]{val expected: List[T]}
.
new ClassName (constructor args...)
.
Furthermore we can use helper methods from companion object which can perform initialization for us.
In general you can't instantiate an inner class without specifying an outer class instance.
class X {
var celsius = 1
val v = _
def fahrenheit = celsius * 9 / 5 + 32 // getter without associated field
def fahrenheit_= (f: Float) { // setter without associated field
celsius = (f - 32) * 5 / 9
}
}
// goes to
class X{
private[this] var celsius = 1
private[this] val v2 = 0
def celsius: Int = v1 // default getter
def celsius_(x: Int) = v2=x // default setter
def v2: Int = v2 // default getter
def fahrenheit = .... // the same as original
}
the defs v1 and v2 are implicitly created getters and setter. v2 doesn't have a setter method since it is val. = _
which is default initializer, which initialize the variable for default one (numbers - 0, strings - "", boolean - false, reference types - null).
As we see in example we can define setter and getter without creating field
o.isIstanceOf[T]
, check if o
is type T
o.asIstanceOf[T]
, casts o to type T. Raises java.lang.ClassCastException
when such operation can not be performedapply
.
val x = new SomeClass()
x(4, 'a') // compiles to x.apply(4, 'a')
To make inheritance we use extends keyword, which goes just after primary constructor declaration
primary constructor must call some base class constructor by applying arguments to base class name (or leave empty when there is constructor with empty arguments list).
class A1(x: Int) {
val param = init(x) // Base calls methods to initiate x
def init(x: Int) = x*2
}
class A2(x: Int) extends A1(f(x)) {
override def init(x: Int) = x*4
}
object A2 {
def f(x:Int) = x+3
}
val b = new A2(1) // Base primary constructor calls override init method
println(b.param) // prints 16
class A(name: String, vals: String*)
class B(name: String, vals: String*) extends A(name, vals: _*) // look how we apply vals!
From Scala Ref. §6.5:
The expression this can appear in the statement part of a template or compound type. It stands for the object being defined by the innermost template or compound type enclosing the reference. If this is a compound type, the type of this is that compound type. If it is a template of a class or object definition with simple name C, the type of this is the same as the type of C.this.
class O {
selfO =>
val name = "O"
trait I {
selfI =>
val name = "I"
def test() {
this.name // refers to "I"
selfI.name // refers to "I"
O.this.name // refers to "O"
selfO.name // refers to "O"
}
}
}
For example we want to create class Foo which must be also the type of Bar1 and Bar2:
trait Bar1
trait Bar2
class Foo {
self : Bar1 with Bar2 => // here we can change self name for whatever we want, but keyword
...
}
val x = new Foo // error: class Foo cannot be instantiated because it does not conform to its self-type Foo with Bar1 and Bar2
val x = new Foo with Bar1 with Bar2 // OK
More info about self type annotations on scala pages
// we want to create base parametric type which requires that in subtypes the type parameter will be the subtype itself:
// So the constraint is that the subtypes must be the form of:
// S extends Base[S]
abstract class Base[Sub] {
self:Sub =>
def add(s:Sub) : Sub
}
case class Vec2(x:Double,y:Double) extends Base[Vec2] { // Ok
def add(that:Vec2) = Vec2(this.x+that.x, this.y+that.y)
}
// attempts to cheat with inheritance won't work:
case class Foo extends Base[Vec2] // error: illegal inheritance;
Base
would have some filed, lets say base_field
, and if in add
method we would refer to that field, we get an error, that class Sub doesn't have such field. class Base
has, but inside definition of Base
we can refer only to fields from Sub
.Consider a class Child with a method roomWith(aChild), which asserts that self and aChild are roommates. If there are two subclasses Boy and Girl, you will want to subclass this method in both with signatures roomWith(aBoy) and roomWith(aGirl) respectively, while doing the actual work (which is the same in both cases, presumably) in the Child method. The type system should check the correctness of every call.
abstract class Child[C <: Child[C]] {
self : C =>
var roomie : Option[C] = None
def roomWith(aChild : C)= {
roomie = Some(aChild)
aChild.roomie = Some(this)
}
}
class Boy extends Child[Boy]
class Girl extends Child[Girls]
val b1 = Boy
val b2 = Boy
val c1 = Girl
b1.roomWith(b2)
b1.roomWith(c1) // error
1 to 5
for (seq) yield expr
for (seq) expr
seq is a sequence of generators, definitions, filters with semicolons between successive elements.
for((a,b) <-range // generator
if x > 10 // filter
if ...; // needs ; before nested expression
y <- range if ... // nested generator + filter expression
CaseCl(tmp1, tmp2) = y.some_function; // definition
if (predicate tmp1) // other filter
) [yield] {
block code // returns something
}
We can use {} instead () to avoid putting ';' after each sentence
Caution the definition part is computed every time new value is taken.
If definition part doesn't depends on variables bound by some generator it is better to put it outside for expression:
for (x <- 1 to 1000; y = expensiveComputationNotInvolvingX) // BAD to put y here
yield x * y
// Better solution
val y = expensiveComputationNotInvolvingX
for (x <- 1 to 1000) yield x * y
for ... yield ...
expression is translated to some composition of map, flatMap and withFilter functions.for
loop is translated to some composition of withFilter, foreach functions.
map, flatMap, withFilter
expects some function as a first argument, which can be partial function
This is used to convert for expression to hight order functions
for ((x1 , ..., xn ) <- expr1 ) yield expr2
// is translates to:
expr1 .map { case (x1 , ..., xn ) => expr2 }
for (pat <- expr1 ) yield expr2
// pat is general pattern, thus translation is a bit complicated:
expr1 withFilter {
case pat => true
case _ => false // this guarantees that match never throw a MatchError
} map {
case pat => expr2
}
Furthermore, you can characterize every monad by map, flatMap, and withFilter, plus a “unit” (Monoid) constructor that produces a monad from an element value. In an object oriented language, this “unit” constructor is simply an instance constructor or a factory method.
try-catch-finally
returns value. The value is returned only from try (if no exception occur), or catch (if exception is thrown and catch). The value computed in finally
clause is dropped. Finally should not normally change the value computed in the main body or a catch clause of the try.
If a finally clause includes an explicit return statement, or throws an exception, that return value or exception will "overrule" any previous one
that originated in the try block or one of its catch clauses. See more in Programming in Scala e2 (page 172).
try {
val f = new FileReader("input.txt")
...
} catch {
case ex: FileNotFoundException => // handle missing file
case ex: IOException => ...
} finally {
file.close()
}
var, val
Function3[Int, Int, Double]
)apply
to call , curry
to transform to curried form (each argument separately passed)toString, andThen, compose, ...
println("abc".length)
might print <function> if you forgot the parentheses after length.
Quite good article about this can be found here
(_:Int) + (_:Double)
is instance of Function1[Int, Double]
, and def f = 1
- is a constant function, instance of Function0[Int]
.
Scala have syntactic sugar for function types:
() => R
stands for Function0[R]
A, B => R
stands for Function2[A, B, R]
and so on ...
someMethod _
" wraps someMethod
to function value, whose apply method is exactly someMethod.
val nums = List(1,2,3)
val.foreach(println _) // ok function value is required, println is converted to function value
val.foreach(println) // ok, compiler expects function
def succ1(i: Int) = i+1
val succ2 = { i: Int => i+1 }
var p = succ1 // ERROR succ is not a first class value
var p = succ1 _ // ok, succ is converted to function value by _
var p: Int=>Int = succ1 // ok, compiler expects function
// succ2 == p ==succ1 _ ; both equal by type and semantic
def method_g[T](x:T) = x.toString.substring(0,4)
val fun_g = [T](x:T) => x.toString.substring(0,4) // ERROR: functions can't be generic
//** but we can obey restriction from fg: **
class Cfun_g[T] extends Function1[T,String] {
def apply(x:T) = x.toString.substring(0,4)
}
val fun_g = new Cfun_g[String]
fun_g("this is a string")
someList.foreach(_ + _)
require
-> raises IllegalArgumentException - use to validate function / constructor arguments assertion
-> raises AssertionErrorapplying to none of arguments:
var f = someFunction _
applying to some arguments, eg function has 3 arguments of types String, Int, Int:
var f = someFunction("hej", _: Int, 3)
val c=1
is implemented as a pair:
private final int c;
public int c(); // getter
var v=1
is implemented as a triple:
private int v;
public int v(); // getter
public void v_$eq(int); // setter
val f = {x:Int => x+1}
is implemented as a pair:
private final scala.Function1 f; // object to keep function value as a constant (f is a val)
public scala.Function1 vf1(); // getter method to the function
var f = {x:Int => x+1}
is implemented as a triple:
private scala.Function1 f; // object to keep function value as a variable (f is var)
public scala.Function1 vf1(); // getter method to the function
public void vf_$eq( Function1 ); // setter
def echo(args: String*) = for(a <- args) println(a)
def pass_to_echo(args: String*) = echo(args: _*)
var arr = Array("jeden", "dwa")
echo("jeden", "dwa")
echo(arr) // Error
echo(arr: _*) // OK
def f(arg1: Int, arg2: String) = ...
f(arg2="hej", arg1=1)
It is also possible to mix positional and named arguments. In that case, the
positional arguments come first.
def f(arg: Int = 2)
def f(a: => Unit) = a;a // a is an argument passed by name
f(println("hej")) // outputs two lines of "hej"
var actions: Listbuffer[()=>Unit] = ListBuffer() // List with some operation to do
def insert(condition)(operation: =>Unit) = {
if(condition) actions.append(() => operation)
}
In the previous example insert puts an unevaluated expression which type is Unit to the action list as a function, which evaluates this expression (use variable).
var count = 0
val e = { count += 1 } // Fake. The expression is called, and e==1, count==1
insert(true){
println("incrementing")
count += 1
} // nothing is print, count is not incremented
insert(true)(println("hej"))
actions(0) // returns a function which evaluates {println("incrementing"); count+=1}
actions(0)() // count == 2
actions(0)() // count == 3
actions(1)() // output "hej" to console
&&
is a method from Boolean class is like in other languages. The second argument is evaluated only when firs is true, and is done with argument passed by name, which look similar to:
class Bollean(left) ...
def &&(right: ()=>Bollean) =
if(left) right
else false
def foo(a: List[Int]): Unit = ...
def foo(a: Seq[Int]): Unit = ...
foo(1::2::Nil) compiler will use foo: List[Int] => Unit
foo
methods below has the same type:
def foo(x: List[Int])
def foo(x: List[Boolean])
So in client code runtime can't figure out which one to use.
We can pass this limitation using union types described in tips section.Other possibility would be to use case classes family + pattern matching + implicit conversion from desired generic types to case classes. More: http://jim-mcbeath.blogspot.com/2008/10/polymorphism-using-implicit-conversions.html
// assuming above foo definition
val my_fun = foo _ // what is the type of my_fun? which foo to use?
// assuming above foo definition
// and we have some implicit conversion from String to Int and Boolean
foo("") // which implicit conversion use: String => Int or String => Boolean?
def f(s: String) = ...
def f(s: String, cond: Boolean = True) = ...
Trait can has a super call on a method declared abstract. Such calls are illegal for normal classes.
Since super calls in a trait are dynamically bound, the super call for a abstract method in a trait will work so long as the trait is mixed in after another trait or class that gives a concrete definition to the method. This arrangement is frequently needed with traits that implement stackable modifications. To tell the compiler you are doing this on purpose, you
must mark such methods as abstract override.
The order of mixins is significant.
The method call order is determined by linearization. Roughly speaking, when you call a method on a class with mixins, the method in the trait furthest to the right is called first. If that method calls super, it invokes the method in the next trait to its left (of from parent trait - not class!), and so on.
The linearization of class is computed from back (of the declaration order) to front as follows. The last part of the linearization is superclass.
Traits can extend other traits. In that case the overwrite version of method is called first. (eg T2 extends T1; class X with T2 with T1; var x = new X, and all has f method, if T2 override f method then T2.f is called first in expression x.f)
abstract class IntQueue {
def get(): Int
def put(x: Int)
}
class BasicIntQueue extends IntQueue {
import scala.collection.mutable.ArrayBuffer
private val buf = new ArrayBuffer[Int]
def get() = buf.remove(0)
def put(x: Int) { buf += x }
}
trait Doubling extends IntQueue {
abstract override def put(x: Int) { super.put(2 * x) }
}
trait Incrementing extends IntQueue {
abstract override def put(x: Int) { super.put(x + 1) }
}
class MyQueue extends BasicIntQueue with Doubling
val queue = new MyQueue with Incrementing
val queue = new BasicIntQueue with Doubling with Incrementing
// example with parameter constructor
class WithParameter(arg: Int) extends MyQueue with Doubling
Ordered
. One abstract method which needs to be implemented: compare(self: X): Int
Application
- doesn't work in multi thread application, has no access to commandline (startup) arguments. Is not well optimizedApp
- take care about every downside from Application trait. Works in multithreading applications, has access to commandline arguments, it's optimized. Available in Scala > 2.9. More in Scala API docsDelayedInit
. The class which implements it, the code from main constructor (with variables and expressions evaluated at initialization time) is moved to delayedInit()
method. At the end we've got a control about initialization time of an object. To correctly use an object we need to call delayedInit method before.Dynamic
. Trait to make dynamic nature of constructed object, which mixin this trait. More in Dynamic proposal
package bobsrockets.navigation
package bobsrockets.navigation {
class Navigator
}
The binary file structure depends on packages, not the source file structure. All definitions are compiled to exactly one binary .class file based on package in which the definition is placed
All the class, traits and companion objects from package Pack1.Pack1_1 in some module (file) are compiled to Pack1/Pack1_1 subdirectory of the output directory.
In order to run an class Main which is in package Pack1.Pack1_1 you need to run: scala Pack1.Pack1_1.Main
You need to assure that directory Pack1 is in your CLASSPATH by either be in directory Pack1 or add path to Pack1 to CLASSPATH through -cp option or setting CLASSPATH environment variable.
Exception in thread "main" java.lang.RuntimeException: Cannot figure out how to run target: Main
scala -cp . test.Main
)
Exception in thread "main" java.lang.RuntimeException: Cannot figure out how to run target: test.Main
scala -cp .. test.Main
def main(args: Array[String]) { ... }
The exception was raised because the class (object) doesn't has the main method or you try to call wrong class - forgot to specify package.scala X.Y.Main
- assuring that X is in your CLASSPATH.
If There are to packages ie: launch one in global scope, second in package bob, and you have a method that access package launch, then that methods refers to bob.launch.
If you want to access global launch you need to write:
_root_.launch
Put another way, every top-level package you can write is treated as a member of package _root_
.
import bobsdelights.Fruit // easy access to Fruit
import bobsdelights._ // easy access to all members of bobsdelights
// import a simple name x. This includes x in the set of imported names
import x
def showFruit(fruit: Fruit) {
import fruit._
println(name +"s are "+ color) // the same as fruit.name, fruit.color
}
// this import to objects from Fruits and renames Apple
import Fruits.{Apple => McIntosh, Orange}
// imports all names from Fruits and renames Apple
import Fruits.{Apple => McIntosh, _}
// imports all members of Fruits except Pear
import Fruits.{Pear => _, _}
Class labeled private[bobsrockets] means that is visible in all classes and objects that are contained in package bobsrockets, but all code outside package bobsrockets can't access this class
Class labeled private[this] allows access only from same object - any access must be made from the very same instance.
Interesting is a field type: private[this] val
- nobody can modify it, and only the object itself has access to its value. There is proposal for optimization - that this field wouldn't take any memory space in object, and in places where it is used the value would be compiled in (as a temporary value).
Modifier protected[X] in a class C allows access to the labeled definition in all subclasses of C and also within the enclosing package, class, or object X.
One exception concerns protected static members. A protected static member of a Java class C can be accessed in all subclasses of C. By contrast, a protected member in a companion object makes no sense, as singleton objects don't have any subclasses.
trait Queue[T] {
def head: T // clients interface
...
}
object Queue {
def apply[T](xs: T*): Queue[T] = // clients factory method hiding actual constructor complexity
new QueueImpl[T](xs.toList, Nil)
private class QueueImpl[T]( // True class inaccessible from outside
private val p1: List[T], // private parameters, inaccessible even from Queue companion object
private val p2: List[T]
) extends Queue[T] {
... // Queue implementation
}
}
So when we create QueueImpl object through factory method from Queue companion object, we can only access to Queue type. So the QueueImpl object is visible outside as a Queue type object, and has access only to fields from trait Queue.
This techniques is used only when we want to hide whole class
We make package object by writing: package object package_name {... }
The contents of the curly braces can include any definitions you like.
// file gardening/fruits/Fruit.scala
package gardening.fruits
case class Fruit(name: String, color: String)
object apple extends Fruit("Apple", "green")
// in file gardening/fruits/package.scala
package gardening
package object fruits {
val planted = List(apple, apple)
def showFruit(fruit: Fruit) {
println(fruit.name +"s are "+ fruit.color)
}
implicit def fruit2string(f: Fruit):String = f.name + " " + f.color
}
Package objects are compiled to files named package.class
which are the located in the directory of the package that they augment. So the package object fruits would be compiled to a class with fully qualified name gardening.fruit.package
(Note that, even though package is a reserved word in Java and Scala, it is still allowed as part of a class name on the JVM level.
Assertions can be turned on/off (so the Assertions error are thrown or not by assert / ensuring ) using JVM command line flags -ea, -da
Predef.assert(assertion: Boolean, [message: => Any]) :Unit
assert
will call toString
on it to get string explanation.
Ensuring
.ensuring(cond: (A) => Boolean, [msg: => Any]): A
ensuring(cond: Boolean, [msg: => Any]): A
AssertionError
with optional msg as an argument.
Example:
var x = 2
if(x<2)
x
else {
val y: Int = 2
y+x
} ensuring ( _ >= 4, "x is >= 2 so y+x is >=4")
the block after else returns (x+y):Int, which is converted to Ensuring with itself as a parameter. Then is applied to cond method ( x+y >= 4 ) and if the result is false exception is thrown.
new
keyword when instantiating. They has factory methods, eg val v = Num(1)
toString hashCode, and equals
to your class. They will print, hash, and compare a whole tree consisting of the class and (recursively) all its arguments. Since == in Scala always delegates to equals, this means that elements of case classes are always compared structurally.
match
{ (pattern => expression)* }.So when we want to use a variable to be used in match evaluation it must be uppercase, or lowercase wrote in form:
var x = 2
(1+1) match {
case `x` => true // x is bind to variable x
case x => true // always TRUE!, x is taken as a free variable, and is bind to match expression (1+1)
case X: => ... // only accepts a value equal to the value X (upper case here makes a difference)
case z if z==x => true // z is taken as a free variable and bind to match expression (1+1). The match is true when "if" evaluates to true
}
The exception is also in infix operators (eg: x::tail
) :
tail.::(x)
)::(x, tail)
)
def simplify(expr: Expr): Expr = expr match {
case UnOp("-", UnOp("-", e)) => e // Double negation
case BinOp("+", e, Number(0)) => e // Adding zero
case BinOp("*", e, Number(1)) => e // Multiplying by one
case List(0, _*) => println("found it") // variable long sequence beginning with 0
case UnOp("abs", e @ UnOp("abs", _)) => e // when matching success e is bind to UnOp)"abs",_)
case BinOp("+", x, x) => BinOp("*", x, Number(2)) // this fails, patterns must be linear.
// pattern variable may only appear once in a pattern.
case BinOp("+", x, y) if x == y => ... // pattern guard. Reformulation of upper
case s:String if s(0) == 'a' => ... // pattern guard. Reformulation of upper
case BinOp(op, l, r) => BinOp(op, simplifyAll(l), simplifyAll(r)) // recursive match further
// other matchers:
case x: String => s.length // any String
case x: Map[_, _] => m.size // any Map, we can't precise context eg Map[_, Int] because of type erasure
case (x, y, ..., z) => ... // only accept a tuple of the same arity
case Extr() => ... // only accept if Extr.unapply(expr) returns Some(Seq()) - some of something/empty sequence
case Extr(x) => ... // only accept if Extr.unapply(expr) returns Some(Seq(x)) or Some(Tuple1(x))
case Extr(x, y, ..,z) => ... // only accept if Extr.unapply(expr) returns Some(Seq(x,y,...,z)) or Some(TupleN(x,y,...z)) - the same arity
case x Extr y => .. // only accept if Extr.unapply(expr) returns Some(Seq(x,y)) or Some((x,y))
case x | y | ... | z => ... // accepts if any of the patterns is accepted (patterns may not contain assignable identifiers)
case _ => expr
}
A parametrized type pattern T [a(1), . . . , a(n)], where the a(i) are type variable patterns or wildcards _. This type pattern matches all values which match T for some arbitrary instantiation of the type variables and wildcards. The bounds or alias type of these type variable are determined as described in (§8.3).
...
A type variable pattern is a simple identifier which starts with a lower case letter. However, the predefined primitive type aliases unit, boolean, byte, short, char, int, long, float, and double are not classified as type variable patterns.
case x Seq[a] => ... // this will match any Seq, and the type parameter will be bind to a
The conclusion is we can't specify type parameters with full qualified names (like java.lang.Integer
).
If we need to specify type from some package, we need to make an type alias starting from Upper letter:
type JavaInt = java.lang.Integer
def describe(e: Expr): String = (e: @unchecked) match {
case Number(_) => "a number"
case Var(_) => "a variable"
// case BinOp(... // known from context, that never be available
}
The patterns in for
expressions can be used to extract values from an object providing map/flatMap/filter/withFilter/foreach functions.
val exp = new BinOp("*", Number(5), Number(1))
val BinOp(op, left, right) = exp // extract from val, throws exception if not succeeded
// filters for pattern, but pattern cannot be "identifier: Type", though that can be replaced by "id1 @ (id2: Type)"
for (pattern <- object providing map/flatMap/filter/withFilter/foreach) ...
val withDefault: Option[Int] => Int = {
case Some(x) => x
case None => 0
}
PartialFunction[A, R]
. If you apply such a function on a value it does not support, it will generate a run-time exception (scala.MatchError
).For example, a partial function that returns the second element of a list of integers:
val second: List[Int] => Int = {
case x :: y :: _ => y
}
To go away from compiler warnings you need to declare that you know you are working with them, by setting proper type. (A1, A2,...,An, A)
is a function from A1 * A2 * ... An
to A
. PartialFunction[A1, A2, ..., An, A]
is a partial function from A1 * A2 * ... An
to A
.
val second: PartialFunction[List[Int],Int] = {
case x :: y :: _ => y
}
Package PartialFunction contains couple of interesting functions which takes partial functions as an argument, to be used for matching in functional style:
import PartialFunction._
cond("abc") { case "def" => true } // result: false
condOpt("abc") { case x if x.length == 3 => x + x } // result: Option[java.lang.String] = Some("abcabc")
condOpt("abc") { case x if x.length == 4 => x + x } // result: Option[java.lang.String] = None
isDefinedAt
to check this.
second.isDefinedAt(List(5,6,7)) // returns true
second.isDefinedAt(List()) // returns false
Other interesting method is lift
, which will turn a PartialFunction[T, R]
into a Function[T, Option[R]]
, which means non-matching values will result in None instead of throwing an exception.
Extractors works with match, assignment and for comprehensions expression.
Scala defines to type of extractor methods:case C(...)
, if C has an unapply
then it is called with match object. The return type of unapply
depends on the case phrase, and should be chosen as follows:
C()
then it is just a test, return a Bollean
C(param: T)
then return Option[T]
C(param_1: T1, .., param_n: Tn)
then return Some((param_1, .., param_n)): Option[(T1, ..., Tn)]
- the Tuple OptionC(param: T*)
then return Option[Seq[T]]
unapplySeq
is used instead unapply
to match variable length parameters, with first on specified:
C(param_1: T, .., param_n: T, param: T*)
is used to match unapplySeq(match_obj): Option[Seq[T]]
None
then the match is not succeed.
More about extractors on daily-scala
case class Food(food:String)
case class Name(name:String)
object Eats {
def unapply(desc:String):Option[(Name,Food)] = {
val i=desc.indexOf(" eats ")
if (i> -1)
Some((Name(desc.substring(0,i)), Food(desc.substring(i+6))))
else None
}
}
val x= "Brutus eats meat" match { case Eats(f,n) => (f,n) } // x=(Name(Brutus),Food(meat))
val Eats(f,n) = "Brutus eats meat" // f=Name(Brutus), n=Food(meat)
val eats_l = List("A eats B", "B ate C", "C not D", "E eats F")
for (Eats(f, n) <- eats_l) yield f // returns only List(Name("A"), Food("B"),
// other strings don't match to Eats(f,n)
def msort[T](comp: (a1: T, a2:T)=>Boolean)(List[T]) =
... // the body of sort method
// List has build in method sortWith
// using the method
1. val l = List(4,2,3)
2. l sortWith (_ > _) // OK
3. msort(_ > _)(l) // ERROR
4. msort[Int](_ > _)(l) // OK
5. msort((a1: Int, a2: Int)=> a1 > a2)(l) // OK
The problem is because virtual machine needs to known the method type to instanced it. In 2
VM knows how to instanced method sortWith because it is a part of the object, and don't has any other generic type (besides the object one)4
VM knows the concrete type of msort, because we explicitly set the parametric type.5
VM also knows the parametric type, VM can watch to its arguments to guess it.
However in 3
VM can't look at the parameter l
because msort is curried function and it is instanced step by step. So there are two calls, and the previous call needs to be instanced separately. msort
so that its parameters are swapped the code would run without error.
This inference scheme suggests the following library design principle: When designing a polymorphic method that takes some non-function arguments and a function argument, place the function argument last in a curried parameter list by its own. That way, the method's correct instance type can be inferred from the non-function arguments.
Example:
val xss : List[A] = ...
(xss :\ List[B]()) ( op ) // fold right operation
Type of op
is (A, B) => B op
is not only related to xss
, and VM needs to instantiate properly method List[A].:\ : (List[X])( (A,X) => X)
We write type parameters just after type name in squared braces.
def foo[T1,T2](arg1: T1, arg2: List[T2]) = ...
class Foo[T1,T2](arg1: T1) { ...
// specifying
val specified_foo = foo[Int, String]
To specify the parametrized generic type we set the type parameters in squared braces. But when we use generic type, then in most cases compiler can infer the type parameters for us, so we don't need to specify it,foo(1, "1"::Nil)
val p1: String Pair Int = ("1", 1)
val p2: Pair[String, Int] = ("1", 1)
But this syntax is quiet odd for most type names.
The situation is different if type name look like operator.
So it can be useful when we create some type alias:
type ##[A,B] = Pair[A,B]
val p3: String##Int = ("1", 1)
def f(q: Queue) // Error: we need to pass a parameters to Queue
A[]
is covariant <=> for each type P1, P2
if P1 <: P2
then A[P1] <: A[P2]
A[+P]
A[]
is contravariant <=> for each type P1, P2
if P1 <: P2
then A[P1] >: A[P2]
A[-P]
This is very important as it explains why covariance can cause some issues. Contravariance is literally the opposite of covariance: parameters vary upward with subtyping. It is a lot less common partially because it is so counter-intuitive, though it does have one very important application: functions.
trait Output[-T] {
def write(x: T)
}
Let's have two outputs: of Seq and List.Output[Seq] <: Output[List]
Output[List]
is writing List to it. The same operation can be done with Output[Seq]
. So it's safe to use Output[Seq]
in place Output[List]
.
On the other hand if function expects Output[Seq]
but gets Output[List]
it can't perform write(some_seq). (because write expects List).
trait Function1[-P, +R] {
def apply(p: P): R
}
This declaration as a whole means that Function1 is contravariant in P and covariant in R. Thus, we can derive the following axioms:
T1' <: T1
T2 <: T2'
---------------------------------------- S-Fun
Function1[T1, T2] <: Function1[T1', T2']
Example:
def f: Seq => String
def g: List => AnyRef
def F: (List => AnyRef) => AnyRef
def F2: (Seq => String) => String
So: f <: g
and we can pass f
and g
to F
, but only f
fits to F2
.
What would happen if we pass g
to F2
?
F2
would call g(some_seq)
, where g expects List and performs List specific operation on its argument - we got an error!. As an exercise one can consider return type.
Types of mutable fields are classified as non-variant since they has corresponding setter method (which argument is the contravariant type) and and getter method (which return value is covariant type)
private[this] variables (vars and vals) are do not affect variance and don't cause problems. The intuitive explanation is that, in order to construct a case where variance would lead to type errors, you need to have a reference to a containing object that has a statically weaker type than the type the object was defined with. For accesses to object private values this is impossible.
class N[T]
class C[+T]
class Cr[+T]
def f1(a: N[AnyRef])
def f2(a: C[AnyRef]) // f2 accepts covariant parameters
def f3(a: Cr[Null]) // f3 accepts contravariant parameters
f1(new N[String]) // Error
f2(new C[String]) // OK
f3(new Cr[String]) // OK
//hypothetical code - which don't run through covariance type violates
class Cell[T](init: T) {
var current:T = init // error: covariant type T occurs in contravariant position
}
val c1 = new Cell[String]("abc")
val c2: Cell[Any] = c1
c2.current = 1 // so far so good
val s: String = c1.current // oops! Type correct because c1 is type Cell[String]
abstract class GenList[+T] { ...
def prepend(x: T): GenList[T] = // illegal! T in contravariant position
new Cons(x, this)
}
With your new-found knowledge of co- and contravariance, you should be able to see why the this example will not compile - look at the covariance of class fields, and contravariance of methods arguments. A
is covariant, while the prepend
function expects its type parameter to be contravariant. Thus, A
is varying the wrong direction. Interestingly enough, we could solve this problem by making GenList contravariant in A
, but then the return type List[A] would be invalid as the prepend
function expects its return type to be covariant.prepend
method which defines A as a lower bound:
abstract class GenList[+T] { ...
def prepend[S>:T](x: S): GenList[S] = // now is OK :)
new Cons(x, this)
}
As an example, suppose there is a class Fruit with two subclasses, Apple and Orange. With the new definition of class GenList, it is possible to prepend an Orange to a GenList[Apple]. The result will be a GenList[Fruit].
sort[T]: (T=>Int, List[T]) => List[T]
Ordered
, so we know the type supports < > ... methodssort[T <: Ordered[T]]: List[T] => List[T]
trait Abstract {
type T // abstract type
def transform(x: T): T
val initial: T
var current: T
}
Any implementation of val must be a val definition (not var, def..).
Abstract vals sometimes play a role analogous to superclass parameters. This is particularly important for traits, because traits don’t have a constructor to which you could pass parameters. So parametrizing a trait works via abstract vals that are implemented in subclasses.
CAUTION! A class parameter argument is evaluated before it is passed to the class constructor (unless the parameter is by-name). An implementing val definition in a subclass, by contrast, is evaluated only after the superclass has been initialized. So the values depending of the definition of abstract val should be initialized also in subclass, or using pre-initialized fields or lazy val.Concrete implementation of var can be val, var or pair of corresponding def getter/setter methods.
Traits can be instantiated by anonymous class that mixes in the trait.
To instantiate a trait, you need to implement the abstract definitions. Here is an example:
trait T{
val arg: Int
val t = 2* arg
}
var x = new T { val arg=expr} // instantiation of trait T.
// CAUTION!!! x.t has inconsistent value
However there is subtle difference between class and trait initialization. The expressions which defines abstract members are evaluated as part of the initialization of the anonymous class, but the anonymous class is initialized after the trait. So the concrete values are not available during initialization of the trait - instead the selection of a them would yield the default value (like 0, "", null) expr=2
then x.t==0
which could be quiet erogenous.
val pre=this.x
this doesn't denote to object being constructed). Consequently, if such an initializer refers to this
, the reference goes to the object containing the class or object that is being constructed, not the constructed object itself.
To make pre-initialized fields simply place the anonymous class definition in braces before the superclass constructor call.
Pre-initialized fields can be used in traits, objects or named subclasses.
trait T {
val arg :Int
val t=2*arg
}
val x1= new T { val arg=2} // constructs anonymous class with body "val arg=2"
val x2= new { val arg=2} with T // inherits from anonymous class which body is "val arg=2" which act the PRE-INITIALIZED FIELD
object x3 extends T { val arg=2}
object x4 extends { val arg=2} with T // object inherits from anonymous class which body is "val arg=2"
class X5 (x: Int) extends T { val arg=x }
val x5 = new X5(2)
class X6 (val arg: Int) extends T
val x6 = new X6(2)
class X7 extends T { val arg=2 } // here we implicitly inherit from AnyRef
val x7 = new X7
class X8 extends { val arg=2 } with T // this is general rule. We inherit from anonymous class
val x8 = new X8
x1.t // 0
x2.t // 4
x3.t // 0
x4.t // 4
x5.t // 0
x6.t // 4
x7.t // 0
x8.t // 4
trait T {
val arg :Int
lazy val t=arg*g
lazy val t2=arg/g
lazy val g = { // It is initialized before the initialization of t and t2 is completed
require(arg>0)
1000/g
}
}
val x= new T { val arg=2} // now x1.t yields correct value == 4
As we see the initialization order doesn't matter as far it doesn't produce side effects nor depends on them. g
is initialized before t
and t2
because they need it when initializing.
object X {
println("hej");
}
// as far we don't get the message "hej"
X // this is the time the message "hej" appears
class Food
abstract class Animal {
def eat(food: Food)
}
class Grass extends Food
class Cow extends Animal {
override def eat(food: Grass) {} // This won’t compile
}
We've got: error: class Cow needs to be abstract, since method eat in class Animal of type (Food)Unit is not defined
error: method eat overrides nothing...
What happened is that the eat
method in class Cow
does not override the eat
method in class Animal
, because its parameter type is different - it’s Grass in class Cow vs. Food in class Animal.
This behavior is justified. To see this consider the case where the previous example would be type correct. Then if we got other class Fish <: Food
we could call:
val bessy: Animal = new Cow
bessy eat (new Fish) // disappointment - you could feed fish to cows.
class Food
abstract class Animal {
type SuitableFood <: Food
def eat(food: SuitableFood)
}
class Grass extends Food
class Cow extends Animal {
type SuitableFood = Grass
override def eat(food: Grass) {}
}
val x = new Animal { override def eat(food: SuitableFood) {} }
The x
type will be Animal
We can describe objective of structural subtyping as follows:
Suppose you want collect all animals which eat grass in a list. There would be to simply solutions:
trait GrassEaters
and mix it in every Animal class which SuitableFood is >:
Grass.val List[SuitableFood] = List(...)
val List[Animal { type SuitableFood = Grass} ] = List(...)
We want to implement loan pattern - a function that takes object, make some operation using this object, and clean up. We need to ensure somehow that object has some method to make clean up:
def using[T <: { def close(): Unit }, S]
(obj: T) (operation: T => S) = { // curled function
val result = operation(obj) // performing operation
obj.close() // cleaning up
result // return the operation result
}
//use case:
using(new PrintWriter("date.txt")) { writer =>
writer.println(new Date)
}
Remark. If no base type is specified, Scala uses AnyRef
automatically.T
is structural subtype of AnyRef
trait Tr1 { def method_tr1 ... }
trait Tr2 { def method_tr2 ... }
def makeTr1_Tr2(arg: Tr1 with Tr2) {
arg.method_tr1()
arg.method_tr2()
}
In similar way we can also create variable with type "on the fly" supporting Tr1 and Tr2:
var x = new SomeClass() extends Tr1 with Tr2
mypackage.bessy.SuitableFood
Lets c.T
is an instance of a path-dependent type. In general, such a type has the form x1 . . . . .xn .t
, where n > 0, x1 , . . . , xn
denote immutable values and t
is a type member of xn
.
Path-dependent types are a novel concept of Scala.
class Outer {
class Inner
}
// Java access:
Outer.Inner
// SCALA access:
Outer#Inner // The ‘.’ syntax is reserved for objects only.
val o1 = new Outer
val o2 = new Outer
o1.Inner, o2.Inner
are two different path-dependent types.Outer#Inner
is a general type, which represents the Inner class with an arbitrary outer object of type Outer.o1.Inner
refers to the Inner class with a specific outer object (the one referenced from o1).
new o1.Inner
. The resulting inner object will contain a reference to its outer object, the object referenced from o1
.Outer#Inner
does not name any specific instance of Outer, you can’t create an instance of it: new Outer#Inner // Error
abstract class AbsCell {
type T
val init: T
private var value: T = init
def get: T = value
def set(x: T): unit = { value = x }
}
var flip = false
def f(): AbsCell = {
flip = !flip
if (flip) new AbsCell { type T = int; val init = 1 }
else new AbsCell { type T = String; val init = "" }
}
f().set(f().get) // illegal!
f()
return cells where the value type is alternating Int
and String
. int
cell to a String
value. The type system does
not admit this statement, because the computed type of f().get
would be f().T
. This type is not well-formed, since the method call f() is not a path (doesn't denote immutable value).
trait Functor[F[_]]
- some container type supporting fmap operation.
How we could create Functor type parametrized by type having three type parameters, eg Function2[A, B, R]
?
The solution is to bind all by one type parameters in a structural type:
implicit def Function2Functor[A, B] = new Functor[({type λ[R]=(A, B) => R})#λ] {
// definition of abstract method
}
Here we create implicit function which gives us implicit object of Functor[(A, B) => R]
.{type λ[R]= (A,B) => R}
which has one field type parametrized by one type parameter.
We couldn't write new Functor[(A, B =>R)]
because type (A, B)=>R is not type constructor - it doesn't take any type parameter. The type is already constructed by Function2[_,_,_]
Function2[_,_,_]
is generic type, but it takes 3 parameters. We expect that Functor takes type which is parametrized only by one type (one parameter type constructor).
To create a new enumeration, you define an object that extends scala.Enumeration
class, as in the following example:
object Color extends Enumeration {
val Red, Green, Blue = Value
}
object Direction extends Enumeration {
val North = Value("North") // overloaded Value function with takes `name` argument
}
Enumeration defines an inner class named Value
, and the same-named parameterless Value
method returns a fresh instance of that class. This means that a value such as Color.Red is of type Color.Value which is a path-dependent type, with Color being the path and Value being the dependent type.
Values of an enumeration are numbered from 0, and you can find them out by its id method:
Direction.North.id
It’s also possible to go the other way, from a non-negative integer number to the value that has this number as id in an enumeration:
Direction(0)
implicit
keyword, eg:
implicit def intToX(i: Int) = new X(i)
When such function is directly in scope (accessed without any preceding identifier), we can implicit convert variable Int to X (for example when function expects X, but we pass Int)
Implicit definitions are used by compiler to insert into a program in order to fix any of its type errors.
For example, if x + y
does not type check, then the compiler might change it to convert(x) + y
, where convert is some available implicit function. If convert changes x into something that has a + method, then this change might fix a program so that it type checks and runs correctly. If convert really is just a simple conversion function, then leaving it out of the source code can be a clarification.
f1: List[Int] => String; f2: Seq[Int] => String
and compiler expects String, but gets List[Int], then it will choose f1
function."abc".reverse
compiler will choose conversion from String to StringOps <: SeqLike[Char]
instead of WrappedString <: SeqLike[Char]
(the old one from Scala 2.7) because the former is declared in Scala.Predef
, the letter one in Scala.LowPriorityImplicits
and Predef <: LowPriorityImplicits
Scala.Predef
contains numerous of helpful implicit conversions.
scala(c) -Xprint:typerIt will show you what your code looks like after all implicit conversions have been added by the type checker.
obj.doIt
, and obj does not have a member named doIt
. The compiler will try to insert conversions before giving up. In this case, the conversion needs to apply to the receiver, obj
. The compiler will act as if the expected "type" of obj
were "has a member named doIt." This "has a doIt" type is not a normal Scala type, but it is there conceptually and is for compiler to insert an implicit conversion in this case.
class Rational(n: Int, d: Int) {
...
def + (that: Rational): Rational = ...
def + (that: Int): Rational = ...
}
// so we can add either two rational numbers or rational and int
val r = new Rational(1,2)
1 + r // error, Int doesn't have method +(x: Rational)
implicit def intToRational(x: Int) = new Rational(x, 1)
1 + r // now we can add int to rational
->
in 1 -> "one"
is not a syntax! Instead, ->
is a method of the class ArrowAssoc
, a class defined inside the standard Scala. It also contains implicit conversion from Any type to ArrowAssoc
.ArrowAssoc
someCall(a)
with someCall(a)(b)
or new SomeClass(a)
with new SomeClass(a)(b)
.
Implicit parameter can be the entire last curried parameter list that’s supplied, not just the last parameter.
To let the compiler supply the parameter implicitly, you must first define a variable of the expected type, which is marked with implicit keyword.
implicit keyword applies to an entire parameter list, not to individual parameters.
class PreferredPrompt(val preference: String)
object Greeter {
def greet(name: String)(implicit prompt: PreferredPrompt, drink: String) {
println("Welcome, "+ name +". The system is ready.")
println("why not enjoy a cup of "+ drink +"?")
println(prompt.preference)
}
}
implicit val bobsPrompt = new PreferredPrompt("relax> ") // it must be marked implicit if compiler might use it
//implicit object BobsPrompt extends PreferredPrompt("relax> ") // Other way to achieve the same as the previous statement
implicit val bobsDrink = "Coca-Cola"
Greeter.greet("Bob")(bobsPrompt) // error: not enough arguments...
Greeter.greet("Bob") // compiler convert it to Greeter.greet("Bob")(bobsPrompt, bobsDrink)
You need to be careful which variables you make implicit. It is wise to not select popular types for implicit, instead choose some rare types (as PreferredPrompt). As a result, it is unlikely that implicit variables of these types will be in scope if they are not intended to be used as implicit parameters.
class <:<
in Predef
- an instance of that class witnesses that A is a subtype of B.A <:< B
encodes to the constraint A <: B
def <:<
in Manifest
- it checks whether the type represented by this manifest is a subtype of that represented by the manifest argument.
class T1=:=T2
- asserts that T1 and T2 are the same type. (does not admits subclassing as <:<
)def implicitly[T](implicit e: T): T = e
- a function which doesn't take any argument, and returns implicit value/function of type T
- the one which compiler would take if it needsdef context[C[_], T](implicit e: C[T]) = e
- this allow us get an implicit value type C[T] by simply writing "context.some_ct_method" maxListElem
we don't use explicitly ordered
function. So the code will be proper if we change the argument name from ordered to anything else.View bound are made by putting <% in type parameters declaration:
def maxList[T <% Ordered[T]](elements: List[T]): T = ...
// which compiles to:
def maxList[T](elements: List[T])(implicit ev: T=>Ordered[T]): T = ...
T <% Ordered[T] means "I can use T, as long as T can be treated as an Ordered[T]."Int
is not a subtype of Ordered[Int]
, but you can pass a List[Int]
to maxList because an implicit conversion from Int to Ordered[Int] is available.T
happens to already be an Ordered[T]
, you can still pass a List[T] to maxList. The compiler will use an implicit identity[T]
function, declared in Predef.
StringOps
but working on original String
.
def f[A <% Ordered[A]](xs: A*): Seq[A] = xs.toSeq.sorted
// even if the type is only used as a type parameter of the return type
def f[A <: Ordered[A]](xs: A*): Seq[A] = xs.toSeq.sorted // oops, not every type supported
def f[A](xs: Ordered[A]*): Seq[A] = xs.toSeq // implicit conversion to an expected type occurs,
// return type is Seq[Ordered[A]]
This example won't work without view bounds. However, if I were to return another type, then I don't need a view bound any more:
def f[A](a: Ordered[A], b: A): Boolean = a < b
Eg3. Handling String and Array, which are Java classes, like they were Scala collections:
def handle_collection[CC <% Traversable[_]](a: CC, b: CC): CC =
if (a.size < b.size) a else b
If one tried to make handle_collection
without view bounds, the return type of a String would be a WrappedString (Scala 2.8), and similarly for Array.
def context_fun[T : P](a: T) = ...
// which compiles to:
def context_fun[T : P](a: T)(implicit v: P[T]) = ...
Context are used to make assure that there exists some implicit value of parametrized type P[T]
The common example of usage in Scala is this:
def f_context_ord[A : Ordering](a: A, b: A) = implicitly[Ordering[A]].compare(a, b)
f_context_ord require some implicit Ordering[A] class instance to compare a and b.
def f1[T <% String](t: T) = 0
// equivalent with context bound
trait To[T] { type From[F] = F => T }
def f2[T : To[String]#From](t: T) = 0
A context bound must be used with a type constructor of kind * => * (a `function` to get implicit value gets type T and results implicit value of context type C[T]). However the type constructor for Function1 is of kind (*, *) => * (type constructor gets two types: function argument type, and function result type; and results implicit function). The use of the type alias partially applies second type parameter with the type String, yielding a type constructor of the correct kind for use as a context bound.
There is a proposal to allow you to directly express partially applied types in Scala, without the use of the type alias inside a trait. You could then write:
def f3[T : [X](X => String)](t: T) = 0
def f1[A](n: Int) = new Array[A](n) // error: cannot find class manifest for element type T
def f1[A : ClassManifest](n: Int) = new Array[A](n)
def **[T : Numeric](xs: Iterable[T], ys: Iterable[T]) =
xs zip ys map { t => implicitly[Numeric[T]].times(t._1, t._2) } // We get implicit context value through implicitly function
def **[T : Numeric](xs: Iterable[T], ys: Iterable[T]) =
xs zip ys map { t => context[T]().times(t._1, t._2) } // the same as above, but we use context function
The classic example is Scala 2.8's Ordering
, which replaced Ordered
throughout Scala's library and take advantage of some implicit conversions inside Ordering that enable the traditional operator style. Another example in Scala 2.8 is the Numeric:
def f[A](a: A, b: A)(implicit ord: Ordering[A]) = {
import ord._ // import members from implicit value ord
if (a < b) a else b // you can call explicitly ord.lt(a,b)
}
def f[A : Numeric](a: A, b: A) = implicitly[Numeric[A]].plus(a, b)
A more complex example is the new collection usage of CanBuildFrom
. And, as mentioned before, there's the ClassManifest
usage, which is required to initialize new arrays without concrete types.
The context bound with the typeclass pattern is much more likely to be used by your own classes, as they enable separation of concerns, whereas view bounds can be avoided in your own code by good design (it is used mostly to get around someone else design).
Though it has been possible for a long time, the use of context bounds has really taken off in 2010, and is now found to some degree in most of Scala's most important libraries and frameworks. The most extreme example of its usage, though, is the Scalaz library, which brings a lot of the power of Haskell to Scala. I recommend reading up on typeclass patterns to get more acquainted it all the ways in which it can be used.
Scala functions (class FunctionX) has a 2 methods andThen, compose
which performs composition. f andThen g = g(f), f compose g = f(g)
&&&
operator.
def maxListElem[T <: Ordered[T]](elements: List[T]): T =
...
}
Everything would go fine, but many popular buildin types are not subtype of Ordered[]
(like Int, String
...).Ordered
.T
- how to convert it to type Ordered[T]
def maxListElem[T](elements: List[T])
(implicit orderer: T => Ordered[T]): T =
elements match {
case List() =>
throw new IllegalArgumentException("empty list!")
case List(x) => x
case x :: rest =>
val maxRest = maxListElem(rest)(orderer) // implicit put (ordered) - maxListElem(rest)(ordered)
if (x > maxRest) x // implicit call ordered(x) when type x don't have > method
else maxRest
}
implicit def myTypeToOrdered(x: MyType): = new Ordered[MyType] {
def compare(that: MyType) = x.some_field - that.some_field;
}
Now we can use function maxListElem
with List[MyType]
// with view -- implicit function
abstract class MyType (implicit cmp : MyType => Ordered[MyType]) {
...
}
implicit def MyTypeToOrdered(x: MyType) = new MyType Ordered[MyType] {
def compare ...
}
// or with context bound:
abstract class MyType (implicit ev : Ordering[MyType]) {
...
}
implicit val MyTypeToOrdering = new Ordering[MyType]{
def compare(a: MyType, b: MyType) = ...
}
// Instead of:
abstract class MyType with Ordered[K] {
// other use case for class parameter
abstract class RedBlackTree[K, V] (implicit cmp : K => Ordered[K]) ... // equally [K <% Ordered, V], but we can use directly cmp method
The benefits of such solution are described in other parts of this subsection:
abstract class A{
type B
}
// We want to make compare instance of classes C1 < A and C2 < A only when C1#B = C2#B
type AA[T] = A { type B = T } // type alias for A in generic form
implicit def aIsOrdered[T](a : AA[T]) = new Ordered[AA[T]] {
def compare(that : AA[T]) = ...
}
Now we can put the list of A subtypes, which have the same type B
field, to some requires Ordered view, as well Ordering context.
String
to Str
type - which is simple regexp matching exactly this string.
abstract class RegExp {
def nullable: Boolean
def derive(c: Char): RegExp
def match(s: String) =
if (s.isEmpty) nullable
else derive(s.head).match(s.tail)
}
case object Empty extends RegExp {
def nullable = false
def derive(c: Char) = Empty
}
case object Eps extends RegExp {
def nullable = true
def derive(c: Char) = Empty
}
case class Str(s: String) extends RegExp {
def nullable = s.isEmpty
def derive(c: Char) =
if (s.isEmpty || s.head != c) Empty
else Str(s.tail)
}
case class Cat(r: RegExp, s: RegExp) extends RegExp {
def nullable = r.nullable && s.nullable
def derive(c: Char) =
if (r.nullable) Or(Cat(r.derive(c), s), s.derive(c))
else Cat(r.derive(c), s)
}
case class Star(r: RegExp) extends RegExp {
def nullable = true
def derive(c: Char) = Cat(r.derive(c), this)
}
case class Or(r: RegExp, s: RegExp) extends RegExp {
def nullable = r.nullable || s.nullable
def derive(c: Char) = Or(r.derive(c), s.derive(c))
}
case class And(r: RegExp, s: RegExp) extends RegExp {
def nullable = r.nullable && s.nullable
def derive(c: Char) = And(r.derive(c), s.derive(c))
}
// repetitions, eg "Rep("a", 4)" matches "aaaa"
case class Rep(r: RegExp, n: Int) extend RegExp {
def nullable = r.nullable
def derive(c: Char) = repr.derive(c)
def repr =
val aux = (i:Int) =>
if (n<=0) r
else Cat(r, aux(n-1))
aux(n)
// Other possibility to represent Repr is to move repr function to some object
}
case class Not(r: RegExp) extends RegExp {
def nullable = !r.nullable
def derive(c: Char) = Not(r.derive(c))
}
We can construct regular expressions (for example to match simple string "start" or "end") using:
Or(Str("start"), Str("end"))
Str("start") | Str("end")
object RegExpPimps {
implicit def string2RegExp(s: String) = Str(s)
implicit def regExpOps(r: RegExp) = new {
def | (s: RegExp) = Or(r, s)
def & (s: RegExp) = And(r, s)
def % = Star(r)
def %(n: Int) = rep(r, n)
def ? = Or(Eps, r)
def ! = Not(r)
def ++ (s: RegExp) = Cat(r, s)
def ~ (s: String) = Matcher.matches(r, s)
}
implicit def stringOps(s: String) = new {
def | (r: RegExp) = Or(s, r)
def | (r: String) = Or(s, r)
def & (r: RegExp) = And(s, r)
def & (r: String) = And(s, r)
def % = Star(s)
def % (n: Int) = rep(Str(s), n)
def ? = Or(Eps, s)
def ! = Not(s)
def ++ (r: RegExp) = Cat(s, r)
def ++ (r: String) = Cat(s, r)
def ~ (t: String) = string2RegExp(s).match(t)
}
object Test {
def main(args: Array[String]) {
// we start from opening RegExpPimps object content, to get access to implicit functions
// and to implicit conversion from String to RegExp
import RegExpPimps._
// here we construct some regular expressions
val digit = "0" | "1" | "2" | "3" | "4" | "5" | "6" | "7" | "8" | "9"
val int = ("+" | "-").? ++ digit.%(1)
val real = ("+" | "-").? ++ digit.%(1) ++ ("." ++ digit.%(1)).? ++ (("e" | "E") ++ ("+" | "-").? ++ digit.%(1)).?
// Some strings to test regular expressions
val ints = List("0", "-4534", "+049", "99")
val reals = List("0.9", "-12.8", "+91.0", "9e12", "+9.21E-12", "-512E+01")
val errs = List("", "-", "+", "+-1", "-+2", "2-")
// testing
// ~ calls match function
ints.foreach(s => assert(int ~ s))
reals.foreach(s => assert(!(int ~ s)))
errs.foreach(s => assert(!(int ~ s)))
ints.foreach(s => assert(real ~ s))
reals.foreach(s => assert(real ~ s))
errs.foreach(s => assert(!(real ~ s)))
}
The difference between rich methods and base class support is: we have access to methods only when we import implicit functions.When using typeclasses we benefit multiple feature definitions. We just import that typeclass we want, and we are ok with other feature interpretation.
We can achieve the same with traits by structural subtyping, but it doesn't looks nice when we need to mixin some other functionality and overwrite the default one in some context.
// mixin trait way
trait FeatureX = { def makeX : Int; }
class A extends FeatureX = {
def makeX = { // default makeX interpretation
}
}
val a = new A
val a2 = new A {
def makeX = { // other makeX interpretation
}
}
val a3 = new A extends SomeTraitWithFeatureX // a3 with arbitrary makeX interpretation from SomeTraitWithFeatureX
// typeclass way
class A
trait FeatureX[T] = { def makeX(t : T) : Int; }
implicit a1_WithFeatureX = new FeatureX[A] {
def makeX{ // first makeX interpretation
}
}
implicit a2_WithFeatureX = new FeatureX[A] {
def makeX{ // second makeX interpretation
}
}
import SomeImplicits.a3_WithFeatureX // other implicit object with arbitrary makeX for type A interpretation
case class X(x: Int) extends Ordered[X] {
def compare(other: X) = x - other.x
}
def binSearch[A <% Ordered[A]](a: Array[A], v: A){
def recurse(low: Int, high: Int): Option[Int] = (low + high) / 2 match {
case _ if high < low => None
case mid if a(mid) > v => recurse(low, mid - 1)
case mid if a(mid) < v => recurse(mid + 1, high)
case mid => Some(mid)
}
recurse(0, a.size - 1)
}
binSearch(Array(X(1), X(2)), X(2))
binSearch(Array(X(1), X(2)), 2) // Type error
We would like to see X
as both Ordered[X], Ordered[Int]
. But we don't want to see Int
as X
.
case class X(x: Int) extends Ordered[X] with Ordered[Int] ... // Illegal!
But this is impossible due to type erasure. After type erasure both Ordered[Int] and Ordered[X] are seen as the same parametrized class, so it makes disambiguation and compiler doesn't allow this.
def binarySearch[B, A <% Ordered[B]](a: Array[A], v: B) = ...
// implicit conversion from X to Ordered[X] exists - it is identity
// we need to make implicit conversion from X to Ordered[Int] to search for int
implicit val compThingToInt: Thing => Ordered[Int] =
t => new Ordered[Int] {
def compare(that: Int): Int = { t.n - that }
}
Other solution might be to use typeclasses
Ordered[T]
is a trait, which we use to compare T
values by mixin with type T
.
// The BAD!!! code - it hacks the type system, and can make some bad work in the future (Read above subsection)
case class Thing(val n:Int) extends Ordered[Any] {
def compare(that: Any): Int = that match {
case i:Int => this.n - i
case x:Thing => this.n - x.n
case _ => throw new IllegalArgumentException("bad type")
}
}
// Better solution is to use something from the upper subsection
// - the normal compare method (which takes Thing argument) and add additional implicit methods
Otherwise we need to implement dozones of compare-like methods (>, >= ...).
Ordering[T]
is a trait - typeclass, which we use to compare T
values by creating (or importing) implicit object of type Ordering[T]
with appropriate methods definitions.
Ordering has more function to work with. eg: reverse
which returns reverse ordering of some type. The working with reverse requires to explicitly apply the Ordering instance to method call:
// TreeMap requires Ordering context
// assuming we have some Ordering[Foo]
new TreeMap[Foo, Bar]()(implicitly[Ordering[Foo]].reverse)
Ordered is being deprecated in favor of Ordering. Ordering is strictly more powerful because you can have several Orderings on a class Foo, whereas Foo can only implement Ordered once. (You can fake with by having Foo -not- implement Ordered, and making several implicit conversions to Ordered[Foo], and controlling the scope of those implicits, but this is kind of hack and also performs very poorly.)
Thanks to the implicit definitions inscala.math.LowPriorityOrderingImplicits
Ordered types, or types which have implicit conversion to Ordered type, are sufficient to provide us with corresponding Ordering type class instances.
Motivation: we want to make a parametrized method, which type parameter is restricted to arbitrary set of types, ie: Int and Long.
We make using typeclass Acceptable - an abstract class, which perform as a type guardian for implicit objects, which perform evidence that the type can be used for desired method.
abstract class Acceptable[T]
object Acceptable {
implicit object IntOk extends Acceptable[Int]
implicit object LongOk extends Acceptable[Long]
}
// Our method:
def f[T: Acceptable](t: T) = ...
import Acceptable._
// now we can use f only with this argument type which are in Acceptable context.
val x = some_val_with_Integral_support
val y = some_val_with_Integral_support
x % y // calls Integral[Int] .quot(x, y) thanks internal implicit conversion to IntegralOps
implicitly[ Integral[Int] ].quot(x, y) // normal call
The example is about implementing Scala version of Read typeclass - which have one method: read::(Read a) => String ->
- function requires type a
which implements class Read
. The function takes string representation of type t, and returns instance of type a corresponding to that string.
// typeclass
trait Read[T] {
def read(s: String): T
}
// implementing instances for Read
implicit object IntRead extends Read[Int] {
def read(s: String) = s.toInt
}
// Our Object
case class Name(firs: String, last: String)
object NameDescription {
def unapply(s: String) = {
val a = s.split("/")
Some(a(0), a(1))
}
}
implicit object NameRead extends Read[Name] {
import NameDescription._
def read(s: String) = s match {
case NameDescription(l, f) => Name(l, f)
case _ => sys.error("invalid")
}
}
// we can also set up generic context for high level constructs:
// here we define implicit "object generator" in context Read
// compiler can make automatic make new instance of class Read for generic Seq[T] type,
// - only if the type parameter T implement Read typeclass
implicit def SeqRead[T : Read] = new Read[Seq[T]] {
def read(s: String) =
s.split(" ").toSeq map (implicitly[Read[T]] read _)
}
// using
def foo[T: Read](s: String) = implicitly[Read[T]] read s
foo[Int]("123") // returns 123 : Int
foo[Name]("Robert/Zaremba") // returns Name("Robert", "Zaremba")
foo[List[Int]]("1 2 3") // returns List(1,2,3)
Presented API becomes hugely expressive under the control of static type system. All this constraints are checked during compile time.
All of this makes Scala to achieve polymorphism in multiple way - depends on the needs.
Moreover - all of them are orthogonal semantic concept, which Scala easy and powerful tool for domain models. - after Debasish words: Great languages are those that offer orthogonality in design. Stated simply it means that the language core offers a minimal set of non-overlapping ways to compose abstractions.
Seq with Function1
T1 with T2 ...
Usually we use type fields when mixing traits or simply structural subtype.
class C[T]
there is no way to directly instantiating type parameter in runtime using new
because:
class BalanceActor[T <: Actor](val fac: () => T) extends Actor {
val workers: Int = 10
private lazy val actors = new Array[T](workers)
override def start() = {
for (i <- 0 until workers) {
actors(i) = fac() //use the factory method to instantiate a T
actors(i).start
}
super.start()
}
}
// using BalanceActor:
val ba = new BalanceActor[CalcActor]( { () => new CalcActor } )
ba.start
Scala is so flexible language, that you can achieve Dependency Injection in multiple way: using mixing, Cake Pattern and high-order Function.
Functional Programming guru also claim, that in FP world there is no need for special DI framework, as high-order function are enough.
Very good explanation of Scala flexibility is Martin Odersk'y paper: Scalable Component Abstractions.
The key idea is that we compound every class/functionality in abstract component which has the instance of functionality as a field (val or parameter less method which gives us the instance). This field is used by other components which depends on it and we should delay coupling to any initialization of it till the time we absolutely need them. And that is when we make the assembly with this component.
When some component (A) depends on other component (B) we use self type annotations (self: B =>
) to express this. So that if we want to use component A we need to mixin component B. At the end the the functionality in class A has an access to instance of functionality B, which is specified in creation time.
To get the full functionality assembling those components using mixins and construct an object or instance of it. It is similar to putting together different layers of a cake to form the final shape.
If The Construction of the object depends on initialization order then the object of the functionality should be lazy val. Eg: Component A depends on component B and initialization of A requires some information from initialization of B.
Since every trait can be mixed only once there is a limitation about using mixins. This is described Constructor based DI section.
General template:
// here we can make an trait to assure that every implementation of component would look that same:
trait Functionality1 {
val / lazy val / def functionality1 : Functionality1Impl
abstract class / trait Functionality1Impl { ... }
}
trait Functionality1Component_V1 extends Functionality1 {
self : Dependency1 with Dependency2 => // eg self: Functionality2 =>
class Functionality1Impl { .. }
}
trait Functionality1Component_V2 extends Functionality1 {
self : DependencyComponent1 with DependencyComponent2 =>
val _cached_functionality1 : Functionality1Impl // still we delay with the instantiation
val / lazy val / def functionality1 = _cached_functionality1
class Functionality1Impl { .. } // other implementation
}
class User(val name: String) {
override def toString() = name
val username = name;
}
/****************************
* Functionality 1: retrieving users
* here functionality is not surrounded by the class
*/
trait UserRepo{
val repo_address :String
println("> creating UserRepo instance to \""+repo_address+"\" repository") // this will be printed each time...
// ...when UserRepoMock is called from "def instance"
def find(name: String): Option[User]
}
//*** implementations of UserRepo ***
class UserRepo1(val repo_address: String) extends UserRepo{
def find (name: String) =
if(name startsWith "r") Some(new User(name))
else None
}
class UserRepoMock(val repo_address: String) extends UserRepo{
def find (name: String) = {
println("mock find")
Some(new User(name))
}
}
//**** wrapping into component ****
trait UserRepoComponent[T <: UserRepo] {
def userRepo : T // here val or lazy val can be used as well
}
//*** examples about how to use User Repo Component ***
object UserService extends UserRepoComponent[UserRepo1] {
def userRepo = new UserRepo1("main") // because of def, there is creating new instance of UserRepo1
} // with every access to userRepo field
//*** or as the val ***
object StartUserRepo {
val test_env = new UserRepoComponent[UserRepoMock] {
val _defaultUserRepo = new UserRepoMock("mock_repo")
def userRepo = _defaultUserRepo // here we get the cached instance of UserRepo to avoid creating new instances on each call
} // because of using def we can still put some job (eg logging) to userRepo access
}
/****************************
* Functionality 2: User authorization
* usually we put the functionality inside an abstract component
* this puts the implementation in coherent namespace:
*/
trait LoggerComponent {
val logger : Logger // here we are using val
trait Logger {
def log(ms: String)
}
}
trait LoggerComponentStd extends LoggerComponent {
val logger = new LoggerImpl // val is specified on trait definition so we can't simply ychanging in runtime,
// only overriding on creation time.
class LoggerImpl extends Logger {
def log(ms:String) = println(ms)
}
}
trait LoggerComponentFile extends LoggerComponent{
// we postpone the creation of val logger till creation of whole Application class time
class LoggerImpl extends Logger {
def log(ms: String) = println("logging to file: \""+ms+"\"")
}
class LoggerMockImpl extends Logger { // Mock class Version!
def log(ms: String) = println("logging to file mock: \""+ms+"\"")
}
}
/****************************
* Functionality 3: User Service which has Authorization and User Update
* this functionality depends on other components: UserRepoComponent and LoggerComponent
*/
trait UserServiceComponent {
this : UserRepoComponent[_] with LoggerComponent => // here we express dependencies
def authorizator : Authorizator
val userUpdater = new UserUpdater // we set default instance here
trait Authorizator {
def authorize(usr: String, passwd: String) : Option[User]
def change_passwd(u: User, new_pass: String)
}
class UserUpdater { // here is the default implementation for UserUpdater
def update(u:User) = println("User " + u.name + " updated" + u)
}
}
trait UserServiceComponentStd extends UserServiceComponent {
this : UserRepoComponent[_ <: UserRepo] with LoggerComponent =>
val company: String // traits can't has constructor nor parameters. So the abstract fields plays the role of trait parameter
val authorizator = new AuthorizatorStd(company)
class AuthorizatorStd(company: String) extends Authorizator {
def authorize(usr: String, passwd: String) = {
logger.log("trying to authorize "+usr + " from "+ company)
if(passwd == "ok") userRepo.find(usr)
else None
}
def change_passwd(u:User, new_pass: String) = println("Password changed")
}
}
/****************************
* Assemblingy everything together
*/
object StartCake extends App {
val service = new UserServiceComponentStd with LoggerComponentFile with UserRepoComponent[UserRepo1] {
val company = "super company" // needs to be lazy, since authorizator rely on this and it is aware of initialization order
// otherwise authorizator could bind company as a null value.
val logger = new LoggerImpl // logger instance was postponed
def userRepo = new UserRepo1("main")
override val userUpdater = new UserUpdater // we can override the default value (actually for simplicity is the same).
}
println(service.userRepo.find("marta"))
service.userRepo.find("robert") match {
case Some(u) => println("found user robert")
case None => println("user robert not found")
}
println(service.authorizator.authorize("robert", "ok"))
}
Very good discussion about cake pattern: http://www.warski.org/blog/2011/04/di-in-scala-cake-pattern-pros-cons/.
Full example showing constructor based DI
which comes form: http://jboner.github.com/2008/10/06/real-world-scala-dependency-injection-di.html
// UserRepo an UserRepo1 as previous
// other service
trait Logger {
def log(ms: String)
}
class LoggerStdOut extends Logger {
def log(ms:String) = println(ms)
}
// =======================
// service declaring two dependencies that it wants injected,
// is using structural typing to declare its dependencies
class UserService(val logger: Logger, val userRepo: UserRepo) {
def authorize(username :String) = {
logger.log("trying to authorize "+username)
userRepo.find(username)
}
}
class Client(us: UserService ) {
us.authorize("robert")
}
// =======================
// instantiate the services in a configuration module
object Config {
lazy val logger = new LoggerStdOut
lazy val userRepo = new UserRepo1("main")
lazy val userService = new UserService(logger, userRepo) // this is where injection happens
}
new Client(Config.userService) // running the client code
// UserRepo and Logger as previous
trait UserService {
def authorize(ur: UserRepo, logger: Logger) : String => Option[Unit] = s => {
logger.log("trying to authorize: " + s)
ur.find(s)
}
def addUser(ur: UserRepo, logger: Logger) : String => Option[User] = s => {
logger.log("adding user: " + s)
if (ur find s) None
else Some(new User(s))
}
def sayHello(logger: Logger) : User => Unit = u => logger.log(u.name + " said hello")
// some test:
def test(ur:UserRepo) : String => Unit =
sayHello(addUser(ur, new LoggerMockImpl)( _ ), new LoggerMockImpl)
// we can do mystically using scalaz:
val test2 = for {
au_part <- addUser(_, new LoggerMockImpl) // make a partial function which will be extracted when applied
sh_part <- sayHello // normal function which will be extracted when applied
} yield (au_part map sh_part)
}
// assembling through partial application
object UserService1 extends UserService {
val logger = new LoggerStdOut
val authorize1 = authorize1(new UserRepo1("main"), logger)
val addUser = addUser(new UserRepo1("main"), logger)
test(new UserRepo1("main"))("test_user")
test2(new UserRepo1("main"))("test_user")
}
trait Y[T]
class X extends Y[A] with Y[B]
Scala 2.10 is going to have some partial solution.
"a b c".split(" ").toSeq map ("L"+) // split returns Array[String] which doesn't have toSeq method, but WrappedArray has.
// This requires implicit conversion from Array to ArrayWrapper toSeq
// but the problem is with `map` function
("a b c".split(" ").toSeq map ("L"+) // Other way to write this expression
trait Handles[-A, -E <: Event]
class Inventory
class CreationEvent extends event
def f(arg: Inventory Handles CreationEvent) // the same as Handles[Inventory, CreationEvent]
List.:+
= List.append. Not effective. Normal list is immutable stack.sys.error
- Throw a new RuntimeException with the supplied message. Its proffered to Predef.error (which is deprecated).Container[T]
- contains objects of type T. We want to T<:ElementElement[T]
- we want to store in each element the enclosing container, so T is the type of container which has the element.
abstract class Container[E <: Element[_]] {
def contains( e: E ): Boolean
def addNewElement(): Unit
}
abstract class Element[C <: Container[_]] {
def enclosingContainer(): C
}
class MyContainer extends Container[MyElement] {
private var elements = List[MyElement]()
override def contains( elem: MyElement ) = elements.contains( elem )
override def addNewElement() { elements ::= new MyElement(this) }
}
class MyElement( container: MyContainer ) extends Element[MyContainer] {
override val enclosingContainer = container
}
f[T]: T=>T
and we want to make function z
which takes f as an argument and operate on two parametrized version of f, eg calls f[Int] and f[Double].
z[T](f: T => T) = f[Int](1) + f[Double](2.2) // error!
this not works, because we use here f as it has two types (Int => Int, and Double => Double).
trait ForAll {
def wrapper[X](x : X) : X
}
def z(wop : ForAll) = wop.wrapper[Int], wop.wrapper[Double])
// using:
def f[T](x: T) = x
z(new ForAll{def wrapper[X](x : X) = f(x)})
T1
or T2
and we call it Union Type.
This can be also useful for method overloading using generic types.
case class OrType[A,B](val a: Option[A], val b: Option[B])
object OrType {
type or[A,B] = DisjointType[A,B] // to type "or" instead of "OrType"
private def da[A,B](a: A): or[A,B] = { DisjointType(Some(a),None) }
private def db[A,B](b: B): or[A,B] = { DisjointType(None,Some(b)) }
// implicit defs - stuttering-or
implicit def aToOrType2[A,B](a: A): or[A,B] =
{ da(a) }
implicit def bToOrType2[A,B](b: B): or[A,B] =
{ db(b) }
implicit def aToOrType3[A,B,C](a: A): or[or[A,B],C] =
{ da(da(a)) }
implicit def bToOrType3[A,B,C](b: B): or[or[A,B],C] =
{ da(db(b)) }
}
// using:
import OrType._
class Foo {
def erasureMethod[T <% String or Int](lt: List[T]) = {
for (x <- lt) x match {
case x: String => println("String list item: " + x)
case x: Int => println("Int list item: " + x)
}
}
}
The drawback of this solution is that we have a new class type with two "subtypes" (two fields).
Below is better solution using sophisticated type system constructions.
type ¬[A] = A => Nothing
type v[T, U] = ¬[¬[T] with ¬[U]] // DeMorgan law
type ¬¬[A] = ¬[¬[A]]
type |v|[T, U] = { type λ[X] = ¬¬[X] <:< (T v U) }
// Using
def size[T: (Int |v| String)#λ](t: T) = t match {
case i: Int => i
case s: String => s.length
}
size(3) // returns 3: Int
size("hej there") // returns 9: Int
size(4.2) // error: Cannot prove that ((Double) => Nothing) => Nothing >: Nothing with (java.lang.String) => Nothing) => Nothing.
Why is the additional |V|
? Because implicitly[Int <:< (Int ∨ String)]
(asking the compiler if it can prove that Int is a subtype of Int ∨ String) simply not gets true. The left hand of <:<
is Int, and the right is a function (because ¬ is a function type). We need to transform right hand of <:<
to some other type. That's why we have ¬¬ and |v| types.
More elaboration about union type construction on Miles Sabin blog
Since type level calculations in Scala are Turing complete it should be possible to find type construction corresponding to any recursive function. This means that – in theory at least – Scala's type system is powerful enough to express any type whose set of values is recursive.
To find the construction for any recursive function we can generalize our |v|
type constructor to the concept of Acceptor:
type Acceptor[T, U] = { type λ[X] = ... }
and for any function try to construct corresponding type level Acceptor
We can set some interpretation for value by boxing it into higher type or refine a type
The classic approach is to box Int into two classes which represent day seconds, and epoch seconds. This leads to new type in a class hierarchy and extra memory space.
Quiet good code snippets about unboxing type: https://gist.github.com/89c9b47a91017973a35f.
Unboxed Tagged Types are part of the scalaz7
type Tagged[U] = { type Tag = U } // type refinement using type alias
type @@[T, U] = T with Tagged[U] // type constructor
trait Day
trait Epoch
type Epochtime = Long @@ Epoch // type aliases for Long type with tag refinement
type Daytime = Long @@ Day
// conversion functions:
def daytime(i: Long): Daytime = i.asInstanceOf[Daytime]
def epochtime(i: Long): Epochtime = i.asInstanceOf[Epochtime]
// we can use pimp my library pattern to add extra functionality:
val hhmmFormat = new SimpleDateFormat("hh:mm")
case class EpochtimeDisplay(time: Epochtime) {
// here new Date expects a Long, but this is ok because Epochtime *is* a Long
def hhmm = hhmmFormat.format(new Date(time))
}
implicit def toEpochtimeDisplay(t: Epochtime) = new EpochtimeDisplay(t)
// using:
def calculateDay(e: Epochtime) = ...
val e = epochtime(10231231)
val d = daytime(2231)
e.hhmm
calculateDay(e) // OK
calculateDay(d) // Error
new Array[Int](size), Array("mama", "tata")
, the last one is an apply method from companion object
indexing is made through apply method call, eg: tab(i)
scala.collecion.immutable.List (default)
- functional type. Most operations are recursive. Operations which can't be tail recursive (map) are implemented imperatively using ListBuffer, and at the end converting to the List (which is efficient - see below).ListBuffer
- "normal", mutable list. Are created by new
.toList
to convert itself to immutable list. This operation is efficent and independent of the List size. toList
is called on ListBuffer, then every mutable operation on a list (eg append) requires copy of the whole list! reed more in Programming in Scala, 22.3scala.collecion.mutable.MutableList
A, List[Nothing]
is subtype of List[A]
.
var x=(1, "22", 'a')
. Access: x._1
. It is impossible to make apply method which gets particular field - because such method needs to return different data types (depends on which field returns).
scala.collections.(mutable | immutable).Map
, eg: var x = Map(1->"one", 2->"two")
scala.collections.(mutable | immutable).Set
, eg: var x = Map(1->"one", 2->"two")
immutable
type var doesn't have sense (var x = scala.collections.immutable.Set)
var y = x+(3->"three")
x += (3->"three")
O
, and a operators G
(functions) on this objects. Operators are closed under G
(every call of the operator gives an element from G
)
e
in G, for given semigroup binary operator.
Inv
operator for every element in G, such that Inv(g)+g=e
, where e is neutral element and + is binary operator given from semigroup.
map
operator, which maps function to every element in the structure.
A ~> Identity[A]
which means that any non-generic type a is a kind of Identity[_] - there exists implicit conversion from A to Identity[A] (like Int, String, ...)M[A] ~> MA[M,A]
- any generic type with one type parameter is a kind of MA[_,_] - there exists implicit conversion from M[A] to MA[M, A] (like Set, List, Option...)M[A, B] ~> MAB[M,A, B]
- any generic type with two type parameters is a kind of MAB[_,_,_] - there exists implicit conversion from M[A,B] to MAB[M, A, B]Identity, MA, MAB
kinds are accepted to be Monoid.|+|
for default operation on monoid (eg: for Boolean `or`)
With monoid we can perform very useful operations. For example:
trait TradingPosition{
def sym: Ticker
def qty: Int
}
val f_london = (_ : TradingPosition).sym.id endswith ".L"
val f_ny = (_ : TradingPosition).sym.id endswith ".O"
val positins: Seq[TradingPosition] = get_positions_from_db("mydb")
positions filter (f_london |+| f_ny) // returns positions from london or ny
val pos_map: Map[Trade, Int] = get_positins... // other view to look at the position: map from ticker to quantity
// With new tread we want to increase the quantity in pos_map
def newTrade(trd : Trade): Unit =
pos_map += (trd.sym -> (pos_map.get(trd.sym) getOrElse 0) + trd.qty)
// the previous statement can be simplified to:
pos_map += (trd.sym -> ~pos_map.get(trd.sym) |+| trd.qty) // this has other advantage, that Int type doesn't appear hear
// so we can safely change representation from Int to pair of Int's
Function0, Function1, Function2
structures: Function0W,Function1W, Function2W
Function1W, and Function2W define some interesting methods:
f.lift[Container_type] apply container
means container map f
val g = (_:Int)+1
g.lift[List] apply List(2,3) assert_=== List(3,4)
g.lift[List].second apply (1, List(2,3)) assert_=== (1, List(3,4)) // see second definition in Array type below
val f = (a:Int) => (a, List(a+1,a+2))
f andThen g.lift[List].second apply 1 assert_=== (1, List(3,4)
// here we compose f and g to second element of f result
f(1):-> g.lift[List] assert_=== (1, List(3,4)) // the same effect as above, see :-> definition below
first, second
which expects a pair as an argument and apply the function to first, or second element of the pair. There exists implicit conversion from Function1 to Arrow.
((_:Int) + 1).first apply (7, "abc") assert_=== (8, "abc")
((_:String) + 1).second apply (7, "abc") assert_=== (7, "abc1")
((_: Int)+ 2) >>> (_*3) apply 2 assert_=== 12
((_: Int)+ 2) MAB.<<< ((_:Int)*3) apply 2 assert_=== 8
((_: Int)+ 2) &&& (_*3) apply 2 == (4,6)
((_: List[Int]):+ 3) *** ((_:Int) + 10) apply (List(1,2), 7) == (List(1,2,3), 17)
((_:Int) + 1).product apply (9, 99) assert_=== (10, 100)
(1,2) :-> (_ * 2) assert_=== (1, 4)
((_:Int) * 2) <-: (1,2) assert_=== (2, 2)
(Left(2): Either[Int, Int]):->(_*2) assert_=== Left(2)
((_:Int)*2) <-: (Left(2): Either[Int, Int]) assert_=== Left(4)
(Right(2): Either[Int, Int]):->(_*2) assert_=== Right(4)
List[Char]
instead of String
.
To make use of String in Scalaz instead of converting List[Char]
to String
, we need to explicitly put some implicit objects to method call.nice articles, tutorials about scalaz:
But for those preferring simply without special emacs/vim abilities I would recommend IntellJ + SBT + sbt-idea (plugin for sbt) + sbt plugin for intellJ. I prefer IntellJ configuration then Eclipse, because of nice configuration of environment. It don't depends on internal Scala environment so I can use my general Scala environment, or what is better - easily manage sbt as a build tool. Then all dependencies and build process will be managed by SBT while coding and interface development will be managed by IntellJ.
If you want to use IntellJ you can track intelliJ Scala plugin blog which publish nice update information.
When I tried to configure Eclipse it was like war with the hell - I couldn’t figure how can I change my Scala version or easy make run / build configuration (eg: to use sbt for building, and specify run configuration for output of sbt).
To speed up compilation process we can use fsc which is bundled with standard Scala. On the first time it runs standard Scala, make whole warmup and loading and stay detached as a daemon process in background. So in following fsc call, it will reuses the same compiler instance without spending whole time to load and JIT.
fsc is usually supported by new IDEs (IntelliJ) - more info on reuses the same compiler instance
SBT only recompile sources that are out of date.
Unfortunate (as of version 0.11) sbt doesn't use fsc to speed up compilation, but it uses the same JVM on each compilation process. So it avoids JVM startup overhead, but still needs a time for Scala compiler warmup and JIT libraries.
As of 10.x and 11.x version SBT doesn't have built in task to setup new project. To start SBT just go to your project catalogue and run SBT and type task to preform.
The most basic tasks are
compile
to look for the source files and compile them. sbt will compile only those with new changes.run
looks for class with main function and run it./; src/main/scala; src/main/java
subdirectories to find source files (eg: for compile task) and uses target/<scala-version>/...
for .class files.
All of this, and more (setting project version, scala-compiler version, compiler options, dependencies, classpath, owner/organization, copyright, packaging options ...) can be specified in build.sbt and project/<.scal or .sbt> files. Further specification is on Wikipedia.
SBT perform dependency management as well as automated downloading of missing.
SBT performs excellent as a build tool and running scala applications and scripts! With SBT you don't need any Scala build - just SBT. It will download and manage anything else you need
$ sbt
$ np name:my-sub-project dir:sub-project-dir
This will create a new sbt project source tree for a project named my-sub-project under the directory named sub-project-dir relative you your projects base directory.
ensime generate
to sbt which generates .ensime project file. Similar functionality from ensime is really restricted and don't parse sbt builds file (just detects them).gen-idea [with-classifiers | no-classfiers | no-sbt-classifiers]
sbt task to create IDEA project files. By default, classifiers (i.e. sources and javadocs) of sbt and library dependencies are loaded if found and references added to IDEA project files. If you don't want to download/reference them, use command gen-idea no-classifiers no-sbt-classifiers.
Some interesting tools build on top of SBT.
After running g8
it will output usage. It is as simple as g8 -l
to list templates and g8 repo_name/template_name
to download template.
The simplifies .ensime configuration file to work with single Scala file (no directory structure).
( :source-roots (".") )
There is also minimal ensime sbt project.
Below I present some tips in configuration ensime
By default Emacs uses syntactic highlighting in sources. Because a syntax highlighter cannot tell whether a symbol is val or var, or a method call it is recommended to enable the semantic highlighter by the snippet below to your .emacs file.
(setq ensime-sem-high-faces
'(
(var . (:foreground "#ff2222"))
(val . (:foreground "#111111"))
(varField . (:foreground "#ff6666"))
(valField . (:foreground "#666666"))
(class . font-lock-type-face)
(trait . (:foreground "#084EA8"))
(object . (:foreground "#026DF7"))
(package . font-lock-preprocessor-face)))
(param . (:foreground "#111111"))
(functionCall . (:foreground "#84BEE3"))
To link external sources (eg to extend type inferencer or symbol autocompleter) we need to extend :source-roots
variable. We can do it in .ensime file or load manually in external script.
key(":source-roots"), sexp(
"/opt/scala/proj/play/src/scala",
"/opt/scala/proj/scalaz/src/scala",
)
We load external documentation by writing extractor function (which extracts appropriate files) and append it to ensime-doc-lookup-map
list
(defun make-play-doc-url (type &optional member)
(ensime-make-java-doc-url-helper
"file:///opt/scala/proj/play2/doc/api/scala/" type member))
(add-to-list 'ensime-doc-lookup-map '("^play\\.api\\." . make-play-doc-url))
JRebel is an alternative solution to updating classes introduced in JVM 1.4 as a hot swapping feature that allows developers to update the code, limited to exists method bodies only, on-the-fly during debugging. JRebel does not require a debugging session to be started. Instead it monitors the file system for changes and updates the classes in-memory. This means that only classes compiled to ".class" files will be updated and changes to classes in JAR files will be ignored. JRebel imposes a performance overhead on the application and should not be used in production or performance tests. It is meant to be a development tool only.
To use JRebel in your sbt project add the following options to java in sbt build file:
-noverify -javaagent:/path/to/jrebel/jrebel.jar
> jetty-run
> ~ prepare-webapp
jetty-run
starts Jetty and monitors the directories given by scanDirectories and redeploys on changes. By default, the entire temporary web application directory is monitored. You might want to change scanDirectories in some cases. For example, set scanDirectories to Nil if you do not want to redeploy on any changes~ prepare-webapp
recompiles and recreates the web application whenever sources files change