Can ScalaCheck/Specs warnings safely be ignored when using SBT with ScalaTest?
- by pdbartlett
I have a simple FunSuite-based ScalaTest:
package pdbartlett.hello_sbt
import org.scalatest.FunSuite
class SanityTest extends FunSuite {
test("a simple test") {
assert(true)
}
test("a very slightly more complicated test - purposely fails") {
assert(42 === (6 * 9))
}
}
Which I'm running with the following SBT project config:
import sbt._
class HelloSbtProject(info: ProjectInfo) extends DefaultProject(info) {
// Dummy action, just to show config working OK.
lazy val solveQ = task { println("42"); None }
// Managed dependencies
val scalatest = "org.scalatest" % "scalatest" % "1.0" % "test"
}
However, when I runsbt test I get the following warnings:
...
[info] == test-compile ==
[info] Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling test sources...
[info] Nothing to compile.
[warn] Could not load superclass 'org.scalacheck.Properties' : java.lang.ClassNotFoundException: org.scalacheck.Properties
[warn] Could not load superclass 'org.specs.Specification' : java.lang.ClassNotFoundException: org.specs.Specification
[warn] Could not load superclass 'org.specs.Specification' : java.lang.ClassNotFoundException: org.specs.Specification
[info] Post-analysis: 3 classes.
[info] == test-compile ==
For the moment I'm assuming these are just "noise" (caused by the unified test interface?) and that I can safely ignore them. But it is slightly annoying to some inner OCD part of me (though not so annoying that I'm prepared to add dependencies for the other frameworks).
Is this a correct assumption, or are there subtle errors in my test/config code? If it is safe to ignore, is there any other way to suppress these errors, or do people routinely include all three frameworks so they can pick and choose the best approach for different tests?
TIA,
Paul.
(ADDED: scala v2.7.7 and sbt v0.7.4)