development
Scala
→ metals LSP
Yet it is fully compatible with scala 2.12.10 version. The 2.11 version only support limited set of features.
The setup is well explained in the metals documentation. There is both metals-emacs
and something to putt within the ini.el
file.
This can take advantage of magit
library and integrates well with it.
→ Logging
scala-logging
is a great tool. It's a wrapper of slf4j, which handles both log4j and logback. In order to work with logback you have to add those dependencieslogback-classic
and log4j-over-slf4j
.
→ logback
In order to configure logback, put a logback.xml in the src/main/resources
folder. For tests, the src/test/resources/logback.xml
file will overwrite the main behavior.
The below configuration allows to specify different levels for libraries:
<configuration>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date{ISO8601} %highlight([%-5level]): %cyan(%logger{15}) %msg%n</pattern>
</encoder>
</appender>
<root level="WARN">
<appender-ref ref="CONSOLE"/>
</root>
<logger name="org.apache.hadoop" level="WARN" includeLocation="true"/>
<logger name="org.apache.spark" level="WARN" includeLocation="true"/>
<logger name="org.apache.hadoop.hive" level="WARN" includeLocation="true"/>
<logger name="org.apache.spark.repl.SparkILoop$SparkILoopInterpreter" level="WARN" includeLocation="true"/>
<logger name="apache.spark.repl.SparkIMain$exprTyper" level="WARN" includeLocation="true"/>
<logger name="org.eclipse.jetty.util.component.AbstractLifeCycle" level="WARN" includeLocation="true"/>
<logger name="org.eclipse.jetty" level="WARN" includeLocation="true"/>
<logger name="io.frama.parisni.spark.postgres" level="INFO" includeLocation="true"/>
<logger name="org.apache.spark.repl.SparkILoop$SparkILoopInterpreter" level="WARN" includeLocation="true"/>
<logger name="org.apache.spark.repl.SparkIMain$exprType" level="WARN" includeLocation="true"/>
<logger name="com.opentable.db.postgres.embedded.EmbeddedPostgres" level="WARN" includeLocation="true"/>
</configuration>
→ log4j
It is unmaintained since 2015. However apache spark is still based on it. The excellent introduction explains how inheritance works and how library logging can have dedicated logging level than the application itself.
→ Read a string file
import scala.io.Source
Source.fromFile("example.txt", "UTF-8").mkstring