Update: the experiment described in this post is now available in the Cats 2.2.0 pre-releases,
starting with 2.2.0-M1, which was
published in March 2020.
This post is an attempt to provide some additional context and justification for
an experiment that I've been working on as a proposal for a
future version of Cats (probably 3.0). The argument is that by
moving Cats's type class instances for standard library types into implicit scope, we can provide a
better user experience along a couple of dimensions (fewer imports to think about, faster compile
times), while also making the library better aligned with future changes in the language and compiler.
Continue reading
I've written a couple of blog
posts
about how the Parallel
type class has changed in
Cats 2.0,
but those posts don't really say much about why someone using Cats should care about Parallel
in the first
place. The name suggests that it has something to do with running computations at the same time, and
while that's one of things you can do with it (via the instance for IO
in
cats-effect, for example), it has a much, much wider
range of applications. This post will focus on a real-world use case for Parallel
that at a glance
might not seem to have much in common with running things in parallel: accumulating errors while validating form inputs.
Continue reading
Typelevel has just published Cats 2.0.0,
and while the core modules are guaranteed to be binary compatible with 1.x, there are some changes
that break source compatibility. Most of these changes are
unlikely to affect users, but a few will, and the goal of this post is to point out which those are and what you can do
about them.
Note that while some of the stuff below is pretty intense, it's unlikely to apply to you. In fact if
you're not using Parallel
, there's like a 99% chance you can close this tab right now and go
change your Cats version and everything will be fine. There are also always people in the Cats
Gitter channel who are happy to help. In any case please don't be intimidated
and put off updating to 2.0.0—the community is healthier if adopters invest in staying up to date.
Continue reading
(Apologies for the title—after a lot of time on Twitter this week I've been feeling nostalgic for things like Tumblr c. 2010.)
This post is an attempt to answer a question Baccata64 asked on Reddit yesterday afternoon:
how does the Parallel change not break bincompat ? Is it that type parameters and type members are encoded the same way at the bytecode level ?
The context is that Cats 2.0.0-RC2 includes a recent change where the Parallel
and NonEmptyParallel
type classes
were changed from having two type parameters each:
trait NonEmptyParallel[M[_], F[_]] {
// ...
}
…to one, with the parallel context (the F
parameter) changed to a type member:
trait NonEmptyParallel[M[_]] {
type F[_]
// ...
}
This post will give some background about the context and motivation for this change, and then will try to answer Baccata64's question.
Continue reading
I'll start with the story of how I got saved, since it's kind of relevant. Back when I was an
English Ph.D. student, I worked on a number of projects that involved natural language
processing, which meant doing a lot of counting trigrams or whatever in tens of thousands of text
files in giant messy directory trees. I was working primarily in Ruby at the time, after years
of Java, and at least back in 2008 it was a pain in the ass to do this kind of thing in
either Ruby or Java. You really want a library that provides the following features:
- Resource management: you don't want to have to worry about running out of file handles.
- Streaming: you shouldn't ever have to have all of the data in memory at once.
- Fusion: two successive mapping operations shouldn't need to traverse the data twice.
- Graceful error recovery: these tasks are all off-line, but you still don't want to have to
restart a computation that's been running for ten minutes just because the formatting in one file
is wrong.
Maybe there was such a library for Ruby or Java back then, but if there was I didn't know about it.
I did have some experience with Haskell, though, and at some point in 2010 I heard about
iteratees, and they were exactly what I'd always wanted. I didn't really
understand how they worked at first, but with iteratee (and later
John Millikin's enumerator) I was able to write code that did what I wanted
and didn't make me think about stuff I didn't want to think about. I started picking Haskell
instead of Ruby for new projects, and that's how I accepted statically-typed functional programming
into my life.
Continue reading
I've always really liked this passage from On the Genealogy of Morals:
[T]here is a world of difference between the reason for something coming into
existence in the first place and the ultimate use to which it is put, its
actual application and integration into a system of goals… anything which
exists, once it has come into being, can be reinterpreted in the service of
new intentions, repossessed, repeatedly modified to a new use by a power
superior to it.
A couple of months ago at LambdaConf I had a few conversations
with different people about why we like (or at least put up with) Scala when
there are so many better languages out there. Most of the answers were the usual
ones: the JVM, the ecosystem, the job market, the fact that you don't have to
deal with Cabal, etc.
For me it's a little more complicated than that. I like Scala in part because
it's a mess. It's not a "fully" dependently typed language, but you can get
pretty close with singleton types and path dependent types. It provides
higher-kinded types, but you have to work around lots of bugs and gaps and
underspecified behaviors to do anything very interesting with them. And so
on—it's a mix of really good ideas and a few really bad ideas and you can put
them together in ways that the language designers didn't anticipate and probably
don't care about at all.
The rest of this blog post will be a long story about one example of this kind of
thing involving Scalaz's UnapplyProduct
.
Continue reading