
Since Java 14, the Java switch and instanceof statements have been enhanced, in multiple phases, to support pattern matching and a “data-oriented” programming style. In this article, I explore when this programming style is beneficial, and why. I look at the sweet spot of perfect pattern usage, absolute antipatterns where it should not be used, no matter how many examples you see in blogs and conference presentations, and corner cases where the switch syntax clashes with legacy behavior.
<h2>Sealed Hierarchies and Records are Nice</h2>
Pattern matching is one of the shiny new objects in the Java language. Maybe not that new anymore—it started with Java 14. And maybe not that shiny. How many times have you used it in your code?
There may be a reason. Pattern matching works best with a sealed hierarchy of interfaces and record types. (If you want to show off, you can call them “algebraic data types”.) Everywhere that you have such a hierarchy, pattern matching is a natural tool.
How many such hierarchies do you have in your code base? Well, that may explain why you are not often reaching for that tool.
Some people argue that you should actively organize your data into such a form. This is sometimes called data-oriented programming, and it can be a good idea when it fits the problem domain. For examples with a business context, I can recommend this book by Chris Kiehl, currently in early access. He discusses real-life scenarios, such as
<pre>public sealed interface Lifecycle {
record Pending() implements Lifecycle {}
record Billed(String invoiceId) implements Lifecycle {}
record Rejected(Reason reason) implements Lifecycle {}
record InReview(ApprovalId approvalId) implements Lifecycle {}
}
Since these scenarios require a fair amount of domain knowledge, let me use a simple and familiar example: JSON values. There are four kinds of primitive values, and arrays and objects.

Make the leaves of the inheritance tree into records, or, if they only have finitely many instances, into enums. And all other types into sealed interfaces:
<pre>sealed interface JSONValue {}
sealed interface JSONPrimitive extends JSONValue {}
enum JSONBoolean implements JSONPrimitive { FALSE, TRUE; }
enum JSONNull implements JSONPrimitive { INSTANCE; }
record JSONNumber(double value) implements JSONPrimitive {}
record JSONString(String value) implements JSONPrimitive {}
record JSONArray(List<JSONValue> values) implements JSONValue {}
record JSONObject(Map<String, JSONValue> entries) implements JSONValue {}
Now we can use pattern matching:
static String quote(String s) {
return "\"" + s.replace("\\", "\\\\").replace("\"", "\\\"") + "\"";
}
static String stringify(JSONValue j) {
return switch (j) {
case JSONNumber(var v) -> "" + v;
case JSONString(var s) -> quote(s);
case JSONBoolean.TRUE -> "true";
case JSONBoolean.FALSE -> "false";
case JSONNull.INSTANCE -> "null";
case JSONArray(var values) ->
values.stream()
.map(this::stringify)
.collect(Collectors.joining(",", "[", "]"));
case JSONObject(var entries) ->
entries.entrySet()
.stream()
.map(e -> quote(e.getKey()) + ":" + stringify(e.getValue()))
.collect(Collectors.joining(",", "{", "}"));
};
}
This is a switch expression. Each case yields a value (after the -> token). The expression switch (j) { ... } yields the value of the matching case. The return statement returns that value.
The value in parentheses in switch (j) is called the selector. In our case, the type of the selector j is the JSONValue interface.
Note the record patterns, such as:
<pre> case JSONNumber(<b>var v</b>) -> "" + v;
If j is a JSONNumber, the variable v is set to the record component. The type of v is the component type, in this case double.
Also note that some cases are enum instances, such as case JSONBoolean.TRUE -> .... These are called constant patterns.
Finally, note that the switch is exhaustive. It covers all possible values for the selector. All switch expressions must be exhaustive. Because no matter what the selector, the expression must have a value.
Ok, not all values are covered. What if the selector j is null? Then a NullPointerExpression is thrown. If j is new JSONString(null), then s is null, also causing an NPE. That is just to be expected. Generally, null is exempted from exhaustiveness checking because it would be too exhausting to check for them, particularly in nested positions.
Why is pattern matching nice? The object-oriented alternative would have been to add a stringify method to all levels of the hierarchy:
<ul><li>as an abstract or default method in each interface</li>
<li>as a concrete method in each record or enum</li>
</ul>
That is easy enough to do—after all, it is a sealed hierarchy. But it has two drawbacks. First, only the owner of the hierarchy can add methods. And the logic of the action, here, stringification, is sprinkled over multiple classes.
By using an external method and pattern matching, the logic is all in one place. And anyone, not just the hierarchy owner, can go forth and pattern match. Without the need of a visitor pattern. This is good.
<h2>Future Niceness</h2>
Optional could have been declared as a sealed interface whose subtypes are a record Optional.Of and an enum Optional.Empty. Then you could use code like this:
<pre>var result = switch(stream.max(comparator)) { // Not actually
case Optional.Of(x) -> x;
case Optional.Empty.INSTANCE -> someDefault;
};
Of course, that is not how Optional actually works. But there are plans to make deconstruction work with arbitrary classes. Then you will be able to write something like this:
<pre>var result = switch(stream.max(comparator)) { // Maybe soon
case Optional.of(x) -> x;
case Optional.empty() -> someDefault;
};
The details are in flux, so I won’t belabor them. Once available, such “member patterns” (or whatever they will end up being called) will make pattern matching practical for a wider set of classes.
<h2>Naughty Fallthrough</h2>
The classic switch statement, which came to Java via C and C++, has a single raison d’être: to allow the compiler to construct a jump table.
If the labels fall in a compact range, the jump addresses can be in an array. Otherwise, the table is an array of pairs (label, address), sorted by label. Binary search finds the matching case.
As of Java 5, labels can also be strings. Then the jump table contains the hash, and each jump target checks if the string actually matches.
The jump table also explains the fallthrough behavior. After jumping to the code of the case, the program keeps running, until a break causes a jump to the end of the statement. Or, if there is no break, it keeps running with the instructions of the next case. Which is almost always unintended, and a common error. Stay away from it.
Java 14 gave us four forms of switch. The classic statement. A lovely new switch expression. And a switch statement without fallthrough. Also a switch expression with fallthrough—very naughty. That was only added in an effort to make the language more regular.
My advice: If you want a jump table, use the new switch statement without fallthrough. Simply use -> tokens instead of :, and drop the break statements.
switch (c) {
case '0', '1', '2', '3', '4', '5', '6', '7', '8', '9' -> {
ndigit[c-'0']++;
}
case ' ', '\n', '\t' -> {
nwhite++;
}
default -> {
nother++;
}
}
If you want pattern matching, use a switch expression. For sure without fall-through.
<h2>Type Patterns</h2>
You have seen record patterns (matching a record and extracting its components), as well as the more general member patterns of the future.
Another pattern, called type pattern, checks whether the selector expression has a particular type. It then binds a variable to the type cast:
Object x = ...;
Object doubled = switch (x) {
case String s -> s + s; // s is a String
case Number n -> n.doubleValue() * 2; // n is a Number
default -> List.of(x, x);
}
Of course, this example is completely artificial. When is the last time you had business logic that made type tests like this?
The blogosphere is full of examples like this, in order to illustrate the finer points of switch. Or to gleefully present puzzles that explore cruel conflicts between classic and modern syntax and semantics. Not that I would ever do such a thing.
It is not common to write code that starts with an Object and then narrows down the type. If that is important for you, go ahead and learn more about type patterns. But if you feel the need for the occasional instanceof, just stick with it.
In fact, instanceof has gotten better. A classic code snippet such as
<pre>if (x instanceof String) {
String s = (String) x;
<var>Do something with</var> s
}
is more easily expressed in modern Java as
if (x instanceof String s) {
Do something with s
}
Primitive Patterns
The classic switch statement in Java 1.0 permitted selector types int, short, char, and byte. Why not long, float, double, or boolean? They aren’t all that useful with jump tables.
As of Java 25, the selector type can be any type, except for those four types. A proposal, now in its fourth preview, aims to remedy this anomaly. To make the language more regular.
If you just use constant case labels, this is unsurprising.
<pre>double x = ...;
String result = switch (x) { // JEP 530 allows selector of type double
case 3.141592653589793 -> “π”;
case 1.4142135623730951 -> “√2”;
default -> “something else”;
};
But there are also primitive patterns:
<pre>result = switch (x) {
case <b>int n</b> when n % 2 == 0 -> "an even integer";
case <b>float _</b> -> "a float";
default -> "something else";
};
The selector is a 64-bit double. The first case tests whether it actually represents a 32-bit int. The second case checks if fits into a 32-bit float without losing any bits of information. To fully understand the latter, you need to be familiar with the internals of the IEEE 754 floating-point standard.
What if you also toss in some wrapper types? Of course, I would never do this. Just kidding, I certainly would in the interest of creating yet another naughty puzzler. But Simon Ritter beat me to it:
int x = ...;
switch (x) {
case Integer i -> System.out.println("int");
case byte b -> System.out.println("byte");
}
This should not compile. After all, the second case can never happen, and pattern matching is generally good about flagging such dominance.
But it does compile. In this instance, poor switch is getting overwhelmed. There is so much historical baggage that must be respected. And sometimes the results are counterintuitive.
Do not mix primitive patterns and type patterns in the same switch. They do completely different things. A primitive pattern checks whether a value can be converted to a different type. A type pattern checks whether a value belongs to a different type.
The conversion tests can be useful, but in many practical situations, they work better with instanceof:
<pre>int x = ...;
if (x instanceof byte b) {
out.write(b);
} else { // x is not between -128 and 127
…
}
Ok, maybe it’s not that useful. Normally you have bytes between 0 and 255. But that’s another story.
Project Valhalla promises to let us define our own types that act like primitive types, such as long double, short float, unsigned byte. It is not yet clear how pattern matching will work with those types, but I would not be surprised if it was complex and a fertile ground for nasty puzzlers.
Right now, there is a lot of noise about primitive patterns, because it is a new feature. But it is unlikely to impact many programmers. Except as pitfalls. Consider this:
<pre>JSONObject o = ...;
var result = switch (o) {
case JSONNumber(int x) -> x;
case JSONNull.INSTANCE -> 0;
default -> throw new IllegalArgumentException();
};
Did the programmer really mean case JSONNumber(int x)? It is an easy mistake to accidentally write int instead of double. Before primitive patterns, the compiler rejected this. Now it has an exciting new meaning: Is o an instance of JSONNumber whose value component is actually an integer? This can be useful, of course, if intended. But what if it isn’t?
Tip: Get into the habit of always using var with record patterns. Then you don’t run into this issue.
Constant Patterns
In our sealed JSON hierarchy, we had a mixture of record patterns and enum constant patterns:
<pre> case JSONNumber(var v) -> "" + v;
case <b>JSONBoolean.TRUE</b> -> "true"; // a constant pattern
And that’s fine.
A classic jump table switch only has constant patterns. That’s fine to.
For now, constant patterns have an unfortunate limitation: they don’t nest.
<pre>record Point(int x, int y) {}
Point p = …;
var result = switch (p) {
case Point(0, _) -> “on x-axis”; // ERROR, can’t nest constant pattern
…
}
You have to write:
var result = switch (p) {
case Point(x, _) when x == 0 -> "on x-axis";
...
}
This limitation may get fixed at some point in the future.
Even with top-level constant patterns, the rules can get pretty arcane. For example, what is wrong with this?
<pre>Object x = ...;
String result = switch (x) {
case “” -> “empty”;
case 0 -> “zero”;
case JSONNull.INSTANCE -> “null”;
default -> “something else”;
};
You can only use string cases when the selector type is String, and integer cases when the selector type is int or Integer. Or short or char or byte. But not long, double, or float. Here the selector type is Object, so they are not allowed. Yet, with an Object selector, enum constants are ok.
You are unlikely to run into real-life switch expressions with Object or Integer selectors, so this too is more of an issue for puzzlers than real-life scenarios.
<h2>Conclusion</h2>
There is a lot going on with modern switch, and some usages are nicer than others.
The sweet spot is the “sealed hierarchies of records and enums” use case. For bragging rights, call it “algebraic data types”.
In the future, that convenience will be extended to other classes such as Optional.
For other type tests, prefer the modern form of instanceof over a switch with type patterns.
Also, if you want to convert between primitive types, try the new instanceof form first.
If you use jump tables, that’s totally fine. But refactor without fallthrough.
Blogs and puzzler presentations will delight in exploring the interactions between classic and “enhanced” switches, which can get arcane and complex. (Nobody could have predicted that.)
Don’t let that scare you away from using pattern matching. It is a truly useful feature, and it is well worth organizing appropriate parts of your code with pattern matching in mind.
steinhauer.software