~/codewithstu

I benchmarked all .NET serializers for .NET 10

Transcript

.NET 10 dropped some serious performance improvements for serialization. Today, I'm going to walk you through the results from my open-source benchmark numbers comparing both JSON and binary serializers. In this update, three libraries shipped major versions: MessagePack, MongoDB.Bson, and ServiceStack.Text. The rest of the ecosystem got minor and patch updates with healthy maintenance across the board. Even Newtonsoft.Json got a patch after years of stability. However, Jil is gone. It hasn't been updated since 2019 and throws an ambiguous match exception on .NET 7 and above. So, if you're still using Jil, it's time to migrate.

Here's a quick rundown on the test environments. All tests were run on my Ryzen 9 7950X3D with 64 gigs of RAM running the latest versions of .NET 9 and .NET 10. BenchmarkDotNet version 0.15.8 handles the heavy lifting and used default settings. For the data, we use the same three data sets that we had in previous videos: a small data set with 200 users, a medium data set with 5,000 users, and a large at 20,000 users.

All tests were performed after a full system reboot with minimal applications running in the background to reduce interference. Let's see the results.

Let's start with JSON since that's what most of you are probably using in your APIs. For large data set serialization, System.Text.Json went from 44 milliseconds down to 27 milliseconds. ServiceStack.Text leads the way with a 40% improvement on the large data set. Even the older Newtonsoft.Json saw a 36% improvement. For our small data set serialization, SpanJson saw a modest 5% improvement, but it is an already heavily optimized library. System.Text.Json improved 24%. ServiceStack and Newtonsoft.Json continued their gains with 19% and 24% respectively. If you want to take a look at the benchmarks in more detail or want to support the channel, head over to codewithstu.tv.

Deserialization follows a similar pattern. For large data set deserialization, System.Text.Json had a 29% improvement going from 91 milliseconds down to 65 milliseconds. Utf8Json improved 19% on the large data set. ServiceStack and Newtonsoft.Json each saw 10% gains. For small data set deserialization, System.Text.Json had a 34% improvement, dropping from 631 micros to 417 micros. SpanJson improved 22%. ServiceStack and Newtonsoft.Json continued their gains with 25% and 24% respectively.

Now let's talk binary formats. These are typically used when you need maximum performance or compact payloads. For large data set serialization, MemoryPack leads at 4.4 milliseconds with a 12% improvement. GroBuf and MessagePack each improved 15%. Hyperion saw the biggest gain at 27%. For context, that's 3 to 7 times faster than the fastest JSON serializers. For small data set serialization, MessagePack had a 25% improvement, dropping from 104 to 78 microseconds. MemoryPack improved 22%. protobuf-net saw an 18% gain. Across the board, binary serializers saw 10 to 27% improvements. For the full benchmark source code, you can grab it from GitHub via codewithstu.tv.

Deserialization follows a similar pattern, though the gains are more modest. For large data set deserialization, MemoryPack leads at 28 milliseconds with a 4% improvement. GroBuf and Bebop are close behind at 31 milliseconds with 5% and 2% respectively. For small data set deserialization, MessagePack had a 20% improvement. MemoryPack and Hyperion improved 11% and 15% respectively.

So, which one should you use? Start with JSON. It's universal, debuggable, and fast enough for most workloads. Reserve binary for when you've measured a real bottleneck or when your protocol requires it. For JSON, System.Text.Json is a great default. It's built-in, it's fast, and most developers will know how to use it. If you do need binary, MemoryPack is the fastest when you control both ends. MessagePack is your best bet for cross-platform interop. The real cost of binary is the maintenance and debuggability. You're trading human readability for throughput. Make sure that trade-off is worth it. Across the board, things got a lot faster. System.Text.Json improvements are substantial and the runtime improvements benefit everyone.

If you enjoyed this video, consider subscribing to the YouTube channel for more content like this.

// share_this