Advanced Serialization and Deserialization in C#

If you're building APIs, caches, or distributed systems, you'll use serialization a lot. At a high level, serialization turns objects into text or bytes, and deserialization rebuilds them. Simple cases are easy, but real systems have cycles, polymorphism, and performance needs. This guide walks through practical solutions and things to watch out for using System.Text.Json, Newtonsoft.Json, and binary formats.

Optimizing System.Text.Json and Newtonsoft.Json

Two JSON libraries you'll see most are System.Text.Json and Newtonsoft.Json (Json.NET). Pick System.Text.Json for speed and smaller deployments. Use Newtonsoft if you need extra features (like certain polymorphic helpers) that aren't easily available yet in System.Text.Json.

Here's a simple example showing how to configure System.Text.Json for smaller, faster JSON. These options are small wins that add up in production.

// Define a sample class
public class Product
{
    public int Id { get; set; }
    public string Name { get; set; }
    public decimal Price { get; set; }
}

// Optimized serialization
var product = new Product { Id = 1, Name = "Laptop", Price = 1299.99m };
var options = new JsonSerializerOptions
{
    WriteIndented = false,
    PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
    DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
};

string json = JsonSerializer.Serialize(product, options);
Console.WriteLine(json);

Quick notes: camel-case is common for APIs, ignoring nulls reduces payload size, and disabling indentation trims bandwidth. For heavy workloads, precompiled converters and caching options help a lot — we'll mention caching again later.

NewtonSoft has similar knobs. If you're stuck on Newtonsoft for compatibility, tweak these settings to get better throughput and smaller payloads.

var settings = new JsonSerializerSettings
{
    Formatting = Formatting.None,
    NullValueHandling = NullValueHandling.Ignore,
    ContractResolver = new CamelCasePropertyNamesContractResolver()
};

string jsonNet = JsonConvert.SerializeObject(product, settings);
Console.WriteLine(jsonNet);

TL;DR: cache options, reuse converters, and avoid repeated reflection-based work when serializing hot types.

Handling Circular References

Circular references are a common headache. For example, two employees might reference each other as manager and direct report. Naive serialization can loop forever — so you need a strategy to either preserve references or break cycles.

public class Employee
{
    public string Name { get; set; }
    public Employee Manager { get; set; }
}

var alice = new Employee { Name = "Alice" };
var bob = new Employee { Name = "Bob", Manager = alice };
alice.Manager = bob; // Circular reference

If you try to serialize a loop without handling it, you'll hit a runtime error. Newtonsoft offers settings to ignore loops or serialize references — choose based on how you want the JSON to look and who consumes it.

var settings = new JsonSerializerSettings
{
    ReferenceLoopHandling = ReferenceLoopHandling.Ignore
};

string json = JsonConvert.SerializeObject(alice, settings);
Console.WriteLine(json);

Alternatively, ReferenceLoopHandling.Serialize combined with PreserveReferencesHandling.Objects allows circular references to be preserved safely:

var settings = new JsonSerializerSettings
{
    PreserveReferencesHandling = PreserveReferencesHandling.Objects,
    Formatting = Formatting.Indented
};

string json = JsonConvert.SerializeObject(alice, settings);
Console.WriteLine(json);

In System.Text.Json, circular references are handled with ReferenceHandler.Preserve:

var options = new JsonSerializerOptions
{
    ReferenceHandler = ReferenceHandler.Preserve,
    WriteIndented = true
};

string json = JsonSerializer.Serialize(alice, options);
Console.WriteLine(json);

Both approaches avoid runtime crashes. If your consumers expect plain JSON, you might ignore loops. If you need to reconstruct the exact object graph, preserve references (but be aware the JSON will include metadata tokens).

Polymorphism in Serialization

Polymorphism shows up when a collection holds different concrete types behind a base type. By default JSON doesn't record which concrete type was used, so deserialization can't rebuild the right object unless you provide a hint — typically a discriminator or a custom converter.

Using Newtonsoft.Json, you can serialize polymorphic lists with type metadata:

public abstract class Shape { public string Name { get; set; } }
public class Circle : Shape { public double Radius { get; set; } }
public class Rectangle : Shape { public double Width { get; set; } }

var shapes = new List<Shape> { new Circle { Name = "C1", Radius = 5 }, new Rectangle { Name = "R1", Width = 10 } };

var settings = new JsonSerializerSettings
{
    TypeNameHandling = TypeNameHandling.Auto,
    Formatting = Formatting.Indented
};

string json = JsonConvert.SerializeObject(shapes, settings);
Console.WriteLine(json);

Newtonsoft can include type metadata automatically (be careful with security—more on that later). For System.Text.Json, you usually write a small converter that reads a discriminator field and dispatches to the right concrete type.

public class ShapeConverter : JsonConverter<Shape>
{
    public override Shape Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
    {
        using var doc = JsonDocument.ParseValue(ref reader);
        var type = doc.RootElement.GetProperty("ShapeType").GetString();

        return type switch
        {
            "Circle" => JsonSerializer.Deserialize<Circle>(doc.RootElement.GetRawText(), options),
            "Rectangle" => JsonSerializer.Deserialize<Rectangle>(doc.RootElement.GetRawText(), options),
            _ => throw new NotSupportedException($"Unknown shape type: {type}")
        };
    }

    public override void Write(Utf8JsonWriter writer, Shape value, JsonSerializerOptions options)
    {
        using var doc = JsonDocument.Parse(JsonSerializer.Serialize(value, value.GetType(), options));
        writer.WriteStartObject();
        foreach (var prop in doc.RootElement.EnumerateObject())
        {
            prop.WriteTo(writer);
        }
        writer.WriteString("ShapeType", value.GetType().Name);
        writer.WriteEndObject();
    }
}

Register the converter with your JsonSerializerOptions so the serializer knows how to handle the polymorphic shape.

Binary and Custom Serialization Formats

JSON is great, but when you need really small messages or speed (think RPC across services), binary formats like MessagePack or Protobuf are worth considering — they reduce size and CPU time.

Important: avoid BinaryFormatter — it's insecure. Instead use modern libraries like MessagePack or protobuf-net. They require schemas or attributes, but they're fast and safe when used correctly.

using MessagePack;

[MessagePackObject]
public class Employee
{
    [Key(0)]
    public int Id { get; set; }
    [Key(1)]
    public string Name { get; set; }
}

var employee = new Employee { Id = 1, Name = "Alice" };
byte[] binaryData = MessagePackSerializer.Serialize(employee);

Employee deserialized = MessagePackSerializer.Deserialize<Employee>(binaryData);
Console.WriteLine(deserialized.Name);

Custom serialization is handy when you need to skip sensitive fields or support versioning. Implement ISerializable or write converters to control exactly what gets written and read.

[Serializable]
public class SecureUser : ISerializable
{
    public string Username { get; set; }
    [NonSerialized]
    public string Password;

    public SecureUser() { }

    protected SecureUser(SerializationInfo info, StreamingContext context)
    {
        Username = info.GetString("Username");
        // Avoid serializing sensitive fields
    }

    public void GetObjectData(SerializationInfo info, StreamingContext context)
    {
        info.AddValue("Username", Username);
    }
}

Advanced Scenarios

For big datasets, stream the JSON to a file or network stream instead of building it all in memory — this avoids out-of-memory issues and is better for APIs that return large results.

await using var fs = File.Create("products.json");
await JsonSerializer.SerializeAsync(fs, largeProductList, options);

Use async serialization for web APIs to keep threads free, and design your formats to tolerate version changes so old clients don't break when you add fields.

Debugging and Testing

Always write tests for your serialization paths. A missing converter or a property rename can silently break deserialization and cause hard-to-debug runtime errors. Test round-trips (serialize then deserialize) for your important shapes.

var serialized = JsonSerializer.Serialize(product, options);
var deserialized = JsonSerializer.Deserialize<Product>(serialized, options);
Debug.Assert(product.Id == deserialized.Id);

Security Considerations

Security note: never trust external JSON blindly. Features like TypeNameHandling in Newtonsoft let the JSON specify CLR types — that can be abused. Prefer safer defaults (System.Text.Json) and always validate or restrict types when deserializing polymorphic data.

Best Practices

  • Choose the right library: System.Text.Json for performance, Newtonsoft.Json for features.
  • Use reference handling for circular graphs.
  • Implement custom converters for polymorphic types.
  • Prefer binary formats for high-performance scenarios.
  • Stream large objects asynchronously.
  • Secure sensitive data using NonSerialized or custom converters.
  • Write unit tests for all serialization/deserialization paths.

Summary

In short: serialization is simple in small apps but gets tricky in real systems. Tweak your serializer settings, write converters for special types, handle cycles and polymorphism intentionally, and add tests. When speed or size matters, use binary formats and streaming. And above all — treat external input as untrusted.