This post is part of the series 'Vulnerabilities'. Be sure to check out the rest of the blog posts of the series!
If your website allows users to upload zip files that will be extracted on the server, you should be careful. Websites are often protected against uploading large files. By default, Kestrel and IIS limit the size of a request to 30MB. Apache and nginx also have limits. However, this limit only applies to the uploaded file, not to its decompressed size. In the case of a zip file, a 10MB file can be compressed to a 281TB file (source). Imagine what a 30MB upload could expand to. If you decompress such a file in memory or on disk, you may run out of memory or disk space very quickly. This is a Denial-of-service (DOS) attack.

You need to check the contents of the zip file before processing it. Each entry in a zip file has a header that declares its uncompressed size. However, you cannot fully trust this value. It is only an informational field set by the tool that created the archive. Legitimate applications set it correctly, but an attacker will not. So, you should first check the declared size from the header to detect suspiciously large files, and then throw an exception if the actual extracted size exceeds the expected limit.
C#
const long MaxLength = 10 * 1024 * 1024; // 10MB
using var zipFile = ZipFile.OpenRead("sample.zip");
// Quickly check the value from the zip header
var declaredSize = zipFile.Entries.Sum(entry => entry.Length);
if(declaredSize > MaxLength)
throw new Exception("Archive is too big");
foreach (var entry in zipFile.Entries)
{
using var entryStream = entry.Open();
// Use MaxLengthStream to ensure we don't read more than the declared length
using var maxLengthStream = new MaxLengthStream(entryStream, entry.Length);
// Be sure to use the maxLengthSteam variable to read the content of the entry, not entryStream
using var ms = new MemoryStream();
maxLengthStream.CopyTo(ms);
}
Here's the code for the MaxLengthStream class. It is a read-only stream that throws an exception when you read more than the specified maximum length.
C#
internal sealed class MaxLengthStream : Stream
{
private readonly Stream _stream;
private long _length = 0L;
public MaxLengthStream(Stream stream, long maxLength)
{
_stream = stream ?? throw new ArgumentNullException(nameof(stream));
MaxLength = maxLength;
}
public long MaxLength { get; }
public override bool CanRead => _stream.CanRead;
public override bool CanSeek => false;
public override bool CanWrite => false;
public override long Length => _stream.Length;
public override long Position
{
get => _stream.Position;
set => throw new NotSupportedException();
}
public override int Read(byte[] buffer, int offset, int count)
{
var result = _stream.Read(buffer, offset, count);
_length += result;
if (_length > MaxLength)
throw new Exception("Stream is larger than the maximum allowed size");
return result;
}
// TODO ReadAsync
public override void Flush() => throw new NotSupportedException();
public override long Seek(long offset, SeekOrigin origin) => throw new NotSupportedException();
public override void SetLength(long value) => throw new NotSupportedException();
public override void Write(byte[] buffer, int offset, int count) => throw new NotSupportedException();
protected override void Dispose(bool disposing)
{
_stream.Dispose();
base.Dispose(disposing);
}
}
You are now safe when extracting archive files on your server.
Do you have a question or a suggestion about this post? Contact me!