storescu crash with large JPEG Baseline coded dataset

Questions regarding other OFFIS DICOM tools

Moderator: Moderator Team

Post Reply
Message
Author
andreasb
Posts: 24
Joined: Thu, 2011-02-24, 11:27
Location: Bremen

storescu crash with large JPEG Baseline coded dataset

#1 Post by andreasb » Tue, 2015-08-11, 16:38

Hi,
we use storescu 3.6.0 compiled with ON_THE_FLY_COMPRESSION under Windows 64 bit. With some specific large data sets (UltrasoundMultiframeImageStorage) encoded as JPEGBaseline (800x600 with over 3000 frames) storescu crashes during image decoding - looks like an access violation in memcpy(). Is this a known issue? Are the images simply too large for storescu, or is there some workaround/idea to prevent the crash, or, better yet, could it already be fixed in the main branch?

Thanks,
- Andreas

andreasb
Posts: 24
Joined: Thu, 2011-02-24, 11:27
Location: Bremen

Re: storescu crash with large JPEG Baseline coded dataset

#2 Post by andreasb » Wed, 2015-08-12, 07:35

Ok, just looked into it a little more - the uncompressed size of the data set is more than 4GB, so it causes an integer overflow in the uncompressed size calculation in DJCodecDecoder::decode(). Thus the allocated memory for the decompressed data is too small, and the crash happens later in memcpy().
I checked the current version, and here UInt32 used for length has been replaced by size_t (in a large check-in to fix the respective warnings) - this may fix the crash under 64 bit, but as the decompressed image cannot be written as DICOM, it would be a problem for storescu any way. I will check if I can add a respective check.

andreasb
Posts: 24
Joined: Thu, 2011-02-24, 11:27
Location: Bremen

Re: storescu crash with large JPEG Baseline coded dataset

#3 Post by andreasb » Wed, 2015-08-12, 08:42

The following prevents the crash (this is for the 3.6.0. version):

Code: Select all

Index: djcodecd.cc
===================================================================
--- djcodecd.cc	(revision 5186)
+++ djcodecd.cc	(working copy)
@@ -152,21 +152,26 @@
               {
                 Uint32 frameSize = ((precision > 8) ? sizeof(Uint16) : sizeof(Uint8)) * imageRows * imageColumns * imageSamplesPerPixel;
                 Uint32 totalSize = frameSize * imageFrames;
+                if (totalSize/frameSize != imageFrames)  // check for integer overflow
+                  result = EC_MemoryExhausted;
                 if (totalSize & 1) totalSize++; // align on 16-bit word boundary
                 Uint16 *imageData16 = NULL;
                 Sint32 currentFrame = 0;
                 Uint32 currentItem = 1; // ignore offset table
 
-                if (isYBR && (imageBitsStored < imageBitsAllocated)) // check for a special case that is currently not handled properly
+                if (result.good())
                 {
-                  if (djcp->getDecompressionColorSpaceConversion() != EDC_never)
+                  if (isYBR && (imageBitsStored < imageBitsAllocated)) // check for a special case that is currently not handled properly
                   {
-                    DCMJPEG_WARN("BitsStored < BitsAllocated for JPEG compressed image with YCbCr color model, color space conversion will probably not work properly");
-                    DCMJPEG_DEBUG("workaround: use option --conv-never (for command line tools) or EDC_never (for the DJDecoderRegistration::registerCodecs() call)");
+                    if (djcp->getDecompressionColorSpaceConversion() != EDC_never)
+                    {
+                      DCMJPEG_WARN("BitsStored < BitsAllocated for JPEG compressed image with YCbCr color model, color space conversion will probably not work properly");
+                      DCMJPEG_DEBUG("workaround: use option --conv-never (for command line tools) or EDC_never (for the DJDecoderRegistration::registerCodecs() call)");
+                    }
                   }
+
+                  result = uncompressedPixelData.createUint16Array(totalSize / sizeof(Uint16), imageData16);
                 }
-
-                result = uncompressedPixelData.createUint16Array(totalSize / sizeof(Uint16), imageData16);
                 if (result.good())
                 {
                   Uint8 *imageData8 = (Uint8 *)imageData16;

Michael Onken
DCMTK Developer
Posts: 1805
Joined: Fri, 2004-11-05, 13:47
Location: Oldenburg, Germany
Contact:

Re: storescu crash with large JPEG Baseline coded dataset

#4 Post by Michael Onken » Wed, 2015-08-12, 11:23

Hi,

my first impression is that it makes sense to perform the decompression in the codec using the size_t approach since otherwise you are not able to uncompress and (e.g.) display those images at all. Only if you try to transfer it in any form of DICOM service (DICOM-conformant network or media-based transfer), then the library should give an error. Of course, it should not crash then.

What do you think?

Best,
Michael

andreasb
Posts: 24
Joined: Thu, 2011-02-24, 11:27
Location: Bremen

Re: storescu crash with large JPEG Baseline coded dataset

#5 Post by andreasb » Wed, 2015-08-12, 12:17

I agree - I just didn't want to integrate the size_t changes into our 3.6.0 version at the moment.
The check will be needed anyway for 32 bit platforms because size_t is still 32 bit there and the crash would occur anyway. On 64 bit platforms, the check will always succeed with the site_t approach, and the given problem (data set larger then 4 GB) would be handled on the writing side, as you propose.

Regards,
Andreas

Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest