DICOM @ OFFIS

Discussion Forum for OFFIS DICOM Tools - For registration, send email with desired user name to the OFFIS DICOM team
It is currently Tue, 2017-10-24, 03:03

All times are UTC + 1 hour




Post new topic Reply to topic  [ 5 posts ] 
Author Message
PostPosted: Tue, 2015-08-11, 16:38 
Offline

Joined: Thu, 2011-02-24, 11:27
Posts: 24
Location: Bremen
Hi,
we use storescu 3.6.0 compiled with ON_THE_FLY_COMPRESSION under Windows 64 bit. With some specific large data sets (UltrasoundMultiframeImageStorage) encoded as JPEGBaseline (800x600 with over 3000 frames) storescu crashes during image decoding - looks like an access violation in memcpy(). Is this a known issue? Are the images simply too large for storescu, or is there some workaround/idea to prevent the crash, or, better yet, could it already be fixed in the main branch?

Thanks,
- Andreas


Top
 Profile  
 
PostPosted: Wed, 2015-08-12, 07:35 
Offline

Joined: Thu, 2011-02-24, 11:27
Posts: 24
Location: Bremen
Ok, just looked into it a little more - the uncompressed size of the data set is more than 4GB, so it causes an integer overflow in the uncompressed size calculation in DJCodecDecoder::decode(). Thus the allocated memory for the decompressed data is too small, and the crash happens later in memcpy().
I checked the current version, and here UInt32 used for length has been replaced by size_t (in a large check-in to fix the respective warnings) - this may fix the crash under 64 bit, but as the decompressed image cannot be written as DICOM, it would be a problem for storescu any way. I will check if I can add a respective check.


Top
 Profile  
 
PostPosted: Wed, 2015-08-12, 08:42 
Offline

Joined: Thu, 2011-02-24, 11:27
Posts: 24
Location: Bremen
The following prevents the crash (this is for the 3.6.0. version):

Code:
Index: djcodecd.cc
===================================================================
--- djcodecd.cc   (revision 5186)
+++ djcodecd.cc   (working copy)
@@ -152,21 +152,26 @@
               {
                 Uint32 frameSize = ((precision > 8) ? sizeof(Uint16) : sizeof(Uint8)) * imageRows * imageColumns * imageSamplesPerPixel;
                 Uint32 totalSize = frameSize * imageFrames;
+                if (totalSize/frameSize != imageFrames)  // check for integer overflow
+                  result = EC_MemoryExhausted;
                 if (totalSize & 1) totalSize++; // align on 16-bit word boundary
                 Uint16 *imageData16 = NULL;
                 Sint32 currentFrame = 0;
                 Uint32 currentItem = 1; // ignore offset table
 
-                if (isYBR && (imageBitsStored < imageBitsAllocated)) // check for a special case that is currently not handled properly
+                if (result.good())
                 {
-                  if (djcp->getDecompressionColorSpaceConversion() != EDC_never)
+                  if (isYBR && (imageBitsStored < imageBitsAllocated)) // check for a special case that is currently not handled properly
                   {
-                    DCMJPEG_WARN("BitsStored < BitsAllocated for JPEG compressed image with YCbCr color model, color space conversion will probably not work properly");
-                    DCMJPEG_DEBUG("workaround: use option --conv-never (for command line tools) or EDC_never (for the DJDecoderRegistration::registerCodecs() call)");
+                    if (djcp->getDecompressionColorSpaceConversion() != EDC_never)
+                    {
+                      DCMJPEG_WARN("BitsStored < BitsAllocated for JPEG compressed image with YCbCr color model, color space conversion will probably not work properly");
+                      DCMJPEG_DEBUG("workaround: use option --conv-never (for command line tools) or EDC_never (for the DJDecoderRegistration::registerCodecs() call)");
+                    }
                   }
+
+                  result = uncompressedPixelData.createUint16Array(totalSize / sizeof(Uint16), imageData16);
                 }
-
-                result = uncompressedPixelData.createUint16Array(totalSize / sizeof(Uint16), imageData16);
                 if (result.good())
                 {
                   Uint8 *imageData8 = (Uint8 *)imageData16;


Top
 Profile  
 
PostPosted: Wed, 2015-08-12, 11:23 
Offline
DCMTK Developer

Joined: Fri, 2004-11-05, 13:47
Posts: 1622
Location: Oldenburg, Germany
Hi,

my first impression is that it makes sense to perform the decompression in the codec using the size_t approach since otherwise you are not able to uncompress and (e.g.) display those images at all. Only if you try to transfer it in any form of DICOM service (DICOM-conformant network or media-based transfer), then the library should give an error. Of course, it should not crash then.

What do you think?

Best,
Michael


Top
 Profile  
 
PostPosted: Wed, 2015-08-12, 12:17 
Offline

Joined: Thu, 2011-02-24, 11:27
Posts: 24
Location: Bremen
I agree - I just didn't want to integrate the size_t changes into our 3.6.0 version at the moment.
The check will be needed anyway for 32 bit platforms because size_t is still 32 bit there and the crash would occur anyway. On 64 bit platforms, the check will always succeed with the site_t approach, and the given problem (data set larger then 4 GB) would be handled on the writing side, as you propose.

Regards,
Andreas


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 5 posts ] 

All times are UTC + 1 hour


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group