开发者

ImageMagick C++ API Output 16-bit grayscale png?

开发者 https://www.devze.com 2023-04-12 01:31 出处:网络
I\'m linking to 开发者_JAVA百科ImageMagick via the Magick++ API. I\'m attempting to take uint16 data and output it as a 1024x768 1-channel 16-bit grayscale PNG. The output I get from the following is

I'm linking to 开发者_JAVA百科ImageMagick via the Magick++ API. I'm attempting to take uint16 data and output it as a 1024x768 1-channel 16-bit grayscale PNG. The output I get from the following is an RGB8 PNG. The image contents are correct besides the formatting.

gray16_view_t u16view = ...;
uint16_t* data = interleaved_view_get_raw_data(u16view);
size_t length = sizeof(u16) * u16view.width() * u16view.height();
Magick::Blob u16Blob(data, length);
Magick::Geometry size(u16view.width(), u16view.height());
Magick::Image u16MagickImg(u16Blob, size, 16, "GRAY");
u16MagickImg.write("test-16bit.png");

Is there any way to specify more about the output format?

Some discussion of imagemagick's PNG handling is here: http://www.imagemagick.org/Usage/formats/#png_formats They list PNG8, PNG24, and PNG32 as available formats, but the following section implies that

-define png:bit-depth 16 
-define png:color-type=0 

on the commandline would have the desired output


    u16MagickImg.quality(00);
    u16MagickImg.defineSet("png:color-type", "0");
    u16MagickImg.defineSet("png:bit-depth", "16");


I tried defineSet and it didn't work for me, but next worked:

image.defineValue("png", "format", "png24");

My situation is a bit different so I use different png format specifier and value, in your case here should be:

u16MagickImg.defineValue("png", "color-type", "0");
u16MagickImg.defineValue("png", "bit-depth", "16");

See the format specifiers list here: http://www.imagemagick.org/script/command-line-options.php#define

See information of the meaning of defineValue and defineSet methods of the Image class here: http://www.imagemagick.org/Magick++/Image.html

Quotes from there:

defineValue: "Set or obtain a definition string to applied when encoding or decoding the specified format. The meanings of the definitions are format specific. The format is designated by the magick_ argument, the format-specific key is designated by key_, and the associated value is specified by value_. See the defineSet() method if the key must be removed entirely." defineSet: "Set or obtain a definition flag to applied when encoding or decoding the specified format.. Similar to the defineValue() method except that passing the flag_ value 'true' creates a value-less define with that format and key. Passing the flag_ value 'false' removes any existing matching definition. The method returns 'true' if a matching key exists, and 'false' if no matching key exists."

Also some important information from the png.c source file: If the image cannot be written without loss with the requested bit-depth and color-type, a PNG file will not be written, a warning will be issued, and the encoder will return MagickFalse.

I'm not an expert and not sure it will work in your particular case, but the code above is something that actually worked for me on OS X ImageMagick 10.6.9.

Hope this helps.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号