I’ve read a GeoTIFF file into a std::vector<std::uint8_t>
which gets passed around as a const ref to various functions. One of those functions I need to open the image as a TIFF*
so I can read some of the tags. As these files can be very large (10s or 100s of megabytes), I don’t want to copy the image in memory. I’ve found these answers which have similar approaches and definitely looked promising, but then TIFFStreamOpen
fails using the following code:
class vectorbuf : public std::streambuf
{
public:
vectorbuf( std::vector<uint8_t>& v )
{
setg( (char*)v.data(), (char*)v.data(), (char*)(v.data() + v.size()) );
}
~vectorbuf() {}
};
void GeotiffUtils::ProcessImage( const std::vector<std::uint8_t>& imgData )
{
detail::vectorbuf strmBuf( const_cast<std::vector<std::uint8_t>&>(imgData) );
std::istream memStrm( &strmBuf );
TIFF* memTiff = TIFFStreamOpen( "MemTiff", &memStrm );
std::uint16_t orientation = 1;
auto success = TIFFGetField( memTiff, TIFFTAG_ORIENTATION, &orientation );
//... (other processing)
TIFFClose( memTiff );
//... (even more processing)
}
I have been able to use TIFFStreamOpen
by copying the data into a std::string
and std::istringstream
like so:
void GeotiffUtils::ProcessImage( const std::vector<std::uint8_t>& imgData )
{
std::string imgString( imgData.begin(), imgData.end() );
std::istringstream memStrm( imgString, std::ios::binary | std::ios::in );
TIFF* memTiff = TIFFStreamOpen( "MemTiff", &memStrm);
std::uint16_t orientation = 1;
auto success = TIFFGetField( memTiff, TIFFTAG_ORIENTATION, &orientation );
//... (other processing)
TIFFClose( memTiff );
//... (even more processing)
}
This is obviously less than ideal with large files.
Is there a reason why the copied data can be read successfully but not the original vector?