By default, INTGeoServer doesn't compress the data sent by the seismictraces and enumeratedtraces web services. Both services tend to send large chunks of data at once. If the bandwidth is low, data transfer becomes a bottleneck and INTGeoServer clients experience slow access. To improve performance in such cases, INTGeoServer supports compression of these traces.
To know which compression services INTGeoServer supports, use the serverinfo web service. The "acceptableSeismicCompression" JSON attribute will give the list of all compression algorithm names. This attribute only shows if the INTGeoCompressionService.jar file is present in your WEB-INF/lib directory. By default, two compression algorithms are plugged:
RawSeismicCompressor: doesn't perform any compression, this is the default
HaarSeismicCompressor: performs a compression based upon Haar wavelet transforms
The raw compression is plugged in the layer.xml file of the INTGeoSeismicService.jar file as:
<folder name="com.interactive.intgeoapi.server.seismic.requesthandlers.seismiccompressors.AbstractSeismicCompressor">
<file name="com.interactive.intgeo.server.seismic.requesthandlers.seismiccompressors.RawSeismicCompressor.instance">
<attr name="position" intvalue="100000"/>
</file>
</folder>
The Haar compression is a lossy compression that is plugged in the layer.xml file of the INTGeoCompressionService.jar file as:
<folder name="com.interactive.intgeoapi.server.seismic.requesthandlers.seismiccompressors.AbstractSeismicCompressor">
<file name="com.interactive.intgeo.server.seismic.requesthandlers.seismiccompressors.HaarSeismicCompressor.instance">
<attr name="position" intvalue="20"/>
</file>
</folder>
Testing shows that the Haar compression starts being useful for bandwidths that are lower than 100 Mbps. It is highly dependent on the data being transferred as the Haar compression takes advantage of trace similarities to compress data.
The AbstractSeismicCompressor class has a simple signature:
public abstract class AbstractSeismicCompressor {
public abstract String getName();
public abstract AbstractSeismicCompressor getInstance();
public abstract void printTraces(JSONObject jsonObject, ISeismicData seismicData, Collection<Integer> traceIndexes, ISeismicReader seismicReader) throws Exception;
}
The compressor is picked based upon the typeTransform JSON parameter. Valid options are "HaarWavelet" and "NoCompress". The default is "NoCompress"
JSONObject dataObject = jsonObject.getJSONObject("data");
String sTypeTransform = JSONUtil.getStringFromJSON(dataObject, "typeTransform");
The printTraces method serializes the header and sample value of the trace indexes specified. Implementations typically extract their own parameters from the jsonObject.
For example, the raw compressor makes these calls to parse input parameters:
JSONObject dataObject = jsonObject.getJSONObject("data");
String byteOrderName = JSONUtil.getStringFromJSON(dataObject, "byteOrder");
String formatName = JSONUtil.getStringFromJSON(dataObject, "sampleFormat");
String sSamples = JSONUtil.getStringFromJSON(dataObject, "samples");
The Haar compressor makes these calls to parse input parameters:
JSONObject dataObject = jsonObject.getJSONObject("data");
String sError = JSONUtil.getStringFromJSON(dataObject, "error");
String sEnabled = JSONUtil.getStringFromJSON(dataObject, "enabled");
String sSamples = JSONUtil.getStringFromJSON(dataObject, "samples");
String byteOrderName = JSONUtil.getStringFromJSON(dataObject, "byteOrder");
By plugging your own compressor, you have control over every byte that is sent along with which JSON parameters your compressor requires to work.
byteOrder represents the endianess of the streamed sample bytes. Valid options are "BIG_ENDIAN" or "LITTLE_ENDIAN". This is an optional parameter. When not specified, the endianess of the OS where INTGeoServer is hosted will be used
sampleFormat represents the format of the samples. Valid options are "Raw", "Byte", "Short", "Float" and "Integer"
samples indicates whether sample values should be included. Valid options are "true" or "false". When samples is false, only header values are transferred. When not specified, sample values are included
For the Haar compression only:
error represents the maximum error percentage during the quantization phase of the Haar compression
enabled indicates whether an AGC should be performed prior to the compression. Valid options are "true" or "false". When enabled is false, no AGC is performed