Skip to content
GitLab
Explore
Sign in
Register
Commits on Source (4)
New upstream version 1.14
· 49b91497
Emmanuel Bourg
authored
Jun 19, 2017
49b91497
New upstream version 1.15
· be0316e0
Emmanuel Bourg
authored
Jul 16, 2018
be0316e0
New upstream version 1.16
· 46a0d1f1
Emmanuel Bourg
authored
Jul 16, 2018
46a0d1f1
New upstream version 1.17
· e215b78d
Emmanuel Bourg
authored
Jul 16, 2018
e215b78d
Show whitespace changes
Inline
Side-by-side
NOTICE.txt
View file @
e215b78d
Apache Commons Compress
Copyright 2002-201
6
The Apache Software Foundation
Copyright 2002-201
8
The Apache Software Foundation
This product includes software developed at
The Apache Software Foundation (http://www.apache.org/).
The Apache Software Foundation (http
s
://www.apache.org/).
The files in the package org.apache.commons.compress.archivers.sevenz
were derived from the LZMA SDK, version 9.20 (C/ and CPP/7zip/),
...
...
README.txt
View file @
e215b78d
...
...
@@ -4,7 +4,7 @@ Apache Commons Compress
Commons Compress is a Java library for working with various
compression and archiving formats.
For full documentation see http://commons.apache.org/proper/commons-compress/
For full documentation see http
s
://commons.apache.org/proper/commons-compress/
## Apache Commons Compress was derived from various sources, including:
...
...
RELEASE-NOTES.txt
View file @
e215b78d
...
...
@@ -2,8 +2,241 @@
Apache Commons Compress software defines an API for working with
compression and archive formats. These include: bzip2, gzip, pack200,
lzma, xz, Snappy, traditional Unix Compress, DEFLATE and ar, cpio,
jar, tar, zip, dump, 7z, arj.
lzma, xz, Snappy, traditional Unix Compress, DEFLATE, DEFLATE64, LZ4,
Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj.
Release 1.17
------------
New features:
o Added a unit test that is supposed to fail if we break the
OSGi manifest entries again.
Issue: COMPRESS-443.
o Add a new SkipShieldingInputStream class that can be used wit
streams that throw an IOException whne skip is invoked.
Issue: COMPRESS-449.
o New constructors have been added to SevenZFile that accept
char[]s rather than byte[]s in order to avoid a common error
of using the wrong encoding when creating the byte[]. This
change may break source compatibility for client code that
uses one of the constructors expecting a password and passes
in null as password. We recommend to change the code to use a
constructor without password argument.
Issue: COMPRESS-452.
Fixed Bugs:
o Removed the objenesis dependency from the pom as it is not
needed at all.
o Fixed resource leak in ParallelScatterZipCreator#writeTo.
Issue: COMPRESS-446.
o Certain errors when parsing ZIP extra fields in corrupt
archives are now turned into ZipException, they used to
manifest as ArrayIndexOutOfBoundsException before.
Issue: COMPRESS-447.
o IOUtils.copy now verifies the buffer size is bigger than 0.
Issue: COMPRESS-451.
o ZipArchiveInputStream failed to read some files with stored
entries using a data descriptor.
Issue: COMPRESS-454.
Changes:
o Fixed some code examples.
Github Pull Request #63.
Thanks to Marchenko Sergey.
o The streams returned by ZipFile and most other decompressing
streams now provide information about the number of compressed
and uncompressed bytes read so far. This may be used to detect
a ZipBomb if the compression ratio exceeds a certain
threshold, for example.
For SevenZFile a new method returns the statistics for the
current entry.
Issue: COMPRESS-445.
Thanks to Andreas Beeker.
o Added a workaround for a bug in AdoptOpenJDK for S/390 to
BZip2CompressorInputStream.
Issue: COMPRESS-453.
Release 1.16.1
--------------
Fixed Bug:
o Fixed the OSGi manifest entry for imports that has been broken
in 1.16.
Issue: COMPRESS-442.
Release 1.16
------------
New features:
o Add read-only support for Zstandard compression based on the
Zstd-jni project.
Issue: COMPRESS-423. Thanks to Andre F de Miranda.
o Added auto-detection for Zstandard compressed streams.
Issue: COMPRESS-425.
o Added write-support for Zstandard compression.
Issue: COMPRESS-426.
o Added read-only DEFLATE64 support to ZIP archives and as
stand-alone CompressorInputStream.
Issue: COMPRESS-380. Thanks to Christian Marquez Grabia.
o Added read-only DEFLATE64 support to 7z archives.
Issue: COMPRESS-437.
Fixed Bugs:
o Synchronized iteration over a synchronizedList in
ParallelScatterZipCreator.
Issue: COMPRESS-430. Thanks to Bruno P. Kinoshita.
o ZipFile could get stuck in an infinite loop when parsing ZIP
archives with certain strong encryption headers.
Issue: COMPRESS-432.
o Added improved checks to detect corrupted bzip2 streams and
throw the expected IOException rather than obscure
RuntimeExceptions.
Issue: COMPRESS-424.
Changes:
o Replaces instanceof checks with a type marker in LZ77 support code.
Issue: COMPRESS-435. Thanks to BELUGA BEHR.
o Updated XZ for Java dependency to 1.8 in order to pick up bug fix
to LZMA2InputStream's available method.
o ZipArchiveEntry now exposes how the name or comment have been
determined when the entry was read.
Issue: COMPRESS-429. Thanks to Damiano Albani.
o ZipFile.getInputStream will now always buffer the stream
internally in order to improve read performance.
Issue: COMPRESS-438.
o Speed improvement for DEFLATE64 decompression.
Issue: COMPRESS-440. Thanks to Dawid Weiss.
o Added a few extra sanity checks for the rarer compression
methods used in ZIP archives.
Issue: COMPRESS-436.
o Simplified the special handling for the dummy byte required by
zlib when using java.util.zip.Inflater.
Issue: COMPRESS-441.
o Various code cleanups.
Github Pull Request #61. Thanks to Shahab Kondri.
o TarArchiveEntry's preserveLeadingSlashes constructor argument
has been renamed and can now also be used to preserve the
drive letter on Windows.
Release 1.15
------------
New features:
o Added magic MANIFEST entry Automatic-Module-Name so the module
name will be org.apache.commons.compress when the jar is used
as an automatic module in Java9.
Issue: COMPRESS-397.
o Added a new utility class FixedLengthBlockOutputStream that
can be used to ensure writing always happens in blocks of a
given size.
Issue: COMPRESS-405. Thanks to Simon Spero.
o It is now possible to specify/read custom PAX headers when
writing/reading tar archives.
Issue: COMPRESS-400. Thanks to Simon Spero.
Fixed Bugs:
o Make sure "version needed to extract" in local file header and
central directory of a ZIP archive agree with each other.
Also ensure the version is set to 2.0 if DEFLATE is used.
Issue: COMPRESS-394.
o Don't use a data descriptor in ZIP archives when copying a raw
entry that already knows its size and CRC information.
Issue: COMPRESS-395.
o Travis build redundantly repeats compilation and tests redundantly
GitHub Pull Request #43. Thanks to Simon Spero.
Issue: COMPRESS-413
o The MANIFEST of 1.14 lacks an OSGi Import-Package for XZ for
Java.
Issue: COMPRESS-396.
o BUILDING.md now passes the RAT check.
Issue: COMPRESS-406. Thanks to Simon Spero.
o Made sure ChecksumCalculatingInputStream receives valid
checksum and input stream instances via the constructor.
Issue: COMPRESS-412. Thanks to Michael Hausegger.
o TarArchiveOutputStream now verifies the block and record sizes
specified at construction time are compatible with the tar
specification. In particular 512 is the only record size
accepted and the block size must be a multiple of 512.
Issue: COMPRESS-407. Thanks to Simon Spero.
o Fixed class names of CpioArchiveEntry and
CpioArchiveInputStream in various Javadocs.
Issue: COMPRESS-415.
o The code of the extended timestamp zip extra field incorrectly
assumed the time was stored as unsigned 32-bit int and thus
created incorrect results for years after 2037.
Issue: COMPRESS-416. Thanks to Simon Spero.
o Removed ZipEncoding code that became obsolete when we started
to require Java 5 as baseline long ago.
Issue: COMPRESS-410. Thanks to Simon Spero.
o The tar package will no longer try to parse the major and
minor device numbers unless the entry represents a character
or block special file.
Issue: COMPRESS-417.
o When reading tar headers with name fields containing embedded
NULs, the name will now be terminated at the first NUL byte.
Issue: COMPRESS-421. Thanks to Roel Spilker.
o Simplified TarArchiveOutputStream by replacing the internal
buffering with new class FixedLengthBlockOutputStream.
Issue: COMPRESS-409.
Release 1.14
------------
New features:
o Added write support for Snappy.
Issue: COMPRESS-246.
o Added support for LZ4 (block and frame format).
Issue: COMPRESS-271.
o Add static detect(InputStream in) to CompressorStreamFactory
and ArchiveStreamFactory
Issue: COMPRESS-385.
o Added a way to limit amount of memory ZCompressorStream may
use.
Issue: COMPRESS-382. Thanks to Tim Allison.
o Added a way to limit amount of memory ZCompressorStream may
use.
Issue: COMPRESS-386. Thanks to Tim Allison.
o Added a way to limit amount of memory LZMACompressorStream and
XZCompressorInputStream may use.
Issue: COMPRESS-382. Thanks to Tim Allison.
o Add Brotli decoder based on the Google Brotli library.
Issue: COMPRESS-392. Thanks to Philippe Mouawad.
o ZipEntry now exposes its data offset.
Issue: COMPRESS-390. Thanks to Zbynek Vyskovsky.
o Using ZipArchiveEntry's setAlignment it is now possible to
ensure the data offset of an entry starts at a file position
that at word or page boundaries.
A new extra field has been added for this purpose.
Issue: COMPRESS-391. Thanks to Zbynek Vyskovsky.
Fixed Bugs:
o SnappyCompressorInputStream slides the window too early
leading to ArrayIndexOutOfBoundsExceptions for some streams.
Issue: COMPRESS-378.
o ZipArchiveEntry#isUnixSymlink now only returns true if the
corresponding link flag is the only file-type flag set.
Issue: COMPRESS-379. Thanks to Guillaume Boué.
o Fixed an integer overflow in CPIO's CRC calculation.
Pull Request #17. Thanks to Daniel Collin.
o Make unit tests work on Windows paths with spaces in their names.
Issue: COMPRESS-387.
o Internal location pointer in ZipFile could get incremented
even if nothing had been read.
Issue: COMPRESS-389.
o LZMACompressorOutputStream#flush would throw an exception
rather than be the NOP it promised to be.
Issue: COMPRESS-393.
Changes:
o The blocksize for FramedSnappyCompressorInputStream can now be
configured as some IWA files seem to be using blocks larger
than the default 32k.
Issue: COMPRESS-358.
o BZip2CompressorInputstream now uses BitInputStream internally.
Pull Request #13. Thanks to Thomas Meyer.
o Improved performance for concurrent reads from ZipFile when
reading from a file.
Issue: COMPRESS-388. Thanks to Zbynek Vyskovsky.
Release 1.13
------------
...
...
@@ -285,7 +518,7 @@ For complete information on Apache Commons Compress, including instructions
on how to submit bug reports, patches, or suggestions for improvement,
see the Apache Commons Compress website:
http://commons.apache.org/compress/
http
s
://commons.apache.org/compress/
Old Release Notes
=================
...
...
@@ -603,7 +836,7 @@ Release 1.4.1
-------------
This is a security bugfix release, see
http://commons.apache.org/proper/commons-compress/security.html#Fixed_in_Apache_Commons_Compress_1.4.1
http
s
://commons.apache.org/proper/commons-compress/security.html#Fixed_in_Apache_Commons_Compress_1.4.1
Fixed Bugs:
...
...
findbugs-exclude-filter.xml
View file @
e215b78d
...
...
@@ -49,6 +49,11 @@
<Method
name=
"parse"
/>
<Bug
pattern=
"SF_SWITCH_FALLTHROUGH"
/>
</Match>
<Match>
<Class
name=
"org.apache.commons.compress.compressors.lz4.BlockLZ4CompressorInputStream"
/>
<Method
name=
"read"
/>
<Bug
pattern=
"SF_SWITCH_FALLTHROUGH"
/>
</Match>
<!-- Reason: fields unused as documented -->
<Match>
...
...
@@ -186,4 +191,17 @@
<Bug
pattern=
"EI_EXPOSE_REP2"
/>
</Match>
<!-- the array is exposed deliberately to improve performance and it
is documented that way -->
<Match>
<Class
name=
"org.apache.commons.compress.compressors.lz77support.LZ77Compressor$LiteralBlock"
/>
<Method
name=
"getData"
/>
<Bug
pattern=
"EI_EXPOSE_REP"
/>
</Match>
<Match>
<Class
name=
"org.apache.commons.compress.compressors.lz77support.LZ77Compressor$LiteralBlock"
/>
<Method
name=
"<init>"
/>
<Bug
pattern=
"EI_EXPOSE_REP2"
/>
</Match>
</FindBugsFilter>
pom.xml
View file @
e215b78d
...
...
@@ -20,42 +20,50 @@
<parent>
<groupId>
org.apache.commons
</groupId>
<artifactId>
commons-parent
</artifactId>
<version>
4
1
</version>
<version>
4
6
</version>
</parent>
<groupId>
org.apache.commons
</groupId>
<artifactId>
commons-compress
</artifactId>
<version>
1.1
3
</version>
<version>
1.1
7
</version>
<name>
Apache Commons Compress
</name>
<url>
http://commons.apache.org/proper/commons-compress/
</url>
<url>
http
s
://commons.apache.org/proper/commons-compress/
</url>
<!-- The description is not indented to make it look better in the release notes -->
<description>
Apache Commons Compress software defines an API for working with
compression and archive formats. These include: bzip2, gzip, pack200,
lzma, xz, Snappy, traditional Unix Compress, DEFLATE
and ar, cpio
,
jar, tar, zip, dump, 7z, arj.
lzma, xz, Snappy, traditional Unix Compress, DEFLATE
, DEFLATE64, LZ4
,
Brotli, Zstandard and ar, cpio,
jar, tar, zip, dump, 7z, arj.
</description>
<properties>
<maven.compiler.source>
1.7
</maven.compiler.source>
<maven.compiler.target>
1.7
</maven.compiler.target>
<commons.componentid>
compress
</commons.componentid>
<commons.module.name>
org.apache.commons.compress
</commons.module.name>
<commons.jira.id>
COMPRESS
</commons.jira.id>
<commons.jira.pid>
12310904
</commons.jira.pid>
<!-- configuration bits for cutting a release candidate -->
<commons.release.version>
${project.version}
</commons.release.version>
<commons.rc.version>
RC1
</commons.rc.version>
<powermock.version>
1.6.4
</powermock.version>
<commons.pmd-plugin.version>
3.7
</commons.pmd-plugin.version>
<commons.japicmp.version>
0.9.3
</commons.japicmp.version>
<powermock.version>
1.7.3
</powermock.version>
<commons.pmd-plugin.version>
3.8
</commons.pmd-plugin.version>
<commons.manifestlocation>
${project.build.outputDirectory}/META-INF
</commons.manifestlocation>
<commons.manifestfile>
${commons.manifestlocation}/MANIFEST.MF
</commons.manifestfile>
<!-- only show issues of the current version -->
<commons.changes.onlyCurrentVersion>
true
</commons.changes.onlyCurrentVersion>
<!-- generate report even if there are binary incompatible changes -->
<commons.japicmp.breakBuildOnBinaryIncompatibleModifications>
false
</commons.japicmp.breakBuildOnBinaryIncompatibleModifications>
<pax.exam.version>
4.11.0
</pax.exam.version>
<slf4j.version>
1.7.21
</slf4j.version>
</properties>
<issueManagement>
<system>
jira
</system>
<url>
http://issues.apache.org/jira/browse/COMPRESS
</url>
<url>
http
s
://issues.apache.org/jira/browse/COMPRESS
</url>
</issueManagement>
<dependencies>
...
...
@@ -65,10 +73,22 @@ jar, tar, zip, dump, 7z, arj.
<version>
4.12
</version>
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
com.github.luben
</groupId>
<artifactId>
zstd-jni
</artifactId>
<version>
1.3.3-3
</version>
<optional>
true
</optional>
</dependency>
<dependency>
<groupId>
org.brotli
</groupId>
<artifactId>
dec
</artifactId>
<version>
0.1.2
</version>
<optional>
true
</optional>
</dependency>
<dependency>
<groupId>
org.tukaani
</groupId>
<artifactId>
xz
</artifactId>
<version>
1.
6
</version>
<version>
1.
8
</version>
<optional>
true
</optional>
</dependency>
<dependency>
...
...
@@ -83,6 +103,57 @@ jar, tar, zip, dump, 7z, arj.
<version>
${powermock.version}
</version>
<scope>
test
</scope>
</dependency>
<!-- integration test verifiying OSGi bundle works -->
<dependency>
<groupId>
org.ops4j.pax.exam
</groupId>
<artifactId>
pax-exam-container-native
</artifactId>
<version>
${pax.exam.version}
</version>
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
org.ops4j.pax.exam
</groupId>
<artifactId>
pax-exam-junit4
</artifactId>
<version>
${pax.exam.version}
</version>
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
org.ops4j.pax.exam
</groupId>
<artifactId>
pax-exam-cm
</artifactId>
<version>
${pax.exam.version}
</version>
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
org.ops4j.pax.exam
</groupId>
<artifactId>
pax-exam-link-mvn
</artifactId>
<version>
${pax.exam.version}
</version>
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
org.apache.felix
</groupId>
<artifactId>
org.apache.felix.framework
</artifactId>
<version>
5.6.10
</version>
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
javax.inject
</groupId>
<artifactId>
javax.inject
</artifactId>
<version>
1
</version>
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
org.slf4j
</groupId>
<artifactId>
slf4j-api
</artifactId>
<version>
${slf4j.version}
</version>
<scope>
test
</scope>
</dependency>
<dependency>
<groupId>
org.osgi
</groupId>
<artifactId>
org.osgi.core
</artifactId>
<version>
6.0.0
</version>
<scope>
provided
</scope>
</dependency>
</dependencies>
<developers>
...
...
@@ -126,6 +197,11 @@ jar, tar, zip, dump, 7z, arj.
<id>
ggregory
</id>
<email>
ggregory at apache.org
</email>
</developer>
<developer>
<name>
Rob Tompkins
</name>
<id>
chtompki
</id>
<email>
chtompki at apache.org
</email>
</developer>
</developers>
<contributors>
...
...
@@ -155,10 +231,18 @@ jar, tar, zip, dump, 7z, arj.
<contributor>
<name>
BELUGA BEHR
</name>
</contributor>
<contributor>
<name>
Simon Spero
</name>
<email>
sesuncedu@gmail.com
</email>
</contributor>
<contributor>
<name>
Michael Hausegger
</name>
<email>
hausegger.michael@googlemail.com
</email>
</contributor>
</contributors>
<scm>
<connection>
scm:git:http://git-wip-us.apache.org/repos/asf/commons-compress.git
</connection>
<connection>
scm:git:http
s
://git-wip-us.apache.org/repos/asf/commons-compress.git
</connection>
<developerConnection>
scm:git:https://git-wip-us.apache.org/repos/asf/commons-compress.git
</developerConnection>
<url>
https://git-wip-us.apache.org/repos/asf?p=commons-compress.git
</url>
</scm>
...
...
@@ -170,11 +254,12 @@ jar, tar, zip, dump, 7z, arj.
<plugin>
<groupId>
org.apache.maven.plugins
</groupId>
<artifactId>
maven-javadoc-plugin
</artifactId>
<version>
${commons.javadoc.version}
</version>
<configuration>
<quiet>
true
</quiet>
<source>
${maven.compiler.source}
</source>
<encoding>
${commons.encoding}
</encoding>
<doc
E
ncoding>
${commons.docEncoding}
</doc
E
ncoding>
<doc
e
ncoding>
${commons.docEncoding}
</doc
e
ncoding>
<linksource>
true
</linksource>
<links>
<link>
${commons.javadoc.java.link}
</link>
...
...
@@ -209,20 +294,22 @@ jar, tar, zip, dump, 7z, arj.
<exclude>
src/test/resources/**
</exclude>
<exclude>
.pmd
</exclude>
<exclude>
.projectile
</exclude>
<exclude>
.mvn/**
</exclude>
</excludes>
</configuration>
</plugin>
<plugin>
<groupId>
com.github.siom79.japicmp
</groupId>
<artifactId>
japicmp-maven-plugin
</artifactId>
<version>
${commons.japicmp.version}
</version>
<groupId>
org.eluder.coveralls
</groupId>
<artifactId>
coveralls-maven-plugin
</artifactId>
<configuration>
<parameter>
<onlyModified>
true
</onlyModified>
<breakBuildOnBinaryIncompatibleModifications>
false
</breakBuildOnBinaryIncompatibleModifications>
</parameter>
<failOnServiceError>
false
</failOnServiceError>
</configuration>
</plugin>
<plugin>
<groupId>
org.apache.felix
</groupId>
<artifactId>
maven-bundle-plugin
</artifactId>
<version>
${commons.felix.version}
</version>
</plugin>
</plugins>
</pluginManagement>
<plugins>
...
...
@@ -244,6 +331,7 @@ jar, tar, zip, dump, 7z, arj.
<manifestEntries>
<Main-Class>
org.apache.commons.compress.archivers.Lister
</Main-Class>
<Extension-Name>
org.apache.commons.compress
</Extension-Name>
<Automatic-Module-Name>
${commons.module.name}
</Automatic-Module-Name>
</manifestEntries>
</archive>
</configuration>
...
...
@@ -252,8 +340,9 @@ jar, tar, zip, dump, 7z, arj.
<groupId>
org.apache.felix
</groupId>
<artifactId>
maven-bundle-plugin
</artifactId>
<configuration>
<manifestLocation>
${commons.manifestlocation}
</manifestLocation>
<instructions>
<Import-Package>
org.tukaani.xz;resolution:=optional
</Import-Package>
<Import-Package>
org.tukaani.xz;resolution:=optional
,org.brotli.dec;resolution:=optional,com.github.luben.zstd;resolution:=optional
</Import-Package>
</instructions>
</configuration>
</plugin>
...
...
@@ -271,6 +360,34 @@ jar, tar, zip, dump, 7z, arj.
<artifactId>
maven-pmd-plugin
</artifactId>
<version>
${commons.pmd-plugin.version}
</version>
</plugin>
<plugin>
<groupId>
org.apache.maven.plugins
</groupId>
<artifactId>
maven-antrun-plugin
</artifactId>
<executions>
<execution>
<phase>
process-test-resources
</phase>
<configuration>
<target>
<untar
src=
"${basedir}/src/test/resources/zstd-tests.tar"
dest=
"${project.build.testOutputDirectory}"
/>
</target>
</configuration>
<goals>
<goal>
run
</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>
maven-surefire-plugin
</artifactId>
<configuration>
<systemPropertyVariables>
<pax.exam.karaf.version>
${karaf.version}
</pax.exam.karaf.version>
<commons-compress.version>
${project.version}
</commons-compress.version>
</systemPropertyVariables>
</configuration>
</plugin>
</plugins>
</build>
...
...
@@ -297,7 +414,7 @@ jar, tar, zip, dump, 7z, arj.
<quiet>
true
</quiet>
<source>
${maven.compiler.source}
</source>
<encoding>
${commons.encoding}
</encoding>
<doc
E
ncoding>
${commons.docEncoding}
</doc
E
ncoding>
<doc
e
ncoding>
${commons.docEncoding}
</doc
e
ncoding>
<linksource>
true
</linksource>
<links>
<link>
${commons.javadoc.java.link}
</link>
...
...
@@ -325,7 +442,7 @@ jar, tar, zip, dump, 7z, arj.
<plugin>
<groupId>
org.codehaus.mojo
</groupId>
<artifactId>
findbugs-maven-plugin
</artifactId>
<version>
3.0.
4
</version>
<version>
3.0.
5
</version>
<configuration>
<threshold>
Normal
</threshold>
<effort>
Default
</effort>
...
...
@@ -386,34 +503,20 @@ jar, tar, zip, dump, 7z, arj.
</plugins>
</build>
</profile>
<!-- can be removed in favor of commons-parent's travis-jacoco
profile once parent 42 has been released -->
<profile>
<id>
travis
</id>
<id>
java9
</id>
<activation>
<property>
<name>
env.TRAVIS
</name>
<value>
true
</value>
</property>
<jdk>
9
</jdk>
</activation>
<build>
<plugins>
<plugin>
<groupId>
org.jacoco
</groupId>
<artifactId>
jacoco-maven-plugin
</artifactId>
<version>
${commons.jacoco.version}
</version>
</plugin>
<plugin>
<groupId>
org.eluder.coveralls
</groupId>
<artifactId>
coveralls-maven-plugin
</artifactId>
<version>
4.3.0
</version>
<configuration>
<timestampFormat>
EpochMillis
</timestampFormat>
</configuration>
</plugin>
</plugins>
</build>
<properties>
<maven.compiler.release>
9
</maven.compiler.release>
<commons.jacoco.version>
0.7.9
</commons.jacoco.version>
<animal.sniffer.skip>
true
</animal.sniffer.skip>
<!-- coverall version 4.3.0 does not work with java 9, see https://github.com/trautonen/coveralls-maven-plugin/issues/112 -->
<coveralls.skip>
true
</coveralls.skip>
</properties>
</profile>
</profiles>
</project>
src/changes/changes.xml
View file @
e215b78d
...
...
@@ -42,7 +42,313 @@ The <action> type attribute can be add,update,fix,remove.
<title>
commons-compress
</title>
</properties>
<body>
<release
version=
"1.13"
date=
"not released, yet"
<release
version=
"1.17"
date=
"not released, yet"
description=
"Release 1.17"
>
<action
type=
"fix"
date=
"2018-02-06"
>
Removed the objenesis dependency from the pom as it is not
needed at all.
</action>
<action
issue=
"COMPRESS-446"
type=
"fix"
date=
"2018-03-29"
>
Fixed resource leak in ParallelScatterZipCreator#writeTo.
</action>
<action
type=
"update"
date=
"2018-04-01"
due-to=
"Marchenko Sergey"
>
Fixed some code examples.
Github Pull Request #63.
</action>
<action
issue=
"COMPRESS-447"
type=
"fix"
date=
"2018-04-22"
>
Certain errors when parsing ZIP extra fields in corrupt
archives are now turned into ZipException, they used to
manifest as ArrayIndexOutOfBoundsException before.
</action>
<action
issue=
"COMPRESS-445"
type=
"update"
date=
"2018-04-22"
due-to=
"Andreas Beeker"
>
The streams returned by ZipFile and most other decompressing
streams now provide information about the number of compressed
and uncompressed bytes read so far. This may be used to detect
a ZipBomb if the compression ratio exceeds a certain
threshold, for example.
For SevenZFile a new method returns the statistics for the
current entry.
</action>
<action
issue=
"COMPRESS-443"
type=
"add"
date=
"2018-04-25"
>
Added a unit test that is supposed to fail if we break the
OSGi manifest entries again.
</action>
<action
issue=
"COMPRESS-449"
type=
"add"
date=
"2018-05-02"
>
Add a new SkipShieldingInputStream class that can be used wit
streams that throw an IOException whne skip is invoked.
</action>
<action
issue=
"COMPRESS-451"
type=
"fix"
date=
"2018-05-04"
>
IOUtils.copy now verifies the buffer size is bigger than 0.
</action>
<action
issue=
"COMPRESS-452"
type=
"add"
date=
"2018-05-09"
>
New constructors have been added to SevenZFile that accept
char[]s rather than byte[]s in order to avoid a common error
of using the wrong encoding when creating the byte[]. This
change may break source compatibility for client code that
uses one of the constructors expecting a password and passes
in null as password. We recommend to change the code to use a
constructor without password argument.
</action>
<action
issue=
"COMPRESS-453"
type=
"update"
date=
"2018-05-24"
>
Added a workaround for a bug in AdoptOpenJDK for S/390 to
BZip2CompressorInputStream.
</action>
<action
issue=
"COMPRESS-454"
type=
"fix"
date=
"2018-05-30"
>
ZipArchiveInputStream failed to read some files with stored
entries using a data descriptor.
</action>
</release>
<release
version=
"1.16.1"
date=
"2018-02-10"
description=
"Release 1.16.1"
>
<action
issue=
"COMPRESS-442"
type=
"fix"
date=
"2018-02-06"
>
Fixed the OSGi manifest entry for imports that has been broken
in 1.16.
</action>
</release>
<release
version=
"1.16"
date=
"2018-02-05"
description=
"Release 1.16"
>
<action
issue=
"COMPRESS-423"
type=
"add"
date=
"2017-10-17"
due-to=
"Andre F de Miranda"
>
Add read-only support for Zstandard compression based on the
Zstd-jni project.
</action>
<action
issue=
"COMPRESS-425"
type=
"add"
date=
"2017-10-22"
>
Added auto-detection for Zstandard compressed streams.
</action>
<action
issue=
"COMPRESS-430"
type=
"fix"
date=
"2017-11-25"
due-to=
"Bruno P. Kinoshita"
>
Synchronized iteration over a synchronizedList in ParallelScatterZipCreator.
</action>
<action
issue=
"COMPRESS-432"
type=
"fix"
date=
"2017-12-22"
>
ZipFile could get stuck in an infinite loop when parsing ZIP
archives with certain strong encryption headers.
</action>
<action
issue=
"COMPRESS-435"
type=
"update"
date=
"2017-12-27"
due-to=
"BELUGA BEHR"
>
Replaces instanceof checks with a type marker in LZ77 support code.
</action>
<action
issue=
"COMPRESS-426"
type=
"add"
date=
"2017-12-28"
>
Added write-support for Zstandard compression.
</action>
<action
issue=
"COMPRESS-424"
type=
"fix"
date=
"2017-12-30"
>
Added improved checks to detect corrupted bzip2 streams and
throw the expected IOException rather than obscure
RuntimeExceptions.
</action>
<action
type=
"update"
date=
"2018-01-04"
>
Updated XZ for Java dependency to 1.8 in order to pick up bug
fix to LZMA2InputStream's available method.
</action>
<action
type=
"update"
date=
"2018-01-05"
issue=
"COMPRESS-429"
due-to=
"Damiano Albani"
>
ZipArchiveEntry now exposes how the name or comment have been
determined when the entry was read.
</action>
<action
issue=
"COMPRESS-380"
type=
"add"
date=
"2018-01-09"
due-to=
"Christian Marquez Grabia"
>
Added read-only DEFLATE64 support to ZIP archives and as
stand-alone CompressorInputStream.
</action>
<action
issue=
"COMPRESS-438"
type=
"update"
date=
"2018-01-10"
>
ZipFile.getInputStream will now always buffer the stream
internally in order to improve read performance.
</action>
<action
issue=
"COMPRESS-440"
type=
"update"
date=
"2018-01-12"
due-to=
"Dawid Weiss"
>
Speed improvement for DEFLATE64 decompression.
</action>
<action
issue=
"COMPRESS-437"
type=
"add"
date=
"2018-01-13"
>
Added read-only DEFLATE64 support to 7z archives.
</action>
<action
issue=
"COMPRESS-436"
type=
"update"
date=
"2018-01-14"
>
Added a few extra sanity checks for the rarer compression
methods used in ZIP archives.
</action>
<action
issue=
"COMPRESS-441"
type=
"update"
date=
"2018-01-14"
>
Simplified the special handling for the dummy byte required by
zlib when using java.util.zip.Inflater.
</action>
<action
type=
"update"
date=
"2018-01-18"
due-to=
"Shahab Kondri"
>
Various code cleanups.
Github Pull Request #61.
</action>
<action
type=
"update"
date=
"2018-01-29"
>
TarArchiveEntry's preserveLeadingSlashes constructor argument
has been renamed and can now also be used to preserve the
drive letter on Windows.
</action>
</release>
<release
version=
"1.15"
date=
"2017-10-17"
description=
"Release 1.15
----------------------------------------
TarArchiveOutputStream now ensures record size is 512 and block size is
a multiple of 512 as any other value would create invalid tar
archives. This may break compatibility for code that deliberately
wanted to create such files."
>
<action
issue=
"COMPRESS-394"
type=
"fix"
date=
"2017-05-22"
>
Make sure "version needed to extract" in local file header and
central directory of a ZIP archive agree with each other.
Also ensure the version is set to 2.0 if DEFLATE is used.
</action>
<action
issue=
"COMPRESS-395"
type=
"fix"
date=
"2017-05-22"
>
Don't use a data descriptor in ZIP archives when copying a raw
entry that already knows its size and CRC information.
</action>
<action
issue=
"COMPRESS-413"
type=
"fix"
date=
"2017-05-22"
due-to=
"Simon Spero"
>
Travis build redundantly repeats compilation and tests redundantly #43.
</action>
<action
issue=
"COMPRESS-397"
type=
"add"
date=
"2017-05-22"
>
Added magic MANIFEST entry Automatic-Module-Name so the module
name will be org.apache.commons.compress when the jar is used
as an automatic module in Java9.
</action>
<action
issue=
"COMPRESS-396"
type=
"fix"
date=
"2017-05-23"
>
The MANIFEST of 1.14 lacks an OSGi Import-Package for XZ for
Java.
</action>
<action
issue=
"COMPRESS-406"
type=
"fix"
date=
"2017-06-12"
due-to=
"Simon Spero"
>
BUILDING.md now passes the RAT check.
</action>
<action
issue=
"COMPRESS-405"
type=
"add"
date=
"2017-06-15"
due-to=
"Simon Spero "
>
Added a new utility class FixedLengthBlockOutputStream that
can be used to ensure writing always happens in blocks of a
given size.
</action>
<action
issue=
"COMPRESS-412"
type=
"fix"
date=
"2017-06-17"
due-to=
"Michael Hausegger"
>
Made sure ChecksumCalculatingInputStream receives valid
checksum and input stream instances via the constructor.
</action>
<action
issue=
"COMPRESS-407"
type=
"fix"
date=
"2017-06-24"
due-to=
"Simon Spero "
>
TarArchiveOutputStream now verifies the block and record sizes
specified at construction time are compatible with the tar
specification. In particular 512 is the only record size
accepted and the block size must be a multiple of 512.
At the same time the default block size in
TarArchiveOutputStream has been changed from 10240 to 512
bytes.
</action>
<action
issue=
"COMPRESS-400"
type=
"add"
date=
"2017-06-26"
due-to=
"Simon Spero "
>
It is now possible to specify/read custom PAX headers when
writing/reading tar archives.
</action>
<action
issue=
"COMPRESS-415"
type=
"fix"
date=
"2017-06-27"
>
Fixed class names of CpioArchiveEntry and
CpioArchiveInputStream in various Javadocs.
</action>
<action
issue=
"COMPRESS-416"
type=
"fix"
date=
"2017-07-04"
due-to=
"Simon Spero "
>
The code of the extended timestamp zip extra field incorrectly
assumed the time was stored as unsigned 32-bit int and thus
created incorrect results for years after 2037.
</action>
<action
issue=
"COMPRESS-410"
type=
"fix"
date=
"2017-07-05"
due-to=
"Simon Spero "
>
Removed ZipEncoding code that became obsolete when we started
to require Java 5 as baseline long ago.
</action>
<action
issue=
"COMPRESS-417"
type=
"fix"
date=
"2017-07-19"
>
The tar package will no longer try to parse the major and
minor device numbers unless the entry represents a character
or block special file.
</action>
<action
issue=
"COMPRESS-421"
type=
"fix"
date=
"2017-10-06"
due-to=
"Roel Spilker"
>
When reading tar headers with name fields containing embedded
NULs, the name will now be terminated at the first NUL byte.
</action>
<action
issue=
"COMPRESS-409"
type=
"fix"
date=
"2017-10-08"
>
Simplified TarArchiveOutputStream by replacing the internal
buffering with new class FixedLengthBlockOutputStream.
</action>
</release>
<release
version=
"1.14"
date=
"2017-05-14"
description=
"Release 1.14"
>
<action
issue=
"COMPRESS-378"
type=
"fix"
date=
"2017-01-09"
>
SnappyCompressorInputStream slides the window too early
leading to ArrayIndexOutOfBoundsExceptions for some streams.
</action>
<action
issue=
"COMPRESS-246"
type=
"add"
date=
"2017-01-10"
>
Added write support for Snappy.
</action>
<action
issue=
"COMPRESS-358"
type=
"update"
date=
"2017-01-10"
>
The blocksize for FramedSnappyCompressorInputStream can now be
configured as some IWA files seem to be using blocks larger
than the default 32k.
</action>
<action
issue=
"COMPRESS-379"
type=
"fix"
date=
"2017-01-15"
due-to=
"Guillaume Boué"
>
ZipArchiveEntry#isUnixSymlink now only returns true if the
corresponding link flag is the only file-type flag set.
</action>
<action
issue=
"COMPRESS-271"
type=
"add"
date=
"2017-02-07"
>
Added support for LZ4 (block and frame format).
</action>
<action
type=
"update"
date=
"2017-02-15"
due-to=
"Thomas Meyer"
>
BZip2CompressorInputstream now uses BitInputStream internally.
Pull Request #13.
</action>
<action
type=
"fix"
date=
"2017-03-29"
due-to=
"Daniel Collin"
>
Fixed an integer overflow in CPIO's CRC calculation.
Pull Request #17.
</action>
<action
issue=
"COMPRESS-385"
type=
"add"
date=
"2017-04-18"
>
Add static detect(InputStream in) to CompressorStreamFactory
and ArchiveStreamFactory
</action>
<action
issue=
"COMPRESS-387"
type=
"fix"
date=
"2017-04-18"
>
Make unit tests work on Windows paths with spaces in their names.
</action>
<action
issue=
"COMPRESS-388"
type=
"update"
date=
"2017-04-25"
due-to=
"Zbynek Vyskovsky"
>
Improved performance for concurrent reads from ZipFile when
reading from a file.
</action>
<action
issue=
"COMPRESS-382"
type=
"add"
date=
"2017-04-25"
due-to=
"Tim Allison"
>
Added a way to limit amount of memory ZCompressorStream may
use.
</action>
<action
issue=
"COMPRESS-386"
type=
"add"
date=
"2017-04-25"
due-to=
"Tim Allison"
>
Added a way to limit amount of memory ZCompressorStream may
use.
</action>
<action
issue=
"COMPRESS-382"
type=
"add"
date=
"2017-04-25"
due-to=
"Tim Allison"
>
Added a way to limit amount of memory LZMACompressorStream and
XZCompressorInputStream may use.
</action>
<action
issue=
"COMPRESS-389"
type=
"fix"
date=
"2017-04-26"
>
Internal location pointer in ZipFile could get incremented
even if nothing had been read.
</action>
<action
issue=
"COMPRESS-392"
type=
"add"
date=
"2017-05-02"
due-to=
"Philippe Mouawad"
>
Add Brotli decoder based on the Google Brotli library.
</action>
<action
issue=
"COMPRESS-390"
type=
"add"
date=
"2017-05-04"
due-to=
"Zbynek Vyskovsky"
>
ZipEntry now exposes its data offset.
</action>
<action
issue=
"COMPRESS-393"
type=
"fix"
date=
"2017-05-07"
>
LZMACompressorOutputStream#flush would throw an exception
rather than be the NOP it promised to be.
</action>
<action
issue=
"COMPRESS-391"
type=
"add"
date=
"2017-05-11"
due-to=
"Zbynek Vyskovsky"
>
Using ZipArchiveEntry's setAlignment it is now possible to
ensure the data offset of an entry starts at a file position
that at word or page boundaries.
A new extra field has been added for this purpose.
</action>
</release>
<release
version=
"1.13"
date=
"2016-12-29"
description=
"Release 1.13 - API compatible to 1.12 but requires Java 7 at runtime."
>
<action
issue=
"COMPRESS-360"
type=
"update"
date=
"2016-06-25"
dev=
"ggregory"
>
Update Java requirement from 6 to 7.
...
...
src/changes/release-notes.vm
View file @
e215b78d
...
...
@@ -474,7 +474,7 @@ Release 1.4.1
-------------
This is a security bugfix release, see
http://commons.apache.org/proper/commons-compress/security.html
#
Fixed_in_Apache_Commons_Compress_1
.4.1
http
s
://commons.apache.org/proper/commons-compress/security.html
#
Fixed_in_Apache_Commons_Compress_1
.4.1
Fixed Bugs:
...
...
src/main/java/org/apache/commons/compress/MemoryLimitException.java
0 → 100644
View file @
e215b78d
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package
org.apache.commons.compress
;
import
java.io.IOException
;
/**
* If a stream checks for estimated memory allocation, and the estimate
* goes above the memory limit, this is thrown. This can also be thrown
* if a stream tries to allocate a byte array that is larger than
* the allowable limit.
*
* @since 1.14
*/
public
class
MemoryLimitException
extends
IOException
{
private
static
final
long
serialVersionUID
=
1L
;
//long instead of int to account for overflow for corrupt files
private
final
long
memoryNeededInKb
;
private
final
int
memoryLimitInKb
;
public
MemoryLimitException
(
long
memoryNeededInKb
,
int
memoryLimitInKb
)
{
super
(
buildMessage
(
memoryNeededInKb
,
memoryLimitInKb
));
this
.
memoryNeededInKb
=
memoryNeededInKb
;
this
.
memoryLimitInKb
=
memoryLimitInKb
;
}
public
MemoryLimitException
(
long
memoryNeededInKb
,
int
memoryLimitInKb
,
Exception
e
)
{
super
(
buildMessage
(
memoryNeededInKb
,
memoryLimitInKb
),
e
);
this
.
memoryNeededInKb
=
memoryNeededInKb
;
this
.
memoryLimitInKb
=
memoryLimitInKb
;
}
public
long
getMemoryNeededInKb
()
{
return
memoryNeededInKb
;
}
public
int
getMemoryLimitInKb
()
{
return
memoryLimitInKb
;
}
private
static
String
buildMessage
(
long
memoryNeededInKb
,
int
memoryLimitInKb
)
{
return
memoryNeededInKb
+
" kb of memory would be needed; limit was "
+
memoryLimitInKb
+
" kb. "
+
"If the file is not corrupt, consider increasing the memory limit."
;
}
}
src/main/java/org/apache/commons/compress/archivers/ArchiveEntry.java
View file @
e215b78d
...
...
@@ -28,6 +28,8 @@ public interface ArchiveEntry {
/**
* Gets the name of the entry in this archive. May refer to a file or directory or other item.
*
* <p>This method returns the raw name as it is stored inside of the archive.</p>
*
* @return The name of this entry in the archive.
*/
String
getName
();
...
...
src/main/java/org/apache/commons/compress/archivers/ArchiveException.java
View file @
e215b78d
src/main/java/org/apache/commons/compress/archivers/ArchiveInputStream.java
View file @
e215b78d
src/main/java/org/apache/commons/compress/archivers/ArchiveOutputStream.java
View file @
e215b78d
src/main/java/org/apache/commons/compress/archivers/ArchiveStreamFactory.java
View file @
e215b78d
...
...
@@ -59,15 +59,15 @@ import org.apache.commons.compress.utils.Sets;
* Compressing a ZIP-File:
*
* <pre>
* final OutputStream out =
new
FileOutputStream(output);
* final OutputStream out = File
s.new
OutputStream(output
.toPath()
);
* ArchiveOutputStream os = new ArchiveStreamFactory().createArchiveOutputStream(ArchiveStreamFactory.ZIP, out);
*
* os.putArchiveEntry(new ZipArchiveEntry("testdata/test1.xml"));
* IOUtils.copy(
new
FileInputStream(file1), os);
* IOUtils.copy(File
s.new
InputStream(file1
.toPath()
), os);
* os.closeArchiveEntry();
*
* os.putArchiveEntry(new ZipArchiveEntry("testdata/test2.xml"));
* IOUtils.copy(
new
FileInputStream(file2), os);
* IOUtils.copy(File
s.new
InputStream(file2
.toPath()
), os);
* os.closeArchiveEntry();
* os.close();
* </pre>
...
...
@@ -75,10 +75,10 @@ import org.apache.commons.compress.utils.Sets;
* Decompressing a ZIP-File:
*
* <pre>
* final InputStream is =
new
FileInputStream(input);
* final InputStream is = File
s.new
InputStream(input
.toPath()
);
* ArchiveInputStream in = new ArchiveStreamFactory().createArchiveInputStream(ArchiveStreamFactory.ZIP, is);
* ZipArchiveEntry entry = (ZipArchiveEntry)in.getNextEntry();
* OutputStream out =
new
FileOutputStream(
new File(dir,
entry.getName()));
* OutputStream out = File
s.new
OutputStream(
dir.toPath().resolve(
entry.getName()));
* IOUtils.copy(in, out);
* out.close();
* in.close();
...
...
@@ -473,6 +473,17 @@ public class ArchiveStreamFactory implements ArchiveStreamProvider {
*/
public
ArchiveInputStream
createArchiveInputStream
(
final
InputStream
in
)
throws
ArchiveException
{
return
createArchiveInputStream
(
detect
(
in
),
in
);
}
/**
* Try to determine the type of Archiver
* @param in input stream
* @return type of archiver if found
* @throws ArchiveException if an archiver cannot be detected in the stream
* @since 1.14
*/
public
static
String
detect
(
InputStream
in
)
throws
ArchiveException
{
if
(
in
==
null
)
{
throw
new
IllegalArgumentException
(
"Stream must not be null."
);
}
...
...
@@ -483,40 +494,54 @@ public class ArchiveStreamFactory implements ArchiveStreamProvider {
final
byte
[]
signature
=
new
byte
[
SIGNATURE_SIZE
];
in
.
mark
(
signature
.
length
);
int
signatureLength
=
-
1
;
try
{
int
signatureLength
=
IOUtils
.
readFully
(
in
,
signature
);
signatureLength
=
IOUtils
.
readFully
(
in
,
signature
);
in
.
reset
();
}
catch
(
IOException
e
)
{
throw
new
ArchiveException
(
"IOException while reading signature."
,
e
);
}
if
(
ZipArchiveInputStream
.
matches
(
signature
,
signatureLength
))
{
return
createArchiveInputStream
(
ZIP
,
in
)
;
return
ZIP
;
}
else
if
(
JarArchiveInputStream
.
matches
(
signature
,
signatureLength
))
{
return
createArchiveInputStream
(
JAR
,
in
)
;
return
JAR
;
}
else
if
(
ArArchiveInputStream
.
matches
(
signature
,
signatureLength
))
{
return
createArchiveInputStream
(
AR
,
in
)
;
return
AR
;
}
else
if
(
CpioArchiveInputStream
.
matches
(
signature
,
signatureLength
))
{
return
createArchiveInputStream
(
CPIO
,
in
)
;
return
CPIO
;
}
else
if
(
ArjArchiveInputStream
.
matches
(
signature
,
signatureLength
))
{
return
createArchiveInputStream
(
ARJ
,
in
)
;
return
ARJ
;
}
else
if
(
SevenZFile
.
matches
(
signature
,
signatureLength
))
{
throw
new
StreamingNotSupportedException
(
SEVEN_Z
)
;
return
SEVEN_Z
;
}
// Dump needs a bigger buffer to check the signature;
final
byte
[]
dumpsig
=
new
byte
[
DUMP_SIGNATURE_SIZE
];
in
.
mark
(
dumpsig
.
length
);
try
{
signatureLength
=
IOUtils
.
readFully
(
in
,
dumpsig
);
in
.
reset
();
}
catch
(
IOException
e
)
{
throw
new
ArchiveException
(
"IOException while reading dump signature"
,
e
);
}
if
(
DumpArchiveInputStream
.
matches
(
dumpsig
,
signatureLength
))
{
return
createArchiveInputStream
(
DUMP
,
in
)
;
return
DUMP
;
}
// Tar needs an even bigger buffer to check the signature; read the first block
final
byte
[]
tarHeader
=
new
byte
[
TAR_HEADER_SIZE
];
in
.
mark
(
tarHeader
.
length
);
try
{
signatureLength
=
IOUtils
.
readFully
(
in
,
tarHeader
);
in
.
reset
();
}
catch
(
IOException
e
)
{
throw
new
ArchiveException
(
"IOException while reading tar signature"
,
e
);
}
if
(
TarArchiveInputStream
.
matches
(
tarHeader
,
signatureLength
))
{
return
createArchiveInputStream
(
TAR
,
in
)
;
return
TAR
;
}
// COMPRESS-117 - improve auto-recognition
if
(
signatureLength
>=
TAR_HEADER_SIZE
)
{
TarArchiveInputStream
tais
=
null
;
...
...
@@ -524,9 +549,9 @@ public class ArchiveStreamFactory implements ArchiveStreamProvider {
tais
=
new
TarArchiveInputStream
(
new
ByteArrayInputStream
(
tarHeader
));
// COMPRESS-191 - verify the header checksum
if
(
tais
.
getNextTarEntry
().
isCheckSumOK
())
{
return
createArchiveInputStream
(
TAR
,
in
)
;
return
TAR
;
}
}
catch
(
final
Exception
e
)
{
// NOPMD
}
catch
(
final
Exception
e
)
{
// NOPMD
// NOSONAR
// can generate IllegalArgumentException as well
// as IOException
// autodetection, simply not a TAR
...
...
@@ -535,10 +560,6 @@ public class ArchiveStreamFactory implements ArchiveStreamProvider {
IOUtils
.
closeQuietly
(
tais
);
}
}
}
catch
(
final
IOException
e
)
{
throw
new
ArchiveException
(
"Could not use reset and mark operations."
,
e
);
}
throw
new
ArchiveException
(
"No Archiver found for the stream signature"
);
}
...
...
src/main/java/org/apache/commons/compress/archivers/ArchiveStreamProvider.java
View file @
e215b78d
src/main/java/org/apache/commons/compress/archivers/EntryStreamOffsets.java
0 → 100644
View file @
e215b78d
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package
org.apache.commons.compress.archivers
;
/**
* Provides information about ArchiveEntry stream offsets.
*/
public
interface
EntryStreamOffsets
{
/** Special value indicating that the offset is unknown. */
long
OFFSET_UNKNOWN
=
-
1
;
/**
* Gets the offset of data stream within the archive file,
*
* @return
* the offset of entry data stream, {@code OFFSET_UNKNOWN} if not known.
*/
long
getDataOffset
();
/**
* Indicates whether the stream is contiguous, i.e. not split among
* several archive parts, interspersed with control blocks, etc.
*
* @return
* true if stream is contiguous, false otherwise.
*/
boolean
isStreamContiguous
();
}
src/main/java/org/apache/commons/compress/archivers/Lister.java
View file @
e215b78d
...
...
@@ -20,8 +20,10 @@ package org.apache.commons.compress.archivers;
import
java.io.BufferedInputStream
;
import
java.io.File
;
import
java.io.
FileInputStream
;
import
java.io.
IOException
;
import
java.io.InputStream
;
import
java.nio.file.Files
;
import
org.apache.commons.compress.archivers.sevenz.SevenZFile
;
/**
* Simple command line application that lists the contents of an archive.
...
...
@@ -44,7 +46,16 @@ public final class Lister {
if
(!
f
.
isFile
())
{
System
.
err
.
println
(
f
+
" doesn't exist or is a directory"
);
}
try
(
final
InputStream
fis
=
new
BufferedInputStream
(
new
FileInputStream
(
f
));
String
format
=
args
.
length
>
1
?
args
[
1
]
:
detectFormat
(
f
);
if
(
ArchiveStreamFactory
.
SEVEN_Z
.
equalsIgnoreCase
(
format
))
{
list7z
(
f
);
}
else
{
listStream
(
f
,
args
);
}
}
private
static
void
listStream
(
File
f
,
String
[]
args
)
throws
ArchiveException
,
IOException
{
try
(
final
InputStream
fis
=
new
BufferedInputStream
(
Files
.
newInputStream
(
f
.
toPath
()));
final
ArchiveInputStream
ais
=
createArchiveInputStream
(
args
,
fis
))
{
System
.
out
.
println
(
"Created "
+
ais
.
toString
());
ArchiveEntry
ae
;
...
...
@@ -62,6 +73,22 @@ public final class Lister {
return
factory
.
createArchiveInputStream
(
fis
);
}
private
static
String
detectFormat
(
File
f
)
throws
ArchiveException
,
IOException
{
try
(
final
InputStream
fis
=
new
BufferedInputStream
(
Files
.
newInputStream
(
f
.
toPath
())))
{
return
factory
.
detect
(
fis
);
}
}
private
static
void
list7z
(
File
f
)
throws
ArchiveException
,
IOException
{
try
(
SevenZFile
z
=
new
SevenZFile
(
f
))
{
System
.
out
.
println
(
"Created "
+
z
.
toString
());
ArchiveEntry
ae
;
while
((
ae
=
z
.
getNextEntry
())
!=
null
)
{
System
.
out
.
println
(
ae
.
getName
());
}
}
}
private
static
void
usage
()
{
System
.
out
.
println
(
"Parameters: archive-name [archive-type]"
);
}
...
...
src/main/java/org/apache/commons/compress/archivers/StreamingNotSupportedException.java
View file @
e215b78d
src/main/java/org/apache/commons/compress/archivers/ar/ArArchiveEntry.java
View file @
e215b78d
...
...
@@ -47,7 +47,7 @@ import org.apache.commons.compress.archivers.ArchiveEntry;
* Compress can read but not write the GNU variant. It fully supports
* the BSD variant.
*
* @see <a href="http://www.freebsd.org/cgi/man.cgi?query=ar&sektion=5">ar man page</a>
* @see <a href="http
s
://www.freebsd.org/cgi/man.cgi?query=ar&sektion=5">ar man page</a>
*
* @Immutable
*/
...
...
@@ -179,12 +179,9 @@ public class ArArchiveEntry implements ArchiveEntry {
}
final
ArArchiveEntry
other
=
(
ArArchiveEntry
)
obj
;
if
(
name
==
null
)
{
if
(
other
.
name
!=
null
)
{
return
false
;
}
}
else
if
(!
name
.
equals
(
other
.
name
))
{
return
false
;
return
other
.
name
==
null
;
}
else
{
return
name
.
equals
(
other
.
name
);
}
return
true
;
}
}
src/main/java/org/apache/commons/compress/archivers/ar/ArArchiveInputStream.java
View file @
e215b78d
...
...
@@ -271,35 +271,11 @@ public class ArArchiveInputStream extends ArchiveInputStream {
public
static
boolean
matches
(
final
byte
[]
signature
,
final
int
length
)
{
// 3c21 7261 6863 0a3e
if
(
length
<
8
)
{
return
false
;
}
if
(
signature
[
0
]
!=
0x21
)
{
return
false
;
}
if
(
signature
[
1
]
!=
0x3c
)
{
return
false
;
}
if
(
signature
[
2
]
!=
0x61
)
{
return
false
;
}
if
(
signature
[
3
]
!=
0x72
)
{
return
false
;
}
if
(
signature
[
4
]
!=
0x63
)
{
return
false
;
}
if
(
signature
[
5
]
!=
0x68
)
{
return
false
;
}
if
(
signature
[
6
]
!=
0x3e
)
{
return
false
;
}
if
(
signature
[
7
]
!=
0x0a
)
{
return
false
;
}
return
true
;
return
length
>=
8
&&
signature
[
0
]
==
0x21
&&
signature
[
1
]
==
0x3c
&&
signature
[
2
]
==
0x61
&&
signature
[
3
]
==
0x72
&&
signature
[
4
]
==
0x63
&&
signature
[
5
]
==
0x68
&&
signature
[
6
]
==
0x3e
&&
signature
[
7
]
==
0x0a
;
}
static
final
String
BSD_LONGNAME_PREFIX
=
"#1/"
;
...
...
src/main/java/org/apache/commons/compress/archivers/ar/ArArchiveOutputStream.java
View file @
e215b78d
Prev
1
2
3
4
5
…
15
Next