## Java ZipFile - ZipException: error in opening zip file - java

### java.util.zip.ZipException: error in opening zip file

I have a Jar file, which contains other nested Jars. When I invoke the new JarFile() constructor on this file, I get an exception which says:
java.util.zip.ZipException: error in opening zip file
When I manually unzip the contents of this Jar file and zip it up again, it works fine.
I only see this exception on WebSphere 6.1.0.7 and higher versions. The same thing works fine on tomcat and WebLogic.
When I use JarInputStream instead of JarFile, I am able to read the contents of the Jar file without any exceptions.

Make sure your jar file is not corrupted. If it's corrupted or not able to unzip, this error will occur.

I faced the same problem. I had a zip archive which java.util.zip.ZipFile was not able to handle but WinRar unpacked it just fine. I found article on SDN about compressing and decompressing options in Java. I slightly modified one of example codes to produce method which was finally capable of handling the archive. Trick is in using ZipInputStream instead of ZipFile and in sequential reading of zip archive. This method is also capable of handling empty zip archive. I believe you can adjust the method to suit your needs as all zip classes have equivalent subclasses for .jar archives.
public void unzipFileIntoDirectory(File archive, File destinationDir)
throws Exception {
final int BUFFER_SIZE = 1024;
BufferedOutputStream dest = null;
FileInputStream fis = new FileInputStream(archive);
ZipInputStream zis = new ZipInputStream(new BufferedInputStream(fis));
ZipEntry entry;
File destFile;
while ((entry = zis.getNextEntry()) != null) {
destFile = FilesystemUtils.combineFileNames(destinationDir, entry.getName());
if (entry.isDirectory()) {
destFile.mkdirs();
continue;
} else {
int count;
byte data[] = new byte[BUFFER_SIZE];
destFile.getParentFile().mkdirs();
FileOutputStream fos = new FileOutputStream(destFile);
dest = new BufferedOutputStream(fos, BUFFER_SIZE);
while ((count = zis.read(data, 0, BUFFER_SIZE)) != -1) {
dest.write(data, 0, count);
}
dest.flush();
dest.close();
fos.close();
}
}
zis.close();
fis.close();
}

It could be related to log4j.
Do you have log4j.jar file in the websphere java classpath (as defined in the startup file) as well as the application classpath ?
If you do make sure that the log4j.jar file is in the java classpath and that it is NOT in the web-inf/lib directory of your webapp.
It can also be related with the ant version (may be not your case, but I do put it here for reference):
You have a .class file in your class path (i.e. not a directory or a .jar file). Starting with ant 1.6, ant will open the files in the classpath checking for manifest entries. This attempted opening will fail with the error "java.util.zip.ZipException"
The problem does not exist with ant 1.5 as it does not try to open the files. - so make sure that your classpath's do not contain .class files.
On a side note, did you consider having separate jars ?
You could in the manifest of your main jar, refer to the other jars with this attribute:
Class-Path: one.jar two.jar three.jar
Then, place all of your jars in the same folder.
Again, may be not valid for your case, but still there for reference.

I've seen this exception before when whatever the JVM considers to be a temp directory is not accessible due to not being there or not having permission to write.

I solved this by clearing the jboss-x.y.z/server[config]/tmp and jboss-x.y.z/server/[config]/work directories.

I also see this error when I am out of disk space on the filesystem it is writing to. So you can either give it more space, clean up log files, etc.

I saw this with a specific Zip-file with Java 6, but it went away when I upgrade to Java 8 (did not test Java 7), so it seems newer versions of ZipFile in Java support more compression algorithms and thus can read files which fail with earlier versions.

I faced this issues because of Corrupted ZIP Fils

Liquibase was getting this error for me. I resolved this after I debugged and watched liquibase try to load the libraries and found that it was erroring on the manifest files for commons-codec-1.6.jar. Essentially, there is either a corrupt zip file somewhere in your path or there is a incompatible version being used. When I did an explore on Maven repository for this library, I found there were newer versions and added the newer version to the pom.xml. I was able to proceed at this point.

Maybe the zip file is damaged, or is brokened when downloading.

I was getting exception
java.util.zip.ZipException: invalid entry CRC (expected 0x0 but got 0xdeadface)
at java.util.zip.ZipInputStream.closeEntry(ZipInputStream.java:140)
at java.util.zip.ZipInputStream.getNextEntry(ZipInputStream.java:118)
...
when unzipping an archive in Java. The archive itself didn't seem corrupted as 7zip (and others) opened it without any problems or complaints about invalid CRC.
I switched to Apache Commons Compress for reading the zip-entries and that resolved the problem.

Simply to overcome the ZipException's, i have used a wrapper for commons-compress1.14 called jarchivelibwritten by thrau that makes it easy to extract or compress from and into File objects.
Example:
public static void main(String[] args) {
String zipfilePath =
"E:/Selenium_Server/geckodriver-v0.19.0-linux64.tar.gz";
//"E:/Selenium_Server/geckodriver-v0.19.0-win32.zip";
String outdir = "E:/Selenium_Server/";
exratctFileList(zipfilePath, outdir );
}
public void exratctFileList( String zipfilePath, String outdir ) throws IOException {
File archive = new File( zipfilePath );
File destinationDir = new File( outdir );
Archiver archiver = null;
if( zipfilePath.endsWith(".zip") ) {
archiver = ArchiverFactory.createArchiver( ArchiveFormat.ZIP );
} else if ( zipfilePath.endsWith(".tar.gz") ) {
archiver = ArchiverFactory.createArchiver( ArchiveFormat.TAR, CompressionType.GZIP );
}
archiver.extract(archive, destinationDir);
ArchiveStream stream = archiver.stream( archive );
ArchiveEntry entry;
while( (entry = stream.getNextEntry()) != null ) {
String entryName = entry.getName();
System.out.println("Entery Name : "+ entryName );
}
stream.close();
}
Maven dependency « You can download the jars from the Sonatype Maven Repository at org/rauschig/jarchivelib/.
<dependency>
<groupId>org.rauschig</groupId>
<artifactId>jarchivelib</artifactId>
<version>0.7.1</version>
</dependency>
#see
Archivers and Compressors
Compressing and Decompressing Data Using Java APIs

On Windows7 I had this problem over a Samba network connection for a Java8 Jar File >80 MBytes big. Copying the file to a local drive fixed the issue.

### Android ZipInputStream: only DEFLATED entries can have EXT descriptor

On my android device, I need to extract a file (an xapk, that is a plain zip archive as far as I know) that I get from a content uri.
I'm creating the ZipInputStream using this line of code:
ZipInputStream zis = new ZipInputStream(getContentResolver().openInputStream(zipUri));
And then I try to read the first entry of the archive with:
ZipEntry entry = zis.getNextEntry()
The problem is that I get this exception:
java.util.zip.ZipException: only DEFLATED entries can have EXT
descriptor
I'm 100% sure that there is no 0bytes files in the archive, and I can extract the same archive with other utilities (RAR, unzip etc) in my device.
If I use a ZipFile with an hard coded path (so no content uri involved), I can extract the same archive without problems, so the issue is related to ZipInputStream with an uri. On the other hand, I can't use a ZipFile here because it doesn't support content uris.

### How to unzip file zipped by PKZIP in mainframe by Java?

I am trying to write a program in Java to unzip files zipped by PKZIP tool in Mainframe. However, I have tried below 3 ways, none of them can solve my problem.
By exe.
I have tried to open it by WinRAR, 7Zip and Linux command(unzip).
All are failed with below error message :
The archive is either in unknown format or damaged
By JDK API - java.util.ZipFile
I also have tried to unzip it by JDK API, as this website described.
However, it fails with error message :
IO Error: java.util.zip.ZipException: error in opening zip file
By Zip4J
I also have tried to use Zip4J. It failed too, with error message :
Caused by: java.io.IOException: Negative seek offset
at java.io.RandomAccessFile.seek(Native Method)
... 5 more
May I ask if there is any java lib or linux command can extract zip file zipped by PKZIP in Mainframe? Thanks a lot!

I have successfully read files that were compressed with PKZip on z/OS and transferred to Linux. I was able to read them with java.util.zip* classes:
ZipFile ifile = new ZipFile(inFileName);
// faster to loop through entries than open the zip file as a stream
Enumeration<? extends ZipEntry> entries = ifile.entries();
while ( entries.hasMoreElements()) {
ZipEntry entry = entries.nextElement();
if (!entry.isDirectory()) { // skip directories
String entryName = entry.getName();
// code to determine to process omitted
InputStream zis = ifile.getInputStream(entry);
// process the stream
}
}
The jar file format is just a zip file, so the "jar" command can also read such files.
Like the others, I suspect that maybe the file was not transferred in binary and so was corrupted. On Linux you can use the xxd utility (piped through head) to dump the first few bytes to see if it looks like a zip file:
# xxd myfile.zip | head
0000000: 504b 0304 2d00 0000 0800 2c66 a348 eb5e PK..-.....,f.H.^
The first 4 bytes should be as shown. See also the Wikipedia entry for zip files
Even if the first 4 bytes are correct, if the file was truncated during transmission that could also cause the corrupt file message.

### Move directory content from local file system to HDFS using Java

I have a source directory(/home/src) in local filesystem containing 2 files file1.txt and file2.txt
I want to copy them to destination directory(/user/dest) in HDFS through code.
When I use FileUtil API to move the content from local src to hdfs dest, it is moving the src directory as well.
FileUtil.copy("/home/src", fs, "/user/dest", true, conf);
Is there a way, where I can move only the directory contents from src to dest using Java API?

Hadoop has build in APIs that can be used to copy from local to hdfs
You just need to import necessary libraries and call the apis as below
This will copy the src folder from local to /user/dest/ of hdfs
If individual files are to be copied follow the following way : by listing the files and copying individual files (we can filter files too if we want)
FileSystem file = FileSystem.get(new Configuration())
File[] sourceFiles = new File("/home/src").listFiles();
if(sourceFiles != null) {
for(File f: sourceFiles) {
//we can filter files if needed here
file.copyFromLocalFile(true, true, new Path(f.getPath()), new Path("/user/dest"));
}
}
I hope this is helpful

### how to check if a jar file is valid?

my webapp allows a user to upload a jar file. however, after the jar file is uploaded, it is corrupted. i have verified this by comparing the md5 checksum (winmd5free).
the uploaded jar file looks "normal" and "right"
the file size compared to the original looks right (at the KB level)
i can open the uploaded jar file using 7z and view its content (resources and class files), and everything is the same compared to the original
when i open up the uploaded jar file (using Notepad++), i did notice that the binary contents are different from the original. also, when i used JarInputStream to read the jar entries, there were no entries.
JarInputStream is = new JarInputStream(new FileInputStream(new File("uploaded.jar")));
JarEntry entry = null;
while(null != (entry = is.getNextJarEntry())) {
System.out.println(entry.getName());
}
furthermore, when i double click on the jar (Windows), i get the following message.
Error: Invalid or corrupt jarfile
my questions are
is there a way to programmatically check if a jar file is valid? i would have expected JarInputStream to detect this right away, but it shows no problems at all
when i double click on the jar file, in windows, is it java.exe that is giving me the invalid or corrupt jarfile message?
how come when an invalid jar file is passed in as part of the classpath, no error/exception is thrown? e.g. java -cp uploaded.jar;libs* com.some.class.Test ?
please note that this has nothing to do with jar signing and/or checking the signing of a jar. it is simply checking if a file (uploaded or not) is a valid jar file (not necessarily if the jar's class files are valid, as there is another SO post on this issue already).
results for running
java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:127)
at java.util.zip.ZipFile.<init>(ZipFile.java:88)
at sun.tools.jar.Main.list(Main.java:977)
at sun.tools.jar.Main.run(Main.java:222)
at sun.tools.jar.Main.main(Main.java:1147)

a way to programmatically detect an invalid jar file is to use java.util.ZipFile.
public static void main(String[] args) {
if(args.length < 1) {
System.err.println("need jar file");
return;
}
String pathname = args[0];
try {
ZipFile file = new ZipFile(new File(pathname));
Enumeration<? extends ZipEntry> e = file.entries();
while(e.hasMoreElements()) {
ZipEntry entry = e.nextElement();
System.out.println(entry.getName());
}
} catch(Exception ex) {
ex.printStackTrace();
}
}
if the jar file is invalid, a ZipException will be thrown when instantiating the ZipFile.