Calling this code:
Path p = FileSystems.getDefault().getPath(dir, file);
byte[] b = Files.readAllBytes(p));
is quite inefficient. When Files.readAllBytes() will call Files.read() which will attempt to create a byte array and read the entire byte array in one shot (so far, so good). When that read succeeds -- because the buffer is now full -- the read() method will resize the buffer (resulting in an expensive array copy) and attempt to read some additional data just in case the file grew. Failing that, it will return -- but first, it must make another copy of the buffer that is the correct size and re-copy all the data back.
I understand the need to handle the case if the file has grown (well, sort of -- there's an inherent race condition there and there is no locking and no guarantee it will work in any case). But the common case should be optimized here (rather than penalized).