JDK-8368290 : java/nio/channels/AsyncCloseAndInterrupt.java fails with timeout
  • Type: Bug
  • Component: core-libs
  • Sub-Component: java.nio
  • Affected Version: 11,17,21,25,26
  • Priority: P3
  • Status: In Progress
  • Resolution: Unresolved
  • OS: os_x
  • CPU: aarch64
  • Submitted: 2025-09-22
  • Updated: 2025-10-24
The Version table provides details related to the release that this issue/RFE will be addressed.

Unresolved : Release in which this issue/RFE will be addressed.
Resolved: Release in which this issue/RFE has been resolved.
Fixed : Release in which this issue/RFE has been fixed. The release containing this fix may be available for download as an Early Access Release or a General Availability Release.

To download the current JDK release, click here.
Other
tbdUnresolved
Related Reports
Relates :  
Relates :  
Sub Tasks
JDK-8370514 :  
Description
java/nio/channels/AsyncCloseAndInterrupt.java is failing on MacOS 26 aarch64 with the below failure:

Is it reproducible on JDK26: Yes
Is it reproducible on MacOS aacrch 64 15.6.1 : No
Is it reproducible on Linux/Ubuntu: No
Is it a platform specific issue: Yes, issue is seen only with macOS 26 platforms

----------messages:(141/9122)----------
command: main --enable-native-access=ALL-UNNAMED AsyncCloseAndInterrupt
reason: User specified action: run main/othervm --enable-native-access=ALL-UNNAMED AsyncCloseAndInterrupt 
started: Mon Sep 15 19:38:41 IST 2025
Mode: othervm [/othervm specified]
Process id: 31762
Timeout information:
Running jstack on process 31762
2025-09-15 19:54:43
Full thread dump Java HotSpot(TM) 64-Bit Server VM (25.0.1+7-LTS-24 mixed mode, sharing):

Threads class SMR info:
_java_thread_list=0x0000000c312671c0, length=14, elements={
0x00000001011d17c0, 0x00000001011e9e90, 0x0000000c31150000, 0x0000000c31150800,
0x0000000c31151000, 0x0000000c31151800, 0x0000000c3105e800, 0x0000000c3105f200,
0x0000000c31152000, 0x0000000c31152800, 0x0000000c31153800, 0x0000000c311a9000,
0x0000000c311aa800, 0x0000000c3126c800
}

"main" #3 [9731] prio=5 os_prio=31 cpu=20.08ms elapsed=961.40s tid=0x00000001011d17c0 nid=9731 in Object.wait()  [0x000000016f836000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait0(java.base@25.0.1/Native Method)
	- waiting on <0x000000070fe95300> (a java.lang.Thread)
	at java.lang.Object.wait(java.base@25.0.1/Object.java:389)
	at java.lang.Thread.join(java.base@25.0.1/Thread.java:1887)
	- locked <0x000000070fe95300> (a java.lang.Thread)
	at java.lang.Thread.join(java.base@25.0.1/Thread.java:1963)
	at com.sun.javatest.regtest.agent.MainWrapper.main(MainWrapper.java:85)

"Reference Handler" #14 [28419] daemon prio=10 os_prio=31 cpu=0.04ms elapsed=961.39s tid=0x00000001011e9e90 nid=28419 waiting on condition  [0x00000001709ae000]
   java.lang.Thread.State: RUNNABLE
	at java.lang.ref.Reference.waitForReferencePendingList(java.base@25.0.1/Native Method)
	at java.lang.ref.Reference.processPendingReferences(java.base@25.0.1/Reference.java:246)
	at java.lang.ref.Reference$ReferenceHandler.run(java.base@25.0.1/Reference.java:208)

"Finalizer" #15 [27907] daemon prio=8 os_prio=31 cpu=0.03ms elapsed=961.39s tid=0x0000000c31150000 nid=27907 in Object.wait()  [0x0000000170bba000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait0(java.base@25.0.1/Native Method)
	- waiting on <0x000000070fe02368> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.Object.wait(java.base@25.0.1/Object.java:389)
	at java.lang.Object.wait(java.base@25.0.1/Object.java:351)
	at java.lang.ref.ReferenceQueue.remove0(java.base@25.0.1/ReferenceQueue.java:137)
	at java.lang.ref.ReferenceQueue.remove(java.base@25.0.1/ReferenceQueue.java:215)
	- locked <0x000000070fe02368> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.ref.Finalizer$FinalizerThread.run(java.base@25.0.1/Finalizer.java:165)

"Signal Dispatcher" #16 [27651] daemon prio=9 os_prio=31 cpu=0.09ms elapsed=961.39s tid=0x0000000c31150800 nid=27651 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Service Thread" #17 [27395] daemon prio=9 os_prio=31 cpu=29.36ms elapsed=961.39s tid=0x0000000c31151000 nid=27395 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Monitor Deflation Thread" #18 [26883] daemon prio=9 os_prio=31 cpu=102.42ms elapsed=961.39s tid=0x0000000c31151800 nid=26883 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread0" #19 [24323] daemon prio=9 os_prio=31 cpu=39.78ms elapsed=961.39s tid=0x0000000c3105e800 nid=24323 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE
   No compile task

"C1 CompilerThread0" #22 [24579] daemon prio=9 os_prio=31 cpu=86.89ms elapsed=961.39s tid=0x0000000c3105f200 nid=24579 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE
   No compile task

"Notification Thread" #23 [26115] daemon prio=9 os_prio=31 cpu=0.02ms elapsed=961.39s tid=0x0000000c31152000 nid=26115 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Common-Cleaner" #24 [25859] daemon prio=8 os_prio=31 cpu=1.78ms elapsed=961.39s tid=0x0000000c31152800 nid=25859 in Object.wait()  [0x0000000171a0e000]
   java.lang.Thread.State: TIMED_WAITING (on object monitor)
	at java.lang.Object.wait0(java.base@25.0.1/Native Method)
	- waiting on <0x000000070fe380c0> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.Object.wait(java.base@25.0.1/Object.java:389)
	at java.lang.ref.ReferenceQueue.remove0(java.base@25.0.1/ReferenceQueue.java:123)
	at java.lang.ref.ReferenceQueue.remove(java.base@25.0.1/ReferenceQueue.java:201)
	- locked <0x000000070fe380c0> (a java.lang.ref.ReferenceQueue$Lock)
	at jdk.internal.ref.CleanerImpl.run(java.base@25.0.1/CleanerImpl.java:146)
	at java.lang.Thread.runWith(java.base@25.0.1/Thread.java:1487)
	at java.lang.Thread.run(java.base@25.0.1/Thread.java:1474)
	at jdk.internal.misc.InnocuousThread.run(java.base@25.0.1/InnocuousThread.java:148)

"MainThread" #25 [43267] prio=5 os_prio=31 cpu=290.20ms elapsed=961.38s tid=0x0000000c31153800 nid=43267 waiting on condition  [0x0000000171e26000]
   java.lang.Thread.State: TIMED_WAITING (sleeping)
	at java.lang.Thread.sleepNanos0(java.base@25.0.1/Native Method)
	at java.lang.Thread.sleepNanos(java.base@25.0.1/Thread.java:509)
	at java.lang.Thread.sleep(java.base@25.0.1/Thread.java:540)
	at AsyncCloseAndInterrupt.sleep(AsyncCloseAndInterrupt.java:61)
	at AsyncCloseAndInterrupt.waitPump(AsyncCloseAndInterrupt.java:539)
	at AsyncCloseAndInterrupt.main(AsyncCloseAndInterrupt.java:767)
	at java.lang.invoke.LambdaForm$DMH/0x00000f8001044000.invokeStatic(java.base@25.0.1/LambdaForm$DMH)
	at java.lang.invoke.LambdaForm$MH/0x00000f8001045400.invoke(java.base@25.0.1/LambdaForm$MH)
	at java.lang.invoke.Invokers$Holder.invokeExact_MT(java.base@25.0.1/Invokers$Holder)
	at jdk.internal.reflect.DirectMethodHandleAccessor.invokeImpl(java.base@25.0.1/DirectMethodHandleAccessor.java:155)
	at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(java.base@25.0.1/DirectMethodHandleAccessor.java:104)
	at java.lang.reflect.Method.invoke(java.base@25.0.1/Method.java:565)
	at com.sun.javatest.regtest.agent.MainWrapper$MainTask.run(MainWrapper.java:138)
	at java.lang.Thread.runWith(java.base@25.0.1/Thread.java:1487)
	at java.lang.Thread.run(java.base@25.0.1/Thread.java:1474)

"Acceptor" #26 [43011] daemon prio=5 os_prio=31 cpu=2.86ms elapsed=961.36s tid=0x0000000c311a9000 nid=43011 runnable  [0x0000000172032000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.Net.accept(java.base@25.0.1/Native Method)
	at sun.nio.ch.ServerSocketChannelImpl.implAccept(java.base@25.0.1/ServerSocketChannelImpl.java:424)
	at sun.nio.ch.ServerSocketChannelImpl.accept(java.base@25.0.1/ServerSocketChannelImpl.java:391)
	at AsyncCloseAndInterrupt$1.run(AsyncCloseAndInterrupt.java:101)

"Pumper" #59 [33923] daemon prio=5 os_prio=31 cpu=658.23ms elapsed=956.18s tid=0x0000000c311aa800 nid=33923 waiting on condition  [0x0000000171c1a000]
   java.lang.Thread.State: WAITING (parking)
	at jdk.internal.misc.Unsafe.park(java.base@25.0.1/Native Method)
	- parking to wait for  <0x000000070f95bad0> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
	at java.util.concurrent.locks.LockSupport.park(java.base@25.0.1/LockSupport.java:369)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionNode.block(java.base@25.0.1/AbstractQueuedSynchronizer.java:520)
	at java.util.concurrent.ForkJoinPool.unmanagedBlock(java.base@25.0.1/ForkJoinPool.java:4364)
	at java.util.concurrent.ForkJoinPool.managedBlock(java.base@25.0.1/ForkJoinPool.java:4310)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(java.base@25.0.1/AbstractQueuedSynchronizer.java:1752)
	at java.util.concurrent.LinkedBlockingQueue.take(java.base@25.0.1/LinkedBlockingQueue.java:435)
	at java.util.concurrent.ThreadPoolExecutor.getTask(java.base@25.0.1/ThreadPoolExecutor.java:1016)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(java.base@25.0.1/ThreadPoolExecutor.java:1076)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(java.base@25.0.1/ThreadPoolExecutor.java:614)
	at java.lang.Thread.runWith(java.base@25.0.1/Thread.java:1487)
	at java.lang.Thread.run(java.base@25.0.1/Thread.java:1474)

"Attach Listener" #60 [23331] daemon prio=9 os_prio=31 cpu=0.42ms elapsed=0.11s tid=0x0000000c3126c800 nid=23331 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"VM Thread" os_prio=31 cpu=0.63ms elapsed=961.40s tid=0x0000000c30ccd800 nid=18947 runnable  

"VM Periodic Task Thread" os_prio=31 cpu=6.42ms elapsed=961.40s tid=0x0000000c30ccd400 nid=20743 waiting on condition  

"G1 Service" os_prio=31 cpu=47.42ms elapsed=961.40s tid=0x0000000c31085400 nid=16643 runnable  

"G1 Refine#0" os_prio=31 cpu=0.01ms elapsed=961.40s tid=0x0000000c31084f00 nid=21251 runnable  

"G1 Conc#0" os_prio=31 cpu=0.01ms elapsed=961.40s tid=0x0000000c31084a00 nid=16387 runnable  

"G1 Main Marker" os_prio=31 cpu=0.01ms elapsed=961.40s tid=0x0000000c31084500 nid=14595 runnable  

"GC Thread#0" os_prio=31 cpu=0.01ms elapsed=961.40s tid=0x0000000c31084000 nid=13315 runnable  

JNI global refs: 16, weak refs: 0

--- Timeout information end.
finished: Mon Sep 15 19:54:43 IST 2025
elapsed time (seconds): 961.419
----------configuration:(0/0)----------
----------System.out:(1/36)----------
Timeout signalled after 960 seconds
----------System.err:(107/3084)----------

FileChannel/transferTo/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/transferTo/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException
WARNING: transferTo/close not tested

FileChannel/transferFrom/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/transferFrom/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException
WARNING: transferFrom/close not tested

FileChannel/read/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/read/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/read/close
Thrown as expected: java.nio.channels.AsynchronousCloseException

FileChannel/readv/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/readv/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/readv/close
Thrown as expected: java.nio.channels.AsynchronousCloseException

FileChannel/write/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/write/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/write/close
Wrote 8192 bytes
Channel closed

FileChannel/writev/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/writev/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

FileChannel/writev/close
Wrote 8192 bytes
Channel closed

SocketChannel/read/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

SocketChannel/read/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

SocketChannel/read/close
Thrown as expected: java.nio.channels.AsynchronousCloseException

SocketChannel/read/shutdown-input
Read returned -1
Channel open, input shutdown

SocketChannel/readv/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

SocketChannel/readv/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

SocketChannel/readv/close
Thrown as expected: java.nio.channels.AsynchronousCloseException

SocketChannel/readv/shutdown-input
Read returned -1
Channel open, input shutdown

SocketChannel/write/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

SocketChannel/write/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

SocketChannel/write/close
Thrown as expected: java.nio.channels.AsynchronousCloseException

SocketChannel/write/shutdown-output
Wrote 580020 bytes
Channel open, output shutdown

SocketChannel/writev/interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

SocketChannel/writev/pre-interrupt
Thrown as expected: java.nio.channels.ClosedByInterruptException

SocketChannel/writev/close
Thrown as expected: java.nio.channels.AsynchronousCloseException

SocketChannel/writev/shutdown-output
Wrote 1301904 bytes
Channel open, output shutdown

Wait for initial Pump
Start pumping refuser ...
----------rerun:(25/2276)*----------
result: Error. Program `jdk-25.0.1.jdk/Contents/Home/bin/java' timed out (timeout set to 960000ms, elapsed time including timeout handling was 961416ms).


test result: Error. Program `jdk-25.0.1.jdk/Contents/Home/bin/java' timed out (timeout set to 960000ms, elapsed time including timeout handling was 961416ms).
Comments
this test logic is causing a problem on macOS 26 if (TestUtil.onWindows()) { log.println("WARNING Cannot reliably test connect/finishConnect" + " operations on this platform"); } else { // Only the following tests need refuser's connection backlog // to be saturated ExecutorService pumperExecutor = Executors.newSingleThreadExecutor( new ThreadFactory() { @Override public Thread newThread(Runnable r) { Thread t = new Thread(r); t.setDaemon(true); t.setName("Pumper"); return t; } }); pumpDone = false; try { Future<Integer> pumpFuture = pumpRefuser(pumperExecutor); waitPump("\nWait for initial Pump"); test(socketChannelFactory, CONNECT, false); test(socketChannelFactory, FINISH_CONNECT, false); pumpDone = true; Integer newConn = pumpFuture.get(30, TimeUnit.SECONDS); log.println("Pump " + newConn + " connections."); } finally { pumperExecutor.shutdown(); } } . . . and the pumpRefuser preparation, NB I added the try { Thread.sleep(10); } catch (Exception ignore) { // ... } below to stop the connect finish while loop spinning at an excessive rate. private static Future<Integer> pumpRefuser(ExecutorService pumperExecutor) { Callable<Integer> pumpTask = new Callable<Integer>() { @Override public Integer call() throws IOException { // Can't reliably saturate connection backlog on Windows Server editions assert !TestUtil.onWindows(); log.println("Start pumping refuser ..."); List<SocketChannel> refuserClients = new ArrayList<>(); // Saturate the refuser's connection backlog so that further connection // attempts will be blocked pumpReady = false; while (!pumpDone) { SocketChannel sc = SocketChannel.open(); sc.configureBlocking(false); boolean connected = sc.connect(refuser.socket().getLocalSocketAddress()); if (!pumpReady) { log.println("pumpRefuser: sc " + sc + " connected == " + connected); } // Assume that the connection backlog is saturated if a // client cannot connect to the refuser within 50 milliseconds long start = System.currentTimeMillis(); while (!pumpReady && !connected && (System.currentTimeMillis() - start < 50)) { log.println("pumpRefuser not connected finish connect "); connected = sc.finishConnect(); try { Thread.sleep(10); } catch (Exception ignore) { // ... } } if (connected) { // Retain so that finalizer doesn't close log.println("pumpRefuser adding refuser client " + sc.getLocalAddress()); refuserClients.add(sc); } else { sc.close(); pumpReady = true; log.println("pumpRefuser refuser backlog full pumpReady"); } } for (SocketChannel sc : refuserClients) { sc.close(); } refuser.close(); log.println("Stop pumping refuser ..."); return refuserClients.size(); } }; the essence of the test setup is to "clog" up the connection backlog queue. The backlog is set to 1 Thus the assumption is that two connect requests should fill up the tcp connect queues. But there can be two queue in the kernel handling connection request and the influence of a backlog setting is dependent on the OS TCP implementation. It is generally perceived that macOS (or OSX) has one connection queue as per the BSD implementation The first connection request and the connection is established. The next connection request should go on the backlog queue and would normally have to wait for the first extant connection to be completed before its connection setup can be complete. This should set the pumpReady. This is not happening on macOS 26. There seems to be a change in behaviour in the macOS handling of connection requests and backlog requires investigation, which is a bit of an effort when using MACH5 nodes I'll take this for now and do some investigation over the next few days and hand it back to bpb when I'm gone.
24-10-2025

refactor rename wildardaddress to localhostaddress the following is misleading wildcardAddress = new InetSocketAddress(InetAddress.getLocalHost(), 0); // This localhost address with an ephemeral port NOT a wildcard address initAcceptor() will bind the ServerSocketChannel to local host address and an ephemeral port (one chosen by OS) initRefuser() will also bind its ServerSocketChannel to the local host address and an ephemeral port (one chosen by OS but different to the acceptor) The reference to wildcard address is a misnomer. maybe create an issue to refactor rename wildcard to localhostEphemeralPortAddress or localhostAddress
23-09-2025