Docker Container Unresponsive After 1-2 Days with ShinyProxy Deployment

Hello ShinyProxy Community,

I am experiencing a recurring issue with ShinyProxy where Docker containers used to serve Shiny apps become unresponsive after running continuously for one or two days. Below are some details about my setup and the errors I am seeing:

ShinyProxy Configuration:

  • Using Docker Swarm as the backend
  • Containers are set to use 4 CPU cores and 6GB memory each
  • ShinyProxy configuration includes several apps deployed with container-network: sp-example-net
  • ShinyProxy configuration allows for Docker pre-initializing, but container re-use is disabled

Here is an excerpt from my configuration:

Problem Details:

  • After one or two days, I can access the ShinyProxy menu, but clicking on an app link results in the app failing to start, sometimes displaying a blank page. The ShinyProxy navigation bar and user information are displayed, but the app interface itself remains blank.
  • Restarting the Docker Swarm service running ShinyProxy temporarily resolves the issue, but it reoccurs every one or two days.
  • At any time, I can successfully log in using OpenID authentication. However, after logging in, I am unable to launch any Shiny app. The app page remains blank, but the ShinyProxy navigation bar and user information are still visible. Restarting the Docker Swarm service running ShinyProxy is required to restore functionality.

Relevant Logs (ShinyProxy):

The following log entries show the issue occurring repeatedly before the containers become unresponsive:

I noticed that the logs repeatedly mention that the Docker Swarm service is “not ready yet” and that the container is “unresponsive.” This seems to indicate a timeout or scaling issue. The containers are already locally available, so they should not need to be pulled again, yet ShinyProxy keeps retrying.

My Observations:

  • The problem seems periodic, usually after the container has been running for a couple of days.
  • Even though my server has 16 cores and 64GB of RAM, and the Docker instances are limited to 4 cores and 6GB each, I face these issues even when there are only a few users.
  • It also seems that when one Shiny container becomes unresponsive or gets stuck, it impacts new containers as well, with ShinyProxy struggling to start new ones.
  • I have another server with the same apps and identical configuration except for the authentication type (set to none instead of openid ). This server does not experience the same issues. The server without the issue never encounters a blank screen problem. Additionally, the apps on the problematic server are larger and have more features, which may result in longer loading times.

Questions:

  1. Could this problem be related to a Shiny app running a long or infinite process, leading to Docker becoming unresponsive or causing a resource bottleneck?
  2. Given the configurations I provided, are there any best practices I might be missing to ensure better container management and stability?
  3. Could the use of openid authentication be contributing to this issue, especially since the server without authentication (none ) does not face the same problem?

Any help or suggestions would be greatly appreciated.

Thank you in advance for your support!

Configuration Summary:

  • Docker Backend: Docker Swarm
  • Memory Limit: 6GB per container
  • CPU Limit: 4 cores per container
  • Server Specs: 16 cores, 64GB RAM
  • Issue Frequency: After 1-2 days of container uptime
  • Authentication Type: OpenID (on problematic server), None (on stable server)

confuguration:

proxy:
  stop-proxies-on-shutdown: true
  recover-running-proxies: false  
  container-wait-time: 200000
  hide-navbar: false
  usage-stats-url: micrometer
  title: My APP
  heartbeat-rate: 10000
  heartbeat-timeout: 180000
  port: 8080
  authentication: openid
  admin-groups: scientists
  openid:
    auth-url: http://auth0.b-ht.com:8135/oauth2/authorize
    token-url: http://auth0.b-ht.com:8135/oauth2/token
    jwks-url: http://auth0.b-ht.com:8135/oauth2/jwks
    userinfo-url: http://auth0.b-ht.com:8135/user
    logout-url: http://auth0.b-ht.com:8135/delToken
    client-id: messaging-client2
    client-secret: 4536
    username-attribute: sub
    roles-claim: authorities
    SHINYPROXY_USERGROUPS: scientists
    scopes: [profile]
  
  container-backend: docker-swarm
  docker:
    internal-networking: true
  specs:
  - id: mstatavip
    display-name: app1
    description: app1
    container-cmd: ["R", "-e", "shiny::runApp('/home/rstudio/mstatavip')"]
    container-image: zhaohongxin0/mstatavip2
    access-groups: [admins, Vipgroup]
    container-network: sp-example-net
    minimum-seats-available: 3
    allow-container-re-use: false
    container-memory-limit: 6g
    container-cpu-limit: 4
    
  - id: tutor
    display-name: app2
    description: app2
    container-cmd: ["R", "-e", "shiny::runApp('/home/rstudio/tutor')"]
    container-image: zhaohongxin0/mstatavip2
    access-groups: [admins, Vipgroup]
    container-network: sp-example-net
    container-memory-limit: 6g
    container-cpu-limit: 4    
  - id: journal
    display-name: app3
    description: app3
    container-cmd: ["R", "-e", "shiny::runApp('/home/rstudio/journal')"]
    container-image: zhaohongxin0/mstatavip2
    access-groups: [admins, Vipgroup]
    container-network: sp-example-net
    container-memory-limit: 6g
    container-cpu-limit: 4
  - id: predictapp
    display-name: app4
    description: app4
    container-cmd: ["R", "-e", "shiny::runApp('/home/rstudio/predictapp')"]
    container-image: zhaohongxin0/mstatavip2
    access-groups: [admins, Vipgroup]
    container-network: sp-example-net
    container-memory-limit: 6g
    container-cpu-limit: 4    
  - id: chatstata
    display-name: app5
    description: app5
    container-cmd: ["R", "-e", "RTutor::run_app()"]
    container-image: zhaohongxin0/mstatavip2
    access-groups: [admins, Vipgroup]
    container-network: sp-example-net
    minimum-seats-available: 1
    allow-container-re-use: false
    container-memory-limit: 6g
    container-cpu-limit: 4

logging:
  file:
    shinyproxy.log

server:
  forward-headers-strategy: native  
  servlet.session.timeout: -1
management:
  metrics:
    export:
      prometheus:
        enabled: true

shinyproxy log right before the issue occur:

{"log":"2024-10-06T04:03:03.817Z  INFO 1 --- [haringScaler-13] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=aa413422-b449-4e9d-b76a-414d49af2776] Started DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:03:03.817159379Z"}
{"log":"2024-10-06T04:03:27.260Z  INFO 1 --- [GlobalEventLoop] e.o.c.s.hearbeat.ActiveProxiesService    : [user=4fbdf587-817e-4e7c-a2ce-36f347515db4 proxyId=03cf5fc5-7bfe-40f6-95a4-5d3999933146 specId=freetrial] Releasing inactive proxy [silence: 64300ms]\n","stream":"stdout","time":"2024-10-06T04:03:27.261105001Z"}
{"log":"2024-10-06T04:03:27.261Z  INFO 1 --- [ProxyService-15] e.o.c.b.d.p.ProxySharingDispatcher       : [user=4fbdf587-817e-4e7c-a2ce-36f347515db4 proxyId=03cf5fc5-7bfe-40f6-95a4-5d3999933146 specId=freetrial delegateProxyId=95d6fd0c-2f7e-44ce-86de-97a512817e2a seatId=af1cf523-cabd-4475-bd4f-7d162a1be9e2] Seat released\n","stream":"stdout","time":"2024-10-06T04:03:27.261214881Z"}
{"log":"2024-10-06T04:03:27.261Z  INFO 1 --- [ProxyService-15] e.o.containerproxy.service.ProxyService  : [user=4fbdf587-817e-4e7c-a2ce-36f347515db4 proxyId=03cf5fc5-7bfe-40f6-95a4-5d3999933146 specId=freetrial] Proxy released\n","stream":"stdout","time":"2024-10-06T04:03:27.261264152Z"}
{"log":"2024-10-06T04:03:27.261Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=95d6fd0c-2f7e-44ce-86de-97a512817e2a] DelegateProxy cannot be re-used, marking for removal\n","stream":"stdout","time":"2024-10-06T04:03:27.2612702Z"}
{"log":"2024-10-06T04:03:27.261Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=95d6fd0c-2f7e-44ce-86de-97a512817e2a seatId=af1cf523-cabd-4475-bd4f-7d162a1be9e2] Removed seat\n","stream":"stdout","time":"2024-10-06T04:03:27.26134874Z"}
{"log":"2024-10-06T04:03:29.689Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=95d6fd0c-2f7e-44ce-86de-97a512817e2a] Stopping DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:03:29.689272555Z"}
{"log":"2024-10-06T04:05:47.260Z  INFO 1 --- [GlobalEventLoop] e.o.c.s.hearbeat.ActiveProxiesService    : [user=1bf5a1aa-ce15-4075-bc10-69963d50a873 proxyId=08a30413-e119-4434-8a3b-c21629520f0d specId=freetrial] Releasing inactive proxy [silence: 75161ms]\n","stream":"stdout","time":"2024-10-06T04:05:47.260985775Z"}
{"log":"2024-10-06T04:05:47.261Z  INFO 1 --- [ProxyService-15] e.o.c.b.d.p.ProxySharingDispatcher       : [user=1bf5a1aa-ce15-4075-bc10-69963d50a873 proxyId=08a30413-e119-4434-8a3b-c21629520f0d specId=freetrial delegateProxyId=5b2a9281-a3fa-465a-947e-5e23cca0c4b0 seatId=fe4976fb-aa20-412f-95ac-d2698dd4f2ed] Seat released\n","stream":"stdout","time":"2024-10-06T04:05:47.261128341Z"}
{"log":"2024-10-06T04:05:47.261Z  INFO 1 --- [ProxyService-15] e.o.containerproxy.service.ProxyService  : [user=1bf5a1aa-ce15-4075-bc10-69963d50a873 proxyId=08a30413-e119-4434-8a3b-c21629520f0d specId=freetrial] Proxy released\n","stream":"stdout","time":"2024-10-06T04:05:47.261163642Z"}
{"log":"2024-10-06T04:05:47.261Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=5b2a9281-a3fa-465a-947e-5e23cca0c4b0] DelegateProxy cannot be re-used, marking for removal\n","stream":"stdout","time":"2024-10-06T04:05:47.261181778Z"}
{"log":"2024-10-06T04:05:47.261Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=5b2a9281-a3fa-465a-947e-5e23cca0c4b0 seatId=fe4976fb-aa20-412f-95ac-d2698dd4f2ed] Removed seat\n","stream":"stdout","time":"2024-10-06T04:05:47.261237379Z"}
{"log":"2024-10-06T04:05:49.690Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=5b2a9281-a3fa-465a-947e-5e23cca0c4b0] Stopping DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:05:49.690445679Z"}
{"log":"2024-10-06T04:07:51.869Z  INFO 1 --- [  XNIO-1 task-4] e.o.containerproxy.service.UserService   : Anonymous user logged out [user: 87157836-4621-4970-a998-e0bf536db7c8]\n","stream":"stdout","time":"2024-10-06T04:07:51.869632868Z"}
{"log":"2024-10-06T04:09:59.905Z  INFO 1 --- [  XNIO-1 task-4] e.o.containerproxy.service.UserService   : User logged in [user: f33e64a8-bd7e-4d9c-813a-09a7f3368e61]\n","stream":"stdout","time":"2024-10-06T04:09:59.905949408Z"}
{"log":"2024-10-06T04:10:00.669Z  INFO 1 --- [ProxyService-15] e.o.containerproxy.service.ProxyService  : [user=f33e64a8-bd7e-4d9c-813a-09a7f3368e61 proxyId=8d394eba-db00-4148-b9ab-6f0dbadf6177 specId=freetrial] Starting proxy\n","stream":"stdout","time":"2024-10-06T04:10:00.669930667Z"}
{"log":"2024-10-06T04:10:00.669Z  INFO 1 --- [ProxyService-15] e.o.c.b.d.p.ProxySharingDispatcher       : [user=f33e64a8-bd7e-4d9c-813a-09a7f3368e61 proxyId=8d394eba-db00-4148-b9ab-6f0dbadf6177 specId=freetrial delegateProxyId=aa413422-b449-4e9d-b76a-414d49af2776 seatId=cc96158c-a66a-42b6-991a-74f9ca003f13] Seat claimed\n","stream":"stdout","time":"2024-10-06T04:10:00.669963106Z"}
{"log":"2024-10-06T04:10:00.669Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial] Scale up required, trying to create 1 DelegateProxies\n","stream":"stdout","time":"2024-10-06T04:10:00.670014624Z"}
{"log":"2024-10-06T04:10:00.669Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=62e27db3-1115-4c9c-a792-b09071fe6275] Creating DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:10:00.670025854Z"}
{"log":"2024-10-06T04:10:00.669Z  INFO 1 --- [ProxyService-15] e.o.containerproxy.service.ProxyService  : [user=f33e64a8-bd7e-4d9c-813a-09a7f3368e61 proxyId=8d394eba-db00-4148-b9ab-6f0dbadf6177 specId=freetrial] Proxy activated\n","stream":"stdout","time":"2024-10-06T04:10:00.670030214Z"}
{"log":"2024-10-06T04:10:00.670Z  INFO 1 --- [haringScaler-13] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=62e27db3-1115-4c9c-a792-b09071fe6275] Preparing DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:10:00.670157325Z"}
{"log":"2024-10-06T04:10:00.670Z  INFO 1 --- [haringScaler-13] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=62e27db3-1115-4c9c-a792-b09071fe6275] Starting DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:10:00.670502457Z"}
{"log":"2024-10-06T04:10:05.796Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (11/40)\n","stream":"stdout","time":"2024-10-06T04:10:05.796672574Z"}
{"log":"2024-10-06T04:10:07.808Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (12/40)\n","stream":"stdout","time":"2024-10-06T04:10:07.808585954Z"}
{"log":"2024-10-06T04:10:09.821Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (13/40)\n","stream":"stdout","time":"2024-10-06T04:10:09.821196166Z"}
{"log":"2024-10-06T04:10:11.832Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (14/40)\n","stream":"stdout","time":"2024-10-06T04:10:11.83261815Z"}
{"log":"2024-10-06T04:10:13.844Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (15/40)\n","stream":"stdout","time":"2024-10-06T04:10:13.844532187Z"}
{"log":"2024-10-06T04:10:15.855Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (16/40)\n","stream":"stdout","time":"2024-10-06T04:10:15.85544685Z"}
{"log":"2024-10-06T04:10:17.867Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (17/40)\n","stream":"stdout","time":"2024-10-06T04:10:17.867414942Z"}
{"log":"2024-10-06T04:10:19.877Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (18/40)\n","stream":"stdout","time":"2024-10-06T04:10:19.878065361Z"}
{"log":"2024-10-06T04:10:21.888Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (19/40)\n","stream":"stdout","time":"2024-10-06T04:10:21.888175727Z"}
{"log":"2024-10-06T04:10:23.897Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (20/40)\n","stream":"stdout","time":"2024-10-06T04:10:23.898005669Z"}
{"log":"2024-10-06T04:10:25.908Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (21/40)\n","stream":"stdout","time":"2024-10-06T04:10:25.908391266Z"}
{"log":"2024-10-06T04:10:27.919Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (22/40)\n","stream":"stdout","time":"2024-10-06T04:10:27.919119086Z"}
{"log":"2024-10-06T04:10:29.929Z  INFO 1 --- [haringScaler-13] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Docker Swarm Service not ready yet, trying again (23/40)\n","stream":"stdout","time":"2024-10-06T04:10:29.929358497Z"}
{"log":"2024-10-06T04:10:36.946Z  WARN 1 --- [haringScaler-13] e.o.shinyproxy.ShinyProxyTestStrategy    : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Container unresponsive, trying again (11/25): http://2dfab14cc920:3838\n","stream":"stdout","time":"2024-10-06T04:10:36.946598614Z"}
{"log":"2024-10-06T04:10:38.947Z  WARN 1 --- [haringScaler-13] e.o.shinyproxy.ShinyProxyTestStrategy    : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Container unresponsive, trying again (12/25): http://2dfab14cc920:3838\n","stream":"stdout","time":"2024-10-06T04:10:38.947169372Z"}
{"log":"2024-10-06T04:10:45.022Z  INFO 1 --- [haringScaler-13] e.o.shinyproxy.ShinyProxyTestStrategy    : [user=null proxyId=62e27db3-1115-4c9c-a792-b09071fe6275 specId=freetrial] Container responsive\n","stream":"stdout","time":"2024-10-06T04:10:45.022391159Z"}
{"log":"2024-10-06T04:10:45.022Z  INFO 1 --- [haringScaler-13] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=62e27db3-1115-4c9c-a792-b09071fe6275 seatId=3339b410-eea8-4b61-80e3-c7c0fff13287] Created Seat\n","stream":"stdout","time":"2024-10-06T04:10:45.022425658Z"}
{"log":"2024-10-06T04:10:45.022Z  INFO 1 --- [haringScaler-13] e.o.c.b.d.p.ProxySharingScaler           : [specId=freetrial delegateProxyId=62e27db3-1115-4c9c-a792-b09071fe6275] Started DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:10:45.022430983Z"}
{"log":"2024-10-06T04:10:51.503Z  INFO 1 --- [ProxyService-15] e.o.containerproxy.service.ProxyService  : [user=de799849-6f00-4317-b5a2-4eb47b180701 proxyId=98bec556-3197-48be-a560-80dd7182375c specId=chatstata] Starting proxy\n","stream":"stdout","time":"2024-10-06T04:10:51.503994625Z"}
{"log":"2024-10-06T04:10:51.503Z  INFO 1 --- [ProxyService-15] e.o.c.b.d.p.ProxySharingDispatcher       : [user=de799849-6f00-4317-b5a2-4eb47b180701 proxyId=98bec556-3197-48be-a560-80dd7182375c specId=chatstata delegateProxyId=19799eed-74db-4911-a296-5218d33b517a seatId=f133e9a7-a70b-499e-8c9b-c56ce3069406] Seat claimed\n","stream":"stdout","time":"2024-10-06T04:10:51.50403379Z"}
{"log":"2024-10-06T04:10:51.504Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata] Scale up required, trying to create 1 DelegateProxies\n","stream":"stdout","time":"2024-10-06T04:10:51.504154506Z"}
{"log":"2024-10-06T04:10:51.504Z  INFO 1 --- [ProxyService-15] e.o.containerproxy.service.ProxyService  : [user=de799849-6f00-4317-b5a2-4eb47b180701 proxyId=98bec556-3197-48be-a560-80dd7182375c specId=chatstata] Proxy activated\n","stream":"stdout","time":"2024-10-06T04:10:51.504171952Z"}
{"log":"2024-10-06T04:10:51.504Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=3d95455c-03c8-472d-902d-16481d27a6a8] Creating DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:10:51.504175786Z"}
{"log":"2024-10-06T04:10:51.504Z  INFO 1 --- [haringScaler-16] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=3d95455c-03c8-472d-902d-16481d27a6a8] Preparing DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:10:51.50426105Z"}
{"log":"2024-10-06T04:10:51.504Z  INFO 1 --- [haringScaler-16] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=3d95455c-03c8-472d-902d-16481d27a6a8] Starting DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:10:51.504477365Z"}
{"log":"2024-10-06T04:10:56.625Z  INFO 1 --- [haringScaler-16] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=3d95455c-03c8-472d-902d-16481d27a6a8 specId=chatstata] Docker Swarm Service not ready yet, trying again (11/40)\n","stream":"stdout","time":"2024-10-06T04:10:56.62592566Z"}
{"log":"2024-10-06T04:10:58.637Z  INFO 1 --- [haringScaler-16] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=3d95455c-03c8-472d-902d-16481d27a6a8 specId=chatstata] Docker Swarm Service not ready yet, trying again (12/40)\n","stream":"stdout","time":"2024-10-06T04:10:58.637479991Z"}
{"log":"2024-10-06T04:11:00.648Z  INFO 1 --- [haringScaler-16] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=3d95455c-03c8-472d-902d-16481d27a6a8 specId=chatstata] Docker Swarm Service not ready yet, trying again (13/40)\n","stream":"stdout","time":"2024-10-06T04:11:00.648545415Z"}
{"log":"2024-10-06T04:11:02.659Z  INFO 1 --- [haringScaler-16] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=3d95455c-03c8-472d-902d-16481d27a6a8 specId=chatstata] Docker Swarm Service not ready yet, trying again (14/40)\n","stream":"stdout","time":"2024-10-06T04:11:02.659547066Z"}
{"log":"2024-10-06T04:11:04.670Z  INFO 1 --- [haringScaler-16] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=3d95455c-03c8-472d-902d-16481d27a6a8 specId=chatstata] Docker Swarm Service not ready yet, trying again (15/40)\n","stream":"stdout","time":"2024-10-06T04:11:04.670217371Z"}
{"log":"2024-10-06T04:11:05.180Z  INFO 1 --- [  XNIO-1 task-6] e.o.containerproxy.service.UserService   : User logged in [user: b28f0354-3cc5-4d5c-838f-10e15f5fd8aa]\n","stream":"stdout","time":"2024-10-06T04:11:05.18025247Z"}
{"log":"2024-10-06T04:11:05.347Z  INFO 1 --- [  XNIO-1 task-4] e.o.containerproxy.service.UserService   : User logged in [user: 6225df14-9fee-4065-aa8a-89fd2030ebc9]\n","stream":"stdout","time":"2024-10-06T04:11:05.347739713Z"}
{"log":"2024-10-06T04:11:05.384Z  INFO 1 --- [ProxyService-15] e.o.containerproxy.service.ProxyService  : [user=b28f0354-3cc5-4d5c-838f-10e15f5fd8aa proxyId=6e3c8f4d-226b-4ea4-979b-c62eff111412 specId=chatstata] Starting proxy\n","stream":"stdout","time":"2024-10-06T04:11:05.387521134Z"}
{"log":"2024-10-06T04:11:05.384Z  INFO 1 --- [ProxyService-15] e.o.c.b.d.p.ProxySharingDispatcher       : [user=b28f0354-3cc5-4d5c-838f-10e15f5fd8aa proxyId=6e3c8f4d-226b-4ea4-979b-c62eff111412 specId=chatstata] Seat not immediately available\n","stream":"stdout","time":"2024-10-06T04:11:05.387543741Z"}
{"log":"2024-10-06T04:11:05.384Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata] Scale up required, trying to create 1 DelegateProxies\n","stream":"stdout","time":"2024-10-06T04:11:05.387565313Z"}
{"log":"2024-10-06T04:11:05.384Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=7fe105d2-57eb-4139-920f-ea2d02ce4f2f] Creating DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:11:05.387569181Z"}
{"log":"2024-10-06T04:11:05.384Z  INFO 1 --- [haringScaler-15] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=7fe105d2-57eb-4139-920f-ea2d02ce4f2f] Preparing DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:11:05.387572729Z"}
{"log":"2024-10-06T04:11:05.384Z  INFO 1 --- [haringScaler-15] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=7fe105d2-57eb-4139-920f-ea2d02ce4f2f] Starting DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:11:05.387576044Z"}
{"log":"2024-10-06T04:11:05.511Z  INFO 1 --- [ProxyService-14] e.o.containerproxy.service.ProxyService  : [user=6225df14-9fee-4065-aa8a-89fd2030ebc9 proxyId=bc147863-4493-4d0f-8818-3596cdff3b49 specId=chatstata] Starting proxy\n","stream":"stdout","time":"2024-10-06T04:11:05.511733155Z"}
{"log":"2024-10-06T04:11:05.511Z  INFO 1 --- [ProxyService-14] e.o.c.b.d.p.ProxySharingDispatcher       : [user=6225df14-9fee-4065-aa8a-89fd2030ebc9 proxyId=bc147863-4493-4d0f-8818-3596cdff3b49 specId=chatstata] Seat not immediately available\n","stream":"stdout","time":"2024-10-06T04:11:05.511764657Z"}
{"log":"2024-10-06T04:11:05.511Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata] Scale up required, trying to create 1 DelegateProxies\n","stream":"stdout","time":"2024-10-06T04:11:05.515448228Z"}
{"log":"2024-10-06T04:11:05.511Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=dde145cf-53ce-4c6d-9a1e-c3bcb07dcb50] Creating DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:11:05.515468885Z"}
{"log":"2024-10-06T04:11:05.512Z  INFO 1 --- [haringScaler-14] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=dde145cf-53ce-4c6d-9a1e-c3bcb07dcb50] Preparing DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:11:05.515471826Z"}
{"log":"2024-10-06T04:11:05.512Z  INFO 1 --- [haringScaler-14] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=dde145cf-53ce-4c6d-9a1e-c3bcb07dcb50] Starting DelegateProxy\n","stream":"stdout","time":"2024-10-06T04:11:05.515474718Z"}
{"log":"2024-10-06T04:11:06.679Z  INFO 1 --- [haringScaler-16] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=3d95455c-03c8-472d-902d-16481d27a6a8 specId=chatstata] Docker Swarm Service not ready yet, trying again (16/40)\n","stream":"stdout","time":"2024-10-06T04:11:06.679512587Z"}
{"log":"2024-10-06T04:11:08.687Z  INFO 1 --- [haringScaler-16] e.o.c.backend.docker.DockerSwarmBackend  : [user=null proxyId=3d95455c-03c8-472d-902d-16481d27a6a8 specId=chatstata] Docker Swarm Service not ready yet, trying again (17/40)\n","stream":"stdout","time":"2024-10-06T04:11:08.688065216Z"}

Hi, thank you for your detailed questions!

Could this problem be related to a Shiny app running a long or infinite process, leading to Docker becoming unresponsive or causing a resource bottleneck?

ShinyProxy is able to run apps for multiple days or weeks, this should not be a problem.

Given the configurations I provided, are there any best practices I might be missing to ensure better container management and stability?

You mention that you are using Docker swarm, but it seems you are using only a single node. In that case I strongly advice to use the regular docker backend and disable swarm on the docker daemon. In general the docker backend/engine is more stable than docker swarm.

Could the use of openid authentication be contributing to this issue, especially since the server without authentication (none ) does not face the same problem?

No, it should not cause the issue. Usually when a blank page is shown, ShinyProxy cannot communicate with the container. Either because the container disappeared or because of some network issue. When this happens, you could check the admin panel, and find the IP of the broken app and check whether the container associated to this IP still exists.

Thank you for your detailed reply!

I have reverted to the Docker backend rather than Docker Swarm, as recommended. However, the issue continues to appear every day or two. When it happens, I can log in, and the navbar is visible, but after clicking the “Admin” button to enter the admin page, everything below the navbar remains blank, including the section where app IP addresses should display.

Additionally, I have another server running the same Shiny app setup, and this server doesn’t encounter this issue. Key differences on the stable server include setting the login method to “none,” limiting maximum session data to 20, restricting file uploads to 1MB, and having fewer app functionalities. These missing functionalities in the stable server may involve extensive R calculations, which makes me wonder: could heavy R computations cause the entire Docker setup to crash?

However, I would have thought a single Docker container crash wouldn’t affect other containers or cause all user containers to disappear. Only the ShinyProxy container and pre-initializing containers remain visible in the Docker backend once the issue happens, as shown in the logs I previously shared.

After each failure, I can restore the service by running the following commands to restart the ShinyProxy container:

sudo docker rm -f $(sudo docker ps -aq)
sudo docker run -d -v /var/run/docker.sock:/var/run/docker.sock:ro --group-add $(getent group docker | cut -d: -f3) --net sp-example-net -p 8080:8080 zhaohongxin0/mstatavip_shinyproxy

Any insights or recommendations you might have would be very helpful. Thank you again!

btw, the shinyproxy log info before the isssue occurred:

"log":"\n","stream":"stdout","time":"2024-10-30T16:11:18.528224025Z"}
{"log":"2024-10-30T16:11:22.181Z ERROR 1 --- [   XNIO-1 I/O-1] io.undertow.request                      : UT005071: Undertow request failed HttpServerExchange{ CONNECT dnspod.qcloud.com:443}\n","stream":"stdout","time":"2024-10-30T16:11:22.181612259Z"}
{"log":"\n","stream":"stdout","time":"2024-10-30T16:11:22.181641083Z"}
{"log":"java.lang.IllegalArgumentException: UT000068: Servlet path match failed\n","stream":"stdout","time":"2024-10-30T16:11:22.181644299Z"}
{"log":"\u0009at io.undertow.servlet.handlers.ServletPathMatchesData.getServletHandlerByPath(ServletPathMatchesData.java:83) ~[undertow-servlet-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181647294Z"}
{"log":"\u0009at io.undertow.servlet.handlers.ServletPathMatches.getServletHandlerByPath(ServletPathMatches.java:133) ~[undertow-servlet-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.18165043Z"}
{"log":"\u0009at io.undertow.servlet.handlers.ServletInitialHandler.handleRequest(ServletInitialHandler.java:148) ~[undertow-servlet-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181653185Z"}
{"log":"\u0009at io.undertow.server.handlers.HttpContinueReadHandler.handleRequest(HttpContinueReadHandler.java:69) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181655911Z"}
{"log":"\u0009at org.springframework.boot.web.embedded.undertow.DeploymentManagerHttpHandlerFactory$DeploymentManagerHandler.handleRequest(DeploymentManagerHttpHandlerFactory.java:74) ~[spring-boot-3.2.6.jar!/:3.2.6]\n","stream":"stdout","time":"2024-10-30T16:11:22.181658636Z"}
{"log":"\u0009at io.undertow.server.handlers.ProxyPeerAddressHandler.handleRequest(ProxyPeerAddressHandler.java:129) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181661361Z"}
{"log":"\u0009at io.undertow.server.Connectors.executeRootHandler(Connectors.java:393) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181675357Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpReadListener.handleEventWithNoRunningRequest(HttpReadListener.java:265) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181678152Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpReadListener.handleEvent(HttpReadListener.java:136) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181680887Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpOpenListener.handleEvent(HttpOpenListener.java:162) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181711184Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpOpenListener.handleEvent(HttpOpenListener.java:100) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.18171419Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpOpenListener.handleEvent(HttpOpenListener.java:57) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181717777Z"}
{"log":"\u0009at org.xnio.ChannelListeners.invokeChannelListener(ChannelListeners.java:92) ~[xnio-api-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181720452Z"}
{"log":"\u0009at org.xnio.ChannelListeners$10.handleEvent(ChannelListeners.java:291) ~[xnio-api-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181723066Z"}
{"log":"\u0009at org.xnio.ChannelListeners$10.handleEvent(ChannelListeners.java:286) ~[xnio-api-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181725671Z"}
{"log":"\u0009at org.xnio.ChannelListeners.invokeChannelListener(ChannelListeners.java:92) ~[xnio-api-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181728256Z"}
{"log":"\u0009at org.xnio.nio.QueuedNioTcpServer2.acceptTask(QueuedNioTcpServer2.java:178) ~[xnio-nio-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181731061Z"}
{"log":"\u0009at org.xnio.nio.WorkerThread.safeRun(WorkerThread.java:612) ~[xnio-nio-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181733656Z"}
{"log":"\u0009at org.xnio.nio.WorkerThread.run(WorkerThread.java:479) ~[xnio-nio-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.181736251Z"}
{"log":"\n","stream":"stdout","time":"2024-10-30T16:11:22.181738836Z"}
{"log":"2024-10-30T16:11:22.537Z ERROR 1 --- [   XNIO-1 I/O-2] io.undertow.request                      : UT005071: Undertow request failed HttpServerExchange{ CONNECT example.com:443}\n","stream":"stdout","time":"2024-10-30T16:11:22.537964857Z"}
{"log":"\n","stream":"stdout","time":"2024-10-30T16:11:22.53798762Z"}
{"log":"java.lang.IllegalArgumentException: UT000068: Servlet path match failed\n","stream":"stdout","time":"2024-10-30T16:11:22.537991657Z"}
{"log":"\u0009at io.undertow.servlet.handlers.ServletPathMatchesData.getServletHandlerByPath(ServletPathMatchesData.java:83) ~[undertow-servlet-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.537995154Z"}
{"log":"\u0009at io.undertow.servlet.handlers.ServletPathMatches.getServletHandlerByPath(ServletPathMatches.java:133) ~[undertow-servlet-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.537999352Z"}
{"log":"\u0009at io.undertow.servlet.handlers.ServletInitialHandler.handleRequest(ServletInitialHandler.java:148) ~[undertow-servlet-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538003179Z"}
{"log":"\u0009at io.undertow.server.handlers.HttpContinueReadHandler.handleRequest(HttpContinueReadHandler.java:69) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538006816Z"}
{"log":"\u0009at org.springframework.boot.web.embedded.undertow.DeploymentManagerHttpHandlerFactory$DeploymentManagerHandler.handleRequest(DeploymentManagerHttpHandlerFactory.java:74) ~[spring-boot-3.2.6.jar!/:3.2.6]\n","stream":"stdout","time":"2024-10-30T16:11:22.538021513Z"}
{"log":"\u0009at io.undertow.server.handlers.ProxyPeerAddressHandler.handleRequest(ProxyPeerAddressHandler.java:129) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538026212Z"}
{"log":"\u0009at io.undertow.server.Connectors.executeRootHandler(Connectors.java:393) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538029509Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpReadListener.handleEventWithNoRunningRequest(HttpReadListener.java:265) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538032885Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpReadListener.handleEvent(HttpReadListener.java:136) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538036232Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpOpenListener.handleEvent(HttpOpenListener.java:162) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538039648Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpOpenListener.handleEvent(HttpOpenListener.java:100) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538043185Z"}
{"log":"\u0009at io.undertow.server.protocol.http.HttpOpenListener.handleEvent(HttpOpenListener.java:57) ~[undertow-core-2.3.13.Final.jar!/:2.3.13.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538047543Z"}
{"log":"\u0009at org.xnio.ChannelListeners.invokeChannelListener(ChannelListeners.java:92) ~[xnio-api-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538050929Z"}
{"log":"\u0009at org.xnio.ChannelListeners$10.handleEvent(ChannelListeners.java:291) ~[xnio-api-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538054225Z"}
{"log":"\u0009at org.xnio.ChannelListeners$10.handleEvent(ChannelListeners.java:286) ~[xnio-api-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538057461Z"}
{"log":"\u0009at org.xnio.ChannelListeners.invokeChannelListener(ChannelListeners.java:92) ~[xnio-api-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538060708Z"}
{"log":"\u0009at org.xnio.nio.QueuedNioTcpServer2.acceptTask(QueuedNioTcpServer2.java:178) ~[xnio-nio-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538064284Z"}
{"log":"\u0009at org.xnio.nio.WorkerThread.safeRun(WorkerThread.java:612) ~[xnio-nio-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.5380675Z"}
{"log":"\u0009at org.xnio.nio.WorkerThread.run(WorkerThread.java:479) ~[xnio-nio-3.8.8.Final.jar!/:3.8.8.Final]\n","stream":"stdout","time":"2024-10-30T16:11:22.538070807Z"}
{"log":"\n","stream":"stdout","time":"2024-10-30T16:11:22.538074153Z"}

A simple crash of the process inside a container should not make the other contains crash. However, if the server is running out of memory, it might be that the kernel starts killing some processes and therefore other containers may crash (oomkill).
You can prevent this by setting container-memory-request to the same value as container-memory-limit, this way the system reserves the memory for the container and you are sure that you don’t over-commit your memory on the server.

Regarding the error message: it seems your ShinyProxy server tries to connect to dnspod.qcloud.com, which seems to be a DNS server of your cloud provider. I’m not sure why ShinyProxy would connect to this. It’s expected that it connects to your OpenID provider. It seem its’ related to UT000068: Servlet path match failed\n , which could be caused by using a wrong domain/path to access ShinyProxy.

Are you using any loadbalancer, proxy, WAF or something like a CDN in front of ShinyProxy?

Frequent ShinyProxy Issues: High CPU Usage and Blank App Screens

Over the past few weeks, I’ve observed a recurring issue with my ShinyProxy setup. These problems occur almost daily, at irregular times. They tend to arise when someone accesses the app but don’t occur during idle hours, such as early mornings when no one is using the system.

The problem starts when a user logs into the system. Although they can successfully log in and launch the app, the app itself fails to load. Instead, after the “Launching App…” message appears below the navigation bar, the screen turns completely white and remains blank.

During these incidents, I’ve checked the server and sometimes noticed that the CPU usage for the Docker container running ShinyProxy spikes to 100%, or even up to 500% (the server has 16 cores). When this happens, the app becomes entirely inaccessible.

When I inspect the container and run the top command, I see that the high CPU usage (500%) is caused by the Java process, which is ShinyProxy itself. However, upon reviewing the ShinyProxy logs, I don’t find anything unusual. The last few lines in the logs during these incidents typically show messages like Stopping DelegateProxy.......

og":"2024-12-02T15:08:18.635Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip] Scale up required, trying to create 1 DelegateProxies\n","stream":"stdout","time":"2024-12-02T15:08:18.635160502Z"}
{"log":"2024-12-02T15:08:18.635Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=b97f1a62-675a-4db8-a5ce-5bb893ecc601] Creating DelegateProxy\n","stream":"stdout","time":"2024-12-02T15:08:18.635168207Z"}
{"log":"2024-12-02T15:08:18.635Z  INFO 1 --- [ProxyService-17] e.o.containerproxy.service.ProxyService  : [user=18080947637 proxyId=1f15ce76-fbed-4545-8d95-4de6a182d13d specId=mstatavip] Proxy activated\n","stream":"stdout","time":"2024-12-02T15:08:18.635171483Z"}
{"log":"2024-12-02T15:08:18.635Z  INFO 1 --- [haringScaler-16] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=b97f1a62-675a-4db8-a5ce-5bb893ecc601] Preparing DelegateProxy\n","stream":"stdout","time":"2024-12-02T15:08:18.635243408Z"}
{"log":"2024-12-02T15:08:18.635Z  INFO 1 --- [haringScaler-16] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=b97f1a62-675a-4db8-a5ce-5bb893ecc601] Starting DelegateProxy\n","stream":"stdout","time":"2024-12-02T15:08:18.635392679Z"}
{"log":"2024-12-02T15:08:18.661Z  INFO 1 --- [haringScaler-16] o.m.docker.client.DefaultDockerClient    : Starting container with Id: 26faa0c1786db36e32d6cef818a6469f51549e4e373db992812ea9fdfc6bcc4a\n","stream":"stdout","time":"2024-12-02T15:08:18.661855548Z"}
{"log":"2024-12-02T15:08:23.853Z  WARN 1 --- [haringScaler-16] e.o.shinyproxy.ShinyProxyTestStrategy    : [user=null proxyId=b97f1a62-675a-4db8-a5ce-5bb893ecc601 specId=mstatavip] Container unresponsive, trying again (11/210): http://26faa0c1786d:3838\n","stream":"stdout","time":"2024-12-02T15:08:23.85319562Z"}
{"log":"2024-12-02T15:08:25.853Z  WARN 1 --- [haringScaler-16] e.o.shinyproxy.ShinyProxyTestStrategy    : [user=null proxyId=b97f1a62-675a-4db8-a5ce-5bb893ecc601 specId=mstatavip] Container unresponsive, trying again (12/210): http://26faa0c1786d:3838\n","stream":"stdout","time":"2024-12-02T15:08:25.853742403Z"}
{"log":"2024-12-02T15:08:32.042Z  INFO 1 --- [haringScaler-16] e.o.shinyproxy.ShinyProxyTestStrategy    : [user=null proxyId=b97f1a62-675a-4db8-a5ce-5bb893ecc601 specId=mstatavip] Container responsive\n","stream":"stdout","time":"2024-12-02T15:08:32.042688286Z"}
{"log":"2024-12-02T15:08:32.042Z  INFO 1 --- [haringScaler-16] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=b97f1a62-675a-4db8-a5ce-5bb893ecc601 seatId=df0eb44d-068f-48d0-9d67-8a3a549737e5] Created Seat\n","stream":"stdout","time":"2024-12-02T15:08:32.042746696Z"}
{"log":"2024-12-02T15:08:32.042Z  INFO 1 --- [haringScaler-16] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=b97f1a62-675a-4db8-a5ce-5bb893ecc601] Started DelegateProxy\n","stream":"stdout","time":"2024-12-02T15:08:32.042787943Z"}
{"log":"2024-12-02T15:08:47.630Z  INFO 1 --- [GlobalEventLoop] e.o.c.s.hearbeat.ActiveProxiesService    : [user=15802853874 proxyId=1f8a0247-1d41-46ca-99f9-2f4ea9b0de94 specId=mstatavip] Releasing inactive proxy [silence: 193596ms]\n","stream":"stdout","time":"2024-12-02T15:08:47.630548299Z"}
{"log":"2024-12-02T15:08:47.630Z  INFO 1 --- [ProxyService-17] e.o.c.b.d.p.ProxySharingDispatcher       : [user=15802853874 proxyId=1f8a0247-1d41-46ca-99f9-2f4ea9b0de94 specId=mstatavip delegateProxyId=e3ecc487-d7a4-40ed-a36c-2bd40415c312 seatId=efa8d68a-2d6b-4f74-baa1-92c46dbf9417] Seat released\n","stream":"stdout","time":"2024-12-02T15:08:47.630715924Z"}
{"log":"2024-12-02T15:08:47.630Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=e3ecc487-d7a4-40ed-a36c-2bd40415c312] DelegateProxy cannot be re-used, marking for removal\n","stream":"stdout","time":"2024-12-02T15:08:47.630826571Z"}
{"log":"2024-12-02T15:08:47.630Z  INFO 1 --- [ProxyService-17] e.o.containerproxy.service.ProxyService  : [user=15802853874 proxyId=1f8a0247-1d41-46ca-99f9-2f4ea9b0de94 specId=mstatavip] Proxy released\n","stream":"stdout","time":"2024-12-02T15:08:47.630833064Z"}
{"log":"2024-12-02T15:08:47.630Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=e3ecc487-d7a4-40ed-a36c-2bd40415c312 seatId=efa8d68a-2d6b-4f74-baa1-92c46dbf9417] Removed seat\n","stream":"stdout","time":"2024-12-02T15:08:47.63083635Z"}
{"log":"2024-12-02T15:08:48.009Z  INFO 1 --- [ProxyService-18] e.o.c.b.d.p.ProxySharingDispatcher       : [user=18080947637 proxyId=2e4b21f1-e370-411f-9ef6-42b477b1ac90 specId=chatstata delegateProxyId=bd393ecf-a8f3-4d59-b2ac-c38de8878ebc seatId=4f9d4eba-dbb4-4448-bf8d-324b6ef30ebc] Seat released\n","stream":"stdout","time":"2024-12-02T15:08:48.009362101Z"}
{"log":"2024-12-02T15:08:48.009Z  INFO 1 --- [  XNIO-1 task-6] e.o.containerproxy.service.UserService   : User logged out [user: 18080947637]\n","stream":"stdout","time":"2024-12-02T15:08:48.009388901Z"}
{"log":"2024-12-02T15:08:48.009Z  INFO 1 --- [ProxyService-19] e.o.c.b.d.p.ProxySharingDispatcher       : [user=18080947637 proxyId=1f15ce76-fbed-4545-8d95-4de6a182d13d specId=mstatavip delegateProxyId=e61ca3d8-7bdf-4486-b6c3-799cdbaef857 seatId=90de97e7-6b13-4877-9728-387523cf92ed] Seat released\n","stream":"stdout","time":"2024-12-02T15:08:48.009398068Z"}
{"log":"2024-12-02T15:08:48.009Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=bd393ecf-a8f3-4d59-b2ac-c38de8878ebc] DelegateProxy cannot be re-used, marking for removal\n","stream":"stdout","time":"2024-12-02T15:08:48.009401825Z"}
{"log":"2024-12-02T15:08:48.009Z  INFO 1 --- [ProxyService-18] e.o.containerproxy.service.ProxyService  : [user=18080947637 proxyId=2e4b21f1-e370-411f-9ef6-42b477b1ac90 specId=chatstata] Proxy released\n","stream":"stdout","time":"2024-12-02T15:08:48.009428415Z"}
{"log":"2024-12-02T15:08:48.009Z  INFO 1 --- [ProxyService-19] e.o.containerproxy.service.ProxyService  : [user=18080947637 proxyId=1f15ce76-fbed-4545-8d95-4de6a182d13d specId=mstatavip] Proxy released\n","stream":"stdout","time":"2024-12-02T15:08:48.009433835Z"}
{"log":"2024-12-02T15:08:48.009Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=bd393ecf-a8f3-4d59-b2ac-c38de8878ebc seatId=4f9d4eba-dbb4-4448-bf8d-324b6ef30ebc] Removed seat\n","stream":"stdout","time":"2024-12-02T15:08:48.009437542Z"}
{"log":"2024-12-02T15:08:48.009Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=e61ca3d8-7bdf-4486-b6c3-799cdbaef857] DelegateProxy cannot be re-used, marking for removal\n","stream":"stdout","time":"2024-12-02T15:08:48.009473289Z"}
{"log":"2024-12-02T15:08:48.009Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=e61ca3d8-7bdf-4486-b6c3-799cdbaef857 seatId=90de97e7-6b13-4877-9728-387523cf92ed] Removed seat\n","stream":"stdout","time":"2024-12-02T15:08:48.009478018Z"}
{"log":"2024-12-02T15:08:49.570Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=69b2b0bd-c7d8-42f5-9c5d-e6b6f96f1d92] Stopping DelegateProxy\n","stream":"stdout","time":"2024-12-02T15:08:49.571106597Z"}
{"log":"2024-12-02T15:08:49.801Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=e61ca3d8-7bdf-4486-b6c3-799cdbaef857] Stopping DelegateProxy\n","stream":"stdout","time":"2024-12-02T15:08:49.801352111Z"}
{"log":"2024-12-02T15:08:50.015Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=mstatavip delegateProxyId=e3ecc487-d7a4-40ed-a36c-2bd40415c312] Stopping DelegateProxy\n","stream":"stdout","time":"2024-12-02T15:08:50.016101227Z"}
{"log":"2024-12-02T15:08:50.351Z  INFO 1 --- [GlobalEventLoop] e.o.c.b.d.p.ProxySharingScaler           : [specId=chatstata delegateProxyId=bd393ecf-a8f3-4d59-b2ac-c38de8878ebc] Stopping DelegateProxy\n","stream":"stdout","time":"2024-12-02T15:08:50.352008571Z"}

Attempts to Troubleshoot

I’ve tried numerous approaches to resolve this issue:

  1. Configuration Changes: I’ve adjusted various ShinyProxy settings.
  2. Server Upgrade: I migrated to a high-performance server with 16 cores and 64GB RAM.
  3. Authentication Changes: I switched the login mechanism from OpenID to none. While this seemed to help sometimes, the issue still persists intermittently.
  4. App Structure: My app is relatively large and includes multiple modules, which might be a factor.

Interestingly, I didn’t experience these problems when I used Docker Swarm to distribute the app across multiple servers. However, after consolidating everything onto a single high-performance server (whether using Docker Swarm or a standard Docker setup), the issue began to occur.

Questions

What could be causing these issues? Could the size or complexity of the app be overloading ShinyProxy, even with a high-spec server? Or is there something specific about my current setup that’s triggering this behavior?

Any insights or suggestions would be greatly appreciated.

Hi, I never saw this issue before, but it seems one or more threads are stuck in an infinite loop. Therefore, it would be useful to get a java thread dump.

Can you execute the command jstack 1 in the ShinyProxy container as soon as the issue occurs and then send the output. If you prefer you can also send the output as a private message.

shinyproxy@820ce0ae7a41:/opt/shinyproxy$ jstack 1
2024-12-05 14:19:25
Full thread dump OpenJDK 64-Bit Server VM (17.0.11+9 mixed mode, sharing):

Threads class SMR info:
_java_thread_list=0x0000726834001fb0, length=139, elements={
0x0000726bac1e63b0, 0x0000726bac1e77a0, 0x0000726bac1ecc40, 0x0000726bac1ee000,
0x0000726bac1ef420, 0x0000726bac1f0e60, 0x0000726bac1f23a0, 0x0000726bac1f3820,
0x0000726bac1fb610, 0x0000726bac1fe3e0, 0x0000726bac276d60, 0x0000726bade84890,
0x0000726bade7fa30, 0x0000726bade80780, 0x0000726bade816d0, 0x0000726bade91970,
0x0000726bade92a50, 0x0000726bade93a60, 0x0000726bade94a70, 0x0000726bade95a80,
0x0000726bade96a90, 0x0000726bade97990, 0x0000726bade98890, 0x0000726bade999b0,
0x0000726bade9aad0, 0x0000726bade9bc00, 0x0000726bade9cd30, 0x0000726badee2b70,
0x0000726badf5b020, 0x0000726badf5ce30, 0x0000726badf5df00, 0x0000726badf5ef00,
0x0000726badf60060, 0x0000726badf611d0, 0x0000726badf62340, 0x0000726badf65e20,
0x0000726badf66bc0, 0x0000726badf67b60, 0x0000726badf68a10, 0x0000726badf69820,
0x0000726badf6a7d0, 0x0000726badf6b6d0, 0x0000726badf6c610, 0x0000726badf6d710,
0x0000726baceff690, 0x0000726bacf00e20, 0x0000726bacf01960, 0x0000726bacf02960,
0x0000726bacf03b40, 0x0000726bacf04b40, 0x0000726bacf05d30, 0x0000726bacf40bd0,
0x0000726bacf41d50, 0x0000726bacf42cd0, 0x0000726bacf43cd0, 0x0000726bacf44f50,
0x0000726bacf46170, 0x0000726bacf47170, 0x0000726bacf075b0, 0x0000726bacf08120,
0x0000726bacf090a0, 0x0000726bacf0a180, 0x0000726bae8918f0, 0x0000726bae94f480,
0x0000726bae950130, 0x0000726bae951130, 0x0000726bae9523b0, 0x0000726bae953630,
0x0000726bae954630, 0x0000726bae955630, 0x0000726bae956630, 0x0000726bae957630,
0x0000726bae958630, 0x0000726bae959610, 0x0000726bae95a6a0, 0x0000726bae95b8e0,
0x0000726bae95c8e0, 0x0000726bae95d8e0, 0x0000726bae95e8e0, 0x0000726bae95fbc0,
0x0000726898020100, 0x0000726bae992420, 0x0000726bae9932e0, 0x0000726bae99df30,
0x0000726bae99ee10, 0x0000726bae9e0910, 0x0000726bae9e1720, 0x0000726bae9e2530,
0x0000726bae9e33b0, 0x0000726bae9e42f0, 0x0000726bae9e5230, 0x0000726bae9e6170,
0x0000726bae9e70b0, 0x0000726bae9e7ff0, 0x0000726bae9e8fb0, 0x0000726bae9e9f50,
0x0000726bae9eb2c0, 0x0000726bae9ec630, 0x0000726bae9aab40, 0x0000726bac025210,
0x00007269d00e3520, 0x0000726abc1116c0, 0x00007268ec005540, 0x00007268c4005190,
0x000072689800ff20, 0x00007268c80052c0, 0x0000726ab01c70d0, 0x00007268c401f8f0,
0x00007268b000a070, 0x000072687c0028d0, 0x000072687c003730, 0x0000726880001e20,
0x000072688812d110, 0x000072687c004520, 0x0000726880002c00, 0x000072688c003a00,
0x00007269cc003ea0, 0x00007268c80035d0, 0x00007268a4016db0, 0x00007268d0009460,
0x00007268c4009230, 0x00007268c8004bd0, 0x00007268dc00cae0, 0x00007268c800ca20,
0x00007268e819e8d0, 0x0000726894006740, 0x00007268e0012640, 0x00007268e81a07b0,
0x00007268e81a1150, 0x0000726af006d4e0, 0x0000726a14001a70, 0x0000726a9c0671b0,
0x00007268b001c8c0, 0x00007268bc004720, 0x00007268e0018560, 0x00007268a400cdc0,
0x00007269cc003600, 0x00007269cc0a8640, 0x00007269cc0a8bc0
}

"Reference Handler" #2 daemon prio=10 os_prio=0 cpu=13.17ms elapsed=65963.35s tid=0x0000726bac1e63b0 nid=0xe waiting on condition  [0x0000726b6cdee000]
   java.lang.Thread.State: RUNNABLE
        at java.lang.ref.Reference.waitForReferencePendingList(java.base@17.0.11/Native Method)
        at java.lang.ref.Reference.processPendingReferences(java.base@17.0.11/Reference.java:253)
        at java.lang.ref.Reference$ReferenceHandler.run(java.base@17.0.11/Reference.java:215)

"Finalizer" #3 daemon prio=8 os_prio=0 cpu=31.16ms elapsed=65963.35s tid=0x0000726bac1e77a0 nid=0xf in Object.wait()  [0x0000726b6ccee000]
   java.lang.Thread.State: WAITING (on object monitor)
        at java.lang.Object.wait(java.base@17.0.11/Native Method)
        - waiting on <no object reference available>
        at java.lang.ref.ReferenceQueue.remove(java.base@17.0.11/ReferenceQueue.java:155)
        - locked <0x000000041fabf798> (a java.lang.ref.ReferenceQueue$Lock)
        at java.lang.ref.ReferenceQueue.remove(java.base@17.0.11/ReferenceQueue.java:176)
        at java.lang.ref.Finalizer$FinalizerThread.run(java.base@17.0.11/Finalizer.java:172)

"Signal Dispatcher" #4 daemon prio=9 os_prio=0 cpu=0.36ms elapsed=65963.34s tid=0x0000726bac1ecc40 nid=0x10 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Service Thread" #5 daemon prio=9 os_prio=0 cpu=52.72ms elapsed=65963.34s tid=0x0000726bac1ee000 nid=0x11 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Monitor Deflation Thread" #6 daemon prio=9 os_prio=0 cpu=4164.55ms elapsed=65963.34s tid=0x0000726bac1ef420 nid=0x12 runnable  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread0" #7 daemon prio=9 os_prio=0 cpu=43593.17ms elapsed=65963.34s tid=0x0000726bac1f0e60 nid=0x13 waiting on condition  [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE
   No compile task



......
......
......

<<< A large section of the log has been removed here due to length constraints >>>
......
......
......

"G1 Conc#2" os_prio=0 cpu=127.11ms elapsed=65960.95s tid=0x0000726b64001650 nid=0x33 runnable

"G1 Refine#0" os_prio=0 cpu=3.39ms elapsed=65963.37s tid=0x0000726bac1af580 nid=0xb runnable

"G1 Service" os_prio=0 cpu=6607.96ms elapsed=65963.37s tid=0x0000726bac1b0480 nid=0xc runnable

"VM Periodic Task Thread" os_prio=0 cpu=19580.03ms elapsed=65963.35s tid=0x0000726bac1fcf60 nid=0x17 waiting on condition

JNI global refs: 24, weak refs: 0


Found one Java-level deadlock:
=============================
"AsyncProxyService-1":
  waiting to lock monitor 0x0000726a9c6f3d20 (object 0x00000004201eac50, a com.google.common.collect.Synchronized$SynchronizedListMultimap),
  which is held by "AsyncProxyService-15"

"AsyncProxyService-15":
  waiting to lock monitor 0x000072690c014620 (object 0x00000004201eabc8, a com.google.common.collect.Synchronized$SynchronizedListMultimap),
  which is held by "XNIO-1 task-4"

"XNIO-1 task-4":
  waiting to lock monitor 0x0000726a9c6f3d20 (object 0x00000004201eac50, a com.google.common.collect.Synchronized$SynchronizedListMultimap),
  which is held by "AsyncProxyService-15"

Java stack information for the threads listed above:
===================================================
"AsyncProxyService-1":
        at com.google.common.collect.Synchronized$SynchronizedListMultimap.get(Synchronized.java:803)
        - waiting to lock <0x00000004201eac50> (a com.google.common.collect.Synchronized$SynchronizedListMultimap)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService.onProxyStoppedEvent(HeartbeatService.java:134)
        at jdk.internal.reflect.GeneratedMethodAccessor304.invoke(Unknown Source)
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(java.base@17.0.11/DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(java.base@17.0.11/Method.java:568)
        at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:354)
        at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:716)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$$SpringCGLIB$$1.onProxyStoppedEvent(<generated>)
        at jdk.internal.reflect.GeneratedMethodAccessor304.invoke(Unknown Source)
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(java.base@17.0.11/DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(java.base@17.0.11/Method.java:568)
        at org.springframework.context.event.ApplicationListenerMethodAdapter.doInvoke(ApplicationListenerMethodAdapter.java:365)
        at org.springframework.context.event.ApplicationListenerMethodAdapter.processEvent(ApplicationListenerMethodAdapter.java:237)
        at org.springframework.context.event.ApplicationListenerMethodAdapter.onApplicationEvent(ApplicationListenerMethodAdapter.java:168)
        at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:185)
        at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:178)
        at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:156)
        at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:451)
        at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:384)
        at eu.openanalytics.containerproxy.service.ProxyService.lambda$stopProxy$7(ProxyService.java:353)
        at eu.openanalytics.containerproxy.service.ProxyService$$Lambda$1731/0x0000726b109d1e38.run(Unknown Source)
        at eu.openanalytics.containerproxy.service.ProxyService.lambda$action$12(ProxyService.java:638)
        at eu.openanalytics.containerproxy.service.ProxyService$$Lambda$1685/0x0000726b109af160.run(Unknown Source)
        at java.util.concurrent.Executors$RunnableAdapter.call(java.base@17.0.11/Executors.java:539)
        at java.util.concurrent.FutureTask.run(java.base@17.0.11/FutureTask.java:264)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(java.base@17.0.11/ThreadPoolExecutor.java:1136)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(java.base@17.0.11/ThreadPoolExecutor.java:635)
        at java.lang.Thread.run(java.base@17.0.11/Thread.java:840)
"AsyncProxyService-15":
        at com.google.common.collect.Synchronized$SynchronizedMultimap.remove(Synchronized.java:684)
        - waiting to lock <0x00000004201eabc8> (a com.google.common.collect.Synchronized$SynchronizedListMultimap)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService.onConnectionClosed(HeartbeatService.java:148)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$HeartbeatConnector.lambda$new$0(HeartbeatService.java:187)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$HeartbeatConnector$$Lambda$1709/0x0000726b109c62d8.handleEvent(Unknown Source)
        at org.xnio.StreamConnection.lambda$mergeListeners$1(StreamConnection.java:82)
        at org.xnio.StreamConnection$$Lambda$1710/0x0000726b109c6500.handleEvent(Unknown Source)
        at org.xnio.ChannelListeners.invokeChannelListener(ChannelListeners.java:92)
        at org.xnio.StreamConnection.invokeCloseListener(StreamConnection.java:132)
        at org.xnio.Connection.close(Connection.java:142)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$HeartbeatConnector.closeConnection(HeartbeatService.java:247)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$$Lambda$1735/0x0000726b109d2b20.accept(Unknown Source)
        at java.lang.Iterable.forEach(java.base@17.0.11/Iterable.java:75)
        at com.google.common.collect.Synchronized$SynchronizedCollection.forEach(Synchronized.java:206)
        - locked <0x00000004201eac50> (a com.google.common.collect.Synchronized$SynchronizedListMultimap)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService.onProxyStoppedEvent(HeartbeatService.java:134)
        at jdk.internal.reflect.GeneratedMethodAccessor304.invoke(Unknown Source)
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(java.base@17.0.11/DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(java.base@17.0.11/Method.java:568)
        at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:354)
        at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:716)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$$SpringCGLIB$$1.onProxyStoppedEvent(<generated>)
        at jdk.internal.reflect.GeneratedMethodAccessor304.invoke(Unknown Source)
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(java.base@17.0.11/DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(java.base@17.0.11/Method.java:568)
        at org.springframework.context.event.ApplicationListenerMethodAdapter.doInvoke(ApplicationListenerMethodAdapter.java:365)
        at org.springframework.context.event.ApplicationListenerMethodAdapter.processEvent(ApplicationListenerMethodAdapter.java:237)
        at org.springframework.context.event.ApplicationListenerMethodAdapter.onApplicationEvent(ApplicationListenerMethodAdapter.java:168)
        at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:185)
        at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:178)
        at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:156)
        at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:451)
        at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:384)
        at eu.openanalytics.containerproxy.service.ProxyService.lambda$stopProxy$7(ProxyService.java:353)
        at eu.openanalytics.containerproxy.service.ProxyService$$Lambda$1731/0x0000726b109d1e38.run(Unknown Source)
        at eu.openanalytics.containerproxy.service.ProxyService.lambda$action$12(ProxyService.java:638)
        at eu.openanalytics.containerproxy.service.ProxyService$$Lambda$1685/0x0000726b109af160.run(Unknown Source)
        at java.util.concurrent.Executors$RunnableAdapter.call(java.base@17.0.11/Executors.java:539)
        at java.util.concurrent.FutureTask.run(java.base@17.0.11/FutureTask.java:264)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(java.base@17.0.11/ThreadPoolExecutor.java:1136)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(java.base@17.0.11/ThreadPoolExecutor.java:635)
        at java.lang.Thread.run(java.base@17.0.11/Thread.java:840)
"XNIO-1 task-4":
        at com.google.common.collect.Synchronized$SynchronizedMultimap.remove(Synchronized.java:684)
        - waiting to lock <0x00000004201eac50> (a com.google.common.collect.Synchronized$SynchronizedListMultimap)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService.onConnectionClosed(HeartbeatService.java:151)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$HeartbeatConnector.lambda$new$0(HeartbeatService.java:187)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$HeartbeatConnector$$Lambda$1709/0x0000726b109c62d8.handleEvent(Unknown Source)
        at org.xnio.StreamConnection.lambda$mergeListeners$1(StreamConnection.java:82)
        at org.xnio.StreamConnection$$Lambda$1710/0x0000726b109c6500.handleEvent(Unknown Source)
        at org.xnio.ChannelListeners.invokeChannelListener(ChannelListeners.java:92)
        at org.xnio.StreamConnection.invokeCloseListener(StreamConnection.java:132)
        at org.xnio.Connection.close(Connection.java:142)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$HeartbeatConnector.closeConnection(HeartbeatService.java:247)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$$Lambda$1733/0x0000726b109d26c8.accept(Unknown Source)
        at java.lang.Iterable.forEach(java.base@17.0.11/Iterable.java:75)
        at com.google.common.collect.Synchronized$SynchronizedCollection.forEach(Synchronized.java:206)
        - locked <0x00000004201eabc8> (a com.google.common.collect.Synchronized$SynchronizedListMultimap)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService.onSessionDestroyedEvent(HeartbeatService.java:123)
        at jdk.internal.reflect.GeneratedMethodAccessor311.invoke(Unknown Source)
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(java.base@17.0.11/DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(java.base@17.0.11/Method.java:568)
        at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:354)
        at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:716)
        at eu.openanalytics.containerproxy.service.hearbeat.HeartbeatService$$SpringCGLIB$$1.onSessionDestroyedEvent(<generated>)
        at jdk.internal.reflect.GeneratedMethodAccessor311.invoke(Unknown Source)
        at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(java.base@17.0.11/DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(java.base@17.0.11/Method.java:568)
        at org.springframework.context.event.ApplicationListenerMethodAdapter.doInvoke(ApplicationListenerMethodAdapter.java:365)
        at org.springframework.context.event.ApplicationListenerMethodAdapter.processEvent(ApplicationListenerMethodAdapter.java:237)
        at org.springframework.context.event.ApplicationListenerMethodAdapter.onApplicationEvent(ApplicationListenerMethodAdapter.java:168)
        at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:185)
        at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:178)
        at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:156)
        at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:451)
        at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:384)
        at org.springframework.security.web.session.HttpSessionEventPublisher.extracted(HttpSessionEventPublisher.java:85)
        at org.springframework.security.web.session.HttpSessionEventPublisher.sessionDestroyed(HttpSessionEventPublisher.java:74)
        at io.undertow.servlet.core.ApplicationListeners.sessionDestroyed(ApplicationListeners.java:366)
        at io.undertow.servlet.core.SessionListenerBridge.doDestroy(SessionListenerBridge.java:98)
        at io.undertow.servlet.core.SessionListenerBridge.sessionDestroyed(SessionListenerBridge.java:78)
        at io.undertow.server.session.SessionListeners.sessionDestroyed(SessionListeners.java:61)
        at io.undertow.server.session.InMemorySessionManager$SessionImpl.invalidate(InMemorySessionManager.java:625)
        at io.undertow.server.session.InMemorySessionManager$SessionImpl.invalidate(InMemorySessionManager.java:599)
        at io.undertow.servlet.spec.HttpSessionImpl.invalidate(HttpSessionImpl.java:163)
        at eu.openanalytics.containerproxy.auth.impl.oidc.OpenIdReAuthorizeFilter.invalidateSession(OpenIdReAuthorizeFilter.java:150)
        at eu.openanalytics.containerproxy.auth.impl.oidc.OpenIdReAuthorizeFilter.doFilterInternal(OpenIdReAuthorizeFilter.java:119)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:227)
        at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:221)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:227)
        at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:221)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.oauth2.client.web.OAuth2AuthorizationRequestRedirectFilter.doFilterInternal(OAuth2AuthorizationRequestRedirectFilter.java:181)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:107)
        at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:93)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.web.csrf.CsrfFilter.doFilterInternal(CsrfFilter.java:117)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.web.filter.CorsFilter.doFilterInternal(CorsFilter.java:91)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.web.header.HeaderWriterFilter.doHeadersAfter(HeaderWriterFilter.java:90)
        at org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:75)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.web.context.SecurityContextHolderFilter.doFilter(SecurityContextHolderFilter.java:82)
        at org.springframework.security.web.context.SecurityContextHolderFilter.doFilter(SecurityContextHolderFilter.java:69)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:62)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.web.session.ForceEagerSessionCreationFilter.doFilterInternal(ForceEagerSessionCreationFilter.java:45)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.doFilter(ObservationFilterChainDecorator.java:227)
        at org.springframework.security.web.ObservationFilterChainDecorator$VirtualFilterChain.doFilter(ObservationFilterChainDecorator.java:137)
        at org.springframework.security.web.session.DisableEncodeUrlFilter.doFilterInternal(DisableEncodeUrlFilter.java:42)
        at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter.wrapFilter(ObservationFilterChainDecorator.java:240)
        at org.springframework.security.web.ObservationFilterChainDecorator$ObservationFilter$$Lambda$1327/0x0000726b107b2ed8.doFilter(Unknown Source)
        at 



io.undertow.server.handlers.PredicateHandler.handleRequest(PredicateHandler.java:43)
        at io.undertow.servlet.handlers.ServletInitialHandler.handleFirstRequest(ServletInitialHandler.java:276)
        at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:135)
        at io.undertow.servlet.handlers.ServletInitialHandler$2.call(ServletInitialHandler.java:132)
        at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:48)
        at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
        at io.undertow.servlet.handlers.ServletInitialHandler.dispatchRequest(ServletInitialHandler.java:256)
        at io.undertow.servlet.handlers.ServletInitialHandler$1.handleRequest(ServletInitialHandler.java:101)
        at io.undertow.server.Connectors.executeRootHandler(Connectors.java:393)
        at io.undertow.server.HttpServerExchange$1.run(HttpServerExchange.java:859)
        at org.jboss.threads.ContextHandler$1.runWith(ContextHandler.java:18)
        at org.jboss.threads.EnhancedQueueExecutor$Task.run(EnhancedQueueExecutor.java:2513)
        at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1538)
        at org.xnio.XnioWorker$WorkerThreadFactory$1$1.run(XnioWorker.java:1282)
        at java.lang.Thread.run(java.base@17.0.11/Thread.java:840)

Found 1 deadlock.

Although the CPU was not at 100% utilization, we encountered difficulty accessing the app. To investigate, I logged into the Docker container and ran the jstack 1 command, and the output is as shown above. It appears that there is a deadlock involving the following threads:

  • AsyncProxyService-1
  • AsyncProxyService-15
  • XNIO-1 task-4

From the stack trace, it seems the issue might be related to com.google.common.collect.Synchronized$SynchronizedListMultimap. Since my server is in an environment where access to Google services is restricted (blocked), I’m wondering if this could be a contributing factor.

Could you please confirm if the issue is indeed caused by a deadlock, and whether it is related to Google services? If not, could it be an issue with the usage of Guava (Google’s library) in a multi-threaded context? Let me know if further logs or information are needed.

I plan to wait until the CPU reaches 100% utilization and run another test to see if this issue occurs consistently under high load. However, based on the current evidence, could you confirm if this is a deadlock issue and whether it might be linked to restricted access to Google services? Let me know if further logs or information are required.

Hi this is really useful!

So we are using a google library for Java, but it’s a general purpose library. It is not at all interacting with google services.

There is indeed a deadlock, and from the code it makes sense this can happen. I’m just surprised this occurs so frequently in your setup. The good news is that we recently fixed a very similar issue in the same part of the code, which I believe should also fix this issue (see Error during the logout action · Issue #535 · openanalytics/shinyproxy · GitHub ) .

Could you please try using this image: openanalytics/shinyproxy-snapshot:3.2.0-SNAPSHOT-20241112.090228 and let us know whether it fixes the issue?

Thanks a lot for your help!

After upgrading to ShinyProxy 3.2.0, the previous deadlock issue seems to have been resolved. However, we are now encountering a different problem.

The Docker container running ShinyProxy is consuming 600% of the server’s CPU. While I can log in to ShinyProxy, none of the Shiny apps are opening—they only display a blank screen.

To investigate, I accessed the ShinyProxy container and ran jstack 1. The output is attached in the file for your reference. Upon reviewing the stack trace, I did not find any obvious issues or deadlocks.

The text of output is too long, so I put it here:

The thread "AsyncProxyService-1" is in a BLOCKED state and is waiting to lock an object (<0x00000004201eac50>), which is a com.google.common.collect.Synchronized$SynchronizedListMultimap.

Below is the current ShinyProxy log for your analysis:

Could you please advise on what might be causing this issue and how it can be resolved? Let me know if you need any additional information or logs.

Thank you for your assistance.

I looked into the thread dump and tried to re-produce a few things and it seems this behavior could be caused when your app returns a very big response for the initial HTML request. In my tests my browser takes about two minutes to render a HTML response of 30MB. While this is loading, the ShinyProxy server goes to 100% CPU. I see in your thread dump you have six threads with this problem, which would explain the 600% percent usage.

Note that this only happens when you try to render the response, simply downloading the file or getting the file using curl does not cause an issue.

In order to check this, could you try the following (whenever the issue occurs):

  1. list the docker containers: docker ps and copy an id of a container running the ip
  2. get the IP address of the container using docker inspect
  3. run curl http://<IP>:3838 | wc -c to send a request to the app and get the size of the response

Alternatively, it would also be useful if you could try accessing the app directly, e.g. using port-forwarding over ssh and check whether the app is still working.

Thank you for your previous suggestions regarding the ShinyProxy CPU usage issue. Following your advice, I moved the large HTML files used in my Shiny app to an external COS server and accessed them through domain links. This change significantly improved the loading speed. Additionally, I shifted the iframe calls entirely to the Shiny UI side to further reduce server-side rendering. Below is an example of how I implemented the iframe in the UI directly:

fluidRow(
    column(
        width = 12,
        tags$iframe(
            seamless = "seamless",
            src = "https://your-cdn.com/html-files/tableone.html",
            width = "100%",
            height = 2000,
            scrolling = "auto"
        )
    )
)

After these changes, everything worked well initially, with fast loading and no noticeable performance issues. However, tonight I encountered a similar problem where ShinyProxy’s CPU usage spiked to 600%, and all Shiny apps became unresponsive, displaying white screens.

Following your previous guidance, I inspected one of the Shiny app containers and executed the curl test. Here are the results:

root@VM-24-4-ubuntu:~# curl http://172.18.0.4:3838 | wc -c  
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current  
                                 Dload  Upload   Total   Spent    Left  Speed  
100 1117k  100 1117k    0     0   345k      0  0:00:03  0:00:03 --:--:--  345k  
1144135  

At the time, there were only around 15 active users, and server memory usage was well below half.

Given that the iframe files are now hosted externally and accessed directly via the user’s browser, I’m struggling to understand why ShinyProxy’s CPU would still spike in this way.

Based on my observations and your previous suggestions, I have a few hypotheses:

  1. Large Response Payloads: The curl result shows that each request to the Shiny app returns approximately 1.1 MB (1117k) of data, even though the iframe content itself is now served externally. Could this large response size still be contributing to ShinyProxy’s thread blocking or processing bottlenecks?
  2. CSS or Resource Overhead: It’s possible that additional resources (e.g., CSS, JavaScript, or other dynamically loaded assets) required for rendering the iframe content are being fetched and processed repeatedly on the Shiny app side, even though the iframe is external. Could these requests still be taxing the Shiny app containers or ShinyProxy’s thread pool?
  3. Thread Blocking or Queue Overload: If multiple users access the Shiny apps simultaneously, and each request still involves a large payload or processing time, could this be saturating ShinyProxy’s thread pool and causing the CPU to spike?
  4. Residual Server-Side Dependencies: Even though the iframe is now hosted externally, is it possible that ShinyProxy or the Shiny app itself still performs server-side tasks (e.g., checking iframe status, initializing app-related assets) that could lead to these spikes under certain load conditions?

I would greatly appreciate your insights into these possibilities, especially regarding the implications of the 1.1 MB response size and any specific optimizations or configurations that could mitigate this issue. For example, would reducing the size of app responses or adjusting thread pool settings in ShinyProxy help address this problem?

Thank you again for your time and guidance!