Docker: Can't connect to local MySQL server through socket


this is more of a Docker issue than ShinyProxy one. I have created a shinyapp that collects data from a local SQLite database also running on my pc (which will be similar once deployed on a server).
When I run the app as a package, app functions alright - connection to the database is established.
But once I create a docker image & run it, the app launched but is unable to connect to my local sql database, returning this error:

Warning: Error in connection_create: Failed to connect: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2 "No such file or directory")

I have tried multiple solutions including setting up docker compose file, using different docker images, setting up different ports etc. & still hitting the same dead end.

Thanks for any help!

could you please provide us with the Dockerfile as well as your code part in which the SQL connection is defined? Thanks

The newest docker file I’ve used was based on r-db. The shiny app is packaged in K2dashboard_*.tar.gz. The app itself runs ok on the port, but is not reactive because it cannot connect to the mysql databse.
Dockerfile looks like this:

FROM colinfay/r-db:3.6.1
RUN R -e 'install.packages("remotes")'
RUN R -e 'remotes::install_github("r-lib/remotes", ref = "97bbf81")'
RUN R -e 'remotes::install_cran("shiny")'
RUN R -e 'remotes::install_github("Thinkr-open/golem")'
RUN R -e 'remotes::install_cran("processx")'
RUN R -e 'remotes::install_cran("attempt")'
RUN R -e 'remotes::install_cran("DT")'
RUN R -e 'remotes::install_cran("glue")'
RUN R -e 'remotes::install_cran("htmltools")'
RUN R -e 'remotes::install_cran("shinydashboard")'
RUN R -e 'remotes::install_cran("shinydashboardPlus")'
RUN R -e 'remotes::install_cran("lubridate")'
RUN R -e 'remotes::install_cran("dplyr")'
RUN R -e 'remotes::install_cran("purrr")'
RUN R -e 'remotes::install_cran("plotly")'
RUN R -e 'remotes::install_cran("DBI")'
RUN R -e 'remotes::install_cran("tibbletime")'
RUN R -e 'remotes::install_cran("tsibble")'
RUN R -e 'remotes::install_cran("shinyWidgets")'
RUN R -e 'remotes::install_cran("leaflet")'
RUN R -e 'remotes::install_cran("pool")'
RUN R -e 'remotes::install_cran("RMariaDB")'
RUN R -e 'remotes::install_cran("roxygen2")'
COPY K2dashboard_*.tar.gz /app.tar.gz
RUN R -e 'remotes::install_local("/app.tar.gz")'
CMD  ["R", "-e", "options('shiny.port'=3838,''); K2dashboard::run_app()"]

And database connection:

con = dbConnect(RMariaDB::MariaDB(), dbname = "training_dash_db", user = "root", password = "", host = '')

Thanks for your time!

first of all you might want to consider to install packages with one command for a cleaner dockerfile:
R -e "install.packages(c('package1', 'package2',...), dependencies = TRUE)"
or in your case install_cran instead of install.package.
My guess is that your database connection is not closed automatically inside a docker container. You can check this by looking at the server logs and see if the server throws an error. One solution is to manage server connections via the pool package.

Another problem might be that from inside your Docker container the host of the DB is not your localhost ( but a host outside of the Docker container. On possibility is to share the all of the network of the host with the Docker container. For this set host = host.docker.internal

Hi Tim. Thank you for advice. I am using a golem for package creation so the install.packages are automatically added to do dockerfile. I don’t really mind it this way. I think that my problem is connected with your last point that I cannot use inside docker and think it will be point to the localhost.
It is a noob question - but where exactly do I specify host.docker.internal? Inside the app, or when running the docker file?

HI, You specify it in your DBI connection as host.

1 Like

Hi, I did that with the same error. I tried to build the image as --network host. No change. I might try to change the ip address for the docker for the actual ip address of my computer. But that’s it. Really don’t know where to go from here. A frustrating experience, to say at least :slight_smile: . Thanks for your help.

Did you bind the Unix sockets of host and container together on container start as ?

Well, I see this for the first time. So, no :slight_smile: . Should I bind my the mysqld.sock with it? To be frank, after reading about a dozen of SO topics on mysqld.sock I still don’t have a clear idea what it is. Thanks for the lead, I will try to investigate it.

Hi - I just had to solve a similar problem. You do pass the socket to the docker image?

Try to mount the docker image with the socket using docker run -v /var/lib/mysql/mysql.sock:/mysql.sock -it YourDockerImage bash. From inside the docker image, use mysql -u root -p -S /mysql.sock to check if the socket connection works (if you have the mysql client installed in the docker image).

To check the connection from R use con = RMariaDB::dbConnect( RMariaDB::MariaDB(), dbname = "...", username = "...", password = "...",unix.socket="/mysql.sock").

If all that works, tell ShinyProxy to pass the socket to the image (in the specs section of application.yml):container-volumes: [ "/var/lib/mysql/mysql.sock:/mysql.sock"]

Hi Kurt. I think you might be on the right track here! Running mysql command in my docker image results in bash: mysql: command not found. So my docker image does not come with mysql even though I thought it does. The solution is to specify mysql installation in my Dockerfile, right?

As far as I know, it is not required for R to work properly, that mysql client is installed. Although for debugging, I found it easier to install the client also in my docker image. My docker image is based on CentOS 7 and I use MariaDB, so installing the mysql client was done with RUN yum install -y MariaDB-client from the Dockerfile.

Without mysql client, you can test the connection using the R command listed above. Just mount the docker image with docker run -v /var/lib/mysql/mysql.sock:/mysql.sock -it YourDockerImage bash (you need to set the proper path to your socket and use your docker image name of course). Then you can start R from within the docker image and connect to the DB using con = RMariaDB::dbConnect( RMariaDB::MariaDB(), dbname = "...", username = "...", password = "...",unix.socket="/mysql.sock").
If this does not work try connecting to the DB via socket directly from the computer that hosts the DB using something like mysql -u root -pXXX -S /var/lib/mysql/mysql.sock (again set the proper path and password).

Thanks for the detailed reply, Kurt. I will be deploying on CentOS 7 as well, but for the time being, I am testing on the localhost.

One problem which I should realized way, way sooner is the fact, that Win os is not creating socket files but a named pipe . To my defence I have tested the dockerized app on linux early on with the same error, so I defaulted back to my Win computer. I didn’t know whether the mysql.sock addressed in the error message points to internal socket for the docker or external socket of my database.

I will rerun all the settings tomorrow using linux. Thanks for pointing me in the right direction! Hopefully I will have easier time debugging.

Hi. I have been able to progress somewhat, thanks to your advices. Currently I am able to log into the docker container on the linux using docker run --net='host' -v /var/lib/mysql/mysql.sock:/mysql.sock -it k2 bash, start R and connect to mysql on the localhost using simple con = DBI::dbConnect(RMariaDB::MariaDB(), dbname = "training_dash_db", user = "root", password = "", host = '', unix.socket="/mysql.sock").
Sadly, running the app which contains the same exact code for database connection from the docker results in the same connection problems described above Failed to connect: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2 "No such file or directory").

Not sure I understand you correctly. You log into the docker image, start R and connect to the DB. Are you sure you are connecting via TCP to localhost ( Does the connection still work without the unix.socket=... and does it not run without host=...? If the connection works via TCP, why do you want to use sockets?

So whatever the connection is, be it over TCP or via socket, if you can connect from R manually, it should also work in your app. From the above error message, I suspect that your app still uses the old path to the socket since there is no socket at /var/run/mysqld/mysqld.sock in your docker image. In your docker image, the socket is in /mysql.sock or in whatever path you have specified when mounting the docker image or in the ShinyProxy settings. In both cases, /var/run/mysqld/mysqld.sock:/mysql.sock means that the socket from the host computer at /var/run/mysqld/mysqld.sock will be mapped to /mysql.sock in the docker images. So your app needs to use /mysql.sock.

The other reason why your app does not work via TCP is probably the --net='host' command when mounting the docker image. This turns on network-mode, which apparently maps from the docker image to the host. Per default, bridge mode is active, so you have to specify container-network: network ( in application.yml`).

Thanks for all the help Kurt. At the end adding two lines into my.cnf inside docker etc/mysql solved my problem:



After that the shiny uses tcp instead of mysql.socket.

1 Like

I have experienced the same problem, except if I separate the MySQL start-up commands:

sudo docker build -t MyDB_img -f
sudo docker run --name SomeDB -e MYSQL_ROOT_PASSWORD="WhatEver" -p 3306:3306 -v $(pwd):/app -d MyDB_img

Then sleep for 20 seconds before running the MySQL scripts, it works.

sudo docker exec -it SomeDB sh -c

I can only presume that the MySQL server takes a few seconds to startup before it can accept incoming connections and scripts. Even though the container is up and running, the DB takes a few more seconds to become available.