I been having a heck of a time just with this one cpanel server and open files limit. At first using open_files_limit did not work so I changed it to open-files-limit that seemed to work but now it rejects the value and sets its down to 65535.
Then system open files limit is 500000. I try to set it to any value about 65535 in my.cnf and here is usual error:
090630 9:32:07 [Warning] option 'open_files_limit': unsigned value 120510 adjusted to 65535
090630 9:32:07 [Warning] option 'open_files_limit': unsigned value 120510 adjusted to 65535
When I run something like the tuning-primer it shows:
Current open_files_limit = 120510 files
The open_files_limit should typically be set to at least 2x-3x
that of table_cache if you have heavy MyISAM usage.
Your open_files_limit value seems to be fine
But Im not sure if it is just reading my.cnf or something. I am still getting complaints from users about lost connections and I see the errors in error log. Ive looked everywhere and cant seem to find a solution to this.
How to increase the Open Files limits descriptor in Apache. In the earlier version of Cpanel, we had an option of Raise FD Size Limit to 16384, but the option no longer appear while rebuilding Apache. What is the way to do it and make the change permanent?
I am trying to increase my open file limit on my CentOS release 4.6 VPS.
I have tried modifying my /etc/security/limits.conf with the following: @mysql soft nofile 4096 @mysql hard nofile 4096 @mysql soft sigpending 4096 @mysql hard sigpending 4096
Even is a mannual set it with "ulimit -n 4096 mysql" it will not stick.
Any ideas what I can do to get this limit increased permanently?
Basically mysql is behaving very very intermittently. Crashes were every 4 hours, I've brought them down to once every 8 or so hours but mysql keeps dying.
the error log will show the same routine each time.
on mysql start:
Quote:
091101 21:58:03 [Warning] option 'open_files_limit': unsigned value 120000 adjusted to 65535 091101 21:58:03 [Warning] Could not increase number of max_open_files to more than 65535 (request: 200110) 091101 21:58:03 [Note] /usr/sbin/mysqld: ready for connections.
Then we'll see errors due to crashed databases:
Quote:
091102 0:33:07 [ERROR] /usr/sbin/mysqld: Incorrect information in file: './<nameofdatabase.frm>'
following this a heap of:
Quote:
091102 0:36:35 [ERROR] /usr/sbin/mysqld: Can't open file: '>another database here.frm> 091102 0:36:36 [ERROR] /usr/sbin/mysqld: Sort aborted 091102 0:36:52 [ERROR] /usr/sbin/mysqld: Sort aborted 091102 0:43:00 [ERROR] Error in accept: Too many open files
We have three virtual hosts on our Apache 2.2 installation on Windows Server 2003. For some reason, I'm unable to open log files (error.log and each virtual hosts-specific log), even though I have full administrator rights. (The log folder is full access to admins.) Every time I try to open the file or even copy it to another location, it just says "Access Denied." I temporarily solved the issue for one of the logs by adding BufferedLogs On
i have Made a VPs on my Own dedicated Server Which i use to run TorrentFlux for Personal Use. I am facing a few problems and dont know where to askf or help.
when i start more than about 12, i get errors in SSh (if i llogin) or th4e Apache Restarts killing all the Transfers.
I ahve 2 Gb Ram, Dual Core CPU.
the Error Via SSh is: sh: pipe error: Too many open files in system
and i ahve attached a Errors Log From Apache.
i am a Noob in Servers so i ahve Lxadmin Contorl Panel Installed and the Log is generated by it.
I have a RHEL 4 plain server, and im using vsftpd server, i can not find an option to specify the max size for uploading files... does anybody know something about this?
I have an server linux OS CentOS 5.2 and using firewall CSF. and need question.
how to limit download theart ( limit connecting when download files ) EX : 4 connecting or 8 or 16 connecting ( my Guest using soft Internet Download Manager ).
For example, my website [url]and Direct links are: [url]. how to limit theart (Connecting) when Guest download which and using soft internetdownloadmanager, flasget.
Domain has PHP Settings in Plesk set to 2G and I get this error when uploading a 48MB file using Wordpress. I assume I need ot modify this manually in conf file somewhere to allow uploading large files?
Requested content-length of 48443338 is larger than the configured limit of 10240000..
mod_fcgid: error reading data, FastCGI server closed connection...
i dont want clients taking the servers I/O and server load over 4.00 when they do major update ect... querys on sql. is there a way to limit the ammount they can do?
I recently had a harddrive failure and luckliy I can still access certain directories on this failed drive. I can still access the /var/lib/mysql/ directory which holds all the users databases and have backed all these up separately using tar.
Now what I need to know is how do you restore these database files to another server? I tried simply untar'ing one of these to the new servers /var/lib/mysql/ direcotry and it stuffed Mysql up - it went offline. I had to get a cpanel tech to bring Mysql back online.
how can I get these database files to fully work on a new server?
I try to search and not found the solutions to limit concurrent of db users to connect to MySQL in same time for example:
Limit user root to connect to MySQL maximum 5 connections in same time
There have max_connections setting but it limit for all users in all connection to MySQL.
But I found that in version 4.1.x there have max_connections property in user table (Users and global privileges) of mysql database, this setting is limit connection in 60 minutes.
I would like to know if there any setting or code modify for this solution that I want.
I did the absolutely most stupid thing yesterday, I emptied the wrong mysql table via phpmyadmin. Fortunately my host managed to grab a the table in question from a few days ago.
So I have:
auctions.frm, auctions.MYD, and auctions.MYI
My assumption, is that I can just overwrite the current files of the same name in the mysql folder, and everyone will be happy. Is there some more to this process?
I have following problem-i have over 20 sites on server and each site has own database.Is there a way to speed up backup and transfer or other server?
Method which i use right now it's following:first i archive entire directory using command tar -pczf name.tar.gz public_html,and then repeat with each directory.But i think i could simply archive all requied directories,but that will took toomuch time,so if i drop connection during archiving it wont be archived at all then.So i think best solution will be to create some kind of batch command which can load in background,so that means command wont stop if client lost connection.
So let's say i have 2 sites and two directories located and different places.
One is at home/site1 and other at home/site2 So i think i would need to put command into batch file tar -pczf site.tar.gz. /home/site1 and tar -pczf site2.tar.gz /home/site2
Will that work? Also second part,mysql databases,i founded if i login into phpmyadmin as root i can see all databases.I managed to export all databases,but question will import again to phpmyadmin work.I think phpmyadmin create command for each database "if there is no db sitename_mysqlbase,create it",but howmuch i know phpmyadmin have limit with importing size of mysql database.Could that be done with import/export mysql command?
I couldn't keep my mouth shut (technically fingers). A customer wanted to upgrade servers and he needed a way to move the data across. Since I don't allow hard drives to be swapped, they have to do it manually all by themselves. I generally allow up-to 4 days for them to transfer data and make DNS changes, etc. But this time, I offered help! I agreed to move the data (darn me) and it just came out of me, involuntarily.
God knows what just happened... but in a positive way, customer is extremely happy!
So...
Both servers are on cPanel - with root access (duh)
200 odd files which total to 25 GB
1 database about 100 MB in size (no biggie)
I was planning on using one of my Windows 2003 servers (via remote desktop) to download the 25 GB and upload the 25 GB, but that sounds like a waste of resources and time.