Something weird happening here. I have tried every string possible...
There are a number of folders I want to remove off my server, tried the good old and simple...
rm -r /folder/
And then went and ended up with a string as long as my screen. No matter what I do, as it goes recursive in to the directory it asks me if I want to remove each file individually. No matter what string or action I take it insists on asking me as it goes to delete each file.
I cannot seem to find a conclusive answer to this. I use subdomains to create client websites then once they are in their own domains the subdomains still remain. Can I delete only the subdomain and leave all the files or at least leave the database since it is still connected to the new domain. Example below.
subdomain.example.com uses database1
newsite.com still uses database1
Is there a way to copy the database into the new domain?
I'm currently using (amongst other backup systems) rsync to an offsite space (am using BQBackup at the moment)
I'm just wondering - apart from backing up all of /home/, /var/lib/mysql/ and the important config files (httpd.conf, php.conf, etc etc) is there anything else that *needs* to be backed up?
Obviously in a worst case scenario, a new machine would be deployed with a fresh OS install (and a fresh WHM/cPanel install) so I wouldn't worry about backing up OS files or cPanel core files, although I'm wondering if there's anything apart from the /home/ directory and the MySQL databases which would be lost (and so need backing up) in the event of a crash?
we've been having issues with reaching or exceeding our disk quotas. I've checked carefully, and while I've cleared our mail queue's, I don't think that's the issue.
So what I'm really looking for is a way to figure out why we have so many files (our quota is 220000, and I'm pretty confident that we aren't intentionally doing anything to create so many files). There's lots of information on finding the folders with the largest file sizes, but locating the folders with the greatest number of files isn't quite so simple--or at least, it doesn't seem very clear to me.
Does anyone know a way via the command line to figure out this information, short of going through every single folder and figuring out how many files are in the specific folder?
Linux Fedora 6, Apache 2 with Mod Security, MySQL.
Our mod_sec logs get incredibly large very quickly. In the configuration for mod_security, we have specified logging options as SecAuditEngine RelevantOnly SecAuditLogRelevantStatus "^"
but the mod_sec.log gets to almost 10 GB (in a matter of 5-6 days) before it is truncated to mod_sec.log.1 and a new one is created.
Is there a way we can specify that a max size of one log file is 1 GB, for example? Or another question, how come it gets so huge so quickly? We thought that logging "RelevantOnly" will only display errors / requests that are deemed security risks.
I'm working on a web site which will basically be a flash games portal. I have a dedicated server running Apache 2 on a 100mbit dedicated line but my download speed for large files (flash files of over 5mbs) is really slow. I am thinking this is because of Apache but I don't know much about this. I've read that I should change for a lighter http server for serving static files. The way my server is set up is I have 2 virtual machines running, one doing the PHP processing and the other serving static files, both running Apache, so if I have to change HTTP server for the static files it would be very easy. Although I am not sure if this is necessary or if I can tune Apache to push files faster than this.
I'm facing a very strange FTP issue with one of my shared-hosting accounts, while all of my other servers are having no problems but only this one, when I try to upload a file (whatever file) larger than 500kb from my local PCs, in most cases, the file would stop uploading during the process and hang there until it times out.
There are 2 interesting things though: The file transmission typically hangs when approximately 248kb of the file have been transferred, pls see the attached screenshot for example.
If you look at the attached screenshot, you will notice that the uploading transmission hangs when 248kb of the file have been transferred. This is very strange and what I mean is that for example, I randomly pick up a file, and attempt to upload it onto my host for 10 times, now see, 5 times it will hang when 248kb of the total size have been transferred, 3 times it will hang at other points *near* 248kb (224kb or 280kb typically), 1 time it will hang at another random point, and 1 time it might be uploaded successfully (yes, there is still a tiny chance for the file to be uploaded successfully).
My default internet uploading speed is 80kb/s-100kb/s, lately I found that, when I limit the uploading speed on my FTP client (e.g. max. 30kb/s), everything WILL WORK without any problem! No hangs, no interrupt.. Whereas when I free up the uploading speed limitation and let it upload with my regular speed, the problem appears again.
It seems to me that the FTP hangs only when the uploading speed is higher than 60kb/s. However my host provider told me that they have customers uploading without any problem at over 400kb/s, and they said "there's no problem or limitations on the server at all".
Up until now, I have done following things to troubleshoot the issue but with no luck:
Contacted my host. Disabled/Enabled the PASV mode on my FTP client. Tried different FTP clients on different computers (FlashFXP and Filezilla). Rebooted my router and reseted everything with the factory default settings. Contacted my ISP for the issue, they "did something" but nothing were helpful. Rebooted all my PCs. Disabled both firewalls on my PC and on the router.
Furthermore, I have asked another friend of mine in another city with another ISP to test the FTP uploading, but unfortunately he got the exact same problem. And I've done some search on the internet for hours but no one seemed to have the same problem..
I've been using Lypha for the past 4 years, but they've taken the last straw (gigabytes of backups went missing and they wont reply to emails as to why).
Looking for a web hosting package for under $10/month that has large enough disk-space/bandwidth to allow me to backup large audio / video files to it, as well as the normal site operation (I use it for portfolio website, as well as hosting additional domains)
I am developing a web application for a private investigative firm. They do surveillance work and therefore have surveillance videos. I would like the capabilities of uploading the videos online and allowing the client to login and view their surveillance video online.
Currently, we get the video from the PI, put it on a DVD and then mail it to the client.
This takes too long. We want the client to be able to view the video online.
Some of these videos can be up to 2 hours long.
First, is this even possible?
Second, - how much bandwidth would a website like this take? - Is there a host that can hold hundreds of GB of video?
I want to convert it to flash to save file size and also so I can stream it.
I have four servers with a quad xeon, 4gb ram, and 2x300GB SAS 15K RAID0 harddrives, pushing a total of 1.6gbits. It serves a lot of zip files with an average flesize of 180mb. My question is, how can I optimize lighttpd 1.4.19 to push its max with very low IO-wait. I've looked up some stuff and only found options that deal with lighttpd 1.5 and use Linux-AIO for the backend network. Currently I use writev with 16 workers and an read/write idle timeout of 10s. Logging is off, too.
i just wana know is it safe to do remote daily backup for about 70,000 files?
file sizes is about 200kb and every day i have about 1000 new file, so rsync first should check old files becouse i am deleting about 30-50 of them daily and them backup new 1000 files , so how much it will take every time to compare that 70,000 files?
i have 2 option now:
1-using second hdd and raid 1 2-using rsync and backuping to my second server , so i can save about $70 each month.
Domain has PHP Settings in Plesk set to 2G and I get this error when uploading a 48MB file using Wordpress. I assume I need ot modify this manually in conf file somewhere to allow uploading large files?
Requested content-length of 48443338 is larger than the configured limit of 10240000..
mod_fcgid: error reading data, FastCGI server closed connection...