I wasn't sure if this would be better posted in the resellers forum, but that seemed to generally be reviews of companies.
Anyway, I work for a company who is trying to move a large amount of sites from one server to a new one. I was looking for the best way to handle this, since you run into issues with mySQL databases, file permission getting lost in ftp, ect.
Linux Fedora 6, Apache 2 with Mod Security, MySQL.
Our mod_sec logs get incredibly large very quickly. In the configuration for mod_security, we have specified logging options as SecAuditEngine RelevantOnly SecAuditLogRelevantStatus "^[45]"
but the mod_sec.log gets to almost 10 GB (in a matter of 5-6 days) before it is truncated to mod_sec.log.1 and a new one is created.
Is there a way we can specify that a max size of one log file is 1 GB, for example? Or another question, how come it gets so huge so quickly? We thought that logging "RelevantOnly" will only display errors / requests that are deemed security risks.
I have a customer who wants to sell access to videos of conferences he runs.
Each flv vid is approx 1 - 1 1/2 hors long approx 380MB each and there will be about 12 videos per conference.
approx 4 - 8 conferences per year.
My customer suggests 10 - 20 people will buy access to watch each video.
Access to watch the videos will be through a password protected webpage.
issue - the current site hosting company only allow uploads up to 150MB per file.
Can I host the flash videos elsewhere and deliver them through the password protected web page without anyone else being able to see them via server they are hosted on?
This would also reduce the bandwidth going through his current site server.
I'm working on a web site which will basically be a flash games portal. I have a dedicated server running Apache 2 on a 100mbit dedicated line but my download speed for large files (flash files of over 5mbs) is really slow. I am thinking this is because of Apache but I don't know much about this. I've read that I should change for a lighter http server for serving static files. The way my server is set up is I have 2 virtual machines running, one doing the PHP processing and the other serving static files, both running Apache, so if I have to change HTTP server for the static files it would be very easy. Although I am not sure if this is necessary or if I can tune Apache to push files faster than this.
I'm facing a very strange FTP issue with one of my shared-hosting accounts, while all of my other servers are having no problems but only this one, when I try to upload a file (whatever file) larger than 500kb from my local PCs, in most cases, the file would stop uploading during the process and hang there until it times out.
There are 2 interesting things though: The file transmission typically hangs when approximately 248kb of the file have been transferred, pls see the attached screenshot for example.
If you look at the attached screenshot, you will notice that the uploading transmission hangs when 248kb of the file have been transferred. This is very strange and what I mean is that for example, I randomly pick up a file, and attempt to upload it onto my host for 10 times, now see, 5 times it will hang when 248kb of the total size have been transferred, 3 times it will hang at other points *near* 248kb (224kb or 280kb typically), 1 time it will hang at another random point, and 1 time it might be uploaded successfully (yes, there is still a tiny chance for the file to be uploaded successfully).
My default internet uploading speed is 80kb/s-100kb/s, lately I found that, when I limit the uploading speed on my FTP client (e.g. max. 30kb/s), everything WILL WORK without any problem! No hangs, no interrupt.. Whereas when I free up the uploading speed limitation and let it upload with my regular speed, the problem appears again.
It seems to me that the FTP hangs only when the uploading speed is higher than 60kb/s. However my host provider told me that they have customers uploading without any problem at over 400kb/s, and they said "there's no problem or limitations on the server at all".
Up until now, I have done following things to troubleshoot the issue but with no luck:
Contacted my host. Disabled/Enabled the PASV mode on my FTP client. Tried different FTP clients on different computers (FlashFXP and Filezilla). Rebooted my router and reseted everything with the factory default settings. Contacted my ISP for the issue, they "did something" but nothing were helpful. Rebooted all my PCs. Disabled both firewalls on my PC and on the router.
Furthermore, I have asked another friend of mine in another city with another ISP to test the FTP uploading, but unfortunately he got the exact same problem. And I've done some search on the internet for hours but no one seemed to have the same problem..
Just noticed quite a few large Core. files within one of our websites (within a sub folder of public_html). Anyone knwo what these are and how they got there?
I've been using Lypha for the past 4 years, but they've taken the last straw (gigabytes of backups went missing and they wont reply to emails as to why).
Looking for a web hosting package for under $10/month that has large enough disk-space/bandwidth to allow me to backup large audio / video files to it, as well as the normal site operation (I use it for portfolio website, as well as hosting additional domains)
I am developing a web application for a private investigative firm. They do surveillance work and therefore have surveillance videos. I would like the capabilities of uploading the videos online and allowing the client to login and view their surveillance video online.
Currently, we get the video from the PI, put it on a DVD and then mail it to the client.
This takes too long. We want the client to be able to view the video online.
Some of these videos can be up to 2 hours long.
First, is this even possible?
Second, - how much bandwidth would a website like this take? - Is there a host that can hold hundreds of GB of video?
I want to convert it to flash to save file size and also so I can stream it.
i have a few directorys on my server which have something in the region of 200+ subdirectorys containing 1 or 2 files. im wondering if anyone knows how to move all the files in the subdirs into the main directory without me going into each folder and doing it by hand
I have four servers with a quad xeon, 4gb ram, and 2x300GB SAS 15K RAID0 harddrives, pushing a total of 1.6gbits. It serves a lot of zip files with an average flesize of 180mb. My question is, how can I optimize lighttpd 1.4.19 to push its max with very low IO-wait. I've looked up some stuff and only found options that deal with lighttpd 1.5 and use Linux-AIO for the backend network. Currently I use writev with 16 workers and an read/write idle timeout of 10s. Logging is off, too.
Something weird happening here. I have tried every string possible...
There are a number of folders I want to remove off my server, tried the good old and simple...
rm -r /folder/
And then went and ended up with a string as long as my screen. No matter what I do, as it goes recursive in to the directory it asks me if I want to remove each file individually. No matter what string or action I take it insists on asking me as it goes to delete each file.
For those of you who are techies you will probably laugh, but to me this is like Greek and scarey
The deal is that I blog for PayPerPost and after they implemented Real Rank only my main blog was able to be monitored for RR. I have 4 other blogs in the system that I can't take any assignments on because they have no real rank. The reason they don't is that the URLs are all the same as the first blog, except that the other 4 have "blog1, blog2, etc., at the end of their address.
SO I went into my hosting control panel of Yahoo and created subdomains for each of the four, reversing the address from internetmarketingreview.org/blog1 to blog1.internetmarketingreview. org. (I thought I was pretty smart to do even that!)
Now you can see my blogs at the addresses. I was really happy, that is until I realized that the permalinks won't work! Of course they won't. Nothing has been done but cosmetic changes. I am just now understanding that somewhere I am missing a step, I have been researching this and I believe I need to move some files, but I am afraid of screwing things up.
I went to the Wordpress forum and found a permalink plugin from Yahoo, but I don't know if installing that will solve my problem. I have a pretty good idea that no, it won't. That too is about cosmetics, right?
i just wana know is it safe to do remote daily backup for about 70,000 files?
file sizes is about 200kb and every day i have about 1000 new file, so rsync first should check old files becouse i am deleting about 30-50 of them daily and them backup new 1000 files , so how much it will take every time to compare that 70,000 files?
i have 2 option now:
1-using second hdd and raid 1 2-using rsync and backuping to my second server , so i can save about $70 each month.
We have three virtual hosts on our Apache 2.2 installation on Windows Server 2003. For some reason, I'm unable to open log files (error.log and each virtual hosts-specific log), even though I have full administrator rights. (The log folder is full access to admins.) Every time I try to open the file or even copy it to another location, it just says "Access Denied." I temporarily solved the issue for one of the logs by adding BufferedLogs On
Domain has PHP Settings in Plesk set to 2G and I get this error when uploading a 48MB file using Wordpress. I assume I need ot modify this manually in conf file somewhere to allow uploading large files?
Requested content-length of 48443338 is larger than the configured limit of 10240000..
mod_fcgid: error reading data, FastCGI server closed connection...
I just moved a website I took over to a new host. The programmer hard-coded the DB in. The old code refers to the mediatemple address:
mysql_connect("internal-db.s5363.gridserver.com", "db5363", "-removed-") or die ('I cannot connect to the database because: ' . mysql_error()); mysql_select_db("db5363_bb", $link)
how I can configure this for the new (CPanel) IGX Host server?
I have to move some large websites from one host to another. The websites contain data about 1 - 3 GB, and my internet connection here is not that fast, so that downloading and uploading would take many hours. Unfortunately I have no SSH-Access to both accounts (old and new one), so I have to make it by FTP.
So I tried to make a tar file and moved it from the old to the new host. I made a tar archive of the whole webfolder (I used php to execute the shell commmands) and moved it by ftp to the new host. But now I´ve got troubles when extracting the archive on the new host: the extracted files are obviously created by the wrong user, so I can´t delete or access them by ftp.
So I´m trying to find other solutions to fix these problems. Maybe someone else had the same problems and could give me advice how to move large sites? Or do you know php-applications that can directly connect one host accout to another via ftp?
Compared to others my sites may not be that big but I have one site that is 5 gigs and another that is about 9 gigs. I was wonder what is the best or most recommend way to transfer these sites to a new host.
I tried downloading the whole site using FireFtp but always seem to get about 3rd of the way done and something messes up the conection. Are there any better tools or methods to do this.
I also have pretty large Dbs that I'll need to transfer as well.
I have couple of site on a shared hosting. Now I want to move it another shared hosting. Is there a program, that can move the sites effortlessly, with a click of button or two. I have couple of mysql database. I want them to be created on my 2nd hosting company.