In reference to my previous post, i want to tranfer accross 7GB of data, approximatly 80,000 files i believe it is (due to a gallery script).
It's currently on another host (on a webhosting account) which uses their own control panel which has no options but to manage databases, the only way i can see to do this is via FTP but it'll take me days. I've tried using compressing and backup scripts, but the damn execution time on the hosts server is too low to allow the files to be zipped. Are there any ways? Can i login to my VPS via SSH and anyhow pull off the files from the other hosts server?
is there any soft for windows who can transfer files from one server to another? I have Shell access on both servers but dont know how to do that via shell..is there any GUI soft for windows where I can on one window see files from one server and on other to see files from second server?
Are there any good software that will backup current server files and restore them on a new server?
I'm looking for full backup, that would backup all the files in less time instead of going to each account's cpanel first and downloading backup than restoring it to new server.
how to transfer files through my windows server and my Pc?
i tried many software but all of them depends on my Pc speed, so it transfer very slow, i tried also before connecting through RDC to windows server, to check on Local resources " Disc Drives" but also transfer files slowly,
i want any speed software to copy files as if there are in same Hard drives
Compared to others my sites may not be that big but I have one site that is 5 gigs and another that is about 9 gigs. I was wonder what is the best or most recommend way to transfer these sites to a new host.
I tried downloading the whole site using FireFtp but always seem to get about 3rd of the way done and something messes up the conection. Are there any better tools or methods to do this.
I also have pretty large Dbs that I'll need to transfer as well.
Recently I changed server providers, so now I'm looking for a way to transfer all the data to my new server. I have a total of 420GBs of files in my secondary HDD that need to be transferred.
The old server is at a 10Mbps line, the new one is at a 100Mbps one. From old server, less than half the pipe is being actively used. So theoretically, I should be able to transfer it all in about a week.
I tried 1) SCP. That was waaay too unreliable. And I couldn't get it to restart from the point left on whenever the transfer stopped (like when the servers were restarted).
2) Transfer using a web script. Way too slow, got to about 35GBs, total would take like 2 months.
Is there any other, reliable way of transferring data from server to server?
A few days ago, my friends studying in America recommended me a new popular transfer toolQoodaa. And he told me that it was a quite good software to download files and movies. At first,I was skeptical, but after using it, I found its a good choice to choose Qoodaa. And I have summarized the some features of Qoodaa:
1.Its speed is faster than any other softwares I used before to upload movies.
2.It can download files quickly through downloading links, in a word, it is time-saver and with high efficiency.
3.No limit of space.No matter where you are, it can download fast.
4. Qoodaa is a green software with high security and easy use.
It really can give you unexpected surprise.
I am a person who would like to share with others, and if you have sth good pls share with me
Linux Fedora 6, Apache 2 with Mod Security, MySQL.
Our mod_sec logs get incredibly large very quickly. In the configuration for mod_security, we have specified logging options as SecAuditEngine RelevantOnly SecAuditLogRelevantStatus "^[45]"
but the mod_sec.log gets to almost 10 GB (in a matter of 5-6 days) before it is truncated to mod_sec.log.1 and a new one is created.
Is there a way we can specify that a max size of one log file is 1 GB, for example? Or another question, how come it gets so huge so quickly? We thought that logging "RelevantOnly" will only display errors / requests that are deemed security risks.
I have a customer who wants to sell access to videos of conferences he runs.
Each flv vid is approx 1 - 1 1/2 hors long approx 380MB each and there will be about 12 videos per conference.
approx 4 - 8 conferences per year.
My customer suggests 10 - 20 people will buy access to watch each video.
Access to watch the videos will be through a password protected webpage.
issue - the current site hosting company only allow uploads up to 150MB per file.
Can I host the flash videos elsewhere and deliver them through the password protected web page without anyone else being able to see them via server they are hosted on?
This would also reduce the bandwidth going through his current site server.
I'm working on a web site which will basically be a flash games portal. I have a dedicated server running Apache 2 on a 100mbit dedicated line but my download speed for large files (flash files of over 5mbs) is really slow. I am thinking this is because of Apache but I don't know much about this. I've read that I should change for a lighter http server for serving static files. The way my server is set up is I have 2 virtual machines running, one doing the PHP processing and the other serving static files, both running Apache, so if I have to change HTTP server for the static files it would be very easy. Although I am not sure if this is necessary or if I can tune Apache to push files faster than this.
I'm facing a very strange FTP issue with one of my shared-hosting accounts, while all of my other servers are having no problems but only this one, when I try to upload a file (whatever file) larger than 500kb from my local PCs, in most cases, the file would stop uploading during the process and hang there until it times out.
There are 2 interesting things though: The file transmission typically hangs when approximately 248kb of the file have been transferred, pls see the attached screenshot for example.
If you look at the attached screenshot, you will notice that the uploading transmission hangs when 248kb of the file have been transferred. This is very strange and what I mean is that for example, I randomly pick up a file, and attempt to upload it onto my host for 10 times, now see, 5 times it will hang when 248kb of the total size have been transferred, 3 times it will hang at other points *near* 248kb (224kb or 280kb typically), 1 time it will hang at another random point, and 1 time it might be uploaded successfully (yes, there is still a tiny chance for the file to be uploaded successfully).
My default internet uploading speed is 80kb/s-100kb/s, lately I found that, when I limit the uploading speed on my FTP client (e.g. max. 30kb/s), everything WILL WORK without any problem! No hangs, no interrupt.. Whereas when I free up the uploading speed limitation and let it upload with my regular speed, the problem appears again.
It seems to me that the FTP hangs only when the uploading speed is higher than 60kb/s. However my host provider told me that they have customers uploading without any problem at over 400kb/s, and they said "there's no problem or limitations on the server at all".
Up until now, I have done following things to troubleshoot the issue but with no luck:
Contacted my host. Disabled/Enabled the PASV mode on my FTP client. Tried different FTP clients on different computers (FlashFXP and Filezilla). Rebooted my router and reseted everything with the factory default settings. Contacted my ISP for the issue, they "did something" but nothing were helpful. Rebooted all my PCs. Disabled both firewalls on my PC and on the router.
Furthermore, I have asked another friend of mine in another city with another ISP to test the FTP uploading, but unfortunately he got the exact same problem. And I've done some search on the internet for hours but no one seemed to have the same problem..
Just noticed quite a few large Core. files within one of our websites (within a sub folder of public_html). Anyone knwo what these are and how they got there?
I've been using Lypha for the past 4 years, but they've taken the last straw (gigabytes of backups went missing and they wont reply to emails as to why).
Looking for a web hosting package for under $10/month that has large enough disk-space/bandwidth to allow me to backup large audio / video files to it, as well as the normal site operation (I use it for portfolio website, as well as hosting additional domains)
I am developing a web application for a private investigative firm. They do surveillance work and therefore have surveillance videos. I would like the capabilities of uploading the videos online and allowing the client to login and view their surveillance video online.
Currently, we get the video from the PI, put it on a DVD and then mail it to the client.
This takes too long. We want the client to be able to view the video online.
Some of these videos can be up to 2 hours long.
First, is this even possible?
Second, - how much bandwidth would a website like this take? - Is there a host that can hold hundreds of GB of video?
I want to convert it to flash to save file size and also so I can stream it.
We had some issues with old server hence we migrated some websites to another new server from our old server.
We did a backup of all existing webfiles and database from old server and transferred the same to new server, a manual transfer few weeks ago.
However, we were not able to backup our awstats logs for this domains, can someone guide how to transfer awstats from old server to new server?
We cannot perform an automated transfer from old server to new server now, is there some way we can migrate our awstats from old server to new server for this domains?
I have a vps (CPANEL)... I would like to have incoming emails for a certain cpanel account transfer to another external server (after coming thru the VPS).
I store my emails on the external server and have more space there.
The reason behind this is:
I have spamassassin on my VPS and would like to run email thru that before it delivers on the external server. I do not have the capability to install spamassassin on the external server.
I have four servers with a quad xeon, 4gb ram, and 2x300GB SAS 15K RAID0 harddrives, pushing a total of 1.6gbits. It serves a lot of zip files with an average flesize of 180mb. My question is, how can I optimize lighttpd 1.4.19 to push its max with very low IO-wait. I've looked up some stuff and only found options that deal with lighttpd 1.5 and use Linux-AIO for the backend network. Currently I use writev with 16 workers and an read/write idle timeout of 10s. Logging is off, too.
Something weird happening here. I have tried every string possible...
There are a number of folders I want to remove off my server, tried the good old and simple...
rm -r /folder/
And then went and ended up with a string as long as my screen. No matter what I do, as it goes recursive in to the directory it asks me if I want to remove each file individually. No matter what string or action I take it insists on asking me as it goes to delete each file.
i just wana know is it safe to do remote daily backup for about 70,000 files?
file sizes is about 200kb and every day i have about 1000 new file, so rsync first should check old files becouse i am deleting about 30-50 of them daily and them backup new 1000 files , so how much it will take every time to compare that 70,000 files?
i have 2 option now:
1-using second hdd and raid 1 2-using rsync and backuping to my second server , so i can save about $70 each month.