How To Transfer A Large File
Mar 28, 2008
I have a situation here.
I wanted to transfer a file (about 10GB) from one server to another server. Both server are on the same LAN.
Inside server1, I zipped a folder as file.tar.gz
From server2, I uses wget http://server1.com/file.tar.gz and it says "404 Not Found"
I suspect it won't allow me to download because the file is too big.
View 7 Replies
ADVERTISEMENT
Jun 10, 2008
Knowing Qoodaa was quite by chance.
A few days ago, my friends studying in America recommended me a new popular transfer tool—Qoodaa. And he told me that it was a quite good software to download files and movies. At first,I was skeptical, but after using it, I found it’s a good choice to choose Qoodaa. And I have summarized the some features of Qoodaa:
1.Its speed is faster than any other softwares I used before to upload movies.
2.It can download files quickly through downloading links, in a word, it is time-saver and with high efficiency.
3.No limit of space.No matter where you are, it can download fast.
4. Qoodaa is a green software with high security and easy use.
It really can give you unexpected surprise.
I am a person who would like to share with others, and if you have sth good pls share with me
View 0 Replies
View Related
Jan 31, 2007
to transfer a MySQL database with way over 1 million rows to a different server.
What is the best way to do this without losing any data? I've tried using the export feature in phpMyAdmin but it doesn't seem to handle it very well.
There has to be a feature in WHM or cPanel,
View 2 Replies
View Related
Jun 15, 2008
I have a debian box, and have archived a gallery in to a .tar file, 5.77gb.
I have a centOS box, and have used wget to bring the data file over to the new server.
However upon doing so it only detects it as 1.8gb when it starts downloading.
I have terminal access to both servers, just trying to bring my files over from one server to another.
View 4 Replies
View Related
Mar 5, 2007
Compared to others my sites may not be that big but I have one site that is 5 gigs and another that is about 9 gigs. I was wonder what is the best or most recommend way to transfer these sites to a new host.
I tried downloading the whole site using FireFtp but always seem to get about 3rd of the way done and something messes up the conection. Are there any better tools or methods to do this.
I also have pretty large Dbs that I'll need to transfer as well.
View 11 Replies
View Related
Jul 29, 2008
My client's website needs to hold files that are around 60 or 70 MB. The host only allows files up to 10 MB. Is that typical?
Right now I'm linking to a file storage but would rather make the files available from my site without going to a 3rd Party Site. He doesn't want to zip his files either - just to be a straight download.
View 7 Replies
View Related
Apr 14, 2009
So I've recently ordered a Supermicro 4U server with 24x1TB HDs, 64GB RAM and put it in RAID 10. I'm running Debian 5.0 and have installed lighttpd. All the content I serve are video files (AVi, MP4, MKV, OGM) and each file is about 100-500mb in size. I'm wondering how can I optimize lighttpd to get the best performance out of it. I look forward to your replies.
View 14 Replies
View Related
Mar 26, 2008
Can anyone tell me the format of SCP command for transfering an 18gig TAR.GZ file from one srever to another
say im logged in to the old host via SSH and the backup is located at /home/public_html/cpbackup/blahblah.tar.gz
what command would i put down to transfer to the new server
View 3 Replies
View Related
Apr 6, 2015
I have been trying quite unsuccessfully to import a large sql db file via phpMyAdmin for one of my clients. Since the db file is about 250mb I get a server timeout error.how I can do this via SSH...I have a CentOS server 6.5, 64 bit that runs Plesk v 12.0.18
View 4 Replies
View Related
Aug 21, 2007
on good hosting setups for getting large amounts of disk space.
I would like to be able to offer up to 2Gb storage space for 100s, maybe up to a few 1000 users - any solution should scale well. The files would be static files that might be up to 400Mb in size.
It would be nice to be able to give users FTP access to their disk space, although it's not a core requirement.
View 4 Replies
View Related
Jul 24, 2007
I'm currently running on a VPS. My site allows for large file uploads and downloads, with files over 600mb in size.
The server has issues when the site gets three or more requests for large file downloads. I'm trying to grow this site to thousands of users and it is hard to do when the site can't handle even three.
I've been told by my host that I need to upgrade to dedicated. My VPS only has 512mb RAM and one large file download is eating up that RAM. This is causing the issue.
I'm a newbie and while I knew I was risking a bit by going with VPS I do find it a bit annoying that these guys advertise 1TB of bandwidth per month but I can't even support downloading 1GB at the same time....maybe it's just me...
Anyway, I am now looking into moving the large files and the upload/download over to Amazon S3. If I do this I am expecting my RAM usage on the VPS to greatly decrease. Is this correct? If my PHP code is running on the VPS, but the actual file download via HTTP is coming from S3, that should not be a heavy load on my box, correct?
any opinions on S3?
View 2 Replies
View Related
May 21, 2007
How to transfer file from old hosting provider to new hosting provider and zero downtime?
View 5 Replies
View Related
Jul 8, 2009
Just moved to a new server, and of course, 10GB doesn't seem that large for a server but for some reason wget is not able to handle the transfer of that backup for me... it transfers about 1MB then tells me "successful transfer..."
The old server is using cPanel, and the new server is just a plain old server that I haven't loaded up yet.
how I can get this full backup over to the new server?
View 11 Replies
View Related
Jan 21, 2008
I am running a large scale business and some time I have to transfer large and very important data files to my business partner. I fear about my data because there are many of my business competitors who will definitely try to steal my important data. So there is huge amount of risk involved in sharing my important data on Internet. I recently heard about secure file transfer technique from my friend who is working in well established software company. Does anyone have any idea about what is Secure File Transfer (SFT) service and how does it work?
View 2 Replies
View Related
Oct 23, 2009
In reference to my previous post, i want to tranfer accross 7GB of data, approximatly 80,000 files i believe it is (due to a gallery script).
It's currently on another host (on a webhosting account) which uses their own control panel which has no options but to manage databases, the only way i can see to do this is via FTP but it'll take me days. I've tried using compressing and backup scripts, but the damn execution time on the hosts server is too low to allow the files to be zipped. Are there any ways? Can i login to my VPS via SSH and anyhow pull off the files from the other hosts server?
View 6 Replies
View Related
Nov 8, 2008
I want to set up my desktop to be kind of a database. So I can access all my files on my home desktop, from school. (and be able to back up all my files on my reliable desktop as opposed to my not so reliable lappy)
Then the next thing I wanted to do is to be able to access my desktop using remote access. So I can control everything on my desktop, while I'm not there.
My laptop is running Vista Home Premium, I dont think that matters too much. But my desktop is running XP Home Edition.
I have a no-ip account. but I dont really know what my next step would be, I'm guessing to make a sort of FTP on my desktop? and I have NO clue how I'd do the remote desktop.
View 7 Replies
View Related
Apr 26, 2007
I am unable to find how to resume file transfor via sftp and command line:
I use PUT command to upload file, but when connections fails and I start again, the transfer starts from its beginning - how could I made it to check the uploaded part and then resume?
View 1 Replies
View Related
Dec 29, 2007
I seem to keep getting this error whenever I try to uploading something thats too large.
It transfers fine for a few mins, then it stalls and eventually I get this error message.
Small files are fine since they don't take long to transfer, it just seems that I can't have a file transferring for too long.
I can actually get the damn file through but since I keep getting that error, I need to keep manually clicking on the transfer button. So it takes about 20 tries before I can finish a 30MB file. And I have a lot of files to transfer so thats very troublesome.
Does anyone know what the problem might be? I tried turning off the firewall and opening ports on my router but it still doesn't work. Using Cute FTP.
View 2 Replies
View Related
Jul 11, 2007
I have problem:
Searching /root....
Found cpmove-clanpz.tar.gz!
Moving Packge to /root/cprestore/cpmove-clanpz.tar.gz
Extracting tarball....window.scroll(0,175);...............
Done
Extracting Domain....Done
Sorry, the copy failed. Unable to find the cpanel user file. Is the archive missing (cwd: /root/cprestore loaded cpmove-clanpz/cp/clanpz)?
checked 4 files.....
1. /scripts/pkgacct username...
2. Transfer backup to new server
3. /scripts/restorepkg username
4. this error
View 14 Replies
View Related
Feb 26, 2009
I have a unique tool to move your files from old host server to a new one because of some reasons.
Find it out later in my post......
View 6 Replies
View Related
Feb 10, 2015
I'm build Plesk Panel for Linux and Presence Builder, I don't want my user can upload their website to hosting via File Manager. How can I do it...
View 2 Replies
View Related
May 26, 2008
Say I have 2 websites and they all use file.php which is located on mainserver.com/file.php.
I want to use the file like this:
website1.com/file.php
website2.com/file.php
View 2 Replies
View Related
Mar 6, 2008
Rapidly growing error logs showing the same message
$ug-non-zts-20020429/ffmpeg.so' - /usr/local/lib/php/extensions/no-debug-non-zts-20020429//usr/local/lib/php/extensions/no-debug-non-zts-20020429/ffmpeg.so: cannot open shared object file: No such file or directory in Unknown on line 0
root@server [~]# ls /usr/local/lib/php/extensions/no-debug-non-zts-20020429
./ ../ eaccelerator.so*
root@server [~]# ls /usr/local/lib/php/extensions/no-debug-non-zts-20020429
./ ../ eaccelerator.so*
Using cpanel 11 / centos 4
View 1 Replies
View Related
Jun 16, 2008
i have a server with centos,
i need to edit the hidden file .htaccess from the file management tool of cpanel,
but the hidden files not shown,
ow can i modify the setting and let the files shown in the file management tool of cpanel?
View 6 Replies
View Related
Sep 17, 2014
How can we stop Plesk resetting the file permissions on a dll file that is found in
C:Program Files (x86)
??
Specifically,
we have a file, jmail.dll,
here
C:Program Files (x86)Dimacw3JMail
By default Plesk permissions are set to DENY for PSACLN.
But the JMail plugin cannot work with these permissions !
We change this to be ALLOW for READ & EXECUTE and DENY for WRITE.
And everything works fine.
But every time Plesk does an update it reverts it back !
This means that a number of our customers contact forms stop working !
View 12 Replies
View Related
Nov 22, 2008
I'm trying to do this
/usr/bin/gzip -p /home/mysite/public_html/shop/feeds/myfile.xml.zip > /home/mysite/public_html/shop/feeds/myfile.xml
But it just tells me
/usr/bin/gzip: invalid option -- p
X-Powered-By: PHP/5.2.5
Content-type: text/html
How do I find the correct option to unzip first file to second file?
View 8 Replies
View Related
Oct 10, 2014
i manage linux apache webserver with a few wordpress blogs and from time to time i see someone inject a malicious .php file into wp-content/uploads/2014/10/ directory.
i think its some bad plugin or theme, but these is more blogs, i ugrade, update, WP, but
how can i setup some monitor to tell me which php file (or even line in php file) injected that malicious .php ? I have linux root access so i can setup anythingÂ
View 3 Replies
View Related
Apr 5, 2007
I'm having a lengthy issue where my databases are to large to import in phpmyadmin using plesk. Unfortunately I dont have direct access to phpmyadmin and can only access it by DB user through plesk.
I have tried to edit php.ini in the following locations:
upload_max_filesize = changed this to 64M
post_max_size = changed this to 32M
maximum_execution_time = changed this to 300
maximum_input_time = changed this to 300
Why am I still not able to import my DB's which are about 8MB each?
View 4 Replies
View Related