A few days ago, my friends studying in America recommended me a new popular transfer tool—Qoodaa. And he told me that it was a quite good software to download files and movies. At first,I was skeptical, but after using it, I found it’s a good choice to choose Qoodaa. And I have summarized the some features of Qoodaa:
1.Its speed is faster than any other softwares I used before to upload movies.
2.It can download files quickly through downloading links, in a word, it is time-saver and with high efficiency.
3.No limit of space.No matter where you are, it can download fast.
4. Qoodaa is a green software with high security and easy use.
It really can give you unexpected surprise.
I am a person who would like to share with others, and if you have sth good pls share with me
Compared to others my sites may not be that big but I have one site that is 5 gigs and another that is about 9 gigs. I was wonder what is the best or most recommend way to transfer these sites to a new host.
I tried downloading the whole site using FireFtp but always seem to get about 3rd of the way done and something messes up the conection. Are there any better tools or methods to do this.
I also have pretty large Dbs that I'll need to transfer as well.
My client's website needs to hold files that are around 60 or 70 MB. The host only allows files up to 10 MB. Is that typical?
Right now I'm linking to a file storage but would rather make the files available from my site without going to a 3rd Party Site. He doesn't want to zip his files either - just to be a straight download.
So I've recently ordered a Supermicro 4U server with 24x1TB HDs, 64GB RAM and put it in RAID 10. I'm running Debian 5.0 and have installed lighttpd. All the content I serve are video files (AVi, MP4, MKV, OGM) and each file is about 100-500mb in size. I'm wondering how can I optimize lighttpd to get the best performance out of it. I look forward to your replies.
I have been trying quite unsuccessfully to import a large sql db file via phpMyAdmin for one of my clients. Since the db file is about 250mb I get a server timeout error.how I can do this via SSH...I have a CentOS server 6.5, 64 bit that runs Plesk v 12.0.18
on good hosting setups for getting large amounts of disk space.
I would like to be able to offer up to 2Gb storage space for 100s, maybe up to a few 1000 users - any solution should scale well. The files would be static files that might be up to 400Mb in size.
It would be nice to be able to give users FTP access to their disk space, although it's not a core requirement.
I'm currently running on a VPS. My site allows for large file uploads and downloads, with files over 600mb in size.
The server has issues when the site gets three or more requests for large file downloads. I'm trying to grow this site to thousands of users and it is hard to do when the site can't handle even three.
I've been told by my host that I need to upgrade to dedicated. My VPS only has 512mb RAM and one large file download is eating up that RAM. This is causing the issue.
I'm a newbie and while I knew I was risking a bit by going with VPS I do find it a bit annoying that these guys advertise 1TB of bandwidth per month but I can't even support downloading 1GB at the same time....maybe it's just me...
Anyway, I am now looking into moving the large files and the upload/download over to Amazon S3. If I do this I am expecting my RAM usage on the VPS to greatly decrease. Is this correct? If my PHP code is running on the VPS, but the actual file download via HTTP is coming from S3, that should not be a heavy load on my box, correct?
Just moved to a new server, and of course, 10GB doesn't seem that large for a server but for some reason wget is not able to handle the transfer of that backup for me... it transfers about 1MB then tells me "successful transfer..."
The old server is using cPanel, and the new server is just a plain old server that I haven't loaded up yet.
how I can get this full backup over to the new server?
I am running a large scale business and some time I have to transfer large and very important data files to my business partner. I fear about my data because there are many of my business competitors who will definitely try to steal my important data. So there is huge amount of risk involved in sharing my important data on Internet. I recently heard about secure file transfer technique from my friend who is working in well established software company. Does anyone have any idea about what is Secure File Transfer (SFT) service and how does it work?
In reference to my previous post, i want to tranfer accross 7GB of data, approximatly 80,000 files i believe it is (due to a gallery script).
It's currently on another host (on a webhosting account) which uses their own control panel which has no options but to manage databases, the only way i can see to do this is via FTP but it'll take me days. I've tried using compressing and backup scripts, but the damn execution time on the hosts server is too low to allow the files to be zipped. Are there any ways? Can i login to my VPS via SSH and anyhow pull off the files from the other hosts server?
I want to set up my desktop to be kind of a database. So I can access all my files on my home desktop, from school. (and be able to back up all my files on my reliable desktop as opposed to my not so reliable lappy)
Then the next thing I wanted to do is to be able to access my desktop using remote access. So I can control everything on my desktop, while I'm not there.
My laptop is running Vista Home Premium, I dont think that matters too much. But my desktop is running XP Home Edition.
I have a no-ip account. but I dont really know what my next step would be, I'm guessing to make a sort of FTP on my desktop? and I have NO clue how I'd do the remote desktop.
I am unable to find how to resume file transfor via sftp and command line:
I use PUT command to upload file, but when connections fails and I start again, the transfer starts from its beginning - how could I made it to check the uploaded part and then resume?
I seem to keep getting this error whenever I try to uploading something thats too large.
It transfers fine for a few mins, then it stalls and eventually I get this error message.
Small files are fine since they don't take long to transfer, it just seems that I can't have a file transferring for too long.
I can actually get the damn file through but since I keep getting that error, I need to keep manually clicking on the transfer button. So it takes about 20 tries before I can finish a 30MB file. And I have a lot of files to transfer so thats very troublesome.
Does anyone know what the problem might be? I tried turning off the firewall and opening ports on my router but it still doesn't work. Using Cute FTP.
Sorry, the copy failed. Unable to find the cpanel user file. Is the archive missing (cwd: /root/cprestore loaded cpmove-clanpz/cp/clanpz)? checked 4 files.....
1. /scripts/pkgacct username... 2. Transfer backup to new server 3. /scripts/restorepkg username 4. this error
Rapidly growing error logs showing the same message
$ug-non-zts-20020429/ffmpeg.so' - /usr/local/lib/php/extensions/no-debug-non-zts-20020429//usr/local/lib/php/extensions/no-debug-non-zts-20020429/ffmpeg.so: cannot open shared object file: No such file or directory in Unknown on line 0
root@server [~]# ls /usr/local/lib/php/extensions/no-debug-non-zts-20020429 ./ ../ eaccelerator.so* root@server [~]# ls /usr/local/lib/php/extensions/no-debug-non-zts-20020429 ./ ../ eaccelerator.so*
i manage linux apache webserver with a few wordpress blogs and from time to time i see someone inject a malicious .php file into wp-content/uploads/2014/10/ directory.
i think its some bad plugin or theme, but these is more blogs, i ugrade, update, WP, but
how can i setup some monitor to tell me which php file (or even line in php file) injected that malicious .php ? I have linux root access so i can setup anythingÂ
I'm having a lengthy issue where my databases are to large to import in phpmyadmin using plesk. Unfortunately I dont have direct access to phpmyadmin and can only access it by DB user through plesk.
I have tried to edit php.ini in the following locations:
upload_max_filesize = changed this to 64M
post_max_size = changed this to 32M
maximum_execution_time = changed this to 300
maximum_input_time = changed this to 300
Why am I still not able to import my DB's which are about 8MB each?