Share A Good Transnational Software For Large File Transfer
Jun 10, 2008
Knowing Qoodaa was quite by chance.
A few days ago, my friends studying in America recommended me a new popular transfer toolQoodaa. And he told me that it was a quite good software to download files and movies. At first,I was skeptical, but after using it, I found its a good choice to choose Qoodaa. And I have summarized the some features of Qoodaa:
1.Its speed is faster than any other softwares I used before to upload movies.
2.It can download files quickly through downloading links, in a word, it is time-saver and with high efficiency.
3.No limit of space.No matter where you are, it can download fast.
4. Qoodaa is a green software with high security and easy use.
It really can give you unexpected surprise.
I am a person who would like to share with others, and if you have sth good pls share with me
I'm sure you all may have heard this question before, so I'm sorry if I'm beating a dead horse...just can't seem to find a good answer. I am interested in setting up a fileserver / fileshare on a VPS so that I can create a mapped drive on a windows PC which points to the fileshare on the VPS. I have a client who currently uses a physical server to perform this task, however this physical server is under-utilized and somewhat unnecessary. I mentioned the possibility of moving to a VPS and he seemed interested. I decided to purchase an entry-level account from VPSLAND to use for testing purposes prior to moving forward with the project. I can't seem to get anything to work so I'm looking for ideas.
I purchased a VPSLAND Windows-based EZ-series VPS with Plesk and all the other bundled goodies.
Compared to others my sites may not be that big but I have one site that is 5 gigs and another that is about 9 gigs. I was wonder what is the best or most recommend way to transfer these sites to a new host.
I tried downloading the whole site using FireFtp but always seem to get about 3rd of the way done and something messes up the conection. Are there any better tools or methods to do this.
I also have pretty large Dbs that I'll need to transfer as well.
I'm trying to find a low cost solution for realtime file share replication in a windows environment.
It doesn't look like there are any open source windows cluster filesystems around, so the only viable option I found would be running OpenFiler in a replication cluster on Hyper-V nodes. Has anyone worked with this, does it work reliably?
The required IO throughput on these shares would really be minimal and my biggest concern is 100% availability.
There are always people who would like to know what the php settings are on the server. Is it a security risk to share the phpinfo.php file on a website, with anybody who visits that website, able to view it?
if you can share a 100MB download link that I can use to test cogent's speed to my network. Hopefully plugged into a 100MBPS port at the switch to see if it will max out or not.
I recently started building out a new network rack to provide a production web site. The new equipment stack includes a disk array providing a CIFS file share to store images to be served up by Apache.
I have had zero luck in getting Apache to properly access the imagestore from the network share. I've read more Google pages on this subject today than I can count, but I am still not having any success getting this working right.
I'll do my best to explain the configuration.
I have an ESXi host running several virtual machines. Each machine needs to be able to access the shares. Each host has multiple network interfaces, each connected to a separate subnet. The virtual machines are running Windows Server 2012 Datacenter edition.
The disk array is file mode access, with NFS and CIFS shares. It has interfaces on both subnets that each VM can reach. I have established a stand alone CIFS server, with the shares configured. They are accessible from the VMs.
I have mapped the share to a drive letter on the VM client, and it works properly from the logged in account. I have full control over files on the file system (create, modify, delete).
The VM has Apache 2.4.9 installed.
Things I've tried with no success:
-created a symlink to the CIFS mounted drive into the webroot directory -added an alias to the CIFS mounted drive -added the aliased directory using the <Directory> directive -added the alias and directory directives using UNC references
I am seeing errors like "path is invalid" mostly, but when I try to add the mapped drive (f or the UNC referenced directory, the Apache service won't start.
I added a separate user for the Apache service, and added it to the group that has privileges to talk to the share, still didn't work.
We have found that we need to limit the amount of cpu uage by users on our video share server. On this server we currently have 20 users on a sharred plan. Thought that the obvious BW usage would be the biggest challenge, as it turns out we havent gone over the 2 TB that we have.
We have come up with an encoding process that uses the 264 codec and gives us excellent results in terms of quality but is very cpu intensive to the point of really slowing down the server when 10 or more users simutaneously are encoding their videos.
Can someone suggest a script that would allows us to limit the file size in terms of MB/GB that each user could upload per month.
So for example a client pays 10.00 per month and we wanted to limit their uploads to a total of 900 MB per month vs the client that is paying 50.00 per month who would have the ability to upload say 8 GB per month.
I have a 1u server that I'm looking to colo. I looked around the forums but I can't seem to find exactly what I need.
I'd prefer if the hosting company was large, owned its own datacenter (as opposed to leasing), offered proactive (automated) monitoring services, remote reboot, and a superb network with little or no downtime.
I like liquidweb but it looks like they only have 500gb / month up and down or a dedicated line capped at 10mbps line. I need something in between, around 1000GB / month up and down with 100mbps port. Support doesn't really matter, nor does price.
I have a web site currently at ipower. Until a few months ago, the transfer rate was good at over 1.5 mega-bytes per seconds, but after a transition to new servers, it's now down to 350 kilo-bytes per second.
Ipower's advertising claims 15,000GB bandwith per month, but in order to get that, it would require a continuous transfer rate of 5.787 mega-bytes per second, 16 1/2 fold increase in transfer rate over the current 350 kilo-byte per second rate. Yet another example of false advertising.
Anyway, other than signing up with a service, testing download rates, canceling the seriver, signing up with another service, is there anyone here that can recommend a web hosting service with around 1 to 2 megabyte per second transfer rates, or a link to a review of web hosts that includes transfer rates?
My client's website needs to hold files that are around 60 or 70 MB. The host only allows files up to 10 MB. Is that typical?
Right now I'm linking to a file storage but would rather make the files available from my site without going to a 3rd Party Site. He doesn't want to zip his files either - just to be a straight download.
So I've recently ordered a Supermicro 4U server with 24x1TB HDs, 64GB RAM and put it in RAID 10. I'm running Debian 5.0 and have installed lighttpd. All the content I serve are video files (AVi, MP4, MKV, OGM) and each file is about 100-500mb in size. I'm wondering how can I optimize lighttpd to get the best performance out of it. I look forward to your replies.
the subject pretty much sums it up, is there a method or solution for multiple websites (whic reside on the same dedicated server) to share just one .htpasswd, or automate the mirroring of said .htpasswd file?
if so any suggestions for methodology or products that would facilitate this action would be most welcome, thx in advance friends..!
I have been trying quite unsuccessfully to import a large sql db file via phpMyAdmin for one of my clients. Since the db file is about 250mb I get a server timeout error.how I can do this via SSH...I have a CentOS server 6.5, 64 bit that runs Plesk v 12.0.18
on good hosting setups for getting large amounts of disk space.
I would like to be able to offer up to 2Gb storage space for 100s, maybe up to a few 1000 users - any solution should scale well. The files would be static files that might be up to 400Mb in size.
It would be nice to be able to give users FTP access to their disk space, although it's not a core requirement.
I'm currently running on a VPS. My site allows for large file uploads and downloads, with files over 600mb in size.
The server has issues when the site gets three or more requests for large file downloads. I'm trying to grow this site to thousands of users and it is hard to do when the site can't handle even three.
I've been told by my host that I need to upgrade to dedicated. My VPS only has 512mb RAM and one large file download is eating up that RAM. This is causing the issue.
I'm a newbie and while I knew I was risking a bit by going with VPS I do find it a bit annoying that these guys advertise 1TB of bandwidth per month but I can't even support downloading 1GB at the same time....maybe it's just me...
Anyway, I am now looking into moving the large files and the upload/download over to Amazon S3. If I do this I am expecting my RAM usage on the VPS to greatly decrease. Is this correct? If my PHP code is running on the VPS, but the actual file download via HTTP is coming from S3, that should not be a heavy load on my box, correct?
Just moved to a new server, and of course, 10GB doesn't seem that large for a server but for some reason wget is not able to handle the transfer of that backup for me... it transfers about 1MB then tells me "successful transfer..."
The old server is using cPanel, and the new server is just a plain old server that I haven't loaded up yet.
how I can get this full backup over to the new server?
I am running a large scale business and some time I have to transfer large and very important data files to my business partner. I fear about my data because there are many of my business competitors who will definitely try to steal my important data. So there is huge amount of risk involved in sharing my important data on Internet. I recently heard about secure file transfer technique from my friend who is working in well established software company. Does anyone have any idea about what is Secure File Transfer (SFT) service and how does it work?
In reference to my previous post, i want to tranfer accross 7GB of data, approximatly 80,000 files i believe it is (due to a gallery script).
It's currently on another host (on a webhosting account) which uses their own control panel which has no options but to manage databases, the only way i can see to do this is via FTP but it'll take me days. I've tried using compressing and backup scripts, but the damn execution time on the hosts server is too low to allow the files to be zipped. Are there any ways? Can i login to my VPS via SSH and anyhow pull off the files from the other hosts server?
I want to set up my desktop to be kind of a database. So I can access all my files on my home desktop, from school. (and be able to back up all my files on my reliable desktop as opposed to my not so reliable lappy)
Then the next thing I wanted to do is to be able to access my desktop using remote access. So I can control everything on my desktop, while I'm not there.
My laptop is running Vista Home Premium, I dont think that matters too much. But my desktop is running XP Home Edition.
I have a no-ip account. but I dont really know what my next step would be, I'm guessing to make a sort of FTP on my desktop? and I have NO clue how I'd do the remote desktop.
I am unable to find how to resume file transfor via sftp and command line:
I use PUT command to upload file, but when connections fails and I start again, the transfer starts from its beginning - how could I made it to check the uploaded part and then resume?
I seem to keep getting this error whenever I try to uploading something thats too large.
It transfers fine for a few mins, then it stalls and eventually I get this error message.
Small files are fine since they don't take long to transfer, it just seems that I can't have a file transferring for too long.
I can actually get the damn file through but since I keep getting that error, I need to keep manually clicking on the transfer button. So it takes about 20 tries before I can finish a 30MB file. And I have a lot of files to transfer so thats very troublesome.
Does anyone know what the problem might be? I tried turning off the firewall and opening ports on my router but it still doesn't work. Using Cute FTP.
Sorry, the copy failed. Unable to find the cpanel user file. Is the archive missing (cwd: /root/cprestore loaded cpmove-clanpz/cp/clanpz)? checked 4 files.....
1. /scripts/pkgacct username... 2. Transfer backup to new server 3. /scripts/restorepkg username 4. this error