I've two connections: ADSL (2.5mbps DOWN, 512 UP) and Cable (256 Down, 56 UP). Uploads are free on ADSL, Cable however is truly unlimited.
I'm wondering if there is any way to upload using ADSL and download using Cable? since there seems to be some amount of inward data transfer even while uploading (p2p) and the ADSL is highly expensive so I wanted to use Cable for that.
I have an server linux OS CentOS 5.2 and using firewall CSF. and need question.
how to limit download theart ( limit connecting when download files ) EX : 4 connecting or 8 or 16 connecting ( my Guest using soft Internet Download Manager ).
For example, my website [url]and Direct links are: [url]. how to limit theart (Connecting) when Guest download which and using soft internetdownloadmanager, flasget.
I'm currently running on a VPS. My site allows for large file uploads and downloads, with files over 600mb in size.
The server has issues when the site gets three or more requests for large file downloads. I'm trying to grow this site to thousands of users and it is hard to do when the site can't handle even three.
I've been told by my host that I need to upgrade to dedicated. My VPS only has 512mb RAM and one large file download is eating up that RAM. This is causing the issue.
I'm a newbie and while I knew I was risking a bit by going with VPS I do find it a bit annoying that these guys advertise 1TB of bandwidth per month but I can't even support downloading 1GB at the same time....maybe it's just me...
Anyway, I am now looking into moving the large files and the upload/download over to Amazon S3. If I do this I am expecting my RAM usage on the VPS to greatly decrease. Is this correct? If my PHP code is running on the VPS, but the actual file download via HTTP is coming from S3, that should not be a heavy load on my box, correct?
how i can make an FTP account for my clients to upload files to but when they upload a file they dont see it after. I want to just make 1 ftp user / pass to give to my clients but after uploading they dont see the file or any other files in the folder.
Maybe a way for the file to auto move to another folder after uploading?
I have created a FTP with 1 user that can upload and download.
How can I create another user through the SHELL that has only upload permission. I need my clients to upload files, but cant download them or overwrite them.
I thought I knew enough about my .htaccess stuff to do this, but I can't seem to work it out. What I want to do is if a user visits domain.com/folder, we check to see if the folder exists. If so, show as normal (IE domain.com/support)
If a user visits domain.com/dynamicusername (dynamicusername is not a physical folder), redirect to dynamicusername.domain.com
when I find the subscription from the admin side of PPA, if I select "Login as user" I've noticed that it is different from actually logging in as the user - for example - "add domain alias" is missing when I login as a customer - but not as an admin... I need my customers to add their own aliases and manage them - how do I add that feature to the client login side?
i had upload problems with Aspirationhosting since signing up yesterday.
Tried the following -
1. filezilla ftp/sftp upload a 8m zip package only to get time out from time to time
2. tried other ftp client resulting same issue
3. upload the other hosting company in the same way turns out very fast
4. isp speed test turns out 180 -230k/s
5. cpanel>web upload to AH server only gets "dead" pages or hours time consuming for a 8m pack
6. contacted the support and ticket is still open, almost all possible issues considered but failed to crack
here is the error from time to time while filezillar ftp upload unpacked site files - Error:Directory /home2/XXXXXX/public_html/XXXXX/directory1/2/3: no such file or directory And if sftp used for uploading package site (only 8m) the error is - time out...
Im having problems uploading a 250mb sql DB thru Phpmyadmin in localhost. Can someone please help me out with this one? Is there a better way to do this. It keeps telling that the file is too large to upload thru phpmyadmin.. I have edit the php.ini file but no luck it still says the same thing.
What I basicly just want to do is to look at my old database tables that I had from my old site. I need all my members information so that I can start migrating info. Is there a software that just allows me to look at my sql DB ? I have a local copy in my hard drive.
Code: running a ipb forum site 2.3.4 current stats : Our members have made a total of 245,827 posts We have 14,673 registered members Total number of topics is 75099 with atleast 150 members online + 200guests.
what would be the perfect my.cnf config for this kinda of a server? Im running litespeed.
Whenever I tried to upload large files to my server it restarts the upload again and again and never actually uploads. It just keeps overwriting the previous file. I don't get any errors? It just automatically reuploads and overwrites the files everytime.
I think I messed php config and I can't upload anything with php now Dir is chmoded on 777 and File_Uploads = On in php.ini
I'm running lsphp5 with suhosin, when I try to import db via phpmyadmin I get error: Uploading is not allowed and when I try to upload some file via php script I can't
I have a forum ( VBulletin ) in admincp Upload file is ok and high, For example .Zip file are max 3 Meg upload, but i want upload .Zip in thread, i can not upload over 1 Mb, and i view database error!
I am having problems on my server... I can't upload files via php script because of a time out... when i upload files that take 2-3 min upload i get timeout... everything under that is normaly uploaded ...
execution time is set to 3000 .. same problem again..
file size limit set over 200MB ... (trying to upload 20-30MB) ... timeout...
i have a vps with vps4less. i have a counter strike server on it.when i am alone i have 65ping. when my friends connected i have 120.They say that i have 10mbps unnmetered.Is there any way to check my upload speed?
When I try to install ffmpeg, but it fail. The server cannot upload 1KB file from php. $_FILES['xxx']['size'] return to 0 $_FILES['xxx']['tmp_name'] return to ''
I am not exactly sure where to post this so I figured I would try here.
I have quite a few customers I host on a dedicated server. I would like to offer them the ability to backup any kind of data they want to on the server as well.
I am looking for a simple program that i could distribute to my customers and all they have to do is:
Install the application Type in UN/PW I provide them Select the directories they would like automatically uploaded Select the frequency of the automatic upload
Does anyone know of good software I can use that is that simple to use?
I'm using Transmit on OS X as an FTP client. I've been trying to upload a folder of images to my site for weeks now and every time I try to do so, that server seems to hang up on me or cut the transmission off and the images never get 100% uploaded. There is still plenty of space available. The images are all under 200KB. I'm using passive mode. I have this problem on this server and a few others. With some other servers, I don't have this problem at all, the images are uploaded just fine.