I was asked to redesign one web site. Nothing much complicated. I've got username and pass to hosting provider. Now I can see the files of web site that are already there but how can I download them because only upload seems to be possible. And if so why is that so?
Has anybody ever come across a server problem, where the file is visited (e.g. domain.com/sub/file.html or domain.com/sub/file2.php) it attempts to download the file automatically as apposed to displaying the page?
I run a download website boasting many files (entirely legal, non-pornographic), the sizes of which range from 100KB to 500MB. Some users of the site seem to be complaining of the files not downloading and or being truncated (i.e: it says it has been downloaded even though it has not been fully downloaded.)
Here is one reply I received:
"No specific massage appears, only sign the download is complete and infact it is not. About 7 mega down loaded instead of 79 mega. I use fire fox browser and ADSL internet line speed 512"
All the files however do seem to download perfectly on my computer, though I have a good 8mb line, which leads me to believe it may be something to do with a timeout setting?
I have a website on bluehost and i have a major problem in downloading speed from my site. for ex. if you went to any game page on my site "that download the game to play" the download time for the game "flash file" take DOUBLE the normal time.
I am on shared hosting, i know it may be less perfect but not to that level of slowness.
I have no errors in the log file .
When i searched for the server i found it has 1260 web sites.
My site www.samargames.com and you may check for game page.
how to solve the slow of downloading the flash files from my BH account.
if it is possible to allow only one ip downloading files from a directory at a time.web server is apache , may be with some apache module or if there is some build in configuration can be applied to .htaccess .
When I download a file from my server, only specific extensions are working. This is really annoying since I want to be able to see how much time left to finish a download.
For example I uploaded a video with .vob extension file.vob --> does not show filesize when downloading
If I rename the same file to different extension: file.avi --> works fine shows filesize when downloading file.mp3 --> works fine shows filesize when downloading file.rar --> works fine shows filesize when downloading file.mp4 --> does not show filesize when downloading file.wmv --> does not show filesize when downloading
These are direct download links, not using any download scripts or anything. Why are some extensions displaying the filesize and some not displaying them? I am using Apache 2.x server.
I have no download possibility on some computers from where I access internet, but I frequently need it.
Can I download files from the internet resource sites directly to a hosting server (just like on a PC)? Can I do it with SSH? What are minimum requirements? SSH on a shared hosting? VPS with root acces?
This is my 2nd server,i had a server with leaseweb and no probs wat so ever setting up utorrent on it.
Ive set it up on my 1%1 server but nothing is moving,ive tried random ports,ive tried turning the cisco firewall of on the 1&1 web page control pannel,but nothing moves wat so ever.
What have i missed here,theres no way they can block all ports...it says on there site all ports are avilible,but even if i picked a random port,and wasnt connectable it would still download just slow,any1 else with a 1&1 server had downloading probs with utorrnte.
I installed apache, mysql, php on my windows vista laptop, and want to test http downloading. This means when selecting a file (for example, contact.php) from a page, and then click download, it will be downloaded to my desktop.
Do we need to install any other softwares to do that?
I'm setting up a web site for my online music library, doing it the hard way and learning as I go!
What I want to do I keep all the audio files secure so only registered users can get at them, how do I do this? FTP? permissions? Can I pass the user data from the client database somehow or do I have to set it up manually for each client?
I'm using php and mysql and have a table set up with all the file locations in it and that side of things is mostly working well. Once a user gets the URL of the file how do I make sure only that user can download the file?
I've tried searching the web for info but I have the sneaking suspicion I'm not asking the right question.
When you delete a site backup from its "Backup Manager" Panel, it is removed and no longer displayed in the Panel. However, I cannot tell if this action actually does anything with the real site backup files in "/var/lib/psa/dumps". Does this action merely remove it from PSA's database but not touch any actual files? If this is true, then how are site backup files supposed to be managed if this action doesn't actually delete them?
I just changed hosts, thinking it would be faster. My homepage uses 53MB of RAM. My old host lets me use ini_set to change this (currently have it at 128MB), my new host doesn't (48MB).
So, I've cut some of the functionality of the site to make it work on the new host. Should I upgrade the hosting plan to higher RAM? Given that the site already works reasonably well on the new host, what type of performance boosts could I expect to see by doubling the RAM?
I am going to upgrade my servers and move all the accounts. Of course DNS IP's will change.
Last time I did this, I just created exactly same DNS on new server (eg. ns11.server.com and ns12.server.com) and updated the IP address of the DNS on the registrar. However it took more than 2 days for some domains to update the new DNS IP address. It was a nightmare.
So my options are: 1) Do the samething as before 2) Create new DNS addresses (eg. ns5.server.com and ns6.server.com) and update the DNS info of all the domains.
When I use website copy function the website files are copied to another domain but the database remains on old site. I use this function to move the website from devel state to the production state. All sites work fine but when i schedule the backups all databases are saved on old domain. How to move the database on production domain?
I live in Hawaii and half my sites serve Hawaii. Webhosts in Hawaii are really expensive. Does it matter where on the mainland US that I host my sites? Would they serve the fastest if I host them in California considering it's the closest to Hawaii?
These new "rules" make BFD ban faster, checks every minute. BFD only checked every 10 minutes and could miss attackers that show up at the right time. Now we keep 10 minutes of IPs, and ban using that list.
I feel that APF and BFD are still the best choices for protecting my server. Cpanel's new "cphulk" feature has a lot more to go to be as good, plus you have total control with BFD where you can add and change rules to suit your needs as they grow, or modify them for particular problems.
The changes I made are based on the latest version of BFD V0.9, you should have that version installed and WORKING ALREADY.
Remember, they are simply shell scripts that define the log file to keep track of and what keywords to trigger on. You can view them with any text reader.
WARNING: These work for me, USE AT YOUR OWN RISK, always make sure you add your current IP in /usr/local/bfd/ignore.hosts (and) /etc/apf/allow_hosts.rules so you don't accidentally ban yourself!
Inside the below tar.gz file are my modified "rules" files for exim, pure-ftpd, rh_imap, rh_pop3, sendmail and sshd. No changes to the BFD V0.9 main program are needed.
You should change the cron job to run BFD every minute, edit this file: /etc/cron.d/bfd
Change the line in that file to this so it runs every minute: */1 * * * * root /usr/local/sbin/bfd -q
I checked the CPU load and since it's reading only a small part of the log file every minute, the CPU load isn't bad, it's done in about 8 seconds on my system. Expect a small rise in load average since it is doing work more often.
The "rules" files are contained in your server directory: /usr/local/bfd/rules
The "rules" files should be REPLACED with the new ones, if you want to keep the old ones around then MOVE THEM OUT to another directory NOT INSIDE the "rules" directory, or else they will be run when BFD runs.
If you need apache, proftpd or other "rules" then you will have to modify them yourself, otherwise you should move these out of the "rules" DIRECTORY, they will not do much with BFD set to run every minute (unless you modify them yourself). I only modified the rules I needed for my server, feel free to post your own mods here.
OK enough, here's the file:
(it's also attached to this message, see below)
This file will only be around for a few months on this free upload site. Someone please put it in a good place/mirror and post a link, thanks.
This runs every minute but keeps a list of the last 10 minutes of bad IPs in a file in tmp, trimming the file every minute so only new IPs are saved.
You can see the list of IPs in files such as: /usr/local/bfd/tmp/.exim /usr/local/bfd/tmp/.sshd
The marker "----" (four dashes) is used to mark each minute and is ignored by BFD but used to trim the old IPs off the file.
If the number of "----" are more than 10, it trims the top of the file up to the marker every run. If the file doesn't exist it's created.
The exim filter "grep" part was modified slightly because the old one was producing bad data every once and a while. The others are all the default filters that come with V0.9.
(BFD people feel free to add this to the next version update, I consider it GPL)
I have heard that Direct Admin is much faster then cpanel & require much less resources then cpanel... I was just wondering if I get vps with 128 ram with Direct admin on it, would it be better than vps with 256 ram with cpanel installed... I am running 3 websites with almost 10 to 20 users at a time (at max 50).... I like both cpanel & directadmin but was thinking if 128 ram with DirectAdmin can give more speed on 128 then I won't spend more money on 256 with cpanel... If anybody have an opinion then please put it here...
So I have been reduced, I am a firm *nix Apache user, to using IIS along with ISAPI for redirection at work. Now I can setup the redirection(s) just fine using the GUI, but I am a *nix man and doing this through the GUI is SLOW! That is when it has to be done on 3 servers at a time plus I can only access those servers through a Citrix environment.
And I need to be adding redirects many times a week. Is there any way to setup ISAPI redirects from a command line? Google has offered me nothing.
i want to make one web site i want to show there some movies same like youtube but my video will on media player .i want to make own server for hosting and i need more than 1tb storage .i don,t know which thing i need for own server .i just found there these kind of things Netgear ReadyNAS Duo RND2150 500GB i don,t know it will work for me or not