I am trying to move filename.tar.gz from server A to Server B with root at both, i have moved filename into a web accessable folder and chmod the file 755, when i go to wget filename from server A I get...
I've just did a 'wget' on a file that's quiet large (2.3GB), and now I'm wanting to serve the file through one of my accounts. The ownership and permissions of the file have already been changed to reflect the specific account; however, when I browse to the file through the web it will not pick up the filesize nor allow me to download the file stating 403 - Forbidden Access.
Is there some setting that needs to be changed to allow users to download a file of this size? or am I missing a permission command?
My crontab uses Wget to access a PHP file. It works just fine; however, the only problem is that each time the crontab is run it creates a file in the root of my site called filename.php.10XX
filename is the name of the file that Wget runs and 10xx is just the number of times the tab has been run.
I can't find wget on a hosting. SSH command find / -name wget returns with nothing, however wget works properly on a hosting, what could the problem be?
I been trying to get this file using wget but it uses some sort of advance Authentication that I cant get around using wget it doesnt use cookies it uses some Authentication method.
How can I login to that website using wget? the form field names are usernamlc and passlc if I can some how post those two usign wget I can get the download link.
I am trying to move over a client from Verio's hosting to my VPS. The verio control panel looks very amateurish and there is no option to do a full account backup. There is no SSH either.
I tried wget -r ftp://user:pass@hostname/directory/* which was mentioned in another WHT thread, but after it gets connected, it shows the following and stops there:
Quote:
Logging in as accountname ... Logged in! ==> SYST ... done. ==> PWD ... done. ==> TYPE I ... done. ==> CWD /directoryname ... done. ==> PASV ...
Can anyone suggest a command that will work? There is only about 35 MB of files there, but I spent hours yesterday trying to download through FTP.
Most popular way of downloading these scripts is to use "wget" comman
when we need to change the command wget in our server for download some destructive script and run it through some server side program such as PHP or PERL ...
now i try to start this session by this command ...
I had made a page on my site earlier and when i went to it it gave me a 403 error so i tried fixing the .htacess then the site got messed up so i reset the .htacess to tis original settigns and now when i redirect to anything but the home directory it gives me a 403 error can anyone help?
I got my all accounts backup from the reseller panel I had earlier and then restored them on dedicated server using multiple account restore feature of WHM. but the problem is , when I try to open those transfered websites , they show :
Forbidden You don't have permission to access /forums on this server.
I've been recently trying to move an account between servers, but the backup file is always incomplete. I was told that it is possible there are too many files.
I decided to tar some of them and move manually, but I cannot access the tar file. I already changed all permissions (644), owner, group, but I still get 403 Forbidden error. Is it possible that the file is too big (9 BGs), and if it is, how do I change the file size limit?
Installation was installed fine without errors and I've change the required permission to 777 as stated on the official website but I could not get rid of the Forbidden 403 error. Folder of live is on root of Public_html.
Still having issues with LiveZilla when ever I launch the chat system.
The folder is according to my FTP program chmod 755 but still when I visit the directory though the web browser it gives me the error "403 Forbidden". When I try to change the chmod in my FTP program I get the error "550 SITE CHMOD command failed.". What can the problem be? Yesturday I could get access to that folder with the MRTG graphs and today, just "403 Forbidden". What can have happened?
ways to utilize dynamic merging of CSS/JS files so that the browser only has to download one file for each type. I have a solution for how I would implement this solution on a non-CMS site. What complicates this is that most of the sites I manage are based on the Joomla! CMS.
Quite often, a webpage that has been extended with 3rd party extensions can have as many as 10-12 CSS/JS files. My problem is that I haven't found a way to suppress the links from being added to the page without having to hack every single extension being used.
One thought I had was to use mod_rewrite with the Forbidden flag for all of the CSS and JS files being requested except for the combined ones I created. My question is: Will this save much load time since the browser will still need to make the same number of requests? I am assuming it will be at least marginally better since nothing is being downloaded but wasn't sure if it would be noticeable.