Unable To Delete Folders In Ftp
Mar 28, 2009im using fireftp and i chmod the folders to 777 but it seems that im still unable to delete the folders as i get the message 550 directory not empty etc.
View 9 Repliesim using fireftp and i chmod the folders to 777 but it seems that im still unable to delete the folders as i get the message 550 directory not empty etc.
View 9 Replieswhen people run a forum and the template and forumdate folders may created some files with nobody permission,
the user could not delete them by the user themself,
and need admin login as root with ssh to delete those,
and let the files permission can run as the user instead of nobody.
the server is centos with cpanel and suexec.
I heard the following folders were used by frontpage.
_private
_vti_bin
_vti_cnf
_vti_log
_vti_pvt
_vti_txt
I was wondering can i delete these folders if i'm not using frontpage ?
After enabling File Sharing in Plesk 12, the service appears to function correctly. If only a few files are uploaded, they are ok. However, after uploading a directory structure with a handful of MB, and subdirectories, File Sharing stops. It shows at the top of the screen:
"Internal error: The File Sharing service is temporarily unavailable because the site is down for maintenance."
it also says in the main window:
"Unable to load folders list."
Clicking on Upload Files, that dialog also shows "Unable to load folders list."
I have checked that the ownership of those files on the server is www-data:www-data , and that permissions are 755 for directories and 644 for files, because it seems like ownership and permissions could be an issue, however that doesn't fix it. Removing all the files in the share does fix it, however that's not useful if the files should be available for sharing.
My default certificate expired recently. I created a new certificate "default certificate 2".
I used this certificate for "Secure the Panel"
I went to "Tools and Settings" -> "IP addresses" and made this certificate the default for all the IPs. On the page "Tools and Settings" -> "SSL certificates", it says Used: 0 next to "default certificate".
But when I try to delete it, it tells me: "Error: Unable to remove certificates: one or several certificates are assigned to the IP addresses/domains."
Is there anywhere I can check where this certificate is still used?
I found these folders in the root
/usr/bin/c99
/usr/include/boost/numeric/interval/detail/c99_rounding_control.hpp
/usr/include/boost/numeric/interval/detail/c99sub_rounding_control.hpp
what are these ? is it normal folders ? or somebody hacked our server?
what shall I do?
Well I finally got around to getting my IIS up and running which will save some time with uploading various files to check that they are working correctly but now I have run into a new problem. What used to happen with my IIS is it would list out all of the folders which I had in the wwwroot and I would simply navigate through and select which site needed to be tested.
At the moment, I have cleared out the wwwroot folder entirely since all of the stuff in there was to do with a "Windows XP Professional" page which appeared upon installation.
However, now that I don't need it anymore, I decided to clear it out and test IIS out by making a new folder called "sites" into wwwroot. Now though, it simply comes up with a "Directory Listing Denied. This Virtual Directory does not allow contents to be listed." error message, even though I have changed the permissions on the wwwroot folder to allow writing etc.
Could this be because it's IIS 5.1 and I need to install IIS 6.0 instead or is something else wrong? I know for a fact that my operating system (Windows Media Center Edition 2005) will do this list as I have had it before, back before I installed Vista and then decided to come back to MCE.
In the public_html directory, I have
php_value user_agent "Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.8.1) Gecko/20061010 Firefox/2.0"
<IfModule mod_security.c>
SecFilterScanPost
</IfModule>
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
Errordocument 404 /404.html
RewriteCond %{REQUEST_FILENAME} !.(jpg|jpeg|gif|png|css|js)$
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule .* index.php [L]
</IfModule>
Then I created a new directory named caller
There is an exact same .htaccess in public_html/caller
However, the .htaccess in the public_html directory rewrites all request to index.php
.htaccess in public_html/caller rewrites all request to /caller/index.htm
The thing is when I access
[url] whatever the one that's called is /public_html/index.php
How can I arrange so that the one called is /caller/index.htm?
Say i have domain zzzzz.com and have some folders say a, b and c
would ssl if installed for main domain zzzzz.com work for https://zzzzz.com/a and so on ? or would wildcard ssl be required for that ?
And what if that /a folder is actually a subdomain, but since you can access subdomains via url/folder instead of folder.url would ssl work on it using url/folder option instead of subdomain url ?
what should be ideal chmod permission for public_html and other folders.
is 755 fine ? what is most secure one
i need to find al www folders within home directory with 777 permissions and need to chmod to 750.
if possible post the command to do it
I can send e-mails out but many times they go directly to the recipient's SPAM box.
Also, my server cannot send to any company e-mail that uses MXLogic.net's services.
Going to [url]<--My server IP is on ZERO blacklists.
/etc/resolv.conf appears correct
/etc/hosts also appears correct
Is there something that I am overlooking?
I'd like to know how to secure an SMF site. Is the default permission good enough?
View 4 Replies View RelatedI have server which all files and folders are stored on. Now there are a couple of folders which are only accessible by one machine at present and all the other machines when they access the folder it is displayed as empty. I have checked all permisions and all machines viewing of files settings are the same and folders accessible by all machines have the same settings as the couple which are only accessible from this one machine.
If I copy and paste the folder all machines then have access to the folder from although this would be a lengthy exercise for the full hard drive.
I would to know how I can change the display permissions for my website sub folders and files from public view, for instants
The current is you can access to any folders or files once trying access to any folders in my website like open this link www.yoursite.com/photos/ will see all sub-folders and files in this “photos” folder also you can see the other folders in another level by going to the top level folders!
but I think there is a way to enforce my users to write the full path of any selected photo or file they want without browsing my folders and sub-folder.
my server has just one site, apache conf has such a line:
Code:
ScriptAlias /cgi-bin/ "/usr/local/apache/htdocs/cgi-bin/"
and
Code:
<Directory "usr/local/apache/htdocs/cgi-bin">
AllowOverride None
Options None
Order allow,deny
Allow from all
</Directory>
then, my cgi script may run well under the said folder,
/usr/local/apache/htdocs/cgi-bin
but I have another script need to be setup another cgi-bin under its own folder below htdocs folder, like,
/usr/local/apache/htdocs/anotherscript/cgi-bin
however, when I run the said script in browser:
Code:
http://www.mydomain.com/anotherscript/cgi-bin/abc.cgi
the browser just shows source codes, instead of running it. I am guessing something wrong in httpd.conf,
may I know how to set apache httpd.conf to meet my requirement?
I am designing a site for a client and in all the years I've done design etc, I've come up against a phenomenon with their VPS server they have. It's linux and uploading files I am using WS_FTP Home.
I am uploading files and folders to their public_html/domain.com/ (*I use domain here for their privacy) and in some folders (directories) after doing so, a mystery folder suddenly appears that is named 5" and as you enter that folder, you see the path directory show up "public_html" and if you go into that one, you come up to the domain.com folder again, and if you deeper into that one you start to see this phenomenon of mirroring folders of the one you go into. Example:
public_html/domain.com/images/5"/public_html/domain.com/images/file
***the file whether it's an image jpg, png, etc is created as the last directory as a folder, not a file. I should also mention that as you go deeper in the 5" mystery directory folder, you no longer see the path in the FTP anything past the 5" one even as you go further in.
Oh, and it doesn't allow you to delete these 5" folders regardless of what permissions. And this folder seems to show up in many areas of this website's directory structure...mostly where images are (don't know if that is just a coincidence).
So hope all this makes sense....anyone seen this before and what the cause could be? Their host doesn't seem to know the reason and says they cannot see it even though others can. They said it's the FTP program as the cause and not their server.
My comeback to that is that I've used this FTP for years and never before seen this happen. It's only with this one client's server.
The scenerio is client want to enable unlimited URL for his individual customers.
ie,[url]
Platform: W2K3 IIS6
I only know two ways doing it
1. Create a real folder /username1, /username2, but this will be real messy, and I remember there is a limitation for up to 36,000 sub-folders within a root folder under Windows (correct me if I am wrong)
2. Create virtual directories under IIS Manager using ASP/ASP.NET script, this is easiest, but having two problem.
a. If I have say 10,000 vir. dir., and then I try to expand that root folder under IIS manager, IIS manager will hang for sure.
b. Having such huge vir. dir will inevitably having a huge IIS Metabase, this means a great chance of corrupting it, so it's very dangrous.
I really hope someone can give me some hints how to do this in a scalable way? I know many Web2.0 site do this even using IIS6 ie, [url]
How do you guys deny run of perl/bash scripts from /tmp, /var/tmp, /dev/shm? I've tried to build simple shell wrapper, but that's not a compromise if you run for example spamassassin on the same server (it needs direct io to/from perl binary). I'm looking intro some kind of binary wrapper or patch that will deny running perl scripts from public folders (also the same for shell scripts will be great). Any ideas or solutions?
If anyone interested in primitive shell wrapper code:
Code:
#!/bin/sh
ARGS=`echo $@ | grep -v "/tmp/"`
if [ "$ARGS" != "" ]; then
/usr/bin/perl.orig $ARGS;
fi
We've had someone starting nobody PERL procs on a box and we can't quite track it down or read the file to see what it is. What he does is to create a folder in /tmp, execute the script from there and delete the folder as soon as it's running (yes, /tmp is mounted noexec, makes no difference). We've managed to discover and block the IP that was doing this, but that's no fix. He hasn't been back since banning the IP...so far.
What we would like to do is see if anyone knows of (or can help create) a script that can watch the /tmp folder and copy newly created directories and thier contents to another dir (also notifying via email would be helpful) in order to see what the heck it's doing, and hopefully be able to figure out how it's getting in. Nothing in any logs this time, and the PERL process seems to be able to hide itself from PS. That bit worries me quite a lot, but none of the binaries appear to have been changed, and it doesn't appear we've been rooted in any way.
Thoughts on this, ideas and suggestions welcome.
Failing that, is it possible without breaking the box to prevent the creation of new directories in /tmp? This I seriously doubt, but if all they need to do is create a folder and work from there, noexec is a joke.
On my site, there is a folder containing somewheres around 75,000 images.
I'm on shared hosting, but I do have shell access.
I want to basically copy this folder to another directory in the site. Can anyone give me any pointers?
I have just discovered in my access log some people accessing urls like this:
domain.com/index.php/index.php/ or even
domain.com/index.php/index.php/index.php
I have opened those links in browser and they worked. Why?
My .htaccess is empty and I have no 'index.php' folder. But I do have 'index.php' file.
I think this is happening to most php sites, not just mine. This is a random example:
[URL] ....
My question is how to force apache to return 404 error when such url is accessed?
I'm trying to write a cron/ssh to remove recursive folders in a "data" folder that is older than X days.
I've been able to remove files, but not folders.
This is the code I have so far, but if someone can point out how to remove folders older than X days, that'd be great ........
OK so I'm on the shell and looking at the contents of my home/ directory which has all the various accounts. In each account directory are subdirs such as mail, logs, tmp, etc, and public_html. Is there a way (or a ZIP command) to be in the home directory and create a massive ZIP file containing all the account directories, each containing ONLY the respective public_html subdir for that account?
Using the command "zip -r Backups.zip ./" seems to include all those extra folders (such as mail, etc.) that aren't needed.
My blog is set up to display in the root of my domain, although the files on the server sit within their own folder:
i.e
Server files
Public_html/wordpressfiles/
Broswer displays
www . mydomain . com/
(disaplys pages from /wordpressfiles)
The problem I have is that I can't access individual directories within the root, unrelated to wordpress.
e.g
I have
Public_html/folder2/...
Setup on the server, but if I enter the path in my browser:
www . mydomain . com/folder2
wordpress thinks I want to access:
www . mydomain . com/wordpress/folder2
...which doesn't exist.
How can I re-gain access to folders in the root, without wordpress interfering?
Something weird happening here. I have tried every string possible...
There are a number of folders I want to remove off my server, tried the good old and simple...
rm -r /folder/
And then went and ended up with a string as long as my screen. No matter what I do, as it goes recursive in to the directory it asks me if I want to remove each file individually. No matter what string or action I take it insists on asking me as it goes to delete each file.
Could this be a configuration option in CentOS?
I am going to backup the whole server. (Rsync)
I understood it is not efficient to backup the local "backup" folder to avoid duplicated backups.
1. Should I exclude additional folders from the backup process?
2. I noticed it is familiar to exclude the "proc" folder also ... Why?
I have 2 domains on 1 account. My main website is www.aviationcafe.net and i added on www.modelcuir.com thats what it looks like to the public. But with my host it will be www.modelcuir.aviationcafe.net.
I noticed in the files area that modelcuir is it's own file, i can password protect that but it will stop people getting onto the website completly and i only want to stop them getting into the members area.
I can't create a members area either unless i can add a new folder which i can't i dont think.