im using fireftp and i chmod the folders to 777 but it seems that im still unable to delete the folders as i get the message 550 directory not empty etc.
Well I finally got around to getting my IIS up and running which will save some time with uploading various files to check that they are working correctly but now I have run into a new problem. What used to happen with my IIS is it would list out all of the folders which I had in the wwwroot and I would simply navigate through and select which site needed to be tested.
At the moment, I have cleared out the wwwroot folder entirely since all of the stuff in there was to do with a "Windows XP Professional" page which appeared upon installation.
However, now that I don't need it anymore, I decided to clear it out and test IIS out by making a new folder called "sites" into wwwroot. Now though, it simply comes up with a "Directory Listing Denied. This Virtual Directory does not allow contents to be listed." error message, even though I have changed the permissions on the wwwroot folder to allow writing etc.
Could this be because it's IIS 5.1 and I need to install IIS 6.0 instead or is something else wrong? I know for a fact that my operating system (Windows Media Center Edition 2005) will do this list as I have had it before, back before I installed Vista and then decided to come back to MCE.
Say i have domain zzzzz.com and have some folders say a, b and c
would ssl if installed for main domain zzzzz.com work for https://zzzzz.com/a and so on ? or would wildcard ssl be required for that ?
And what if that /a folder is actually a subdomain, but since you can access subdomains via url/folder instead of folder.url would ssl work on it using url/folder option instead of subdomain url ?
I have server which all files and folders are stored on. Now there are a couple of folders which are only accessible by one machine at present and all the other machines when they access the folder it is displayed as empty. I have checked all permisions and all machines viewing of files settings are the same and folders accessible by all machines have the same settings as the couple which are only accessible from this one machine.
If I copy and paste the folder all machines then have access to the folder from although this would be a lengthy exercise for the full hard drive.
I would to know how I can change the display permissions for my website sub folders and files from public view, for instants
The current is you can access to any folders or files once trying access to any folders in my website like open this link www.yoursite.com/photos/ will see all sub-folders and files in this “photos” folder also you can see the other folders in another level by going to the top level folders!
but I think there is a way to enforce my users to write the full path of any selected photo or file they want without browsing my folders and sub-folder.
my server has just one site, apache conf has such a line:
Code: ScriptAlias /cgi-bin/ "/usr/local/apache/htdocs/cgi-bin/" and
Code: <Directory "usr/local/apache/htdocs/cgi-bin"> AllowOverride None Options None Order allow,deny Allow from all </Directory> then, my cgi script may run well under the said folder,
/usr/local/apache/htdocs/cgi-bin
but I have another script need to be setup another cgi-bin under its own folder below htdocs folder, like,
/usr/local/apache/htdocs/anotherscript/cgi-bin
however, when I run the said script in browser:
Code: http://www.mydomain.com/anotherscript/cgi-bin/abc.cgi the browser just shows source codes, instead of running it. I am guessing something wrong in httpd.conf,
may I know how to set apache httpd.conf to meet my requirement?
I am designing a site for a client and in all the years I've done design etc, I've come up against a phenomenon with their VPS server they have. It's linux and uploading files I am using WS_FTP Home.
I am uploading files and folders to their public_html/domain.com/ (*I use domain here for their privacy) and in some folders (directories) after doing so, a mystery folder suddenly appears that is named 5" and as you enter that folder, you see the path directory show up "public_html" and if you go into that one, you come up to the domain.com folder again, and if you deeper into that one you start to see this phenomenon of mirroring folders of the one you go into. Example:
public_html/domain.com/images/5"/public_html/domain.com/images/file ***the file whether it's an image jpg, png, etc is created as the last directory as a folder, not a file. I should also mention that as you go deeper in the 5" mystery directory folder, you no longer see the path in the FTP anything past the 5" one even as you go further in.
Oh, and it doesn't allow you to delete these 5" folders regardless of what permissions. And this folder seems to show up in many areas of this website's directory structure...mostly where images are (don't know if that is just a coincidence).
So hope all this makes sense....anyone seen this before and what the cause could be? Their host doesn't seem to know the reason and says they cannot see it even though others can. They said it's the FTP program as the cause and not their server.
My comeback to that is that I've used this FTP for years and never before seen this happen. It's only with this one client's server.
The scenerio is client want to enable unlimited URL for his individual customers. ie,[url] Platform: W2K3 IIS6
I only know two ways doing it
1. Create a real folder /username1, /username2, but this will be real messy, and I remember there is a limitation for up to 36,000 sub-folders within a root folder under Windows (correct me if I am wrong)
2. Create virtual directories under IIS Manager using ASP/ASP.NET script, this is easiest, but having two problem.
a. If I have say 10,000 vir. dir., and then I try to expand that root folder under IIS manager, IIS manager will hang for sure.
b. Having such huge vir. dir will inevitably having a huge IIS Metabase, this means a great chance of corrupting it, so it's very dangrous.
I really hope someone can give me some hints how to do this in a scalable way? I know many Web2.0 site do this even using IIS6 ie, [url]
How do you guys deny run of perl/bash scripts from /tmp, /var/tmp, /dev/shm? I've tried to build simple shell wrapper, but that's not a compromise if you run for example spamassassin on the same server (it needs direct io to/from perl binary). I'm looking intro some kind of binary wrapper or patch that will deny running perl scripts from public folders (also the same for shell scripts will be great). Any ideas or solutions?
If anyone interested in primitive shell wrapper code:
Code: #!/bin/sh ARGS=`echo $@ | grep -v "/tmp/"` if [ "$ARGS" != "" ]; then /usr/bin/perl.orig $ARGS; fi
We've had someone starting nobody PERL procs on a box and we can't quite track it down or read the file to see what it is. What he does is to create a folder in /tmp, execute the script from there and delete the folder as soon as it's running (yes, /tmp is mounted noexec, makes no difference). We've managed to discover and block the IP that was doing this, but that's no fix. He hasn't been back since banning the IP...so far.
What we would like to do is see if anyone knows of (or can help create) a script that can watch the /tmp folder and copy newly created directories and thier contents to another dir (also notifying via email would be helpful) in order to see what the heck it's doing, and hopefully be able to figure out how it's getting in. Nothing in any logs this time, and the PERL process seems to be able to hide itself from PS. That bit worries me quite a lot, but none of the binaries appear to have been changed, and it doesn't appear we've been rooted in any way. Thoughts on this, ideas and suggestions welcome.
Failing that, is it possible without breaking the box to prevent the creation of new directories in /tmp? This I seriously doubt, but if all they need to do is create a folder and work from there, noexec is a joke.
OK so I'm on the shell and looking at the contents of my home/ directory which has all the various accounts. In each account directory are subdirs such as mail, logs, tmp, etc, and public_html. Is there a way (or a ZIP command) to be in the home directory and create a massive ZIP file containing all the account directories, each containing ONLY the respective public_html subdir for that account?
Using the command "zip -r Backups.zip ./" seems to include all those extra folders (such as mail, etc.) that aren't needed.
Something weird happening here. I have tried every string possible...
There are a number of folders I want to remove off my server, tried the good old and simple...
rm -r /folder/
And then went and ended up with a string as long as my screen. No matter what I do, as it goes recursive in to the directory it asks me if I want to remove each file individually. No matter what string or action I take it insists on asking me as it goes to delete each file.
I have 2 domains on 1 account. My main website is www.aviationcafe.net and i added on www.modelcuir.com thats what it looks like to the public. But with my host it will be www.modelcuir.aviationcafe.net.
I noticed in the files area that modelcuir is it's own file, i can password protect that but it will stop people getting onto the website completly and i only want to stop them getting into the members area.
I can't create a members area either unless i can add a new folder which i can't i dont think.
linking domains to folders. Im using webmin on a Debian server, and setting up the DNS and BIND isn't a problem, i can do that, but what if i want to link, lets say example.com to the folder hdoc/example/ , how would one do that?
At the moment when setting up the zones, they link to the main page.
I am working with an Apple Lion Server. I want to give users the possibility to gain access to certain share points with the webbrowser via the WebDAV protocol. The OS allows to define sharepoints with the GUI. In this GUI you can adjust, that the users are allowed to access the sharepoints via WebDAV but it is not possible to access the folders via a browser. You just get an error from the webserver after a login:
You don't have permission to access /webdav/ on this server.
So I have looked for the relevant configuration file "httpd_webdavsharing.conf" (apache v2.2)
Code:
# # Apache Config for WebDAV Sharing # Activated and deactivated by com.apple.webapp.webdavsharing webapp #
RegisterResource "WebDAV Sharing: %c %s" /webdav main webdav RewriteEngine On RewriteMap webdavmap prg:/usr/libexec/webdavsharing/webdavsharing_mapper
[Code] .....
Is there a way to modify the code in such a way that it allows the favoured access?