Tar Many Directories, In Pieces? I Have A Start, Can You Fill In?
Sep 12, 2007
I'm trying to tar a ton of directories and files, and breaking them up in an attempt to get around the too many arguments problem.
Here's what I tried...
tar -czf [s-z]* ../as-z.tgz
I thought that would tar all directories and files starting with s, and ending in z, placing the tar one directory up.
Instead, I received the following error:
--------
tar (child): s1111: Cannot open: Is a directory
tar (child): Error is not recoverable: exiting now
Broken pipe
--------
s1111 was the first direstory starting with s.
I've been fighting this issue all day long, and tried numerous other things that didn't work.
I am having problem with a server. On all sites on the server start appearing core.xxxx files that in result fill server. Quotas were disabled because some people had issues logging in on because of error.
Quote:
Sorry for the inconvenience!
The filesystem mounted at /home/*** on this server is running out of disk space. cPanel operations have been temporarily suspended to prevent something bad from happening.
Please ask your system admin to remove any files not in use on that partition.
how to remove all of them so they dont appear again, on some sites there are thousands of core.xxxx files and weigh over 60GB.
Which of those directories: usr and var are allowed to rename for a moment and mount their content on a new HD so that SSH and server will not stop work?
So one of my hypervm servers, got a lot of corrupt binaries in the operating system, CentOS.
My datacenter has plugged in an external hard drive so I can backup my files there so the DC can reformat the server.
I know that I will need to copy: /vz and /home and /etc
But.. the one important thing is the hypervm MySQL database! What do I have to backup? /var/lib/mysql ? any other directories?
And how would I restore that MySQL to the new server, when hypervm is installed and the server is formatted?
I will be installing hypervm on the server again when it's done formatting.
I know that hypervm has exploits, so I am going to stop hypervm once I get the server up and running back on its feet. And then switch to VZWave when the production version comes out.
I am working on a website which is planned to have many articles, each with their own directory [url].
I would like the website to be as automated as possible in terms of adding new content and the like.
After a little bit of research, I learned about Apache's ErrorDocument directive, and IIS's equivalent.
What I had in mind was that users would type in the URL they wanted (such as [url], and the server would not be able to find this directory and would try to return a 404.
The file I would tell the server to use for 404 messages, would be a PHP file. It would read what the URL was (to find out what the user wants to look at), convert the data from the URL into the same format as details of articles held in my database, and the script would then check to see if there were any articles available which matched what the user was looking for.
My understanding is that if an article exists, I will be able to extract the article information from the database, and present this as the web page while manually sending a HTTP 200 response. If the article did not exist, a standard 404 page could be offered.
My questions are:Is my understanding of the process here correct?
Is the process and procedure that I want to follow correct? Does it have any flaws?
Other than how to tell the server about the custom ErrorDocument PHP script, is there anything which differs the process between Apache and IIS?
Assuming that the above works for typical users, would it work properly for search engines as well?
There is no directory for the root account on my server. The server admin tells me this: "It looks like all the files in there were deleted. I suggest you terminate and recreate the site in order to reset all the relevant directories."
Can any one tell me the ssh command that will do this for me?
Whenever i try to connect to ftp through any of my server account then its not showing any directory inside public_html but when i use cpanel file manager then its working fine i also tried to used differect ftp client program but still problem not solve will any of you help me? ftp successfully connected but not showing files and folders .
when I FTP into my server, I can't see the files files and folders starting with dots, such as .thumbs or .htaccess. How do I configure my server (through SSH) so that these files are visible rather than hidden? I'm running Fedora on my server.
In my server permission got corrupted cpanel said to change the permission manually to all the domain how can change the permission of the all the domain using the script ...
how to pwd protect directories with when using no control panel, I am planning to change the login details of the protected directories every few days as well as its top secret data, so I would like to know how to protect directories with pwd, I know how to do using control panel such as cPanel r Plesk but I am having no control panel at this interface
I intend to share the files under this protected directories only to my team, so plz help me with codes if there are any
For some reasons I cant access any of my accounts on my dedicated server via FTP. It simply times out when it tried to display the directories.
Heres a log from FileZila...
Code: Status:Resolving address of testdomain.com Status:Connecting to 64.237.58.43:21... Status:Connection established, waiting for welcome message... Response:220---------- Welcome to Pure-FTPd [TLS] ---------- Response:220-You are user number 3 of 50 allowed. Response:220-Local time is now 19:39. Server port: 21. Response:220-This is a private system - No anonymous login Response:220-IPv6 connections are also welcome on this server. Response:220 You will be disconnected after 15 minutes of inactivity. Command:USER testaccount Response:331 User testaccount OK. Password required Command:PASS ******** Response:230-User testaccount has group access to: testaccount Response:230 OK. Current restricted directory is / Command:SYST Response:215 UNIX Type: L8 Command:FEAT Response:211-Extensions supported: Response: EPRT Response: IDLE Response: MDTM Response: SIZE Response: REST STREAM Response: MLST type*;size*;sizd*;modify*;UNIX.mode*;UNIX.uid*;UNIX.gid*;unique*; Response: MLSD Response: ESTP Response: PASV Response: EPSV Response: SPSV Response: ESTA Response: AUTH TLS Response: PBSZ Response: PROT Response:211 End. Status:Connected Status:Retrieving directory listing... Command:PWD Response:257 "/" is your current location Command:TYPE I Response:200 TYPE is now 8-bit binary Command:PASV Response:227 Entering Passive Mode (64,237,58,43,145,153) Command:MLSD Response:150 Accepted data connection Response:226-ASCII Response:226-Options: -a -l Response:226 18 matches total Error:Connection timed out Error:Failed to retrieve directory listing
I have restarted the FTP service serveral times but still It doesnt loads.
I have many directories from years ago on a fileserver. Many of them are indexed on Google. I have five webservers reading from one fileserver for the data. I'd like to keep copies of the most recent data on the webservers and through mod_rewrite tell it to use local copies instead of reading through the fileserver. I would like to move most of my older directories into an "archived" directory.
If I move most of my older directories into an archived directory, the Google listings will then 404. I could put a 404 doc up redirecting to the index page, but Google will drop the link, which I would like to avoid.
I could add a new RewriteCond / RewriteRule (below) for each new directory. But, it would need to be added in the httpd.conf file for each of the webservers and restarted. With the number of new directories I add, this would be infeasible.
Does anyone know of a way to accomplish what I am trying to do? I do not need the older directories in the "archived" directory to be accessible or rewritten, I just don't want to lose the search engine listings from them. And, I want to reduce the strain on the fileserver for these files.
I also don't know of a way to say: rewrite everything locally unless it is in the /dir1/archived directory and if there is a 404, the directory probably existed at some point, redirect that to the index and give a 200 status code.
Does anyone have any suggestions? I tried mod_cache but it doesn't seem all that speedy.
RewriteEngine on RewriteCond %{REQUEST_URI} ^/dir1/dir2 RewriteRule ^/dir1/dir2/(.*)$ /path/to/dir1-localcache/dir2/$1
how can i hide all files and directories in public_html so when using apps suck Flash Get Site Explorere and similar software it will not show any file or directories in public_html.
due to a major bug found in some software installed on our server I would like to scan for all directories set with 0777 permissions. Is there a bash script or command I can run to get a list or possibly even chmod the directories to something else?
I have got server, on which there is Windows XP Professional (I know it is not optimal choice), Apache, PHP, MySQL and so on.
What I want to do is to enable access to the disc on the server so that people can upload and download files and directories from the server. I know it is not a safe thing to share the folder on the network. But I don't have to enable access to the whole disc, I can do it only with one given directory.
I shared it in this way: Name_of_directory -> RMB (right mouse button) -> Sharing and Security -> Sharing -> Network sharing and security: [X] Share this folder on the network, Share name: [Name_of_directory], [X] Allow network users to change my files.
The other thing was to change name of the server: My Computer -> RMB -> Properties -> General -> I changed name to "hpserwer".
I enter "Start -> Search" on my laptop (not on remote desktop but on my laptop when I am connected to local network of the company). I choose "Find other elements -> Computers" and write "hpserwer". The result is "no founds". What should I do?
is there a way to change your server so that if you're trying to link to site.com/a/page.html through a link on site.com/b/page.html you can simply put /a/page.html instead of having to put ../a/page.html? I've noticed that some sites don't use the .. and was wondering how I could do the same.
We have a weird issue on one of our centos 5 servers, running the 2.6.18-53.1.4.el5 kernel
for some reason we cant mkdir any directories that are numerically named
ie mkdir 5789 doesnt work
alpha and alpha numeric work fine... ie mkdir ab567a
Quote:
[root@]# mkdir 6 mkdir: cannot create directory `6': No such file or directory [root@]# mkdir 6ab [root@]#
Someone suggested the adore-ng rootkit was to blame but we have run ossec root check, rkhunter and chkrootkit and even run clamav over the drives and cant find anything nasty
I had what I thought was a fairly smooth install via yum of Pure-FTP on a Fedora 7 SELinux server. I configured it to use it's own PureDB virtual user system, and I added a few users using # pure-pw useradd to test things out. However, upon successfully logging in...
[21:51:34] USER test1 [21:51:34] 331 User test1 OK. Password required [21:51:34] PASS (hidden) [21:51:34] Cannot login waiting to retry (30s)... [21:51:34] Server closed connection...so, I check /var/log/messages to find... Jul 31 21:50:25 homeserve pure-ftpd: (?@192.168.0.134) [INFO] New connection from 192.168.0.134 Jul 31 21:50:25 homeserve pure-ftpd: (?@192.168.0.134) [ERROR] Home directory not available - aborting
Clearly, something is awry. When I created the users, I explicitly specified their home directory using pure-pw's -d flag, and I can confirm that it was entered correctly by viewing the /etc/pure-ftpd/pureftpd.passwd. The directories, of course, do exist with the proper permissions and ownership.
I was wondering if it would be advisable to use the Password Protected Directory option in cPanels to limit part of my website to no more than 1,000 paying customers (yearly subsciption). Can cPanels handle this? Would accessing .htaccess and .htpassword to authenticate be too slow? Would management become too much? Has anyone attempted this? Are there any good alternatives such as open source programs? I've looked and found a few that are expensive and do way more than what I need.
I have directory1, directory2, directory3 etc and each has directories inside them. Is it possible to zip them all (directory1, directory2, directory3 and their contents) into one .zip file? If so, what is it?