Archiving A Whole Directory
Sep 29, 2007
How do I archive a whole directory using ssh?
There is gzip and tar, but what command do I use to make an archive for a whole directory?
[Directory ---> One_Archive.tar]
It will be easier to transfer big directories this way by archiving it all into one.
View 3 Replies
ADVERTISEMENT
Apr 27, 2007
I have many directories from years ago on a fileserver. Many of them are indexed on Google. I have five webservers reading from one fileserver for the data. I'd like to keep copies of the most recent data on the webservers and through mod_rewrite tell it to use local copies instead of reading through the fileserver. I would like to move most of my older directories into an "archived" directory.
If I move most of my older directories into an archived directory, the Google listings will then 404. I could put a 404 doc up redirecting to the index page, but Google will drop the link, which I would like to avoid.
I could add a new RewriteCond / RewriteRule (below) for each new directory. But, it would need to be added in the httpd.conf file for each of the webservers and restarted. With the number of new directories I add, this would be infeasible.
Does anyone know of a way to accomplish what I am trying to do? I do not need the older directories in the "archived" directory to be accessible or rewritten, I just don't want to lose the search engine listings from them. And, I want to reduce the strain on the fileserver for these files.
I also don't know of a way to say: rewrite everything locally unless it is in the /dir1/archived directory and if there is a 404, the directory probably existed at some point, redirect that to the index and give a 200 status code.
Does anyone have any suggestions? I tried mod_cache but it doesn't seem all that speedy.
RewriteEngine on
RewriteCond %{REQUEST_URI} ^/dir1/dir2
RewriteRule ^/dir1/dir2/(.*)$ /path/to/dir1-localcache/dir2/$1
View 0 Replies
View Related
Jul 20, 2008
I was hesitant to even post this because I've been on the net for so long, that and I know just about every method out there. However, large corporations obviously can't live with loosing data so I figured I'd see what everybody else is doing out and figure out my next battle plan for archiving data.
Here's what happened. I had triple backups of some data. I burned them twice on DVD (yes, two DVD's of each archive) as well as had them on a WD hard drive. ALL are dead or unreadable. I had the discs in a DVD binder and they were never touched for a good 5 years. I put them in a player (multiple players) and only certain parts were read.
Other parts came up with errors. (Yes, I'm aware there are companies that offer recovery for both CD/DVD and Hard drives, but they always charge a fortune and in some cases, aren't worth it). This was Memorex media btw which might of had something to do with it, but at the time I thought it was the best. Next I went to the hard drive. This was stored in a room temp house over a good 5 years and when I plugged it in to a computer's IDE port, nothing. Not sure exactly what's wrong with the drive and I'm sure it could be fixed and recovered, but once again, that's not my question.
My Question: What methods are you using to store your archives over long periods of time? Any recommendations on hardware? I'm thinking of DLT Tape backups and of course doing RAID next but don't want to run into the same problem in 5 years. How can I sleep at night (which is already impossible) knowing that in 5 or 10 years when I'm ready to get into my backups that they will be there?
What are you doing and what do you recommend? Better yet, what the hell is Hollywood doing to keep all their music and movies safe?
View 3 Replies
View Related
Jan 17, 2008
Been running a server for 2 years now on Windows SBS 2003, and within the past 2 weeks, the server has been locking up or going very sluggish after only a few hours of use.
I've noticed that STORE.EXE is using a lot of memory, there are a number of people within the building using this server for various tasks with Microbiz, Quickbooks, and Exchange server. I researched the topic and was told if I can archive Exchange, that this should help with performance. Anyone know how to do this or to lower the use of memory with the server?
View 8 Replies
View Related
May 31, 2007
I have a situation like this:
There is a directory say, "Master" and inside, "Master" there is sub-directory, "Slave". A user who has access to, "Master" should be able to access, "Slave" automatically. However, a user who has access to, "Slave" should not have access to, "Master". Inside cPanel this type of protection is not possible.
View 3 Replies
View Related
Dec 11, 2008
How do I direct my httpd file to point to:
home/USER/public_html
instead of:
C:Users estetc...
I want to do this to make my test server just like the remote server.
View 5 Replies
View Related
Sep 18, 2008
I want to move the entire contents of a directory tree to another directory.
So for example we may have a directory with 15 directories inside, each directory contains files itself. I want to copy all the files from the directory tree into another directory located somewhere else one the file system. I want only the "files" to end up in the other directory and not the file structure too.
Im running CENTOS latest version.
View 4 Replies
View Related
Nov 1, 2009
Which files could I safely delete/archive from the usr directory? Also, what is the command to list each subdirectory's size?
View 8 Replies
View Related
Jan 31, 2008
If we want to Create output of "Tar" a directory ; to different server, how can we do that ?
I tried this command, but it does not works
tar -cvf oldserverzip.tar /directory | ssh newserver@newserver.com
Basically, what we want is to shift a site "dump" to another server, without creating a copy at source server (as there is no space available there to create a "dump")
View 12 Replies
View Related
Aug 14, 2007
Cpanel 11
Fedora
Is there any particular reason or advantage to having SSL files load from a different directory than the http files load?
for instance the http home files are in
/home/domain
is there any reason that an index.html (unsecure file) would need to redirect secure files from a directory?
View 4 Replies
View Related
May 6, 2007
how I go about unlocking my bin directory and making it writable again. I tried to do installworld with the latest freebsd release and as soon as it gets to the bin directory I get a permissions denied error. I looked at the permissions and ownership compared to another freebsd box and they all look the same. I tried to create a file by using touch just to test it and I received permission denied error.
View 3 Replies
View Related
Apr 21, 2007
I want to have a directory that i want to set up so you can only a script can see files in a directory. They will be getting the file names from the database, i have turned off directory browsing. I have also set up a Robots.txt and stopped robots from scanning the directory. Is there anything else i can do?
Also is there and software that allows you to download an entire site. Even if it is gigs of content. If so, how do i set it so it will stop that from happening. I am on linux. Centos running Cpanel.
View 2 Replies
View Related
Mar 9, 2009
I know that /www is an alias (symlink) for public_html, but I came across an account with virtual_html. Is there a difference between virtual_html and public_html?
And while I assume secure_html is for secure files, how does it tie in?
View 5 Replies
View Related
Aug 20, 2008
I'm trying to use SSH to delete a directory. It won't delete via FTP.
I've tried variations of the below but nothing seems to work;
rm -f [folder]
rm /-f [folder]
the directory that should be deleted is test.crossroadsclub.net, so I use that as the folder.
View 5 Replies
View Related
May 23, 2008
how do i protect a file directory from be accessed view a web browser but still allowing scripts like flash to access it?
i have a folder with xml files in it and i don't want a user pulling them up via there webbrowser.
View 3 Replies
View Related
Oct 30, 2008
I need to make this writeable but I can't find it on my server with my FTPclient. Can someone tell me what it is and where it's likely to be?
Maybe I have to create one?
View 1 Replies
View Related
Aug 6, 2008
I have 10 GB partition and we have mount /var direcotry and now it has been full 97% what can i do and how to manage it and i have no other option because no other space is available to create new partition and mount there.
View 10 Replies
View Related
May 8, 2008
the caching directory I created for eaccelerator deleted by itself. There are 2 copies of eaccelerator, one for apache and one for litespeed. The one for Apache got deleted today. Could it be due to the fact that the directory hasn't been modified or accessed in nearly 2 weeks? It's been that long since Apache was used.
View 4 Replies
View Related
Nov 4, 2008
How do I do this in IIS
I need to be able to access resources such as PDFs, videos from my website once a user has logged in but I need to block users being able to access the resources from a url?
View 6 Replies
View Related
Jul 26, 2008
i updated my old pages on the ftp directory wwwroot.. i deleted the old files and inserted new pages with new names.. now i am not able to see the new website if i type the exact address..the old page is still being viewed. wheras if i omit the www part from the addres i am able to see the new page..
View 5 Replies
View Related
Jul 9, 2008
I want hide the directory of my site:
[url]
How to hide the content list there?
View 4 Replies
View Related
Jul 30, 2008
I would like to make some kind of script (probably .sh?) that automatically takes a directory, makes a copy of it then makes a gzip tar and then shoots it over to an FTP server. I would like this to happen twice a day (ie. every 12 hours).
View 13 Replies
View Related
Jun 12, 2008
I am running phpsuexec on the server and it seems all php.ini settings are being applied per directory, I would like to have them running per user.
View 6 Replies
View Related
Jan 20, 2008
I upgraded from Apache 1.3.7 to the latest copy
Everything works nicely, except the cgi-bin directory
When a user tries to access a script or even a standard text file, it throws up the error..
Not Found
The requested URL /cgi-bin/first.txt was not found on this server.
Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
When they try and access the cgi-bin directory itself, they get
Forbidden
You don't have permission to access /cgi-bin/ on this server
Now, I've checked the httpd.conf file and this is what it has for Cgi-bin
<IfModule alias_module>
ScriptAlias /cgi-bin/ "/usr/local/apache/cgi-bin/"
</IfModule>
<Directory "/usr/local/apache/cgi-bin">
AllowOverride None
Options None
Order allow,deny
Allow from all
</Directory>
And the error logs say..
[Sun Jan 20 18:09:56 2008] [error] [client xx.xx.xx.xx] File does not exist: /home/goewowc/public_html/404.shtml
[Sun Jan 20 18:09:56 2008] [error] [client xx.xx.xx.xx] script not found or unable to stat: /usr/local/apache/cgi-bin/first.txt
The CGI-bin directory is chmodded correctly, the files are also chmodded and belong to the correct group
View 3 Replies
View Related
May 9, 2007
css $ ls -l /san
ls: reading directory /mountfd: Too many open files
total 0
View 3 Replies
View Related
Aug 13, 2008
I'm trying to create a directory, "hello", so I log in via SSH and CD to /proc, as this is where I want the directory to be. I type, "mkdir hello" and I get the following;
mkdir: cannot create directory `hello': No such file or directory
I've searched around for the error, and I've found two mentioned problems;
Persmissions
Parent directory doesn't exist.
However, the parent directory; /proc does exist and it has the permissions 7777.
View 8 Replies
View Related
Jun 21, 2008
I'm trying to understand how to create a remote directory for basically external processing.. meaning..
I create let's say..
/bla/share
Then I patch that share into a running apache/php on the said server and connect externally to mysql and begin my remote processing..
I've tested bandwidth between the 2 servers at about 5 mbit and am looking to process about 176 Gb of data..
Of course I have a dozen questions on this issue since I've never done this before and don't know how to exactly establish a remote directory..
What is the easiest software/program to use command line for a remote directory.. sshfs, nfs or other? (especially for a older o/s FreeBSD 4.9)
Would data manipulation at 5 mbit sustained be really slow for processing 176 Gb of data give or take? (were not talking about copying that I know off.. just resizing files etc.)
View 0 Replies
View Related