Multiple Large Core Files
May 20, 2007Just noticed quite a few large Core. files within one of our websites (within a sub folder of public_html). Anyone knwo what these are and how they got there?
View 3 RepliesJust noticed quite a few large Core. files within one of our websites (within a sub folder of public_html). Anyone knwo what these are and how they got there?
View 3 RepliesLinux Fedora 6, Apache 2 with Mod Security, MySQL.
Our mod_sec logs get incredibly large very quickly. In the configuration for mod_security, we have specified logging options as
SecAuditEngine RelevantOnly
SecAuditLogRelevantStatus "^[45]"
but the mod_sec.log gets to almost 10 GB (in a matter of 5-6 days) before it is truncated to mod_sec.log.1 and a new one is created.
Is there a way we can specify that a max size of one log file is 1 GB, for example?
Or another question, how come it gets so huge so quickly? We thought that logging "RelevantOnly" will only display errors / requests that are deemed security risks.
I have a customer who wants to sell access to videos of conferences he runs.
Each flv vid is approx 1 - 1 1/2 hors long approx 380MB each and there will be about 12 videos per conference.
approx 4 - 8 conferences per year.
My customer suggests 10 - 20 people will buy access to watch each video.
Access to watch the videos will be through a password protected webpage.
issue - the current site hosting company only allow uploads up to 150MB per file.
Can I host the flash videos elsewhere and deliver them through the password protected web page without anyone else being able to see them via server they are hosted on?
This would also reduce the bandwidth going through his current site server.
I am trying to locate what large file are filling up the / on the server but I am having trouble using the find command to do this.
View 1 Replies View RelatedI'm working on a web site which will basically be a flash games portal. I have a dedicated server running Apache 2 on a 100mbit dedicated line but my download speed for large files (flash files of over 5mbs) is really slow. I am thinking this is because of Apache but I don't know much about this. I've read that I should change for a lighter http server for serving static files. The way my server is set up is I have 2 virtual machines running, one doing the PHP processing and the other serving static files, both running Apache, so if I have to change HTTP server for the static files it would be very easy. Although I am not sure if this is necessary or if I can tune Apache to push files faster than this.
View 8 Replies View RelatedI'm facing a very strange FTP issue with one of my shared-hosting accounts, while all of my other servers are having no problems but only this one, when I try to upload a file (whatever file) larger than 500kb from my local PCs, in most cases, the file would stop uploading during the process and hang there until it times out.
There are 2 interesting things though:
The file transmission typically hangs when approximately 248kb of the file have been transferred, pls see the attached screenshot for example.
If you look at the attached screenshot, you will notice that the uploading transmission hangs when 248kb of the file have been transferred. This is very strange and what I mean is that for example, I randomly pick up a file, and attempt to upload it onto my host for 10 times, now see, 5 times it will hang when 248kb of the total size have been transferred, 3 times it will hang at other points *near* 248kb (224kb or 280kb typically), 1 time it will hang at another random point, and 1 time it might be uploaded successfully (yes, there is still a tiny chance for the file to be uploaded successfully).
My default internet uploading speed is 80kb/s-100kb/s, lately I found that, when I limit the uploading speed on my FTP client (e.g. max. 30kb/s), everything WILL WORK without any problem! No hangs, no interrupt.. Whereas when I free up the uploading speed limitation and let it upload with my regular speed, the problem appears again.
It seems to me that the FTP hangs only when the uploading speed is higher than 60kb/s. However my host provider told me that they have customers uploading without any problem at over 400kb/s, and they said "there's no problem or limitations on the server at all".
Up until now, I have done following things to troubleshoot the issue but with no luck:
Contacted my host.
Disabled/Enabled the PASV mode on my FTP client.
Tried different FTP clients on different computers (FlashFXP and Filezilla).
Rebooted my router and reseted everything with the factory default settings.
Contacted my ISP for the issue, they "did something" but nothing were helpful.
Rebooted all my PCs.
Disabled both firewalls on my PC and on the router.
Furthermore, I have asked another friend of mine in another city with another ISP to test the FTP uploading, but unfortunately he got the exact same problem. And I've done some search on the internet for hours but no one seemed to have the same problem..
I just logged into my VPS and was astonished by how much space I have in use.
8.09GB... but I can't figure out what's using up so much space!?
How can I find out were large files are located ? Since it's increasing daily
I use LXAdmin with HyperVM Control Panel
I have a debian box, and have archived a gallery in to a .tar file, 5.77gb.
I have a centOS box, and have used wget to bring the data file over to the new server.
However upon doing so it only detects it as 1.8gb when it starts downloading.
I have terminal access to both servers, just trying to bring my files over from one server to another.
I've been using Lypha for the past 4 years, but they've taken the last straw (gigabytes of backups went missing and they wont reply to emails as to why).
Looking for a web hosting package for under $10/month that has large enough disk-space/bandwidth to allow me to backup large audio / video files to it, as well as the normal site operation (I use it for portfolio website, as well as hosting additional domains)
I am developing a web application for a private investigative firm. They do surveillance work and therefore have surveillance videos. I would like the capabilities of uploading the videos online and allowing the client to login and view their surveillance video online.
Currently, we get the video from the PI, put it on a DVD and then mail it to the client.
This takes too long. We want the client to be able to view the video online.
Some of these videos can be up to 2 hours long.
First, is this even possible?
Second,
- how much bandwidth would a website like this take?
- Is there a host that can hold hundreds of GB of video?
I want to convert it to flash to save file size and also so I can stream it.
I have some 100's of MB's to move and I'm definitely not doing it by transferring it via my PC / FTP.
I seen all the tutorials on how to move your MySQL databases, but what about full folders etc, how do I move those (putty?)?
Files such as core.12762 keep appearing in the root web directory. Amount of core files appearing seems to be proportional to traffic. On average one core file per few million php requests.
Will appreciate advice on how to troubleshoot this issue, or how to disable the core dumping. As it is one error per few million requests it does not seem significant enough to spend several hours troubleshooting.
I`ve found some core.xxxxx files in some directories under some of our accounts.
I think those should be a trojan, Is it right?
* Many of those files are 0bytes, But some others are about 2-3M.
And how to remove those? What should to do to have those removed successfully.
This is a part from one of those files:
Code:
^?ELF^A^A^A^@^@^@^@^@^@^@^@^@^D^@^C^@^A^@^@^@^@^@^@^
@4^@^@^@^@^@^@^@^@^@^@^@4^@ ^@U^@^@^@^@^@^@^@^D^@^@^@Ô
^@^@^@^@^@^@^@^@^@^@ö^@^@^@^@^@^@^@^@^@^@^@^@^@^@^A^@^@^@^@ ^@^@^@^P^Q^@^@^@^@^@^@^@^@^@^@@^M^@^E^@^@^@^@^P^@^@^A^@^@^@^@ ^@^@^@P^^^@^@^@$
q^Y^@^@^@^@^@�����x^^^@^@^@^@^@^@^@^@^@^@^@^@^@��^]^@^@^@^@^@����Ðx^^^@^@^@^@^@^@^@^@^@^@^@^@^@Ç�^]^@^A^@^@^@^P^@^@^@^@y^^^@^B^@^@^@^@^@^@^@$
^@^@^@^@^@^@^@(^@^@^@#^L^^^@^A^@^@^@^P^@^@^@`z^^^@^E^@^@^@^@^@^@^@^T^@^@^@^V�^]^@^A^@^@^@^P^@^@^@Üz^^^@^C^@^@^@|�^]^@^P^@^@^@+�^]^@^A^@^@^@^$
)^@^@^@^@^@^@^@^@^@^@^@^@^@^@^P^@^@^@^Ø^@^^^@^E^@^@^@P(^@^@^@^@^@^@^@^@^@^@^@^@^@^@^P^@^@^@�^@^^^@^E^@^@^@^V=^
@^@^@^@^@^@^@^@^@^@^@^@^@^@^P^$
^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^H^^^@^@^@^@^@^B
^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@t^K^^^@^A^@^@^@^P
^@^@^@�|^^^@^C^@^@^@^@^@^@^@^L$
^@^@^@Ê^�^X^@ ^�^X^@^@^@^@^@H^P^@^@��^X^@a�^X^@^@^@^@^@^@^@^@^@^@^@^@
^@^@^@^@^@^@^@^@^@!^@^@^@^H^@^@^@^X^@^@^@^H^@^@^@^A^@^@
^@�^�^X^@^D^�^X^$
^@^@^@^Z�^X^@p^X^@^@^@^@^@^Ä^@^@^@��^X^@a�^X^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^
@^@^@^C^@^@^@^G^@^@^@^P^@^@^@^@^@^@^@^�^X^@��^X^@�^X^@^@$
^@^@^@ð�^@^@^K^@^@^@^P^@^@^@^C^@^@^@�^Ö^^^@^B^@^@^@^È^B^@^@^T^@^@^@^Q^@^@^@^W^@^@^@á�^S^@^Q^@^@^@TÑ^R^@^R^@^@^@^ÐÞ^@^@^S^@^@^@^H^@^@^@���oÔ^$
�^S^@^Z�^S^@*�^S^@:�^S^@J�^S^@Z�^S^@j�^S^@z�^S^@^�^S^@^�
^S^@��^S^@��^S^@Ê�^S^@Û�^S^@í�^S^@��^S^@
�^S^@^Z�^S^@*�^S^@:�^S^@J�^S^@Z�^S^@j�^S^@z�^S^@^�^S^@^�
^S^@��^S^@��^S^@Ê�^S^@Û�^S^@í�^S^@��^S^@
�^S^@^Z�^S^@*�^S^@:�^S^@J�^S^@Z�^S^@j�^S^@z�^S^@^�^S^@^�
^S^@��^S^@��^S^@Ê�^S^@Û�^S^@í�^S^@��^S^@
�^S^@^Z�^S^@*�^S^@:�^S^@J�^S^@Z�^S^@j�^S^@z�^S^@^�^S^@^�
^S^@��^S^@��^S^@Ê�^S^@Û�^S^@í�^S^@��^S^@.
We are using cPanel on CentOS 4.5, And have clamav installed on the server.
I ran ls -la / and I saw something new that I haven't seen before, look at this result :
-rw------- 1 root root 9375744 Jun 2 14:10 core.21044
-rw------- 1 root root 9375744 Jun 2 14:11 core.21056
-rw------- 1 root root 9379840 Jun 2 14:44 core.22839
-rw------- 1 root root 9379840 Jun 2 14:56 core.22973
-rw------- 1 root root 9371648 Jun 2 14:59 core.22997
-rw------- 1 root root 9371648 Jun 2 15:02 core.23182
-rw------- 1 root root 9506816 Jun 22 05:26 core.26811
-rw------- 1 root root 9367552 Jun 18 04:20 core.27185
-rw------- 1 root root 9371648 Jun 18 04:22 core.27245
-rw------- 1 root root 9371648 Jun 18 04:23 core.27289
-rw------- 1 root root 9367552 Jun 18 04:24 core.27306
-rw------- 1 root root 9297920 Jun 15 06:39 core.420
-rw------- 1 root root 9367552 Jun 18 04:28 core.7092
I did cat one of them and I saw there's something about kernel and the rest of the file has been filled by meaningless characters
They also filled up a huge space, what are these files for?
I have four servers with a quad xeon, 4gb ram, and 2x300GB SAS 15K RAID0 harddrives, pushing a total of 1.6gbits. It serves a lot of zip files with an average flesize of 180mb. My question is, how can I optimize lighttpd 1.4.19 to push its max with very low IO-wait. I've looked up some stuff and only found options that deal with lighttpd 1.5 and use Linux-AIO for the backend network. Currently I use writev with 16 workers and an read/write idle timeout of 10s. Logging is off, too.
View 14 Replies View RelatedSomething weird happening here. I have tried every string possible...
There are a number of folders I want to remove off my server, tried the good old and simple...
rm -r /folder/
And then went and ended up with a string as long as my screen. No matter what I do, as it goes recursive in to the directory it asks me if I want to remove each file individually. No matter what string or action I take it insists on asking me as it goes to delete each file.
Could this be a configuration option in CentOS?
Now the core files for example (core.18920) files are filling the server, here is a shot
-rw------- 1 someuser somegroup 3.9M Mar 17 16:15 core.18920
all server account have that files, all users how running php files in there websites have the core.**** files and all websites quote are going to be excceded
I guess the problem from apache, i do rebuilf using whm, but the same problem
i just wana know is it safe to do remote daily backup for about 70,000 files?
file sizes is about 200kb and every day i have about 1000 new file, so rsync first should check old files becouse i am deleting about 30-50 of them daily and them backup new 1000 files ,
so how much it will take every time to compare that 70,000 files?
i have 2 option now:
1-using second hdd and raid 1
2-using rsync and backuping to my second server , so i can save about $70 each month.
Does writing large files (ie, 10GB backups in one archive) cause any risk of damaging a linux filesystem?
View 1 Replies View RelatedI've got a client who wants to host audio files... Here are the sizes:
50 x 75MBs
300 x 10MBs
400 x 5MBs
That totals 8750MBs or 8.75GBs... If he gets hundreds of visitors, it could end up being 1000's of GBBs or bandwidth.
I don't know what to look for to support so much bandwidth... Do you buy bandwidth? Are their special companies out their that host it for you?
Domain has PHP Settings in Plesk set to 2G and I get this error when uploading a 48MB file using Wordpress. I assume I need ot modify this manually in conf file somewhere to allow uploading large files?
Requested content-length of 48443338 is larger than the configured limit of 10240000..
mod_fcgid: error reading data, FastCGI server closed connection...
I am having problem with a server. On all sites on the server start appearing core.xxxx files that in result fill server. Quotas were disabled because some people had issues logging in on because of error.
Quote:
Sorry for the inconvenience!
The filesystem mounted at /home/*** on this server is running out of disk space. cPanel operations have been temporarily suspended to prevent something bad from happening.
Please ask your system admin to remove any files not in use on that partition.
how to remove all of them so they dont appear again, on some sites there are thousands of core.xxxx files and weigh over 60GB.
I am getting really frustrated in my attempts to automatically uncompress a bunch of .tar.gz files within a directory.
I was told to try tar -xvzf $i;from within the directory containing the tarred files, but all that returned was errors.
Even a google search hasn't turned up info that a newbie like me can understand.
I want to install a script (a simple wordpress blog)on my website but now i'm just astonished at how many files I have to upload on my server's directory! Uploading those files one by one will take forever. Is there a way to upload multiple files all at once?
View 3 Replies View Relatedim sure ive asked this before but i cant seem to find my old post via search. i have just moved to a new server and i need to change the ip addresses in my dns on the new server (have copied the zones from old one ) to replace my old servers ip with my new servers so that when i change my nameservers the newserver handles serving the site.
View 6 Replies View Relatedi have a few directorys on my server which have something in the region of 200+ subdirectorys containing 1 or 2 files. im wondering if anyone knows how to move all the files in the subdirs into the main directory without me going into each folder and doing it by hand
servers running centos 4
I have one server right now and will be adding a second server in the next 4-5 months. In the meantime I've upgraded the coding in my community site to act as if there is already two servers. Mainsite.com has all the php files, and a separate HD for mysql db only, [url] has all the user images and videos.
This problem I'm having right now is the permission to delete files from [url]via [url]
What I've Tried:
vsFTPd - it has a small footprint but its not reliable, needed to be restarted a few times while testing on the beta site.
usermod - this is messy, changing chmods, owners, grps... etc I tried all of that but still had problems getting the Mainsite to be part of the group without having to change all the chmods and grps.
NFS mount - haven't tried yet...
Does anyone run a multiple server site? My sites pretty big been around for 4yrs, and i've upgraded servers 3 times since then.
If i can't get this to work right my final option would be to continue to upgrade to a new server each year, everytime a bigger and faster server setup becomes available.
Current server Dual Xeon 5310 4gb, 3 x 146gb scsi/sas. The server is doing fine with this setup. But with the new site uprade and features it's going to top things out. I can keep throwing ram on the server and move mysql off site.
The site has little over 1mil users with about 60-70% of them being active daily...
i have a lot of files in about 100-200 subfolders and im wondering if anyone has a script i can run to move the files from the subfolders (mostly .zip files) to the root folder then extract the zips (which wil normally result in a .rar file) then extract that file and leave the extracted files in the same folder so i can easily access them, the folders/files are old backups of one of my computers that i need to get data out of. i have access to linux and windows systems whichever is easiest
View 2 Replies View RelatedI very much like the "readme.txt" that Apache appends to the directory listings. That is, when a browser GETs a directory, the text in that file is put underneath the list of files therein.
View 1 Replies View RelatedWe're managing a CMS script which we offer as a hosted solution / leased script. This means we have several accounts on our server, one pr. client/script user. They all run the same script software and occasionally we have to upgrade the script on these sites. Until now this is being done manually.
Connect with FTP -> upload new files -> Done.
This is very easy work but takes annoyingly much time due to FTP needing to set up connections, switching accounts, etc.. and if you have 100s or 1000s of accounts this becomes...yes...you know..
The question is now, how can this be done easier? I was hoping that it would be possible to do this automated somehow (we have root access to our server).