My client's website needs to hold files that are around 60 or 70 MB. The host only allows files up to 10 MB. Is that typical?
Right now I'm linking to a file storage but would rather make the files available from my site without going to a 3rd Party Site. He doesn't want to zip his files either - just to be a straight download.
So I've recently ordered a Supermicro 4U server with 24x1TB HDs, 64GB RAM and put it in RAID 10. I'm running Debian 5.0 and have installed lighttpd. All the content I serve are video files (AVi, MP4, MKV, OGM) and each file is about 100-500mb in size. I'm wondering how can I optimize lighttpd to get the best performance out of it. I look forward to your replies.
I have been trying quite unsuccessfully to import a large sql db file via phpMyAdmin for one of my clients. Since the db file is about 250mb I get a server timeout error.how I can do this via SSH...I have a CentOS server 6.5, 64 bit that runs Plesk v 12.0.18
on good hosting setups for getting large amounts of disk space.
I would like to be able to offer up to 2Gb storage space for 100s, maybe up to a few 1000 users - any solution should scale well. The files would be static files that might be up to 400Mb in size.
It would be nice to be able to give users FTP access to their disk space, although it's not a core requirement.
3 different sizes, I would figure maybe adding droptables would amount to higher size but the first way and third way i use, how come there is a size difference of 2kb?
Which is the right way to backup mysql database safely?
A few days ago, my friends studying in America recommended me a new popular transfer tool—Qoodaa. And he told me that it was a quite good software to download files and movies. At first,I was skeptical, but after using it, I found it’s a good choice to choose Qoodaa. And I have summarized the some features of Qoodaa:
1.Its speed is faster than any other softwares I used before to upload movies.
2.It can download files quickly through downloading links, in a word, it is time-saver and with high efficiency.
3.No limit of space.No matter where you are, it can download fast.
4. Qoodaa is a green software with high security and easy use.
It really can give you unexpected surprise.
I am a person who would like to share with others, and if you have sth good pls share with me
I'm currently running on a VPS. My site allows for large file uploads and downloads, with files over 600mb in size.
The server has issues when the site gets three or more requests for large file downloads. I'm trying to grow this site to thousands of users and it is hard to do when the site can't handle even three.
I've been told by my host that I need to upgrade to dedicated. My VPS only has 512mb RAM and one large file download is eating up that RAM. This is causing the issue.
I'm a newbie and while I knew I was risking a bit by going with VPS I do find it a bit annoying that these guys advertise 1TB of bandwidth per month but I can't even support downloading 1GB at the same time....maybe it's just me...
Anyway, I am now looking into moving the large files and the upload/download over to Amazon S3. If I do this I am expecting my RAM usage on the VPS to greatly decrease. Is this correct? If my PHP code is running on the VPS, but the actual file download via HTTP is coming from S3, that should not be a heavy load on my box, correct?
Rapidly growing error logs showing the same message
$ug-non-zts-20020429/ffmpeg.so' - /usr/local/lib/php/extensions/no-debug-non-zts-20020429//usr/local/lib/php/extensions/no-debug-non-zts-20020429/ffmpeg.so: cannot open shared object file: No such file or directory in Unknown on line 0
root@server [~]# ls /usr/local/lib/php/extensions/no-debug-non-zts-20020429 ./ ../ eaccelerator.so* root@server [~]# ls /usr/local/lib/php/extensions/no-debug-non-zts-20020429 ./ ../ eaccelerator.so*
i manage linux apache webserver with a few wordpress blogs and from time to time i see someone inject a malicious .php file into wp-content/uploads/2014/10/ directory.
i think its some bad plugin or theme, but these is more blogs, i ugrade, update, WP, but
how can i setup some monitor to tell me which php file (or even line in php file) injected that malicious .php ? I have linux root access so i can setup anythingÂ
I'm having a lengthy issue where my databases are to large to import in phpmyadmin using plesk. Unfortunately I dont have direct access to phpmyadmin and can only access it by DB user through plesk.
I have tried to edit php.ini in the following locations:
upload_max_filesize = changed this to 64M
post_max_size = changed this to 32M
maximum_execution_time = changed this to 300
maximum_input_time = changed this to 300
Why am I still not able to import my DB's which are about 8MB each?
I have a website which has about 20K users, and now I am using VPS plan at LunarPages.
However, I have encountered a trouble of out-of-memory. Although I have configured my Apache and MySQL carefully, the 512M memory is not enough. Therefore, the users' expirence is not good these days because my site is very unstable.
I contacted Lunarpages, asking them whether I can upgrade my VPS to bigger RAM, but they said the ONLY way to get a RAM bigger than 512M is to upgrade to dedicated hosting plan.
The following are some stats of my website:
Total Members: 20k Online at the same time: max 600, average 300
The Lunarpages VPS plan: www[dot]lunarpages[dot]com/virtual-private-server/ disk space: 20G RAM: 512M price: $42 / mo
Now I am not sure whether to migrate to didicated hosting plan, because currently, the main problem is just the size of RAM. Other resources e.g (CPU, network etc. ) are not my bottleneck. So I think it seems not worthwhile for me to migrate to the dedicated hosting plan with a doubled price (even more, almost 3x if I need 1G RAM), just for a larger size of RAM.
Can you guys give some suggestions to choose a VPS provider for my site? The factors taken into my consideration include:
* RAM size: at least (1G for peak, 768M garantee). The bigger, the better. Nice if can choose larger size when needed. * price * bandwidth: 1T/mon? * easy to upgrade to dedicated host: just in case that one day I will have to use dedicated. * whether there are coupons for a lower price.
I've been with zone.net for a couple months now, and I have a guaranteed 512MB of memory, which I seem to constantly be hitting, which seems to result in processes being killed and http access vanishing. Growing quite annoying.
I'm looking into moving onto a new provider that can provide more guaranteed RAM for about the same price.
Space isn't a huge deal, I'd do fine with a meager 5GB. Bandwidth I need at least 200GB, but wouldn't mind more.
I'd like to stay managed if possible, as I'm not as well versed in server workings as I should be. Also am in need of cPanel, which I know is a spendy sucker.
My budget is something around $70 a month, and I don't really want to go much higher than that. Still a poor college boy :/
Can anyone suggest such a provider? I've browsed around a lot of the VPS hosts but can't seem to find one that has as much RAM as I need for a decent price. All the ones that seem to have 512MB+ are pretty expensive, and offer a lot more other stuff (space/bandwidth) than I need.
As a final note, the line speed isn't that big of a deal. I'm currently on a 3mbit and am surviving, but going back to a higher speed line would be great
Just had a quick question about backing up a large MySQL DB. I have a database that is 50gb with about half a billion entries in it. One table itself is about 40gb, the other 10gb consists of smaller tables.
The problem is, I want to back the database up and be able to keep it LIVE at the same time (as it will fall behind quickly if it's pulled for more than a few hours, as there are somewhere in the area of a million entries an hour, plus other deletions and queries).
I'm currently using iptables to ban IP addresses from the servers, like:
Code: iptables -A INPUT -s xxx.xxx.xxx.xxx -j DROP I ran a "spam trap" for the last few months and now I have over 11000 IP addresses who were trying to spam on my website (guestbooks, phpBB and forms) and I want to ban them all (pretty sure bots run from them).
My question - is iptables the way to do it? I mean does banning such a large number of addresses have any significant performance or other issues I should be aware of (except of the fact I may be banning some legitimate traffic)? Is the -A INPUT the way to ban them all or is there a more appropriate way of baning such a number of addresses?
I'm on CentOS 4.5 i686, Apache/1.3.37, Pentium D 930, 2GB RAM.
I wasn't sure where to post this so here goes, I need to migrate a MySQL DB, in the past I have just created an SQL file and used that method (sometimes having to split the SQL file up) but now the DB is about 50 meg and 733,233 records.
Is there an easier way to migrate the Database from one server to another?
I'm selling downloads of music files. The zip files are quite large. I've had several people complain that they get a message that the server resets their connection before the download finishes.