What sort of hosting requirements do you think I would need for...
A site with a mysql db with 100,000+ records in a single table and executing simple select statements on the table with 100,000+ records at a rate of 50 queries per second? (All other things should be negligible at this scale).
im planning to setup a loadbalencer system soon and im wanting to redirect users based on their location, im wondering if anyone knows how best to do this (servers will be hosting streaming media and load balancer will be la linux based server
Without having all of the operating systems at my disposal for testing, I would like to figure out a way to determine the operating system of a remotely accessed Linux machine.
It seems pretty strange though, since cPanel reports both machines I am using as being
CENTOS Enterprise 4.5 i686, yet one's uname -a reports:
Code: Linux hostsentry.crucialwebhost.com 2.6.9-023stab044.4-enterprise #1 SMP Thu May 24 17:41:23 MSD 2007 i686 i686 i386 GNU/Linux
Code: Linux main.7kb.org 2.6.9-55.0.6.ELsmp #1 SMP Tue Sep 4 21:36:00 EDT 2007 i686 i686 i386 GNU/Linux I'm assuming there is a way to determine the OS from this information. Anyone know how?
after creating a new cron job, when i try to open Scheduled Tasks i get this error :Internal error: DateTime::__construct(): Failed to parse time string (2015-05-28 0.12:0:37) at position 17 ): Unexpected character
Message DateTime::__construct(): Failed to parse time string (2015-05-28 0.12:0:37) at position 17 ): Unexpected character File Helper.php Line 143 Type Exception
which type of CPU would be better for a web server that will run Windows, PHP, ColdFusion, mail, DNS, and IIS. Would a dual core CPU with a higher clock speed do better than a quad core CPU with lower clock speed? For instance, would a dual core 3 Ghz processor do better than a quad core at 2.4Ghz?
I have setup Backup Manager to backup Files and Configurations every week for every subscription. This worked really good however we are now considering changing this to daily basis.
As the free space on server will decrease. I was wondering i there is any way with cronjobs how to remove the backup older than 24 weeks. How to make such an cronjob script?
If possible i also would like to know where the Backup Manager physically stores its backups?
How can I make it so only the system (scheduled task or the server itself) can access to the file to execute it. I tried placing CHMOD 700 to the PHP file but everyone is still able to access to it and executive it.
I've just migrated server to server but I didn't use the migration manager, I simply backed up each domain's content and restored those backups on the new server.
The issue I have run into is the scheduled tasks at the admin level (Tools & Settings > Scheduled Tasks) work but any schedule task under a domain do not. The scheduled tasks under the domains show as inactive (no green tick) and when I click to enable them, I get the following message
"Information: The scheduled task Google Product Feed Daily Scheduled Task was switched on."
If I then look at the page, even though the above message says it is, the scheduled task is still not enabled. Going into the scheduled task and switching it on there results in the same message, but it doesn't switch on.
How to rebuild all of the scheduled tasks within Plesk?
The CRON service is running and works correctly, the Scheduled tasks on the administration Plesk Panel (tools & Setting / Scheduled tasks / root user) works ok, but when a customer sets a new scheduled task from her domain panel, don't works.
I installed lsws without apache conf file(httpd.conf). Then I created a new virtual host in "suEXEC" Template. I added a new user via SSH and made home dir for him and chowned his home dir + all his files to hisusername:hisusername. His home dir(/home/user/) is chmoded to 755 and his /public_html to 711. It worked fine but after that I installed phpbb3 forum and when I tried to chmod config.php to 600 I got an error on the forum:
Fatal error: require() [function.require]: Failed opening required './config.php' (include_path='.:/usr/local/lib/php') in /home/username/public_html/common.php on line 127
When I was using lsws with apache conf file and I had configured suEXEC + suPHP for apache I was able to chmod config file to 600 and it worked fine. I have no idea what could be the problem now.
It works fine when I chmod config.php to 755 but for security reasons I would need a way to configure it to 600. LiteSpeed si running as nobody:nobody. EX. APP settings: LSAPI App $VH_NAME_lsphp uds://tmp/lshttpd/$VH_NAME_lsphp.sock
I have Fedora Core 5 wih Apache 2.2.2 and VirtualHosts setup, currently running mod_php and mod_suexec. I would like to switch to use php with suexec because I need to edit files with php that "nobody" doesn't have access to (777 not an option).
Right now mod_suexec works great with Perl, but not PHP. So I ask, how can I get them to play nice?
I've been attempting to develope a server running apache 2+, php 5, and I was running into issues installing php as cgi.... All my scripts require The shebang: #!/usr/bin/php at the top to execute properly. Anyone know a good site/how to that explains how to do this?
has anyone else here run suexec with apache? If so, could you tell me what you compiled it with? Just curious, as I think i'm doing everything right, yet I still fail
I just heard this story on NPR yesterday discussing cloud computing, how you can use external computers to do super-computer sized tasks without having the hardware in house yourself.
If we host colocated servers, how feasible is it to get our servers into that game?