A few accounts that will probably hold a few gigabytes worth of data. I need each account to use IMAP and handle up to 5 customer service reps simultaneously accessing the same account via IMAP. * We tried this previously with Google for domains but were limited to 10 simultaneous connections via IMAP *
Just a basic LAMP setup is fine and the mysql isn't even necessary. I will be hosting a basic form for hi-res photo uploading though so I need to have a pretty high timeout and memory limit for php.
As far as traffic on the site is concerned, I don't anticipate more than a handful of users on the site at a time.
My question is - do you think a good shared host in the $10 a month range could handle a project like this? Can you recommend one?
I can setup a VPS on Linode for $20 a month that I'm sure could handle this, but I'm not a server admin and don't want to risk managing sensitive data myself.
I have a Windows 2003 server with 2GB ram running IIS and SQL 2005 express. I noticed that the server only uses around 1.2GB of its physical memory (been a few days) while page file usage seems high at 1.1GB.
SQL server has been allocated a maximum 1GB of memory while IIS has no limitations on memory usage.
Server doesn't really show any slowness at this point (not a lot of traffic). But to my understanding other than the proccesses that always use some virtual memory (around 400mb usually), page file should be only used when there is insufficient physical memory? So is there something wrong with my memory usage, 1.1GB on page file while there are still almost 50% available physical ram?
Upgrade from 11.5 reported it had completed but when I log in, I get the grey bar on the left with various options - Hosting Services, Tools & Settings etc. (only Profile & Preferences works) and a main white screen with the words 'File Not Found' in large font black lettering.
Rapidly growing error logs showing the same message
$ug-non-zts-20020429/ffmpeg.so' - /usr/local/lib/php/extensions/no-debug-non-zts-20020429//usr/local/lib/php/extensions/no-debug-non-zts-20020429/ffmpeg.so: cannot open shared object file: No such file or directory in Unknown on line 0
root@server [~]# ls /usr/local/lib/php/extensions/no-debug-non-zts-20020429 ./ ../ eaccelerator.so* root@server [~]# ls /usr/local/lib/php/extensions/no-debug-non-zts-20020429 ./ ../ eaccelerator.so*
I have a web site backup file (let's call it 'filename.tar.tgz') that was generated from a home-grown web hosting panel and is ~1.6GB in size. It is resident on a WinXP computer, but I also have it copied to a *nix machine.
I have attempted to restore the backup using the normal restore process provided by the site admin panel, but it will never complete because of the size of the file. So, I need to retrieve the folders/files from within the 'filename.tar.tgz' file so that I can re-upload the files/folders through normal FTP.
I have had no success extracting the files/folders when using tar, gtar, gunzip, etc on the Linux box. 7Zip won't open it either. The Linux terminal reports a 'stdin: not in gzip format' error when trying to decompress/extract the files.
What I need is the exact syntax (with any switches) that I can use in my Linux Terminal Shell for extracting this archive so that I can access the files within.
I've been recently trying to move an account between servers, but the backup file is always incomplete. I was told that it is possible there are too many files.
I decided to tar some of them and move manually, but I cannot access the tar file. I already changed all permissions (644), owner, group, but I still get 403 Forbidden error. Is it possible that the file is too big (9 BGs), and if it is, how do I change the file size limit?
I'm having a issue with my current robots.txt file , which is not properly handling the requests/ blocking the content to be access . What I want is that to only allow like google bots , yahoo , msn , bing , alexander ranking beside those bots block all other bots . my current file rebots.txt is below