Very Large Disk Partitions - Practical Limit

Aug 5, 2009

We're outgrowing our current bulk storage system and I'd like to solicit opinions.

With 2 TB disks and a 16 disk array, it's possible to have a single 28 TB volume (after deducting RAID5 parity overhead and a hot-spare disk). I've seen arrays from Aberdeen with 48 and 96 disks, for nearly 200 TB. Windows supports up to 256 TB per volume when 64K cluster sizes are used.

Our backup system uses a ton of storage space, and it would be far more convenient, and more efficient from a utilization standpoint, to access that space as a single volume.

Breaking it up into smaller chunks, such as 2 TB each, means we have to make a "best guess" on balancing actual need.

For example, if we assign 25 servers to each 2 TB volume for backup storage purposes, some volumes might only see 800 GB of consumption (remaining 1.2 TB allocated but not used) while other volumes might get 1.6 TB used (remaining 400 GB allocated but not used). Key concept: wasted space, because we have to over-estimate need to assure adequate headroom.

From the opposite viewpoint, if we had a sudden increase in need that exceeded the available space allocated to that volume, we'd have to move that server to a different volume. Key concept: increased admin workload to monitor and re-balance distribution as needed.

Now if we used one giant volume, there would be no guesswork, no "allocating more than we think is needed" for a bunch of small volumes. All servers share one huge common pot.

But there has to be a practical limit from a system-overhead standpoint. Our backup sets consist of a few multi-gigabyte files, so using 64K clusters will not cause much waste from slack space.

I'd like to get your opinions on maximum disk volume sizes from a practical standpoint.

View 1 Replies


ADVERTISEMENT

Plesk 12.x / Linux :: Content-length Limit When Uploading Large Files

Jun 18, 2015

Domain has PHP Settings in Plesk set to 2G and I get this error when uploading a 48MB file using Wordpress. I assume I need ot modify this manually in conf file somewhere to allow uploading large files?

Requested content-length of 48443338 is larger than the configured limit of 10240000..

mod_fcgid: error reading data, FastCGI server closed connection...

View 1 Replies View Related

Windows 08 - IIS7 - Limit Disk Usage & FTP

Aug 30, 2008

I'm starting a webhosting business in the next few months (working on the panel), and was wondering what is the best method to limit the amount of disk usage the user can use? I know about Disk Quota, but that would be a pain to use. Is there anything built into IIS7?

Also, is it possible to use a SQL 05 DB for FTP user accounts with IIS7? If not, is there any other way to have FTP accounts *without* having to create a windows user account?

View 5 Replies View Related

Limit Disk Inodes On A Per User Basis Or Server Wide

Oct 29, 2008

We have a questions for everyone and any help would be greatfull, we are looking to limit disk inodes on a per user basis or server wide. we would like to know if anyone ca referance us as to how this is accomplished.

View 3 Replies View Related

Plesk Automation :: Update Hard Disk Memory Limit For The Created Subscription

Mar 3, 2014

Is any way to change hard disk memory limit for created subscription ? It is 4 Gb I have to increase it to 8 Gb.

View 1 Replies View Related

Host My Own Sites - Is It Practical

Sep 22, 2008

I'm thinking of using one of my computers at home as a dedicated server to host my own sites, and I would like to get your opinion guys on whether that would be a practical thing to do or not.

Dedicated Server: I put together;

Intel Pentium Dual Core 2.8GHz
3GB DDR2
1TB Seagate HD
GeForce 9800 GX2 1GB
Gigabit LAN
Windows XP Pro / IIS 5.1
Smart Firewall/Router Symantec
APC Smart-UPS battery (Full 15 hrs.)

Dedicated Connection: Business account will run me $80 CDN/month;

Speed Download - Up to 16 Mbps
Speed Upload - Up to 1 Mbps
Transfer/month - 200 GB
IP Addresses - 2 dynamic & 1 reserved.

The Sites:

Both sites are Academic based - together they receive approx. 200,000 visits a month / 50 to 60 GB transfer and growing. I'm also in process of publishing a 3rd site, but over all, I anticipate the 3 sites transfer to hover around 100GB/mo unless Digg/StumbleUpon/Google all decide to have an orgy-traffic linkage at once and push it higher.

View 14 Replies View Related

Windows Versus Linux Host: Any PRACTICAL Difference

May 15, 2008

I (and my clients) have a few very small, simple-minded websites...a few php programs for simple forms fetch-and-forward. Is there much PRACTICAL difference between a Windows-based host and a Linux-based host?

View 3 Replies View Related

Partitions On Linux

Aug 2, 2009

I have a linux server with cPanel and 500GB Disk Space. After investigating I think I would have the following partitions.

/
/boot
/home
/dev/shm
/tmp
/usr
/var

how much to allow to each partition. I will be using the server for hosting accounts, shared and resellers.

Also what would you recommend the swap file size be?

View 9 Replies View Related

Recovering NTFS Partitions

Mar 5, 2009

I have a Vista machine. I have installed CentOS 5.1 by selecting the C: (Active partition) and formatting it as ext3 partition. Then after installation, in the Hardware > Hard disks, it is showing only one NTFS partition. But actually I have 4 NTFS partitions. When I try to mount that partition using ntfs-3g, I am getting "/dev/sda3: permission denied" error.

View 2 Replies View Related

Centos + Cpanel And Partitions

Mar 31, 2008

I have an dedi with 2 x 250 gb hd's with raid-1.
Now I just have one partition / for whole drive.

Should I re-install and make proper partitions?

Which partitions you recommend??

I will backup to a local folder /cpbackup and rsync from there to nas.

View 2 Replies View Related

Creating VPS Partitions On Dedicated Server For My Own Use

Apr 26, 2009

more experience Linux users to partition my dedi into VPS. I have an Intel Quadcore 2.4 Ghz, 500GB HDD, 2GB DDR RAM, dedicated server with a max 100mbit connection and 2000GB BW/mo. It has Centos 5.3 (centos-release-5-3.el5.centos.1) installed on the server and I want to install the DirectAdmin CP soon.

I'm not a reseller or webhost and don't intend to become one. This server is for my exclusive use.

I want to use half the server to run virtual instances of a Windows 2008 server and a KDE or similar Linux virtual desktop using FreeNX as well as a 4PSA VoIP Now or similar software. The other half of the drive will be to run my businesses websites, mailserver, a DNS server, etc.

I have six IP addresses for this server that can be used to this end and will host at least three websites (under separate domain names) and one or two blogs for which I will install requisite software.

I understand that the RHEL 5 embedded virtualization software will allow me to partition the server into VPS for various purposes.

Here are the outputs from ckdisk -l and parted -l respectively for the current HDD partitions.

Disk /dev/sda: 500.1 GB, 500107862016 bytes
255 heads, 63 sectors/track, 60801 cylinders
Units = cylinders of 16065 * 512 = 8225280 bytes

Device Boot Start End Blocks Id System
/dev/sda1 * 1 13 104391 83 Linux
/dev/sda2 14 60801 488279610 8e Linux LVM
[root@denprivatevaert ~]# parted -l

Model: ATA ST3500320AS (scsi)
Disk /dev/sda: 500GB
Sector size (logical/physical): 512B/512B
Partition Table: msdos

Number Start End Size Type File system Flags
1 32.3kB 107MB 107MB primary ext3 boot
2 107MB 500GB 500GB primary lvm

Error: Unable to open /dev/md0 - unrecognised disk label.

For the DA install, so I don't have to try to figure out where things are, I'd like to use their more complex partition structure as follows:

/boot 40 meg
swap 2 x memory
/tmp 1 Gig. Highly recommended to mount /tmp with noexec,nosuid in /etc/fstab
/ 6-10 Gig
/usr 5-12 gig. Just DA data, source code, frontpage.
/home rest of drive. Roughly 80% for user data. Mount with nosuid in /etc/fstab if possible.

I will install dovecot to be able to create SSL access to my webmail so don't need a '/var' directory.

What I want to know is:

1) Should install virtualization and partition the drive prior to having DA installed?

2) How do I best partition the drive into VPSes so I can run distinctly different virtual instances of different OS and/or programs on the VPS as well as use half for websites, blogs, servers, etc.?

3) What else do I have to keep in mind when doing this?

I'd appreciate any positive, useful response and information on getting this done and I'd like to try to get this done by Monday or Tuesday of next week so DA can be installed on the appropriate partition.

View 0 Replies View Related

Partitions For New Server :: Error Message Is Not_enough_space

Apr 15, 2008

I just installed a new server with partitions below

# df
Filesystem 1K-blocks Used Available Use% Mounted on
/dev/mapper/VolGroup00-LogVol00 470889384 2015312 444568408 1% /
/dev/sda1 101086 28075 67792 30% /boot
/usr/tmpMnt 290503 10289 265214 4% /tmp

However, in HyperVM, it shows
VolGroup00 465.53 GB 465.53 GB
/(VolGroup00-LogVol00) 25.7 GB 459.85 GB

And I could not create vps on each. Error message is not_enough_space[].

I have searched on lxlabs board, and seems I need another partition for data?

View 4 Replies View Related

Set Up Cpanel Users To Use Different Partitions (home Dirs)

Dec 1, 2007

Is it possible to specify where your CPanel user's data is stored?

Let's say I have four hard drives without RAID, I have hard drive one on /home, hard drive two on /home2, hard drive three on /home3, and so on. Is it possible to setup users on the different partitions to spread out disk usage?

To explain further, I would like to set it so maybe one reseller account was using /home2, then another was using /home4, and another using /home.

Any ideas on how to go about splitting up user's data across seperate partitions?

View 5 Replies View Related

Does The Use Of Partitions Prevent Hackers From Getting Access To The Linux Server

May 7, 2007

Does the use of partitions prevent hackers from getting access to the entire Unix server?

View 2 Replies View Related

Use Almost Disk Space Will Harm Hard Disk

Apr 29, 2008

My server has a small SAS disk(about 73G), if I use 90% diskspace of it, is it good idea, will it harm the physical HDD?

View 6 Replies View Related

2 SATAII Disk W/RAID0 Vs One SAS Disk

Jan 8, 2009

I am looking for better disk performance. Due to the tight budget, I have to choose one of following options as my disk choice:

2 SATAII disk w/RAID0, 7200rpm, 32M cache for each disk

1 SAS disk, 15000rpm, 16M cache.

which one will be better and how better if other things(hardware & OS) are same?

View 14 Replies View Related

Disk Image To Disk Partition (Xen)

Sep 1, 2008

can a Xen disk image be converted to a diskpartition?

Someone is asking whether I can host his disk image at his current host, which he is leaving for poor I/O (wonder why that would be ). I can host a diskimage, but I don't like diskimages (slow, and 100GB isn't very 'comfortable' either). Is there any way out there to convert a disk image into a normal partition?

View 3 Replies View Related

Apache :: Use Disk To Enhance For Page Cache And Database Cache For Disk

Apr 24, 2013

I use apache with CentOS VPS hosting for my blog. I only host one blog in this VPS account. I have 1.5GB RAM and I have 7, 500 page preview per day. My page loading time is 2-3 seconds (according to the pingdom tool).

I want to know what is the best performance (faster web page loading) W3 Total cache option for VPS hosting blog. Currently I use Disk to enhance for page cache and database cache for disk.

View 2 Replies View Related

SQL Too Large

Apr 5, 2007

I'm having a lengthy issue where my databases are to large to import in phpmyadmin using plesk. Unfortunately I dont have direct access to phpmyadmin and can only access it by DB user through plesk.

I have tried to edit php.ini in the following locations:

upload_max_filesize = changed this to 64M

post_max_size = changed this to 32M

maximum_execution_time = changed this to 300

maximum_input_time = changed this to 300

Why am I still not able to import my DB's which are about 8MB each?

View 4 Replies View Related

Large Servers

Oct 2, 2009

what type of hosting do use big sites like Deviantart, or any kind of site with a lot of Gb or Tb. and how much it costs?

View 7 Replies View Related

VPS With Large RAM, Or Dedicated

May 20, 2008

I have a website which has about 20K users, and now I am using VPS plan at LunarPages.

However, I have encountered a trouble of out-of-memory. Although I have configured my Apache and MySQL carefully, the 512M memory is not enough. Therefore, the users' expirence is not good these days because my site is very unstable.

I contacted Lunarpages, asking them whether I can upgrade my VPS to bigger RAM, but they said the ONLY way to get a RAM bigger than 512M is to upgrade to dedicated hosting plan.

The following are some stats of my website:

Total Members: 20k
Online at the same time: max 600, average 300

The Lunarpages VPS plan:
www[dot]lunarpages[dot]com/virtual-private-server/
disk space: 20G
RAM: 512M
price: $42 / mo

Now I am not sure whether to migrate to didicated hosting plan, because currently, the main problem is just the size of RAM. Other resources e.g (CPU, network etc. ) are not my bottleneck. So I think it seems not worthwhile for me to migrate to the dedicated hosting plan with a doubled price (even more, almost 3x if I need 1G RAM), just for a larger size of RAM.

Can you guys give some suggestions to choose a VPS provider for my site?
The factors taken into my consideration include:

* RAM size: at least (1G for peak, 768M garantee). The bigger, the better. Nice if can choose larger size when needed.
* price
* bandwidth: 1T/mon?
* easy to upgrade to dedicated host: just in case that one day I will have to use dedicated.
* whether there are coupons for a lower price.

View 9 Replies View Related

VPS With A Large Amount Of RAM

May 22, 2007

I've been with zone.net for a couple months now, and I have a guaranteed 512MB of memory, which I seem to constantly be hitting, which seems to result in processes being killed and http access vanishing. Growing quite annoying.

I'm looking into moving onto a new provider that can provide more guaranteed RAM for about the same price.

Space isn't a huge deal, I'd do fine with a meager 5GB. Bandwidth I need at least 200GB, but wouldn't mind more.

I'd like to stay managed if possible, as I'm not as well versed in server workings as I should be. Also am in need of cPanel, which I know is a spendy sucker.

My budget is something around $70 a month, and I don't really want to go much higher than that. Still a poor college boy :/

Can anyone suggest such a provider? I've browsed around a lot of the VPS hosts but can't seem to find one that has as much RAM as I need for a decent price. All the ones that seem to have 512MB+ are pretty expensive, and offer a lot more other stuff (space/bandwidth) than I need.

As a final note, the line speed isn't that big of a deal. I'm currently on a 3mbit and am surviving, but going back to a higher speed line would be great

View 11 Replies View Related

Backing Up Large DB

Jan 17, 2007

Just had a quick question about backing up a large MySQL DB. I have a database that is 50gb with about half a billion entries in it. One table itself is about 40gb, the other 10gb consists of smaller tables.

The problem is, I want to back the database up and be able to keep it LIVE at the same time (as it will fall behind quickly if it's pulled for more than a few hours, as there are somewhere in the area of a million entries an hour, plus other deletions and queries).

View 3 Replies View Related

Ban A Large Number Of IPs

May 30, 2007

I'm currently using iptables to ban IP addresses from the servers, like:

Code:
iptables -A INPUT -s xxx.xxx.xxx.xxx -j DROP
I ran a "spam trap" for the last few months and now I have over 11000 IP addresses who were trying to spam on my website (guestbooks, phpBB and forms) and I want to ban them all (pretty sure bots run from them).

My question - is iptables the way to do it? I mean does banning such a large number of addresses have any significant performance or other issues I should be aware of (except of the fact I may be banning some legitimate traffic)? Is the -A INPUT the way to ban them all or is there a more appropriate way of baning such a number of addresses?

I'm on CentOS 4.5 i686, Apache/1.3.37, Pentium D 930, 2GB RAM.

View 11 Replies View Related

Large MySQL Migration

Feb 2, 2008

I wasn't sure where to post this so here goes, I need to migrate a MySQL DB, in the past I have just created an SQL file and used that method (sometimes having to split the SQL file up) but now the DB is about 50 meg and 733,233 records.

Is there an easier way to migrate the Database from one server to another?

View 8 Replies View Related

8gb Ram And Large Bandwidth Less Than $300/month

Apr 5, 2009

I'm trying to find a server that can offer about 7-8 TB with 6-8 GB of ram. Does anyone know a good provider?

View 14 Replies View Related

Large E-mail Accounts

Mar 28, 2009

any recommendation for a good e-mail service (IMAP) for large accounts? between 5 and 10GB each?

View 13 Replies View Related

Allowing Large Downloads

Mar 9, 2009

I'm selling downloads of music files. The zip files are quite large. I've had several people complain that they get a message that the server resets their connection before the download finishes.

What can I do to allow these large downloads?

I'm on IIS.

View 21 Replies View Related

How To Copy Large Directory Via SSH

Jul 9, 2009

I have a large directory which I want to copy to another account on the same server. Its 1 folder which contains 20000+ files and its around 2GB in size.

I used:

Quote:

cp -r /home/useraccount/public_html/foldername /home/useraccount2/public_html

It worked but some of the files didn't transfer correctly,

View 11 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved