Huge File To Transfer
Jul 8, 2009
Just moved to a new server, and of course, 10GB doesn't seem that large for a server but for some reason wget is not able to handle the transfer of that backup for me... it transfers about 1MB then tells me "successful transfer..."
The old server is using cPanel, and the new server is just a plain old server that I haven't loaded up yet.
how I can get this full backup over to the new server?
View 11 Replies
ADVERTISEMENT
Feb 8, 2008
transfer a client's site files (over 220 MB) to my server. The client does not use cPanel or have SSH access.
FTP is horribly tedious. I have created the account on my server and have SSH enabled. I have a feeling I can use wget to download the files to the account's home directory, but I am not sure of the correct syntax to recursively download all the directories and the files.
View 6 Replies
View Related
Jun 23, 2008
I had several user accounts that were pushing their quota. I was digging around in SSH and found that the INBOX file in /home/username/mail was huge even though the user does not keep messages on the server. I deleted this file to free up space and all seems file. A couple seconds later I did check and the file was recreated with new incoming mail.
My question is how do I keep this file from growing out of control? One of the users I had for almost 2 years had an INBOX file of almost 2GB!
Server Details:
VPS running WHM 11.23.2 cPanel 11.23.3-R25623
Redhat 9
View 5 Replies
View Related
Jan 24, 2008
I had 18GB bandwidth.log file at /etc/log/ directory? What is the meaning of bandwidth.log file? And what may be reason increasing file size to 18GB, especially in one night.
View 4 Replies
View Related
Oct 24, 2014
I have enabled modsecurity system and in 1 day the modsec_audit.log file has grown to more than 700Mb. Is there any way to reduce the number of messages that this module logs?
View 4 Replies
View Related
May 21, 2007
How to transfer file from old hosting provider to new hosting provider and zero downtime?
View 5 Replies
View Related
Mar 28, 2008
I have a situation here.
I wanted to transfer a file (about 10GB) from one server to another server. Both server are on the same LAN.
Inside server1, I zipped a folder as file.tar.gz
From server2, I uses wget http://server1.com/file.tar.gz and it says "404 Not Found"
I suspect it won't allow me to download because the file is too big.
View 7 Replies
View Related
Jan 21, 2008
I am running a large scale business and some time I have to transfer large and very important data files to my business partner. I fear about my data because there are many of my business competitors who will definitely try to steal my important data. So there is huge amount of risk involved in sharing my important data on Internet. I recently heard about secure file transfer technique from my friend who is working in well established software company. Does anyone have any idea about what is Secure File Transfer (SFT) service and how does it work?
View 2 Replies
View Related
Nov 8, 2008
I want to set up my desktop to be kind of a database. So I can access all my files on my home desktop, from school. (and be able to back up all my files on my reliable desktop as opposed to my not so reliable lappy)
Then the next thing I wanted to do is to be able to access my desktop using remote access. So I can control everything on my desktop, while I'm not there.
My laptop is running Vista Home Premium, I dont think that matters too much. But my desktop is running XP Home Edition.
I have a no-ip account. but I dont really know what my next step would be, I'm guessing to make a sort of FTP on my desktop? and I have NO clue how I'd do the remote desktop.
View 7 Replies
View Related
Apr 26, 2007
I am unable to find how to resume file transfor via sftp and command line:
I use PUT command to upload file, but when connections fails and I start again, the transfer starts from its beginning - how could I made it to check the uploaded part and then resume?
View 1 Replies
View Related
Dec 29, 2007
I seem to keep getting this error whenever I try to uploading something thats too large.
It transfers fine for a few mins, then it stalls and eventually I get this error message.
Small files are fine since they don't take long to transfer, it just seems that I can't have a file transferring for too long.
I can actually get the damn file through but since I keep getting that error, I need to keep manually clicking on the transfer button. So it takes about 20 tries before I can finish a 30MB file. And I have a lot of files to transfer so thats very troublesome.
Does anyone know what the problem might be? I tried turning off the firewall and opening ports on my router but it still doesn't work. Using Cute FTP.
View 2 Replies
View Related
Jun 10, 2008
Knowing Qoodaa was quite by chance.
A few days ago, my friends studying in America recommended me a new popular transfer toolQoodaa. And he told me that it was a quite good software to download files and movies. At first,I was skeptical, but after using it, I found its a good choice to choose Qoodaa. And I have summarized the some features of Qoodaa:
1.Its speed is faster than any other softwares I used before to upload movies.
2.It can download files quickly through downloading links, in a word, it is time-saver and with high efficiency.
3.No limit of space.No matter where you are, it can download fast.
4. Qoodaa is a green software with high security and easy use.
It really can give you unexpected surprise.
I am a person who would like to share with others, and if you have sth good pls share with me
View 0 Replies
View Related
Jul 11, 2007
I have problem:
Searching /root....
Found cpmove-clanpz.tar.gz!
Moving Packge to /root/cprestore/cpmove-clanpz.tar.gz
Extracting tarball....window.scroll(0,175);...............
Done
Extracting Domain....Done
Sorry, the copy failed. Unable to find the cpanel user file. Is the archive missing (cwd: /root/cprestore loaded cpmove-clanpz/cp/clanpz)?
checked 4 files.....
1. /scripts/pkgacct username...
2. Transfer backup to new server
3. /scripts/restorepkg username
4. this error
View 14 Replies
View Related
Feb 26, 2009
I have a unique tool to move your files from old host server to a new one because of some reasons.
Find it out later in my post......
View 6 Replies
View Related
Feb 10, 2015
I'm build Plesk Panel for Linux and Presence Builder, I don't want my user can upload their website to hosting via File Manager. How can I do it...
View 2 Replies
View Related
Feb 3, 2008
I'm sure this question has been asked before, but I'm looking for a nice and simply way of breaking up log files into smaller chunks.
I've been running apache2 on a VPS for the past few months and one of the access.log files is now 700mb big... bit of a waste of space. I'm currently just doing:
CustomLog /var/www/logs/domain.com/access.log combined
ErrorLog /var/www/logs/domain.com/error.log
In my apache config.
Is there any easy way of telling apache to just keep the last week or months worth of logs?
View 7 Replies
View Related
May 15, 2008
The error logs on my web server keep growing to stupidly large sizes within a couple of weeks.
when i look through the error logs it seems to be showing exactly the same line but just from diffferent Ip addresses. the line is as follows
[Sun May 11 07:11:41 2008] [error] [client ###.###.###.###] File does not exist: /var/www/phpmyadmin/tracker
View 5 Replies
View Related
Mar 24, 2008
I've been using mod_security for a long time, but apparently I accidentally enabled some kind of log or something that uses mysql. I don't remember it being there before.. but the point is; the database is like 145100k!
Which is HUGE for a database..
How can I disable this stupid log?
View 2 Replies
View Related
Jun 11, 2008
I'm just curious as to what kind of things the huge sites--Youtube, Myspace, etc.--are doing to try to keep scalable. What sites do you guys just hate for failing in this regard, and perhaps most importantly, what are some ways we can prevent downtime?
View 4 Replies
View Related
Oct 26, 2008
I want a new dedicated server that has +3TB bandwidth for the best price and quality
View 14 Replies
View Related
Oct 30, 2007
where do you go host HUGE websites, youtube like sites, with HUGE bandwidth usage?
I don't believe people go on host like rackspace, with their 150GB / month packages, unless they want to pay an absurd amount of $$.... so where do these guys go to host? What kind of hosts are these?
View 9 Replies
View Related
Oct 29, 2007
I have been receivig a huge logwatch report, seems that logwatch is not parsing the /var/log/secure file, but sending the log entries instead of any resume of it. I got thousands of lines like
Cp-Wrap: Pushing "47 GETDISKUSED pvargas lights.com.co" to '/usr/local/cpanel/bin/eximadmin' for UID: 47 : 25 Time(s)
Cp-Wrap: Pushing "47 GETDISKUSED r.perez konecrans.com" to '/usr/local/cpanel/bin/eximadmin' for UID: 47 : 69 Time(s)
Cp-Wrap: Pushing "47 GETDISKUSED r.rodriguez konecrans.com" to '/usr/local/cpanel/bin/eximadmin' for UID: 47 : 114 Time(s)
I have upgraded to the most recent version of Logwatch with default configuration. Any ideas on what could be wrong?
View 4 Replies
View Related
Nov 26, 2007
ways to improve the database performance in the situation when I have to modify a large table (several million rows), by e.g. adding a column. Currently this would take several hours which is too slow. The bottleneck is disk I/O. I am considering either partitioning the table over several innodb files on several disks, or going to a RAID-5 or RAID-10, it this will give me better write performance.
The database is 130GB large, and the problem table (which I make period changes to) is the largest table on the server. I cannot have downtime of 3 hours each time I make a change and adding blank fields (to be used later, when a new field is needed) is not an option.
Each time I add a column, the cpu goes into high (80%) io wait state for about 3 hours.
I have a hack which would allow me to split the large table into multiple smaller tables based on some criteria (for example, forumID or such). Here are a couple of things but would like to know which is best, and am open to new ideas. The ideas so far:
1. Split the table into 3 or 5 smaller tables each on it's own disk. The disk IO would then not be so bad, and it might only take 1 hour to perform the table change. But this might not work because the changes to the database (as in adding a column) might be serial, meaning only 1 disk is being written to at a time. (Then again, maybe it will work if I launch 3 different scripts, one to update each table at once).
2. Do RAID 5 or 10, and have 3 or 5 disks. This again might not help at all because of the above issue with MySQL writing serially.
I am using latest MySQL 5.0.45 with InnoDB engine on Debian etch Linux
View 4 Replies
View Related
Nov 6, 2009
i need around 300+GB bandwidth, 20+GB space with 2-5MB of sql database. is it suggestible to take hostgator starting plan (hatchling)?
is hostgator worth that?
View 14 Replies
View Related
Dec 11, 2008
I have one domain where is hosted a lot of subdomains,and for some reason it constantly have 4% cpu usage and 33% mem usage.Since that domain is inactive,could be that usage beacuse of addon domains but it simply not presented correctly in whm?
View 4 Replies
View Related
Sep 21, 2008
I have done my research, befriend a few super proxy webmasters, and learned everything I need to know about being successful in the proxy business. So I am selling almost all my websites to fund this huge project. I will also be flipping proxies from time to time to fund the project even more. This will be a year long project and will be my full time job sooner or later. My goal is to have 1,000 proxy sites.
So with this knowledge, my questions are the following;
1) Which hosting plan should I get right now "Reseller" or "VPS"?
2) Which one would be more profitable in the short term?
View 7 Replies
View Related
Jul 8, 2007
Just few mins ago, my site went down so I went to check up through putty, and when i put Top this is what i got:
top - 09:49:35 up 5 days, 14:41, 2 users, load average: 192.59, 109.31, 62.29
Tasks: 299 total, 3 running, 296 sleeping, 0 stopped, 0 zombie
Cpu(s): 4.0% us, 5.3% sy, 0.0% ni, 0.0% id, 88.7% wa, 0.3% hi, 1.7% si
Mem: 1009272k total, 1001268k used, 8004k free, 124k buffers
Swap: 3919840k total, 1518816k used, 2401024k free, 14676k cached
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
14263 apache 17 0 201m 9m 3788 D 1.0 1.0 0:04.74 httpd
16772 apache 17 0 152m 13m 5340 R 1.0 1.4 0:00.82 httpd
16881 apache 16 0 155m 14m 5368 D 1.0 1.4 0:00.52 httpd
16767 apache 16 0 154m 14m 5352 D 0.7 1.4 0:00.48 httpd
16864 apache 16 0 155m 15m 5364 D 0.7 1.6 0:00.80 httpd
16874 apache 17 0 155m 14m 5416 D 0.7 1.4 0:00.60 httpd
8900 apache 17 0 200m 12m 3844 D 0.3 1.3 0:10.60 httpd
13680 apache 17 0 202m 10m 3944 D 0.3 1.0 0:06.05 httpd
14687 apache 17 0 202m 11m 4060 D 0.3 1.2 0:06.12 httpd
14838 apache 16 0 206m 16m 5624 D 0.3 1.6 0:08.19 httpd
15858 apache 17 0 152m 13m 5452 D 0.3 1.4 0:01.39 httpd
16593 apache 17 0 150m 9180 3664 D 0.3 0.9 0:00.49 httpd
16668 apache 17 0 200m 7304 3496 D 0.3 0.7 0:00.72 httpd
16703 apache 17 0 149m 7208 3192 D 0.3 0.7 0:00.61 httpd
16750 apache 17 0 151m 14m 5268 D 0.3 1.5 0:00.81 httpd
16855 apache 17 0 200m 6616 3480 D 0.3 0.7 0:00.68 httpd
16863 apache 17 0 156m 13m 5500 D 0.3 1.3 0:00.61 httpd
But after few mins, the server load went down to 5 What could've caused the huge server overload problem?
Server spec:
64 3500+
1Gb of Ram
View 9 Replies
View Related
Mar 13, 2007
A While back I found a great deal for SSL certficates so I purchased a bulk package of about 10 of them and used several of them at the time. Now when I went back to use the rest of my pre-purchased SSL certificates (more than a year later), the "contracts" have apparently EXPIRED and the money that was put into those contracts has been frozen along with the contracts! WHAT THE F#$@!
That is such BS! When you pay money for something you should get something in return.
What have I learned... That to me seems extremely manipulative of RapidSSL and Geotrust...
I WILL NEVER PURCHASE AN SSL FROM Rapid SSL or Geo Trust AGAIN! and I hope this post inspires others to select one of the many other certificate sellers out there that are more upfront about their business.
I have contacted both of them and both are telling me that they cannot help me.
Now that I am looking for a new SSL provider can someone give me a good respectable company.
View 12 Replies
View Related
Nov 8, 2007
server has huge serverloads of 25+ at random. When I login as root and type the top -s command, the highest cpu usage is less than 5%. The total is less than 50%. Yet my serverload can reach as high as 80.
I also get the "lfd: High 5 minute load average alert " email, but that also does not show what process uses such high resources.
How can the hugh serverload be seen and expained?
View 4 Replies
View Related