I've been developing websites for clients on and off for about 5 years. I've recently gotten more serious about developing as my client base is growing. So, as I get more clients I develop sites for I need to figure out where to host the sites.
So, the question is VPS, Dedicated, or Shared. How do I compete with companies that charge $6.99 for so much data transfer and storage?
I have a client who had me redesign his bands website which is currently hosted at 1and1.com. I've never dealt with them before, so I was wondering what everyone's thoughts are on them. So far the technical support has been giving us a hard time retrieving the cpanel password which is different from the ftp. Other then that, is this a reliable host for a band selling there music, or would I be better off switching to another host such as jaguarpc.com?
I will soon be meeting a few people who plan on selling their clients to me that are currently hosting with them.
I have never bought/sold clients before and would like some guidance from the community here on WHT.
Correct me if im wrong.
I should consider purchasing those clients who's contracts are going to expire soon, preferably within the next 6 months atleast?
Should i or should i not purchase clients who they have recently taken a renewal fee from that will last for atleast another 6 months?
How will calculate on what prices they should be selling their clients to me for? Are they going to add up a years worth of hosting before they set a final sale price? At the end of the day i dont want to take on clients who have already paid for their hosting and wont be expiring anytime soon. I think, if i take them on board now they will be using up my hosting resources at my expense which i will have to wait until their next renewal is due.
The guys i will be buying the clients from have their current hosting plans as follows:
Monthly prices range from £1.90 £5.00 £9.50 £15.50,
Yearly prices range from £19.00 £50.00 £95.00 £155.00
Depending on what packages their clients are on now, how much do you think on average 5 clients would be worth roughly?
Before i purchase what else do i need to look into?
What other questions do i need to ask other than the likelihood of how long they expect their clients to stay with them/(me).
I did an FTP of a high res photo to a client. They got it fine.
Today I tried to FTP several of my various files to another client. However, it would not go through. Both times I uploaded them to my web hosting service. But when I tried to send several files at one time it did not work. I even tried zipping them.
I would like to know about cirtexhosting company from its clients. I found this company attarctive by the offered deals, and currently I am looking for somebody to tell me if this company is really worth of getting an account from. I mean how's their shared hosting quality? The pre sales support looks to be fast and informative, but how about customer support?
We're having a small problem with one of our Ventrilo boxes (Linux) that seems to be blocking users from connecting. The DC (The Planet) is reporting that the issue is likely with the user and my tech is also saying the same thing. I have reason to believe they're both wrong but wanted to see if anyone had any ideas.
Here are 3 tracert's that show normal activity until the user reaches the DC network and then they can't reach our server.
All 3 of these are from different users experiencing the exact same problem.
Tracing route to ca.13.1343.static.theplanet.com [22.214.171.124] over a maximum of 30 hops: 1 2 ms 1 ms <1 ms 192.168.1.1 2 * * * Request timed out. 3 70 ms 72 ms 24 ms 126.96.36.199 4 127 ms * 13 ms 10g-9-3-ur02.longmont.co.denver.comcast.net [68. 86.103.157] 5 103 ms 87 ms 66 ms 10g-9-1-ur01.longmont.co.denver.comcast.net [68. 86.103.161] 6 12 ms 11 ms 12 ms 10g-3-1-ar01.denver.co.denver.comcast.net [68.86 .103.154] 7 9 ms 9 ms 23 ms 188.8.131.52 8 11 ms 12 ms 11 ms 184.108.40.206 9 45 ms 42 ms 33 ms tbr2-p013702.dvmco.ip.att.net [220.127.116.11] 10 31 ms 34 ms 126 ms tbr2-cl33.dlstx.ip.att.net [18.104.22.168] 11 145 ms 32 ms 31 ms gar1-p3100.dlrtx.ip.att.net [22.214.171.124] 12 38 ms 32 ms 31 ms 126.96.36.199 13 30 ms 34 ms 32 ms te7-1.dsr02.dllstx3.theplanet.com [188.8.131.52] 14 34 ms 32 ms 30 ms vl41.dsr01.dllstx4.theplanet.com [184.108.40.206] 15 59 ms 142 ms 48 ms gi1-0-2.car02.dllstx4.theplanet.com [220.127.116.11 34] 16 * * * Request timed out. 17 * * * Request timed out. 18 * * * Request timed out. 19 * * * Request timed out. 20 * * * Request timed out. 21 * * * Request timed out. 22 * * * Request timed out. 23 * * * Request timed out. 24 * * * Request timed out. 25 * * * Request timed out. 26 * * * Request timed out. 27 * * * Request timed out. 28 * * * Request timed out. 29 * * * Request timed out. 30 * * * Request timed out.
Tracing route to ca.13.1343.static.theplanet.com [18.104.22.168] over a maximum of 30 hops:
1 <1 ms <1 ms <1 ms 192.168.1.1 2 8 ms 7 ms 10 ms 22.214.171.124 3 9 ms * 7 ms GE-2-37-ur01.aberdeen.wa.seattle.comcast.net [68 .86.98.9] 4 9 ms 13 ms * te-5-2-ur01.olympia.wa.seattle.comcast.net [68.8 6.96.6] 5 11 ms * 12 ms te-8-4-ar01.burien.wa.seattle.comcast.net [68.86 .96.10] 6 14 ms 19 ms 14 ms 126.96.36.199 7 69 ms 69 ms 70 ms tbr1011401.st6wa.ip.att.net [188.8.131.52] 8 72 ms 76 ms 74 ms tbr2-cl10.sffca.ip.att.net [184.108.40.206] 9 73 ms 66 ms 67 ms tbr1-cl30.sffca.ip.att.net [220.127.116.11] 10 67 ms 68 ms 69 ms tbr1-cl3.la2ca.ip.att.net [18.104.22.168] 11 70 ms 67 ms 67 ms tbr1-cl20.dlstx.ip.att.net [22.214.171.124] 12 67 ms 68 ms 67 ms gar1-p340.dlrtx.ip.att.net [126.96.36.199] 13 65 ms 66 ms 65 ms 188.8.131.52 14 66 ms 66 ms 65 ms te9-1.dsr02.dllstx3.theplanet.com [184.108.40.206] 15 68 ms 67 ms 68 ms vl42.dsr02.dllstx4.theplanet.com [220.127.116.11] 16 67 ms 66 ms 66 ms gi1-0-1.car02.dllstx4.theplanet.com [18.104.22.168 ] 17 * * * Request timed out. 18 * * * Request timed out. 19 * * * Request timed out. 20 * * * Request timed out. 21 * * * Request timed out. 22 * * * Request timed out. 23 * * * Request timed out. 24 * * * Request timed out. 25 * * * Request timed out. 26 * * * Request timed out. 27 * * * Request timed out. 28 * * * Request timed out. 29 * * * Request timed out. 30 * * * Request timed out.
1 9 ms 5 ms 7 ms 22.214.171.124 2 7 ms 7 ms * GE-2-37-ur01.gigharbor.wa.seattle.comcast.net [6 86.99.9] 3 * * * Request timed out. 4 13 ms 9 ms 11 ms 126.96.36.199 5 64 ms 66 ms 64 ms 188.8.131.52 6 67 ms 67 ms 273 ms tbr2-cl10.sffca.ip.att.net [184.108.40.206] 7 67 ms 73 ms 69 ms tbr1-cl30.sffca.ip.att.net [220.127.116.11] 8 66 ms 66 ms 65 ms tbr1-cl3.la2ca.ip.att.net [18.104.22.168] 9 68 ms 64 ms 63 ms tbr1-cl20.dlstx.ip.att.net [22.214.171.124] 0 67 ms 70 ms 66 ms gar1-p340.dlrtx.ip.att.net [126.96.36.199] 1 70 ms 68 ms 67 ms 188.8.131.52 2 72 ms 69 ms 66 ms te9-1.dsr02.dllstx3.theplanet.com [184.108.40.206] 3 63 ms 65 ms 65 ms vl42.dsr02.dllstx4.theplanet.com [220.127.116.11] 4 69 ms 75 ms 69 ms gi1-0-1.car02.dllstx4.theplanet.com [18.104.22.168 5 * * * Request timed out. 6 * * * Request timed out. 7 * * * Request timed out. 8 * * * Request timed out. 9 * * * Request timed out. 0 * * * Request timed out.
I am trying to find a good solution for email. I use a Linux based server running CestOS, WHM and cpanel. I absolutely hate the webmail clients that come with cpanel (horde, squirrel mail, neomail) so I use outlook instead. Problem is, I use the same email between different computers and end up getting the same messages twice. This does not seem to matter if I use pop3 or IMAP; I still get the same emails in both places.
What I would like to see is a possibility of a mail client that is either web based, or something I can add to my server that allows me to take advantage on IMAP - and have it work like it is supposed to. I like the functionality of pop3 in outlook because it allows you to easily delete and maintain a clean inbox.
I have 20 clients who are on different networks and countries but all of the sudden their IP's are keep getting black list in spamhause, CBL, dsbl etc and they can not send email i am so very tired of this,
I have been researching i3d.net as a dedicated provider.
I have searched the archives and in order to take a final decision I would like to kindly ask some current clients and even past clients of i3d.net to share some insight on the quality of this provider and it's network.
One of my clients is under a bandwidth attack in the past few days. He went from 6000 "hits" (file requests, not users) on his site Monday to over 11,400,000 for the day yesterday (Friday). Today's just further escalation.
His site usually does about 2-6 GB traffic a month, and he's now over 450 GB for the month of December as of earlier today (being Dec 6th - 6 days into the month). The traffic is from (or if using proxies, appears to be from) multiple residential IPs - with over 90 simultaneous users on the site at any given time.
First they were hitting a particular image, and when access to that image was removed, they've just been hitting the home page since.
Besides taking his site offline, any thoughts here on how to keep him online? I discussed with him moving to his own dedicated 10mbps/unmetered system - but with the escalation here, I'd imagine they'd use up all that bandwidth in short order as well.
Any experience on how long these attacks generally continue? All I can think is that he's hurting a competitor who is now using this as a tactic to cripple the online business/success/etc. This business (for the types of products they sell) for the most part is pretty small though, and I can't imagine one of the competitors really has the knowledge on how to do this themselves - or even coordinate it themselves (or they'd be in another business making more $$$). If they're not coordinating it themselves, any idea on how much it might be costing the competitor to have someone else coordinate such an attack?
I was just wondering how fast does it take for a VPS business to reach a certain number of clients.
I'm sure there are lots of variables, like how long the company is in business, what sort of product the company is providing, the market it is targeting, and more, but just elaborate as much as you want, or just ball park it.
I am a host reseller and the parent company is upgrading their servers to PHP5. This may break some of my clients' sites (OSCommerce, Zen). Should we fix those sites for free or should my clients pay for the upgrade?
Im currently running cent0S 5. I recently just installed Squid Version 2.6.STABLE6 for a client to enable him to use as proxy. However it seems that sites like whatismyip.com and ipchicken.com are resolving back to my clients IP address and not the servers.
There is only one IP on my server and I think the problem may deal with X-Headers? (correct me if I am wrong)
Is there any way to use the server IP address for when my customer is using the proxy server.
My squid.conf looks like the following: Code: Code: http_port 8080 forwarded_for off icp_port 0 cache_mem 64 MB cache_dir ufs /var/spool/squid 100 16 128 maximum_object_size 4096 KB cache_store_log none cache_access_log /var/log/squid/access.log hierarchy_stoplist cgi-bin ? acl QUERY urlpath_regex cgi-bin ? no_cache deny QUERY visible_hostname proxyserver acl all src 0.0.0.0/0.0.0.0 acl manager proto cache_object acl localhost src xxx.xx.xxx.xxx acl SSL_ports port 443 563 10000 acl Safe_ports port 80 acl Safe_ports port 21 acl Safe_ports port 443 563 acl Safe_ports port 70 acl Safe_ports port 210 acl Safe_ports port 1025-65535 acl Safe_ports port 280 acl Safe_ports port 488 acl Safe_ports port 591 acl Safe_ports port 777 acl Safe_ports port 901 acl purge method PURGE acl CONNECT method CONNECT acl LocalNet src xxx.xx.xxx.xx http_access allow manager localhost http_access deny manager http_access allow purge localhost http_access deny purge http_access deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow localhost http_access allow LocalNet http_access deny all icp_access allow all log_fqdn on ##### This side is to make the proxy tranparency #httpd_accel_with_proxy on #httpd_accel_uses_host_header on #httpd_accel_host virtual #httpd_accel_port 80 ######------------------------------ error_directory /usr/share/squid/errors/English #httpd_accel_uses_host_header off #anonymize_headers deny From Referer Server forwarded_for on http_port ServerIP:8080 transparent
# no forwarded quite useless for an anonymizer forwarded_for off # no client stat client_db off
# Paranoid anonymize header_access Allow allow all header_access Authorization allow all header_access Cache-Control allow all header_access Content-Encoding allow all header_access Content-Length allow all header_access Content-Type allow all header_access Date allow all header_access Expires allow all header_access Host allow all header_access If-Modified-Since allow all header_access Last-Modified allow all header_access Location allow all header_access Pragma allow all header_access Accept allow all header_access Charset allow all header_access Accept-Encoding allow all header_access Accept-Language allow all header_access Content-Language allow all header_access Mime-Version allow all header_access Retry-After allow all header_access Title allow all header_access Connection allow all header_access Proxy-Connection allow all header_access All deny all header_access Cookie allow all header_access Set-Cookie allow all header_replace User-Agent Anonymous Proxy at example.com
I am a reseller for SkyNetHosting.Net for about 3 months now and yes we are new in hosting business. For this pass 2 months I'm having issues with my host server firewall. Every time I get myself blacklist I have to summit a ticket asking for my IP to be whitelist. Its fine with me but I don't think my clients are happy with this as we are getting almost 90% non-renewal rate..
They keep saying its my fault.. Ok, I can take that.. But how do I explain to my clients and most importantly my client's visitors??
Is there anything on the firewall settings that they should tweak to minimize this?
I never had any problem when I was at hostgator...
My user experience with you guys so far has been excellent, especially the support department and if I were to single out your tech support employee, it has to be Nathan. Thumbs up for his fast and reliable support.
However I have one major complain.. And that is your firewall issue as I keeps getting block/blacklist even for a mere moment of connecting to Luna Server for less than 10sec. Yes, you heard me.. just 10sec after i login to the internet and browse my sites there is a high possibility of my IP getting blacklist.
We're not a pure web hosting company, we received more web design projects where by they also host their website on our server. So you see, we maintain most of our clients website and regular FTP to multiple websites are required.
Here are the list of things I would do 1st thing when I login to internet, including my employee and my business partner. So if we multiply the below by 4 .. We will get blacklist for sure, most of the times!!
1. Login WHMCS 2. Login Livezilla Chat Support 3. Login webmail to check emails 4. FTP Updates on clients sites
I hope you guys can re-look into the firewall sensitivity settings where by it reduce or better still does not effect us business wise.
My Host reply to ticket
I recommend to change ALL your FTP/cPanel passwords at once and if you are saving them on browser, FTP clients not to do it anymore. If possible try to login from a deferent clean laptop/PC and see if you get the same problem.
Customers who save their FTP login credentials in FTP softwares like FileZilla, Cute-FTP, WS_FTP-Pro, Dreamweaver or Frontpage are prone to malicious scripts injections from their PC's using their legit cPanel login details via FTP and the owner of the domain/account were not even aware of it. Your login credentials are leaked to the hackers once a Trojan or Virus gets installed on your Windows Machine.
The easiest way to save your login credentials would be, to save them in a text document without saving the Domain name or Login Host information in the same document. To be absolutely sure your FTP account won’t get compromised, we highly recommend you choose a strong password which contains a combination of upper and lower case letters, numbers and special characters such as $?£;: while adding a new FTP login name from your cpanel control panel. If you manage multiple websites may not like this change, but losing your data and then losing your rankings in Search Engines will create more trouble.
Honestly I don't think this issue can be resolve on the user end as we are currently facing a monthly non-renewal rate of 90% from our clients. If this goes on we would end up losing our reputation and most importantly our business.
Yes, I can tell them the same message that your tech support replied to me. But my clients do not understand and they would rather find a host that are less complicated "firewall sensitive". Is there somehow you can recalibrate the firewall sensitivity, so we all can have our peace?
My client on chat support
My Cpanel username is 'justin'.
This is with regards to my problem with the IP whitelist.
I need a no-nonsense answer.
I have visitors from the US complaining that they cannot view my site.
I cannot possibly be whitelisting everyone.
So, is it possible to tweak the filtering/firewall settings?
If it is not possible, I would like to exercise my money-back guarantee and close my account.
I would like a day to download my files and databases if that is possible.
Yes, it’s quite true. DreamHost representatives are asking their clients to block GoogleBot trough the .htaccess file, because their websites were “hammered by GoogleBot”.
Dreamhost representatives are also guiding their clients to make all their websites “unsearchable and uncrawlable by search engine robots”, because they cause “high memory usage and load on the server”.
PS: This is NOT a rumor. Multiple people are already complaining about Dreamhost asking clients to block search engine spiders. A post in QuickOnlineTips confirms this too.
So my the other day one of my clients rented a Windows 2003 server. Has no idea on how to run DNS server on it and demanded us to do something about it. Since he had no management plan, we let him know that if he got himself a management plan we could help him out - but looks as though he was on a tight budget and couldn't opt in for it. Same thing happened last week and again one yesterday!
To help these customers out I was wondering if I could start a DNS service. Each client would get access to it for free regardless of their operating system or management plan. They could point their domains to the nameservers and use the DNS service to point the domain's A records to their servers.
From the start, I have been using EditDNS and still using it. But coming to think of it, its getting more unstable day by day even though they have 5 servers spread across the globe. I am planning on to move to EveryDNS. With this move, I'd like to use their services for my clients too. Rather than running my own DNS servers - I feel this could work out more efficient and reliable.
So what I am planning on doing here is more like creating private namesevers to mask EveryDNS's nameservers. Then have a custom script site built to interact with EveryDNS's APIs. Sounds like a good idea? Then my customer uses the script's interface to create/delete/edit records, etc.
Anyway I need some advice, suggestion or help in this matter. Has anyone tried this stunt out before? If so, how successful were they/you? How are your customer's reaction towards this (satisfied or not?). Does it stay efficient? (both in terms of money, time and energy spent).
Also, if anyone could suggest me any programmer who could do this for me (someone well versed with DNS/EveryDNS on PHP/MySQL) it would be nice.
Edit: I guess ServerBeach/Peer 1 already has implement this. If anyone really knows whats going on there - please share.