How To Deal With Spam When Keeping A Client Database
Jan 1, 2009
I am using a email list manager to keep contact of our client and I need to send few thousand emails to my members occationally. Now I have noticed that gmail, hotmail and others consider you as a spammer if you send too many email.
But my contacts are legitemate. How Can it manage this list fairly?
In this php page, I dont have any code related to mail. But the header is clearly showing
Return-path: <email@domain.com> Received: from apache by server-name with local (Exim 4.63) (envelope-from < email@domain.com >) id 1HIhVz-0006CN-0l for kris.bodine@okstate.edu; Sun, 18 Feb 2007 15:35:23 +0700 To: kris.bodine@okstate.edu Subject: Good Day X-PHP-Script: www.domain.com/path/filename.php for 195.166.234.251 From: Victor Tommy <tommyvic@mail.ru> Reply-To: vic-tommy@hotmail.com MIME-Version: 1.0 Content-Type: text/plain Content-Transfer-Encoding: 8bit Message-Id: <E1HIhVz-0006CN-0l@server-name> Date: Sun, 18 Feb 2007 15:35:23 +0700
The Plesk panel indicates that the Spam folder can be retrieved using an email client. We are using Outlook 2013 and can not figure out a way to make this happen. We can see the Spam folder using Webmail, but the contents never shows up in Outlook...
From admin account there is no problem to create databases.I`ve run bootstrap.sh repair successfully but still not working.Plesk 12.0.18 #55 Ubuntu Linux 10.04
I was doing a search on google and retrieved some files on it with some sites that should not be available to the public. I investigated the site a little bit and it looked like they are running ASP. I know with Linux servers you can place a .htaccess file which can restrict bots from accessing certain directories, but how can you do it with a windows server running IIS? I would like to get in contact with these companies and let them know about the issues I ran into with their site.
We have a few single CPU (54xx quad core)systems running Hyper-V and looking at the Hyper-V Logical Processer Total value in Perfmon its staying pretty much from 85% to 100% all day long. Perfomance is mostly ok with an occasional hesitation, but the biggest reason is we are trying to avoid doubling the cost of SPLA license by not adding the second CPU. Most motherboards we have only hold 16 gig to 24 gig memory and by adding a second CPU both will probably be less then 40% or 50%
Any problems keeping a 54xx or any CPU for that matter running flat out as long as its cooled OK?
We are in the process of replacing our site & hardware (all co-located servers) and moving to a five server config with 2 Apache servers, 2 MySQL servers and a stage server (where pages are prepared). The Apache servers utilise mod_proxy_balancer and a PHP/MySQL script to connect to either of the MySQL servers which replicate with each other.
The bit I am a bit stuck on is keeping both the Apache servers in sync with eachother in terms of files (pages & images) - has anyone got any suggestions that could help with this?
Ideally, the images would be placed on the stage server and automatically copied to the 2 Apache servers. I did have one idea which was to put the images on one of the MySQL servers and use that as a file store but then if that goes down, we are in the poo so to speak.
what's the most practical way to move ALL the content from one domain to another in terms of keeping decent search ranking as well as forwarding to the new domain.
So like say the current URL is:
[url]
And I want it to be:
[url]
So if you went to the example.com version it'd automatically forward to the example.org version and then the search engines would pick up on that and update the URL.
I've got full control over all the DNS zones, nameservers, etc etc...so whatever is the best option I can do it. I just don't know what that best option is.
what's the most practical way to move ALL the content from one domain to another in terms of keeping decent search ranking as well as forwarding to the new domain.
So like say the current URL is:
[url
And I want it to be:
[url]
So if you went to the example.com version it'd automatically forward to the example.org version and then the search engines would pick up on that and update the URL.
I've got full control over all the DNS zones, nameservers, etc etc...so whatever is the best option I can do it. I just don't know what that best option is.
How can it be determined that multiple domains are "affiliated" with each other? For example, at the link below, on the left side of the page it says that yahoo.com, yahoo.net, and yahooligans.com are owned together. Does this mechanism use WHOIS records, or does it use the IP addresses to which each address resolves (which would only get sites that use named-based virtual hosting to the same IP address, right?)
[url]
If people run multiple sites, say personal and business, and they don't want them to be affiliated, how can they make sure they can’t be linked with each other through such mechanisms?
I have a 100 mbps unmetered dedicated server + cpanel = $425 in 10TB.com and I would like to hear if you guys can advice me on a better deal, I have seen lately great offers for less, so I wonder if I am paying more than the average for a 100 mbps unmetered dedicated + cpanel. I don't complain service is good, well theplanet was better but even more expensive.
I have an adult video site so all i need is bandwidth.
I appreciate your feedback on current prices and what i paid there. $425 for unmetered 100 mbps dedicated + cpanel is okay? paying a lot more? After all 10tb is a reseller, so should i go to another reseller that charge less?
Recently I noticed the load on one of my servers way beyound what I would expect it to be. I run multi processor servers and even during a backup the load is only around 1.5.
But lately I noticed peak loads that high under normal web traffic.
I know 1.5 is low on an multi processor server, but I am hoping to add much more to those machines and with sustained load that high it leaves no room for expansion. The servers are not cheap, so adding another server to the cluster can only be done if I make money from the last one I added.
I checked the traffic levels and they were very high. After further review I had some bots hitting sites at over 1200 pages a minute. Multiply that by a few hundred bots and clearly I could have a load issue. The potential is there to bring any server to its knees when delivering those volumes.
I created programing to watch connections and block the abusive bots. While logging I became aware of over 600 bots crawling my servers. Many bots from, Japan, China, Germany and so on and on, useless to my customers even if they are legit search indexes.
Another problem I see is that the bots are running from many ip addresses and hitting the same sites from multiple ips at the same time. Why would the need to do that?
Among other things I decided to validate googlebot, msn and yahoo with dns lookups so I could determine that they were actually their bots and not imposters. In 24 hours I found valid bots from the big three hitting one server from 1100 different ips.
Now we are looking at thousands of vaild bots and thousands more email harvesters and content theives.
As a host, the number of sites I can host on a server is greatly reduced by the bot traffic. My customers do not want to hear that their website was being crawled at 3,000 pages a minute and that is why they could not access it. Of course they will blame it on me.
I was able to filter the bots at a firewall level and drop connections based on reverse dns lookups and site crawl rates and my server sits around 0.05 most of the time even with hundreds of pages a minute being accessed.
I am wondering how the rest of you hosts deal with this problem. Do you leave it up to your hosting customers? Or do you have some type of filter to get rid of the bots.
When you have a few sites it is not really a problem, but as you grow it grows exponetially out of control.
My site is hosted on Dreamhost and gets over 1 million hits a day. The site is highly optimized, so it can handle the load easily without slowing the server down. Most pages have a loading time of under 0.2 seconds.
However, Dreamhost is telling me now that I'm using up too many "connections" and have limited my connections to 150 every 3 seconds (or so they say). Now 503 errors are coming up left and right, and its highly annoying to me and my users. Oh, and Dreamhost has mentioned several times that I'm oh such a very good candidate to upgrade to $400/mo dedicated hosting (from $8/mo currently).
So my question is, is this connection restriction really a valid concern of Dreamhost or are they just trying to milk me for money because my site is popular?
we have a dedicated server licent who host a big forum with 1G big SQL size Dual Xeon 3G ram SCSI 10K RPM main drive Cpanel server and cpanel auto backuo
load is always under 2 and ram is MORE than enough.
everytimes auto backup run, his forum HANGS due to large size of SQL at least 10 minutes.
Issue is that he targets at 2 very different time zone , he expects NO downtime/HANGS at all
Basically, my client don't like yahoo as their host for their website, but wants to keep their "Yahoo! Business Mail". How can I do this without doing any "forwarding"?
They will be using Dreamhost to host their website.
I'd like to create a forwarding address, and not have the original account keep the mail - but without having to delete the original account, since I'd like to keep my existing emails.
For instance if the old account is apples@domain.com, I'd like to forward new emails to oranges@new.com, where apples doesn't keep the new emails. Currently in cPanel when I create a forwarder apples@domain.com=>oranges@new.com, new emails are sent to both accounts.
Whenever I try out a vps or dedicated the first thing I do is run Unixbench WHT on it.
I have been trying to compile on different systems a mixture of 32bit, 64bit, Fedora, Ubuntu etc, and in many instances the compilation fails the first time around.
e.g on a 32bit Ubuntu system I had to switch to gcc-3.4, instead of 4.0.3 for compilation to succeed.
Is there any place I can try and have this issues resolved when they arise?
Unsurprisingly it always appears to compile okay on 32bit Fedora/Centos Virtuozzo systems.
We've browsed/searched a lot these last years and I believe fdcservers has the best bandwidth deal, with stability and a good service.
We've been searching the last days and found some other companies (ex 10TB.com). Are there any other companies that would offer something as 'good' as fdc regarding bandwidth?
10TB says they use softlayer's DC but it seems they don't use their network (got that information on a chat with SL's sales) so in my opinion there is no benefits to be on SL's DC if we don't use their network which is really good/stable. We've found that 10tb.com is a new company and their site has 404 and incorrect contact e-mails address which made we 'quit' the signup process.. (and because it is hard to 'trust' a new company with lot of bandwidth - and errors on their website - when there are others on the market for a long time that couldn't offer it).
What do you think about 10tb.com? Other options? What about fdc? Is it worth going with them to host streaming ? We need as much bandwidth as possible, but we also need quality.
I'm thinking about a couple of different deals, and as I'm not a hardware expert I wondered if anyone could help me out with this.
Server 1: Quad-Core Xeon X3220 2.4 Ghz/2x4MB (expensive) 1066FSB 2GB RAM ECC 500 GB SATA 7200 133 RPM
Server 2 (the cheapest): AMD Athlon 64 X2 6000+ Dual Core 6 GB DDR2 RAM 2 x 750 GB SATA II
Server 3 (medium price): Core™2Duo 2x2,0 GHz 3 GB DDR2 RAM 2x320 GB SATA
I have a website where each page is around 200KB and uses a simple MySql db. Around how many pageviews would each of the servers above manage a day? They are all 100mbit dedicated. What server is the best choice.
I'm after a Linux shared hosting package with all the usuals: webalizer, php, mysql, phpmyadmin, subdomains, ftp accounts, that will give us a percentage when we refer a customer.
Basically every time we get a new client we'd like to get a little pay out for it.
We'd go for reliability over cost. Is anyone running anything like this?
it looks like SuperMicro MB H8DME-2 is locked it self. (I set password, and worked fine until few days ago, and password does not work anymore.. and it is locked)
Is there any good way to flush this password with keeping all setting?