There are always people who would like to know what the php settings are on the server. Is it a security risk to share the phpinfo.php file on a website, with anybody who visits that website, able to view it?
A friend of mine owns a hosting company and a client of his asked to have mbstring and mysqli installed. What he wants to know is , is there any security risks if he does install that on his server?
Also, he wants to know, if there is not, what how does he go about installing that on the server?
develop and deploy a security strategy to make my single dedi and two VPSes (all with similar hardware configuration and running Linux Centos 5.2+ w/DirectAdmin CP and Xen virtualization), as secure ass possible, both internally and externally.
I hope you'll freely share your best practices, recognizing that is the kind of thread multiple members will read for a long time to find out WHO the WHT experts are and what they recommended this newb do. While I hope you'll read the whole post because I may raise issues either you've never thought about or legitimate security issues you've tried to make others aware of but to no avail, I don't expect everyone to respond to every word of this long post. Please feel free to provide solutions-oriented comments and/or constructive direction, based on your area of expertise, only to the specific issues you want to address.
A little background is helpful:
I'm not a reseller nor will I be running anything that needs DDOS-like protection. I'll be running some virtual OS instances, trying out VoIP software and installing and running a virtual Linux desktop from my dedi and creating a mirror for the VPS for my websites, blogs, and email. One VPS will be the slave server to the dedi. I will be running my own DNS, mail and virtual servers on both VPS and the dedi as well. I'll also be backing up data on one of the VPS. All of these activities, I know, present security issues I need to confront.
I'm looking for primarily open source solutions to protect my small server network since first, it fits my budget and, second, I find most proprietary software restrictive and easier to exploit with backdoors, etc. I'd prefer an open source alternative that's of the same high quality and security as a proprietary service. But, if you think a proprietary product or service far outstrips anything open source and you've deployed it for clients or used it for your own servers, let me know. (I prefer to hear actual, first person, end-user accounts/suggestions.)
I'm a quick study--in fact, warp speed--so can learn what I need to do if I have good direction, (which is why I came here to ask). But, since I'm not yet an expert, please expect clarification questions.
So, here's what I want to know:
1) I will be logging in via secure, encrypted SSH to run commands and manage software but what's the best secure file and data transfer method/software to use? Can I make SSH more secure? Should I run a VPN from one of the boxes? Is using a secure web interface safe for managing or monitoring my server?
2) What's the best firewall for a dedi and will that firewall work for a VPS?
3) Same question for anti-malware (antispyware/antivirus/antispam) software. I see Kasperky and Dr. Web a lot as well as Spamassassin (which is open source) but what are some other options? Aren't server hackers expecting most servers to have the same protection software and doesn't that make them easier to hack?
4) What are some of the ways my servers can be exploited? For example, can others use my email servers to send spam or other servers to commit illegal acts? (I want to avoid getting my server taken down or my IPs blacklisted for someone else's activities). How do I prevent such exploitation?
5) What's the best and safest way to backup and/or sync my servers? What kinds of encryption should I use for the data on my servers? My internal servers like mail, file and virtual servers and appliances?
6) Other than software, what are some of the best methods for protecting my servers from DNS attacks, spam, viruses, hacking, etc.? Should I write specific commands into certain files or run them on a bash shell?
7) Are their GOOD websites or blogs that cover this subject? I can't afford to buy a library of books and wouldn't have time to read them. Also, by the time I do, the information would be outdated. I need to keep up. Finally, I learn best by doing and need to hit the ground running; information needs to be somewhat noob friendly and definitely actionable.
Also, what about implementing general server privacy practices? For example, I invest in truly private domain name registration (read: privacyprotect.org) and, in addition, private DNS for my website and blog domain names. I will be employing other (legal) techniques that prevent to much info from being revealed in my email headers without getting my email sent to spam. In some case, I use encrypted email.
If I'm taking those steps, so, doesn't make sense to implement a strategy that prevents as many people as possible from physically locating my servers in the first place--to force them to spend significant time (and money if they're serious) trying to figure out where my IP addresses goes by using some kind of stealth DNS?
The analogy that comes to mind is using a correctly configured, encrypted and anonymous VPN, SSH tunnel or proxy server to mask the IP address that leads to your home ISP and, ultimately, to your house. Not to protect yourself from law enforcement because if you're doing illegal stuff online, you SHOULD be caught. But to protect myself from nefarious individuals, nosy neighbors, stalkers or ISPs logging your every internet move. Is there a way to do this with my dedi and VPSes, prevent unnecessary location thus targeting, logging, sniffing, etc?
What other things should I be thinking about? Tell me what I'm missing but please don't just share potential nightmare scenarios without telling me HOW to avoid them.
Again, the advice that's most helpful to me focuses on constructive, actionable solutions; what I CAN do, use, implement, deploy, etc. to develop and execute a strong security strategy for my servers. Again, if you share a negative scenario, please share a positive, effective solution. Tell me how I CAN effectively implement best security practices, even as a noob (since we ALL start as noobs, right?),
I already know this won't be easy but I'm up for the challenge and like the control I'll have managing my own servers. So, I'm also not looking to pay anyone else to manage my digital assets (including my DNS) or for average end-user (retail) solutions designed for truly non-technical folks but ineffective for power users. Been there, done that, lost a lot of data, especially lately.
Finally, though I won't totally cheap out, I don't have thousands of dollars to invest in enterprise level services I don't need for just one dedi and two small VPSes. To me, in terms of scale, this is not unlike securing my home network of a couple of laptops and a desktop workstation from drive by hacking and other threats. In addition to open source software, if I can do something myself, I'd rather, than paying someone else.
If I can rebuild my Windows desktop from bare metal (more than once, in fact) and install a home network and secure both as well as any service can, I can do this.
I just started using FileZilla Client, as a way of allowing business clients to upload to an ftp account at my website (the ftp account is a subdirectory of my public_html directory, and has its own username and password).
I noticed that, along with other information for each file listed at that subdirectory, FileZilla also posts info on "owner" and "group". It turns out that, for each of these fields, FileZilla displays the username of my entire site -- not the username specifically associated with the particular ftp account to which FileZilla had connected. Thankfully, it doesn't also display the password that goes along with it!
I'm wondering if anyone would know:
- does this constitute a significant security risk? - is this because of actions on the part of my web host, or because of FileZilla's programming? (ie, would the same thing occur in all ftp clients?) - if this is a significant security risk, would there be any workaround?
I'm sure you all may have heard this question before, so I'm sorry if I'm beating a dead horse...just can't seem to find a good answer. I am interested in setting up a fileserver / fileshare on a VPS so that I can create a mapped drive on a windows PC which points to the fileshare on the VPS. I have a client who currently uses a physical server to perform this task, however this physical server is under-utilized and somewhat unnecessary. I mentioned the possibility of moving to a VPS and he seemed interested. I decided to purchase an entry-level account from VPSLAND to use for testing purposes prior to moving forward with the project. I can't seem to get anything to work so I'm looking for ideas.
I purchased a VPSLAND Windows-based EZ-series VPS with Plesk and all the other bundled goodies.
I'm trying to find a low cost solution for realtime file share replication in a windows environment.
It doesn't look like there are any open source windows cluster filesystems around, so the only viable option I found would be running OpenFiler in a replication cluster on Hyper-V nodes. Has anyone worked with this, does it work reliably?
The required IO throughput on these shares would really be minimal and my biggest concern is 100% availability.
if you can share a 100MB download link that I can use to test cogent's speed to my network. Hopefully plugged into a 100MBPS port at the switch to see if it will max out or not.
I recently started building out a new network rack to provide a production web site. The new equipment stack includes a disk array providing a CIFS file share to store images to be served up by Apache.
I have had zero luck in getting Apache to properly access the imagestore from the network share. I've read more Google pages on this subject today than I can count, but I am still not having any success getting this working right.
I'll do my best to explain the configuration.
I have an ESXi host running several virtual machines. Each machine needs to be able to access the shares. Each host has multiple network interfaces, each connected to a separate subnet. The virtual machines are running Windows Server 2012 Datacenter edition.
The disk array is file mode access, with NFS and CIFS shares. It has interfaces on both subnets that each VM can reach. I have established a stand alone CIFS server, with the shares configured. They are accessible from the VMs.
I have mapped the share to a drive letter on the VM client, and it works properly from the logged in account. I have full control over files on the file system (create, modify, delete).
The VM has Apache 2.4.9 installed.
Things I've tried with no success:
-created a symlink to the CIFS mounted drive into the webroot directory -added an alias to the CIFS mounted drive -added the aliased directory using the <Directory> directive -added the alias and directory directives using UNC references
I am seeing errors like "path is invalid" mostly, but when I try to add the mapped drive (f or the UNC referenced directory, the Apache service won't start.
I added a separate user for the Apache service, and added it to the group that has privileges to talk to the share, still didn't work.
We have found that we need to limit the amount of cpu uage by users on our video share server. On this server we currently have 20 users on a sharred plan. Thought that the obvious BW usage would be the biggest challenge, as it turns out we havent gone over the 2 TB that we have.
We have come up with an encoding process that uses the 264 codec and gives us excellent results in terms of quality but is very cpu intensive to the point of really slowing down the server when 10 or more users simutaneously are encoding their videos.
Can someone suggest a script that would allows us to limit the file size in terms of MB/GB that each user could upload per month.
So for example a client pays 10.00 per month and we wanted to limit their uploads to a total of 900 MB per month vs the client that is paying 50.00 per month who would have the ability to upload say 8 GB per month.
A few days ago, my friends studying in America recommended me a new popular transfer toolQoodaa. And he told me that it was a quite good software to download files and movies. At first,I was skeptical, but after using it, I found its a good choice to choose Qoodaa. And I have summarized the some features of Qoodaa:
1.Its speed is faster than any other softwares I used before to upload movies.
2.It can download files quickly through downloading links, in a word, it is time-saver and with high efficiency.
3.No limit of space.No matter where you are, it can download fast.
4. Qoodaa is a green software with high security and easy use.
It really can give you unexpected surprise.
I am a person who would like to share with others, and if you have sth good pls share with me
the subject pretty much sums it up, is there a method or solution for multiple websites (whic reside on the same dedicated server) to share just one .htpasswd, or automate the mirroring of said .htpasswd file?
if so any suggestions for methodology or products that would facilitate this action would be most welcome, thx in advance friends..!
I currently do daily backups to rsyncpalace Daily, weekly, monthly. cPanel does a backup of all user accounts to a folder and they are rsync'd offsite via ssh.
My questions are: Should I be comfortable or concerned that all of my website(s)' data are neatly bundled, stored in plaintext (tar) formats and only protected by a single login and password?
Am I exposed to any more or less risk of tampering with my data than on my webserver itself?
I am planning to use CGI for my web installations and there appears to be a whole lot of conflicting info about setting file permissions in the user's folder.
What are the permissions actually required for reading and writing into the web users directory?
A lot of them say 755, but that doesn't make sense as it gives any user read and write permissions to the whole web directory tree.
Other than the initial index .php, .cgi or some other files that need to be ready by the webserver process shouldn't every other file be 700 or 600 as every subsequent file access is done under the control of the cgi program?
Unless a file is to be served directly by the web server process and is not in a ScriptAlias directory or is not marked as a CGI shouldn't the permissions on that file be 600 or 700?
I'd also like to know if there are some guides as to how the CGI security issues operate.
My website, a free classified ads site, is hosted by XO, the hosting company. I'm introducing a feature where advertisers can, for free, post pictures of the things that they're advertising -- that is, where advertisers can upload a JPEG or a GIF. I understand that this can open my site up to the uploading of malicious code, and that I should put safeguards in place to make sure that only JPEGs and GIFs get uploaded. However, I'm wondering if XO doesn't include some built-in safeguards that would keep malicious code from getting executed. In other words, since a profesional hosting company runs the servers -- not me -- do I need to be worried about security at all?
Code: [root@ns1 conf.d]# cat /etc/httpd/conf.d/php.conf # # PHP is an HTML-embedded scripting language which attempts to make it # easy for developers to write dynamically generated webpages. #
LoadModule php5_module modules/libphp5.so
SetOutputFilter PHP SetInputFilter PHP
AddType application/x-httpd-php .php AddType application/x-httpd-php-source .phps This prevents file.php.gif from executing in a non-chrooted site, but in my chrooted sites, file.php.gif will execute as a PHP file. Any idea why? Some other config I have to change?
It is currently with company A who charge quite a lot to keep it there. I want to move it to company B who are my hosts and with whom I get 1 free domain name.
One added complication is that the domain is in a friends name, but I have logon and can change name to my own any time I want.
Company B said "After it is on our registrar, you will be the only one that can renew it as long as it doesn't expire for longer than 90 days."
This has me worried that because I haven't renewed it withing 90 days that it can be stolen from me. Have I misunderstood or is this a risk?
If so would I be better advised to renew it in my friends name wth company A?
anyone have a cpanel server running centos 64bit or fedora ? i would appreciate a phpinfo. then you can remove it afterwards i need to debug something in 2 of my servers.
I'm a Windows guy and can little or nothing about Linux. How big risk do I take if I'm using a Linux VPS and never update/patch the kernel?
I'm using CentOS 5 and LxAdmin. I can update the control panel, but I can not update/patch the kernel since I have no knowledge how I do that.
I'm using a unmanaged plan, so no help there.
Some of my sites are running Wordpress, but I'm always using the lates WP installation. I not using any other plugins that WG2, Gallery2, and remove max width.
Nobody except me have access to the VPS, and I have no other FTP accounts or something like that on the VPS.
I have no other scripts or any kind of dynamic pages on my VPS.
What kind of risk do I have here?
I'm currently having plans to cancel my second VPS that's using Win2003, and only use Linux in the future. I can cut my monthly expensive with 50% that way, but do I take a big risk doing it that way?
I find that lots of hosts don't put the link to a phpinfo() script on their websites, even though that would save everyone a lot of unnecessary questions.
So I figure we should have a thread where people can add links to phpinfo scripts on their own hosts, or hosts where they happen to have the URL of a phpinfo script.
For those of you who own your own web hosting services, here's a chance to show off all your installed goodness, and all it takes is placing a simple link here (and put one in your FAQ section on your web site too, for the love of God)..
Since the forum doesn't allow editing your own posts (I still think that is nuts), please include all previous links in your post, so visitors will only need to read the LAST post to find all the updated links in one place.
mysql MySQL Support enabled Active Persistent Links 0 Active Links 0 Client API version 4.1.21 MYSQL_MODULE_TYPE external MYSQL_SOCKET /var/lib/mysql/mysql.sock MYSQL_INCLUDE -I/usr/include/mysql MYSQL_LIBS -L/usr/lib/mysql -lmysqlclient
mysqli MysqlI Support enabled Client API library version 4.1.21 Client API header version 4.1.20 MYSQLI_SOCKET /var/lib/mysql/mysql.sock
At present I run SSH on a different port then normal to protect root. This has worked for two years, but with discovering that cPanel finally support SFTP without shell access needed, I want to finally turn off FTP and require SFTP. The problem is the port I am using. Since it's a random port I have been secured against root attacks (well nothing has shown up). I am with LiquidWeb which is fully managed. So I guess they take care of allot of prevention.
This is what I am thinking of doing. move SSH back to port 22 (I only host a few friends sites and want to be hosting 20 accounts by end of year to cover my costs). Then disabled root password and require SSH keys. Would this be strong as secure as running SSH on a high #port or am I fooling myself.
I could also add in for good measure restricting root SSH/SFTP (yes I prefer SFTP for file management as I am legally blind and using Transmit+BBEdit is allot easier for me for editing files). The problem with restricting to certain IP's, is that Shaw charges $30/month more for a static IP and I also am at my moms 25% of the time (and she is also with Shaw). I think the XXXX.vs.shawcable.net is static but I am not 100% sure.
I really do want to kill FTP so that only port 80 is the only non SSL port open.