I would like to build a personal website powered by a content management system and I'm wondering: How many SQL Server databases a site requires? At the Godaddy.com site, the plans for Windows offer one, two or three SQL databases and at least 10 MySQL databases.
My question is regarding Microsoft Access Databases and ASP.NET. I know that you need SQL server installed so that your website can communicate with any sql databases in use. Do you need Microsoft Access installed on a server where you accessing a MS Access Database?
How can i check (using SSH) which databases/users cause server load to mysql ?
I've tried "mysqladmin proc stat" but it shows just the current. How can i get stats of the last 24 hours for example ?
I've also seen slow connections stats. What is the command to check more detailed report of the slow connections; which databases caused it etc', in the last 24 hours for example as well.
I am using Windows XP Pro, and since I updated to Internet Explorer 7 I am having problems getting web sites to work on my PC in Internet Explorer using the local IIS web server. These web sites are using ASP and Access databases.
The two errors I get are:
Operation must use an updateable query.
or
Cannot modify the design of table 'TABLE_NAME'. It is in a read-only database. (they are not read only!)
Does anyone know how I can get this working so I can test my web sites locally before uploading them to the live sites?
Since I have never worked on the server end of things I had a quick question for all you web hosting gurus.
Is it possible to have PHP installed on ONE single server and still have the ability for the server to work with both MS Access AND MySQL at the same time?
I would think YES, but I am being told by our server branch at my current job that this is not the case. They claim there is no way for the server on one machine to be able to handle both types of databases. Are they right?
If they are wrong and it is possible to have one server run both type of databases, what steps would be necessary to set up the server to handle both types? Do we need to tweak the php.ini file or is there another method of allowing the server the ability to work with both MySQL and MS Access.
Sorry if this question seems stupid or odd, as I said, I have minimal experience on the server end but I am confident that a server can handle both.
I have upgrade my mysql plesk installation from 5.1 to 5.6 and have troubles in setting utf8mb4 charset for server and databases. Special characters still display badly in my websites. I'm running Windows Server 2012 R2 and plsk 12.
Do u want to run an online e-commerce store? create a forum? or develop the next facebook or myspace? If you answered yes to any of th questions then there are very important factors you need to consider before choosing a host.
You have to understand that there's a difference between Database and database size. UNLIMITED database means that you have as many databases as you want, but of what use is an unlimited database if the disk size for the database is just 50MB for example. Don't be deceived by the number of database a web host company is willing to chun out. Do a little research, ask questions. Afterall you might not need an unlimited database afterall.
So your host keeps at least a daily backup of your database, but if you don't get to the backup in time, you may find that they have a backup of a crashed database. That means its important to have a procedure for getting regular backups by other means too. How are you doing this?
we have been trying to import sql databases through cpanel.
The old server the client was on did not have any control panel. While the new server the client is migrating to, has cpanel. We try importing sql databases through cpanel, but because the sql databases are so large in size, they corrupt while importing them.
The client's old server he would be able to access all his sql databases through FTP.
While with this new cpanel server, we can't upload sql databases by FTP. We are wanting to use FTP because when using cpanel, the sql databases corrupt.
So is there anyway to upload sql databases with FTP directly into mysql when using cpanel server?
Does anyone know anything about how to set up a mysql database? I installed this script for a gallery for wordpress but it won't create it's own databases. I already set one up myself but Im still getting this error:
Quote:
SELECT * FROM wp_fim_cat ORDER BY date DESC
Is there something Im supposed to add inside each database table? And how do I add it to make it work?
Recently I have noticed at least a couple of SQL 2000 databses hacked in different servers. The hacker inserts a javascript code whose source is [url] file. The code is inserted in all records of some tables especially the tables having html code in the records. We are running SQL 2000 on some unused port and not the default port and also we have firewalls enabled in all the servers. We also have the latest updates and fixes applied in the servers. Can anybody throw some light on this and help me find the cause and solution to prevent this from happening again?
I have most of my sites on the very reliable clook.co.uk but I only only get around a Gig of bandwidth per month per site.
I'm thinking of getting an unlimited hosting package so I could upload client's music files, large images, video clips etc.. Nothing is illegal or adult. Then I could host my sites as normal at Clook and then point to all the very large files being hosted at the new place.
I think I'd get about 10Gig of traffic max a month. And as it'll just be an extended online harddrive I've no real need for databases, email, php, cPanel etc...
I've seen some unlimited offers at hostpapa.co.uk for 4.99 a month with a free domain. Is there a catch? What do you recommend?
I've recently been asked to do a lot of data extraction from a state that has about 20 databases, each with between 10 and 100 tables. I often find myself diagramming things out on paper to try to visualize how everything works together.
I wondered if there's a tool that would "draw" the tables, columns, and relationships? I hesitate to say this, but almost like how MS Access does it, but that runs on Linux and does MySQL. Is there such a thing? I know about phpMyAdmin and MySQL's Query Browser, but it's not what I'm looking for.
I am running a dedicated server with Debian, and I installed a community sofware that has a lot of mySQL entries, many of which need to be changed to fit my needs
However, it is very hard to know exactly where each value I need to change is stored. Is there a way to search all database tables for a specific value?
For example, one thing that is stored in the database is the site's title displayed in the browser's title bar. The sofware does not give me the option to change it, so I have to find where it is located in the database and change it myself, but it would be extremely time comsuming to check all tables one by one for any occurences of the current title.
My HD was failing. So ServerBeach quickly set me up with a fresh box with the dying drive installed as a secondary drive. (nice job guys)
I'm able to mount the secondary drive and browse/copy the files... but when I get to the most important files of all, the database binaries, my /var/lib dir looks like this:
Code: [root@rosemary v2]# ls -l /mnt/dying/var/lib/ total 68 ... .... -rw-r--r-- 1 root root 2171 Sep 14 03:02 logrotate.status drwxrwsr-x 6 root 41 4096 Oct 12 2005 mailman drwxr-xr-x 2 root root 4096 Oct 12 2005 misc ?--------- ? ? ? ? ? mysql drwxr-xr-x 4 root root 4096 Oct 12 2005 nfs drwxr-xr-x 2 ntp ntp 4096 Sep 14 10:31 ntp ... ... And if I try to cd into the dir, I get this:
Code: [root@rosemary v2]# cd /mnt/dying/var/lib/mysql -bash: cd: /mnt/dying/var/lib/mysql: Input/output error I really *really* need this data!
I host with hostgator and I was wondering if there are any software programs or services, or even something in my cpanel that can automatically grab my MySQL databases and everything on my server and make a backup on my personal PC?
I know I can manually do this, but I would like something that automatically does it once a week or something to insure my clients data is always backed up.
I've been running a forum for a while now which of course had a database. Now I sold the vBulletin License and I'd like to delete the database that was created for it. How do I do that?
I've looked throught cPanel, but couldn't find an option on deleting a whole database. Might be looking in the wrong sections or something. Can someone help me out?
It would be even better if it is possible to reset my whole server throught cPanel. Is this possible?
I have a corrupted VPS, and have some mysql databases on it.
I want to backup databases and restore them on the new server.
Cpanel service is down on the vps and I cannot transfer accts.
Is it possible to do so ?
1- Zip or Tar the username folder in /home dir.
2- zip or tar the database name in /var/lib/mysql folder
Either by SSH or File manager trough HyperVM
Then transfer files some where safe...
When vps is rebuilt, restore the archives and databases to the folders I have backed up before rebuilt.
There are also some accounts on a terminated vps that there is only an image of the vps available so, I only can tar files using a jailed shell account, move to the new vps and untar them.
My site is databases driven and runs on around 15 mySQL databases. Im wanting to download a local copy of these databases daily, however, if i try and back them up via cPanel (on a WHM VPS) they give me blank files. Each databases is around 55mb and growing. I can back them up one by one via phpmyadmin, it just takes around 5 minutes per database. Meaning around 40 minutes per night..
Is there any solution for having a script downloading them automatically so i can download them via ftp? I've tried [url].htm but it gives me blank files.
I have one server with 500GB disks. The server is running HyperVM. My secundary backup disk is mounted /backup. In the main node I installed ProFTPd and I'm already making backups of cPanel coming trough a VPS hosted on that server. And I would like that all backups made by users or Automated Backups made by HyperVM would go to /backup, so I done this:
1) I would like to make backups of HyperVM databases. I done this:
[url]
Is this correct?
2) I went to Home/Central BackupServers and added:
[url]
Is this correct?
3) Then I went to Servers/localhost/Central Backup Config and selected 127.0.0.1 from the list.
[url]
Is this correct?
Is only this I need to do to make my backups? Is there anything left? If disk one fails will I be able to restore the server after reinstalling HyperVM?
I am trying to develop a new way to secure my server. But I couldn't find any way i can search for a specific word on the whole mysql server ( All databases )..
I have two servers which both run (very) large databases. To put that into context, one has about 40,000,000 database entries whilst the other has around 100,000,000.
These both currently run on 250GB SATA-II's. I'm considering adding a 147GB SA-SCSI 15k in replace for the 250GB SATA-II's for the live databases.
Would this be beneficial? I've no experience with faster speeded HDD's - especially on servers.
Reason I ask is I have some web applications which access these databases and can often take a while (minutes) to look up data.
how to backup MySQL databases by cron and have the backups sent to you by email, or have them uploaded by FTP. It is based on a script I found at another website though I can confirm it is fully working.
Change the commented variables in the following file and save it as backup.sh:
Code: #!/bin/sh
# This script will backup one or more mySQL databases # and then optionally email them and/or FTP them
# This script will create a different backup file for each database by day of the week # i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1) # This is a trick so that you never have more than 7 days worth of backups on your FTP server. # as the weeks rotate, the files from the same day of the prev week are overwritten. #/bin/sh /home/user/directory/scriptname.sh > /dev/null ############################################################ #===> site-specific variables - customize for your site
# List all of the MySQL databases that you want to backup in here, # each seperated by a space # If not run by root, only one db per script instance databases="mydbname"
# Directory where you want the backup files to be placed backupdir=/home/mydomain/backups
# MySQL dump command, use the full path name here mysqldumpcmd=/usr/bin/mysqldump
# MySQL Username and password userpassword=" --user=myuser --password=mypasswd"
# MySQL dump options dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"
# Send Backup? Would you like the backup emailed to you? # Set to "y" if you do sendbackup="n" subject="mySQL Backup" mailto="me@mydomain.com"
#===> site-specific variables for FTP ftpbackup="y" ftpserver="myftpserver.com" ftpuser="myftpuser" ftppasswd="myftppasswd" # If you are keeping the backups in a subdir to your FTP root ftpdir="forums"
#===> END site-specific variables - customize for your site ############################################################
# Get the Day of the Week (0-6) # This allows to save one backup for each day of the week # Just alter the date command if you want to use a timestamp DOW=`date +%w`
# Create our backup directory if not already there mkdir -p ${backupdir} if [ ! -d ${backupdir} ] then echo "Not a directory: ${backupdir}" exit 1 fi
# Dump all of our databases echo "Dumping MySQL Databases" for database in $databases do $mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql done
# Compress all of our backup files echo "Compressing Dump Files" for database in $databases do rm -f ${backupdir}/${DOW}-${database}.sql.gz $gzip ${backupdir}/${DOW}-${database}.sql done
# Send the backups via email if [ $sendbackup = "y" ] then for database in $databases do $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu $mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu done fi
# FTP it to the off-site server echo "FTP file to $ftpserver FTP server" if [ $ftpbackup = "y" ] then for database in $databases do echo "==> ${backupdir}/${DOW}-${database}.sql.gz" ftp -n $ftpserver <<EOF user $ftpuser $ftppasswd bin prompt cd $ftpdir lcd ${backupdir} put ${DOW}-${database}.sql.gz quit EOF done fi
# And we're done ls -l ${backupdir} echo "Dump Complete!" exit Upload backup.sh to your server, to any directory you want. A directory which is not web-accessible will stop your login information being seen by just anyone .
You should chmod the file to 777:
Code: chmod 777 backup.sh If you uploaded this file from a Windows machine you will need to convert the file to Unix format. You should run the following command by SSH in the appropriate directory:
Code: dos2unix backup.sh If you don't have dos2unix installed, you can install it using yum if you have that:
Code: yum install dos2unix If you don't have yum, get it here.
You may want to test the script at this point to make sure it's doing what you want it to. Change to the appropriate directory and run this command:
Code: ./backup.sh Once you're happy with it, enter it into the crontab to run daily (or whenever you want). Cron jobs vary a lot depending on the configuration of your system, so check Google for how to do it on your system. The command you will need to run by cron is: