Configuring Server To Handle Both MySQL And MS Access Databases
Mar 24, 2008
Since I have never worked on the server end of things I had a quick question for all you web hosting gurus.
Is it possible to have PHP installed on ONE single server and still have the ability for the server to work with both MS Access AND MySQL at the same time?
I would think YES, but I am being told by our server branch at my current job that this is not the case. They claim there is no way for the server on one machine to be able to handle both types of databases. Are they right?
If they are wrong and it is possible to have one server run both type of databases, what steps would be necessary to set up the server to handle both types? Do we need to tweak the php.ini file or is there another method of allowing the server the ability to work with both MySQL and MS Access.
Sorry if this question seems stupid or odd, as I said, I have minimal experience on the server end but I am confident that a server can handle both.
My question is regarding Microsoft Access Databases and ASP.NET. I know that you need SQL server installed so that your website can communicate with any sql databases in use. Do you need Microsoft Access installed on a server where you accessing a MS Access Database?
I am using Windows XP Pro, and since I updated to Internet Explorer 7 I am having problems getting web sites to work on my PC in Internet Explorer using the local IIS web server. These web sites are using ASP and Access databases.
The two errors I get are:
Operation must use an updateable query.
or
Cannot modify the design of table 'TABLE_NAME'. It is in a read-only database. (they are not read only!)
Does anyone know how I can get this working so I can test my web sites locally before uploading them to the live sites?
I have setup php and mysql on a windows IIS 6 server. php has installed fine and works but I have 2 problems
1. When i browse to http://localhost i get a 403 error but when browsing to http://localhost/index.php it works fine. It makes me think that the server does not know what extension to default to when doing a directory listing
2. I have phpmyadmin installed and the cfg file is set to do a 'config' authentication. That works but everytime i go to submit a form to create a new database or anything i get prompted with an HTTP user/pass. I'm not sure if this is a phpmyadmin issue or a php configuration issue
How can i check (using SSH) which databases/users cause server load to mysql ?
I've tried "mysqladmin proc stat" but it shows just the current. How can i get stats of the last 24 hours for example ?
I've also seen slow connections stats. What is the command to check more detailed report of the slow connections; which databases caused it etc', in the last 24 hours for example as well.
I recently had a harddrive failure and luckliy I can still access certain directories on this failed drive. I can still access the /var/lib/mysql/ directory which holds all the users databases and have backed all these up separately using tar.
Now what I need to know is how do you restore these database files to another server? I tried simply untar'ing one of these to the new servers /var/lib/mysql/ direcotry and it stuffed Mysql up - it went offline. I had to get a cpanel tech to bring Mysql back online.
how can I get these database files to fully work on a new server?
I'm using Windows XP and installed Apache, PHP 5 and MySql 5. They were all working before, but recently I had to format my hard drive. Fortunately, I had an image of my computer created from before but MySql was not installed when that image was created.
After deploying that image, everything seems to be working fine. The server is running and PHP scripts are executing as well. But I can't seem to have MySql to work. I've followed many tutorials online, but have no luck.
I have edited the php.ini file to point to the correct directory that holds the extentions or dll's. I also enabled the following:
I already tested to see if MySql was correctly installed by using the command line client and it seems to be installed. I was able to log in and see the default databases created.
But when I run the following php script
<? phpinfo() ?>
I don't see the MySql section anywhere and that's why I'm assuming that PHP is not currently working with MySql for me. However, interestingly, I do see the "mbstring" section, which I assume should show only if MySql is working with PHP. But nothing else related to MySql info shows up. I've attached part of the screen shot if it helps you.
I've recently been asked to do a lot of data extraction from a state that has about 20 databases, each with between 10 and 100 tables. I often find myself diagramming things out on paper to try to visualize how everything works together.
I wondered if there's a tool that would "draw" the tables, columns, and relationships? I hesitate to say this, but almost like how MS Access does it, but that runs on Linux and does MySQL. Is there such a thing? I know about phpMyAdmin and MySQL's Query Browser, but it's not what I'm looking for.
I am running a dedicated server with Debian, and I installed a community sofware that has a lot of mySQL entries, many of which need to be changed to fit my needs
However, it is very hard to know exactly where each value I need to change is stored. Is there a way to search all database tables for a specific value?
For example, one thing that is stored in the database is the site's title displayed in the browser's title bar. The sofware does not give me the option to change it, so I have to find where it is located in the database and change it myself, but it would be extremely time comsuming to check all tables one by one for any occurences of the current title.
My HD was failing. So ServerBeach quickly set me up with a fresh box with the dying drive installed as a secondary drive. (nice job guys)
I'm able to mount the secondary drive and browse/copy the files... but when I get to the most important files of all, the database binaries, my /var/lib dir looks like this:
Code: [root@rosemary v2]# ls -l /mnt/dying/var/lib/ total 68 ... .... -rw-r--r-- 1 root root 2171 Sep 14 03:02 logrotate.status drwxrwsr-x 6 root 41 4096 Oct 12 2005 mailman drwxr-xr-x 2 root root 4096 Oct 12 2005 misc ?--------- ? ? ? ? ? mysql drwxr-xr-x 4 root root 4096 Oct 12 2005 nfs drwxr-xr-x 2 ntp ntp 4096 Sep 14 10:31 ntp ... ... And if I try to cd into the dir, I get this:
Code: [root@rosemary v2]# cd /mnt/dying/var/lib/mysql -bash: cd: /mnt/dying/var/lib/mysql: Input/output error I really *really* need this data!
I host with hostgator and I was wondering if there are any software programs or services, or even something in my cpanel that can automatically grab my MySQL databases and everything on my server and make a backup on my personal PC?
I know I can manually do this, but I would like something that automatically does it once a week or something to insure my clients data is always backed up.
I have a corrupted VPS, and have some mysql databases on it.
I want to backup databases and restore them on the new server.
Cpanel service is down on the vps and I cannot transfer accts.
Is it possible to do so ?
1- Zip or Tar the username folder in /home dir.
2- zip or tar the database name in /var/lib/mysql folder
Either by SSH or File manager trough HyperVM
Then transfer files some where safe...
When vps is rebuilt, restore the archives and databases to the folders I have backed up before rebuilt.
There are also some accounts on a terminated vps that there is only an image of the vps available so, I only can tar files using a jailed shell account, move to the new vps and untar them.
My site is databases driven and runs on around 15 mySQL databases. Im wanting to download a local copy of these databases daily, however, if i try and back them up via cPanel (on a WHM VPS) they give me blank files. Each databases is around 55mb and growing. I can back them up one by one via phpmyadmin, it just takes around 5 minutes per database. Meaning around 40 minutes per night..
Is there any solution for having a script downloading them automatically so i can download them via ftp? I've tried [url].htm but it gives me blank files.
how to backup MySQL databases by cron and have the backups sent to you by email, or have them uploaded by FTP. It is based on a script I found at another website though I can confirm it is fully working.
Change the commented variables in the following file and save it as backup.sh:
Code: #!/bin/sh
# This script will backup one or more mySQL databases # and then optionally email them and/or FTP them
# This script will create a different backup file for each database by day of the week # i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1) # This is a trick so that you never have more than 7 days worth of backups on your FTP server. # as the weeks rotate, the files from the same day of the prev week are overwritten. #/bin/sh /home/user/directory/scriptname.sh > /dev/null ############################################################ #===> site-specific variables - customize for your site
# List all of the MySQL databases that you want to backup in here, # each seperated by a space # If not run by root, only one db per script instance databases="mydbname"
# Directory where you want the backup files to be placed backupdir=/home/mydomain/backups
# MySQL dump command, use the full path name here mysqldumpcmd=/usr/bin/mysqldump
# MySQL Username and password userpassword=" --user=myuser --password=mypasswd"
# MySQL dump options dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"
# Send Backup? Would you like the backup emailed to you? # Set to "y" if you do sendbackup="n" subject="mySQL Backup" mailto="me@mydomain.com"
#===> site-specific variables for FTP ftpbackup="y" ftpserver="myftpserver.com" ftpuser="myftpuser" ftppasswd="myftppasswd" # If you are keeping the backups in a subdir to your FTP root ftpdir="forums"
#===> END site-specific variables - customize for your site ############################################################
# Get the Day of the Week (0-6) # This allows to save one backup for each day of the week # Just alter the date command if you want to use a timestamp DOW=`date +%w`
# Create our backup directory if not already there mkdir -p ${backupdir} if [ ! -d ${backupdir} ] then echo "Not a directory: ${backupdir}" exit 1 fi
# Dump all of our databases echo "Dumping MySQL Databases" for database in $databases do $mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql done
# Compress all of our backup files echo "Compressing Dump Files" for database in $databases do rm -f ${backupdir}/${DOW}-${database}.sql.gz $gzip ${backupdir}/${DOW}-${database}.sql done
# Send the backups via email if [ $sendbackup = "y" ] then for database in $databases do $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu $mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu done fi
# FTP it to the off-site server echo "FTP file to $ftpserver FTP server" if [ $ftpbackup = "y" ] then for database in $databases do echo "==> ${backupdir}/${DOW}-${database}.sql.gz" ftp -n $ftpserver <<EOF user $ftpuser $ftppasswd bin prompt cd $ftpdir lcd ${backupdir} put ${DOW}-${database}.sql.gz quit EOF done fi
# And we're done ls -l ${backupdir} echo "Dump Complete!" exit Upload backup.sh to your server, to any directory you want. A directory which is not web-accessible will stop your login information being seen by just anyone .
You should chmod the file to 777:
Code: chmod 777 backup.sh If you uploaded this file from a Windows machine you will need to convert the file to Unix format. You should run the following command by SSH in the appropriate directory:
Code: dos2unix backup.sh If you don't have dos2unix installed, you can install it using yum if you have that:
Code: yum install dos2unix If you don't have yum, get it here.
You may want to test the script at this point to make sure it's doing what you want it to. Change to the appropriate directory and run this command:
Code: ./backup.sh Once you're happy with it, enter it into the crontab to run daily (or whenever you want). Cron jobs vary a lot depending on the configuration of your system, so check Google for how to do it on your system. The command you will need to run by cron is:
I'm running a comic site on a dedicated linux box that has weekly releases. When a new release comes out I'll typically get 4000-5000 unique hits an hour. The problems started when I set up a MySQL download logging system, that would query and display the file names when the user loads the site, and update +1 when they download a comic. Yesterday after a release, the site became very slow to access, and after about 20 seconds you'd connect and be able to browse freely at nice speeds but if you went idle you'd have to wait about 20 seconds to connect again. The other domains on the server were running fine, and I could access my whm fine as well, the server load was 0.10 or lower.
I'm assuming there's a max connection limit somewhere either on the mysql or apache side that's restricting the site from handling the load it's getting. I've poked around google and researched it a bit but couldn't really find much. I don't have an enormous amount of time to invest in this because it's just a hobby so I'd appreciate any help one of you could offer!
I can't find any straight answers when searching Google. I've been stuck in a what has become a nightmare server move for about 2 weeks now.
After finally getting a mysql.sock issue fixed I can get the MYSQL dbs over to the new server but it appears that their permissions need to be changed. Is there a way to handle this from the root level? I have roughly 120 dbs to process
I am moving from a VPS to a dedicated server - both have root access.
I have a server that is running linux with cpanel and I am running out of space on a partition.
I was getting this error: Drive Warning: /dev/sda3 (/var) is 83% full
I looked in the folder and went to /var/lib/mysql and noticed that I have about 6 databases that are a little over 1 GB each. The sda3 partition only has a capacity for 9.9 GB and I was suggested to configure mysql to hold my databases on another partition that has more space..
I log in as Reseller1 (clean login; not by switching from PPA Admin to Reseller1), create a Webspace and a Database. At that point I am able to see and create Databases on every Node, even on Reseller1 (Node1), which is set to restricted. I guess this behaviour is not intended, isn't it?