We have a product which is web based and uses sessions for everyone logged in and when they timeout the Session_OnEnd is supposed to fire and delete records from a table which stores some data for the user.
The problem is that the Session_OnEnd is not firing and therefore not clearing the table everytime the user times out.
The server:
Windows 2003 Service Pack 1 (I know there has been a problem with SP2 with the session_onEnd not firing but this does not effect SP1)
The whole site is running behind a SSL. We have switched this off and it still doesnt fire.
The site is written in classic ASP.
We have over another 200 clients with the same setup but they are not suffering from the same problems and I have come to the end of my knowledge about IIS (as I am a web developer and not server admin) and where to look.
We've installed domainkeys, SPF, and taken all measures suggested by Yahoo but customer e-mails continue to be delayed for days at times. And then of course they end up in the recipients junk folder. Worst of all is that Yahoo does not care to take the time to help or inform ISP's. No explanations, just generic tips.
So I was wondering if there is some sort of exim or sendmail script (procmail?) we can create that will do the following:
For every e-mail sent to Yahoo, the mail server sends another one to the recipient with a notice saying something like:
Attempts to deliver this e-mail to you were rejected by Yahoo mail service X number of times [or delayed X hours/days]. If this is unsatisfactory, we suggest you contact Yahoo support for a resolution.
I have a large survey (I use phpsurveyor) on my reseller webdomain running. It takes about 30 minutes to fill in the complete survey. After 24 minutes my respondents get a session error and their data is lost.
Code: Warning: session_start() [function.session-start]: Cannot send session cache limiter - headers already sent (output started at /home/myhope/public_html/pro/index.php:1) in /home/myhope/public_html/pro/index.php on line 2
It could've gone into the php section as well...I think.
I got a dedicated server 2 days ago, but the "session.save_path" is not set. What I did was create a "php.ini" file, put the following code into it and uploaded it to my public directory.
When i try to login my linux server from GUI i put my username "root" and then put the password, i am not able to login and getting following error
"Your session only lasted less then 10 sec. If you have not logged out yourself this could mean that there is some installation problem of that you may be out of disk space. Try logging in with one of the failsafe sessions to see if you can fix this problem view details in ~/.xsession.error file "
when I install Joomla, it said session.save_path = /tmp is unwriteable. however, following phpinfo(), the session.save_path on server is /tmp, and all my php sites are working fine, I can see many sess_ files in /tmp. That means the sessions are still written into /tmp by user apache, is that correct?
so, why Joomla instalation saying it's unwriteable? i am on a linux server
I had done a program in early 2006 for a site in php-mysql. At the time of doing the code, The code written was not so standard and it contained uninitialized variables used for include file paths (eventhough values are assigned to it before using) and the "sess" folder was created within the website folder. Also the parameters for the SQL query were not escaped, but everything was working fine.
And now i was informed that the insecure code in my program caused the server crash and i have to pay the penalty for the same. Can anyone let me know whether the below code / keeping the session variables within a folder inside the /www/ will make the sites hosted on the server where this program runs to stop/crash for ever ?
------------------------------------------------------------------ function update_region($id,$regname,$regcom) { $query = "UPDATE taxregion_mast SET taxregion_name = '". $regname."', region_comments = '". $regcom."' WHERE region_id =" .$id; mysql_query($query);
Is it possible to have 1 session under XXX.XXX.XXX.XXX IP and the 2 session under a different IP allocated by your DC?.
The box is in Europe ..I need to log-in one session and download some files from a server so whenever the master sees it will see one IP and the other session from another.
Two of the reasons I need this done is 1. privacy 2. avoiding of buying another box.
I've come across an issue where our users are not logging out of their terminal services session properly. Whether via TSWeb or MSTSC (remote desktop), if they close the browser or RDP window using the x it keeps the session alive for upto 1 minute.
The problem with this is that we use terminal services to host an application for users who can't install it, so other users that login (using a generic username and password) are adopting/hijacking the original session and seeing someone elses data.
Does anyone know of a way to force a new session each time a user connects to RDP? Whether via TSWeb or MSTSC (remote desktop)?
if it's possible to log all ssh commands to a file by session. For example, if I log in as user 'test123', once I close the session, all commands I ran will be saved to a file and either emailed to my server logs email address or saved to a file.
I am assisting a client who is linking to an online calculator, he is putting a frame on top of the calculator page, so people will still see his information. However, for some reason he is getting a Session Timeout Error in IE.
I don't get this error in Firefox using this method, or ever going to the direct page in IE.
Let me give a better explanation:
If you visit: [url]
Just put in a fake name and email, it loads a frame at top, and then the online calculator, which is this page: [url]
Why I am getting a Session Timeout? Is there a better solution. I never get the same error if I go directly too: [url]
We want a frame or better solution because we still want the contact information to be in front of the consumer.
Does this maybe have to do with a cookie and frame?
I have a website where people perform a number of tasks, saving some data to temporary session files. If the user is idle for a certain amount of time, then performs an action, his/her work will be gone.
I'd like to set the sessions to never expire, so that only a browser close would delete the temporary files.
I've tried looking around in the IIS manager, but I cannot find a way to do this.
I have a Cpanel box, in WHM I used the "PHP Configuration Editor" and changed the php execution time (minor change). After clicking save I now get the following error on any php using sessions:
Warning: session_start() [function.session-start]: open(/tmp /sess_1d374c43a0f726cd43776f9f92485bec, O_RDWR) failed: No such file or directory (2) in /home/continou/public_html/control/index.php on line 4
One thing I noticed it did was turn on PHPSuexec which generally causes problems for me. I turned that off and the error response changed slightly (to above) but the problem is not solved.
I tried rebooting the server. /tmp does exist, I am now rebuilding apache in hopes that corrects the problem.
and i get the error ----------------------------------------------------- Warning: Unknown: Your script possibly relies on a session side-effect which existed until PHP 4.2.3. Please be advised that the session extension does not consider global variables as a source of data, unless register_globals is enabled. You can disable this functionality and this warning by setting session.bug_compat_42 or session.bug_compat_warn to off, respectively. in Unknown on line 0 -----------------------------------------------------
I'm looking at allowing remote telnet into my server.
like any security-minded administrator, I want to log what my users type on the telnet session.
I'm using the script command to generate transcripts of the users session.
I have /etc/profile set to automatically start the script command to log user activity, and in /etc/bash.bash_logout I have a command that emails me the transcript of the users' session.
All of the above works well except for one thing:
the users can type "exit" to escape from my script logging and any commands they type won't get logged.
For the past couple of days, I've noticed that there is alot of apache processes running in the "D" state and that my IO wait is up to %80. I straced one of the processes and the result is its locking on a php session:
on my (virtual) Ubuntu Server (12.04.5 LTS), I am running the latest Plesk version (12.0.18 Update #35). For "normal" web use, I am running Apache (2.2.22) and PHP (5.3.10-1ubuntu3.16) as a FastCGI module.
Since I was just checking a couple of other servers regarding PHP session management, I also checked this server. What I found out is that on this server probably no session file garbage collection takes place.The basic PHP configuration (I didn't touch it so far) is done through files located /etc/php5/cgi with its basic PHP configuration file php.ini. Running phpinfo() confirmed the session settings done in php.ini:
With these settings (gc_probability = 0), I assume that no automatic garbage collection is started from within PHP (URL....). As posted in the link before, Debian and Ubuntu normally have their own scripts in /etc/cron.d/php5 to do this garbage collection.
On my Plesk server with no personal modifications towards PHP however, neither the automated garbage collection (in php.ini) is activated, nor does the (standard) Ubuntu/Debian cron job exists to delete outdated session files in /var/lib/php5. This results in many outdated files filling up that directory on my server.
My questions now are: 1) Is the removal of the cron job /etc/cron.d/php5 something specific that happened during the installation of Plesk? Why? 2) Is there another (Plesk-specific) script that should do the work?
my question. We host around 40 websites that have hundreds of pictures. I have 60gigs dedicated to pictures alone. since we are getting more clients I want to get a bigger hard drive just for the pictures. Since my boss is obssesed with user access times, as in users accessing the pictures on the web on their end, he thinks getting a bigger hard drive will slow down the server and hence slow down access time serving out the pictures to end users. I guess what I'm saying is does a bigger hard drive slow down a website?
Let me know if you need clarification I wasn't sure how to word this.
I am running into a problem with my home linux server running Fedora 8 (2.6.26.6-49.fc8). The problem is with my SSH sessions, in that they hang or freeze when I try to run certain commands such as (ps -auxt or top) or when trying to edit (/etc/httpd/conf/httpd.conf) via VI or Pico. I am able to edit my pure-ftpd.conf file with little to no trouble through either Vi or Pico however. I did some research already on openssh.org but my problem does not resemble the tcp timeout issues I've seen described. When the SSH session hangs I'm forced to disconnect and then reconnect. Just curious if anyone has any ideas as to what I can try to resolve this problem.
I have installed apache 2.4.10 with tomcat-7 as backend .Proxypass has been added in apache to access tomcat via http port .now requirement is to restrict each Context to 100 sessions only , how i need to achieve this .
I have diesel generator controller card (IB Lite made by Comap) and the built-in webserver supports only a single user/session connected.
I want to set apache in front and serve multiple connection while apache is keeping a single session with the IB Lite card in background no matter how sessions it have.
I tried ProxyPass but it doesn't seems to be a solution.