Search Results

Search found 17851 results on 715 pages for 'log aggregation'.

Page 52/715 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • Variable disappears when I log in

    - by John
    Hello, I have profile page where the profile is retrieved via GET. The index file has this: $profile = $_GET['profile']; When I log in on the profile page, the $profile variable disappears. Here is the form action on the login function: <form name="login-form" id="login-form" method="post" action="./index.php"> (The $profile variable is separate of the login username.) How could I make the page retain the $profile variable? Thanks in advance, John

    Read the article

  • Log to Firefox Error Console from JavaScript

    - by Torsten Marek
    Is it possible to add messages to the built-in error console of Firefox from JavaScript code running in web pages? I know that I there's Firebug, which provides a console object and its own error console, but I was looking for a quick fix earlier on and couldn't find anything. I guess it might not be possible at all, to prevent malicious web pages from spamming the log?

    Read the article

  • Git log: fatal object [sha1] is corrupted

    - by Keyo
    Is there any way I can repair my repository with commit history in tact. # git log fatal: object 01aeb2bf2e93b238f0e0422816b3e55518321ae7 is corrupted From reading the link below it looks like I'll have zap it and start over. http://www.miek.nl/s/7e76eadefe/

    Read the article

  • Log into Launchpad from python script

    - by jack
    How can I log into my Launchpad account in a python script? Any sample code would be appreciated. The login url is https://launchpad.net/+login and then redirect to something like https://login.launchpad.net/fJLVSRbxPfKTpVDr/+decide Thanks in advance!

    Read the article

  • Configure SSIS logging to log in one file

    - by Pramodtech
    I know configuring the logging for individual packages thru BIDS. But the drawback I see here is I have to add connectionstring for each tasks and when I have to deplloy these packages on server I have to change log file connectionstring for all packages. Currently I have 32 pacakes and this seems to be time consuming. Is there any way where I can set up logging for all packages in one place?

    Read the article

  • CodeIgniter Error Log Info + Errors

    - by fatnjazzy
    Hi, IS there a way to save in the log, Info + Errors without debug? Howcome debug level apears with info? If i want to log info "Account id 4345 was deleted by Admin", why do i need to see all of these: DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Config Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Hooks Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 URI Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Router Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Output Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Input Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Global POST and COOKIE data sanitized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Language Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Loader Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Config file loaded: config/safe_charge.php DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Config file loaded: config/web_fx.php DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Helper loaded: loadutils_helper DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Helper loaded: objectsutils_helper DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Helper loaded: logutils_helper DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Helper loaded: password_helper DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Database Driver Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 cURL Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Language Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Config Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Account MX_Controller Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 File loaded: ./modules/accounts/models/pending_account_model.php DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Model Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 File loaded: ./modules/accounts/models/proccess_accounts_model.php DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Model Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 File loaded: ./modules/accounts/models/web_fx_model.php DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Model Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 File loaded: ./modules/accounts/models/trader_account_type_spreads.php DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Model Class Initialized DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 File loaded: ./modules/accounts/models/trader_accounts.php DEBUG - 2010-12-27 08:39:13 --> 192.168.200.32 Model Class Initialized Thanks

    Read the article

  • How to log sql statements in grails

    - by damian
    Hi I want to log in the console or in a file, all the queries that Grails do, to check performance. I had configured [this][1] without success. Any idea would help. [1]: http://www.grails.org/FAQ#Q: How can I turn on logging for hibernate in order to see SQL statements, input parameters and output results?

    Read the article

  • Tracing log for user doing transaction

    - by user295334
    Dear All, I want all of you help to create file log for tracing user transaction. I want to do that when a user wants to update or add some information, I want to trace which information that user was updated. The tracing file that I want is easy read and search. Even, it is in text file or xml file. Thanks, Sopolin

    Read the article

  • Log files legal aspect?

    - by relwarc
    I like data. That is why I add a standalone PHP script which logs all relevant HTTP variables like: Date of visit IP User-agent Request URI Referer Am I allowed to store all this in non-public text files? Am I allowed to evaluate the data? What am I allowed to do with the log files? Do I have to delete them after some time?

    Read the article

  • Perl - Compare two files and copy intire line to final.log

    - by user2977141
    I need perl script to compare line from file1 with file2 and copy to final file, something like this: File1.txt: ASPO01 ASPO02 ASPO03 File2.txt: ASPO01 2013-11-10 19hrs ASPO10 2013-11-09 24hrs ASPO02 2013-11-08 10hrs ASPO16 2013-11-05 9hrs ASPO17 2013-11-06 6hrs ASPO03 2013-11-07 15hrs ASPO18 2013-11-02 25hrs ... Search into File2 and copy to final file called final.log, like this: final.txt: ASPO01 2013-11-10 19hrs ASPO02 2013-11-08 10hrs ASPO03 2013-11-07 15hrs Thanks for all good friends can help me!!!

    Read the article

  • Simple MultiThread Safe Log Class

    - by Robert
    What is the best approach to creating a simple multithread safe logging class? Is something like this sufficient? public class Logging { public Logging() { } public void WriteToLog(string message) { object locker = new object(); lock(locker) { StreamWriter SW; SW=File.AppendText("Data\\Log.txt"); SW.WriteLine(message); SW.Close(); } } }

    Read the article

  • how to log failures in php unit?

    - by user187809
    How to log only failures in an external file, from php unit? I want the complete information, including actual value, expected value, line number etc. Right now I am using fwrite and logging all pass and fail tests into a file, is there a better way to do it?

    Read the article

  • Is it safe to delete rotated MySQL binary logs?

    - by Milan Babuškov
    I have a MySQL server with binary logging active. Once a days logs file is "rotated", i.e. MySQL seems to stop writing to it and creates and new log file. For example, I currently have these files in /var/lib/mysql -rw-rw---- 1 mysql mysql 10485760 Jun 7 09:26 ibdata1 -rw-rw---- 1 mysql mysql 5242880 Jun 7 09:26 ib_logfile0 -rw-rw---- 1 mysql mysql 5242880 Jun 2 15:20 ib_logfile1 -rw-rw---- 1 mysql mysql 1916844 Jun 6 09:20 mybinlog.000004 -rw-rw---- 1 mysql mysql 61112500 Jun 7 09:26 mybinlog.000005 -rw-rw---- 1 mysql mysql 15609789 Jun 7 13:57 mybinlog.000006 -rw-rw---- 1 mysql mysql 54 Jun 7 09:26 mybinlog.index and mybinlog.000006 is growing. Can I simply take mybinlog.000004 and mybinlog.000005, zip them up and transfer to another server, or I need to do something else before? What info is stored in mybinlog.index? Only the info about the latest binary log? UPDATE: I understand I can delete the logs with PURGE BINARY LOGS which updates mybinlog.index file. However, I need to transfer logs to another computer before deleting them (I test if backup is valid on another machine). To reduce the transfer size, I wish to bzip2 the files. What will PURGE BINARY LOGS do if log files are not "there" anymore?

    Read the article

  • How to automate a monitoring system for ETL runs

    - by Jeffrey McDaniel
    Upon completion of the Primavera ETL process there are a few ways to determine if the process finished successfully.  First, in the <installation directory>\log folder,  there is a staretlprocess.log and staretl.html files. These files will give the output results of the ETL run. The staretl.html file will give a detailed summary of each step of the process, its run time, and its status. The .log file, based on the logging level set in the Configuration tool, can give extensive information about the ETL process. The log file can be used as a validation for process completion.  To automate the monitoring of these log files, perform the following steps: 1. Write a custom application to parse through the log file and search for [ERROR] . In most cases,  a major [ERROR] could cause the ETL process to fail. Searching the log and finding this value is worthy of an alert. 2. Determine the total number of steps in the ETL process, and validate that the log file recorded and entry for the final step.  For example validate that your log file contains an entry for Step 39/39 (could be different based on the version you are running). If there is no Step 39/39, then either the process is taking longer than expected or it didn't make it to the end.  Either way this would be a good cause for an alert. 3. Check the last line in the log file. The last line of the log file should contain an indication that the ETL run completed successfully. For example, the last line of a log file will say (results could be different based on Reporting Database versions):   [INFO] (Message) Finished Writing Report 4. You could write an Ant script to execute the ETL process and have it set to - failonerror="true" - and from there send results to an external tool to monitor the jobs, send to email, or send to database. With each ETL run, the log file appends to the existing log file by default. Because of this behavior, I would recommend renaming the existing log files before running a new ETL process. By doing this,  only log entries for the currently running ETL process is recorded in the new log files. Based on these log entries, alerts can be setup to notify the administrator or DBA. Another way to determine if the ETL process has completed successfully is to monitor the etl_processmaster table.  Depending on the Reporting Database version this could be in the Stage or Star databases. As of Reporting Database 2.2 and higher this would be in the Star database.  The etl_processmaster table records entries for the ETL run along with a Start and Finish time.  If the ETl process has failed the Finish date should be null. This table can be queried at a time when ETL process is expected to be finished and if null send an alert.  These are just some options. There are additional ways this can be accomplished based around these two areas - log files or database. Here is an additional query to gather more information about your ETL run (connect as Staruser): SELECT SYSDATE,test_script,decode(loc, 0, PROCESSNAME, trim(SUBSTR(PROCESSNAME, loc+1))) PROCESSNAME ,duration duration from ( select (e.endtime - b.starttime) * 1440 duration, to_char(b.starttime, 'hh24:mi:ss') starttime, to_char(e.endtime, 'hh24:mi:ss') endtime,  b.PROCESSNAME, instr(b.PROCESSNAME, ']') loc, b.infotype test_script from ( select processid, infodate starttime, PROCESSNAME, INFOMSG, INFOTYPE from etl_processinfo  where processid = (select max(PROCESSID) from etl_processinfo) and infotype = 'BEGIN' ) b  inner Join ( select processid, infodate endtime, PROCESSNAME, INFOMSG, INFOTYPE from etl_processinfo  where processid = (select max(PROCESSID) from etl_processinfo) and infotype = 'END' ) e on b.processid = e.processid  and b.PROCESSNAME = e.PROCESSNAME order by b.starttime)

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >