Search Results

Search found 14434 results on 578 pages for 'moss 2010'.

Page 234/578 | < Previous Page | 230 231 232 233 234 235 236 237 238 239 240 241  | Next Page >

  • Sorting by date

    - by user62367
    Original: Jan 23 2011 10:42 SOMETHING 2007.12.20.avi Jun 26 2009 SOMETHING 2009.06.25.avi Feb 12 2010 SOMETHING 2010.02.11.avi Jan 29 2011 09:17 SOMETHING 2011.01.27.avi Feb 11 2011 20:06 SOMETHING 2011.02.10.avi Feb 27 2011 23:05 SOMETHING 2011.02.24.avi Output: Feb 27 2011 23:05 SOMETHING 2011.02.24.avi Feb 11 2011 20:06 SOMETHING 2011.02.10.avi Jan 29 2011 09:17 SOMETHING 2011.01.27.avi Jan 23 2011 10:42 SOMETHING 2007.12.20.avi Feb 12 2010 SOMETHING 2010.02.11.avi Jun 26 2009 SOMETHING 2009.06.25.avi How could I get the output where the newest file is at the top?

    Read the article

  • access_log item w/out IP. Starts with "::1 - - [<date>]"

    - by Meltemi
    Looking at our Apache log I see normal requests like: 174.133.xxx.xxx - - [20/May/2010:17:36:44 -0700] "GET /index.html HTTP/1.1" 200 2004 but every so often i get a cluster of these w/out an IP address. ::1 - - [20/May/2010:18:47:21 -0700] "OPTIONS * HTTP/1.0" 200 - ::1 - - [20/May/2010:18:47:22 -0700] "OPTIONS * HTTP/1.0" 200 - ::1 - - [20/May/2010:18:47:23 -0700] "OPTIONS * HTTP/1.0" 200 - what do they mean and curious what causes them?

    Read the article

  • powerpoint compatibility

    - by John
    Imagine you work in Office 2007 or 2010, you create a far-fetched presentation and save it as 'ppt'. Office 2007/2010 warns you that some effects will be lost, but you save it. Question: If you open that ppt file in the office version you're working with(2007/2010) will it run like on '97 or '03 with the lack of effects, no other problems? I'm asking so I know whether I have to install an older office so I can check compatibility.I use 2010. Thanks in advance

    Read the article

  • What issues ensue from having multiple versions of Office installed?

    - by Michael Sorens
    My ultimate question is embodied in the title but I thought it might be helpful to others if I detail what instigated my inquiry and my examination of the problem. To me, the first rule of software updates is Primum non nocere -- first, do no harm. So with my Windows 7 system containing both Office 2003 and Office 2010 I blithely proceeded to install this month's updates from Microsoft, containing updates for both versions of Office. While Microsoft officially does not recommend running multiple versions (see, for example, Running Multiple Versions of Microsoft Excel it is possible; I have had two versions installed for a year or more and have never run into an issue before. One thing that is always mentioned is installation order, i.e., the one you want to open files by default should be installed last. I wanted 2010 as my default so I had indeed installed 2003 first then, years later, 2010. So with this round of Windows updates, either it installed patches to 2010 before 2003, knocking out the file association, or the 2003 patch was more comprehensive, in the sense of touching the file association while the 2010 did not. In any case, after updates, double-clicking a .xls file opened 2003 rather than 2010. Web search indicated either: Use the file associations control panel to re-associate .xls files with the correct version of excel. I looked at this first, but it showed what seemed to be an unversioned "Excel" associated with .xls files so I did not check further. (This turned out to be an error on my part; more later.) Re-install versions in the desired order; I find this unreasonable. Run the repair option of the Office installer on the desired version; still seems more work than one should need. Run excel from the command line with "/regserver" on the one to be the default and "/unregserver" on the other. Good idea, but further search indicated that neither 2007 nor 2010 support "/regserver" contrary to some posts (e.g. Default Program With Multiple Versions Installed). Since this was a Windows Update issue and Microsoft provides free support for such, I inquired there as well, but succeeded only in getting the suggestion to uninstall all other versions, period; not acceptable to me. What worked for me was going back to the file associations control panel and manually selected the Office 2010 version of Excel. While it appeared no different in the control panel, it did fix the double-click issue. So if all it takes is this simple fix after an update, I can live with that. What I am wondering is: Has anyone seen any other problems related to having multiple versions of Office installed?

    Read the article

  • Problem obfuscating struts2 webapplication using Proguard / Tomcat SEVERE: Error filterStart ... org

    - by Xinus
    I am trying to obfuscate struts2 web application using struts2, I have separated out all action and servlet classes into separate jar file which does not take part in obfuscation, I am obfuscating everything else, I am using Proguard for obfuscation But for some reason tomcat is giving weired error as follows Apr 21, 2010 11:22:44 AM org.apache.coyote.http11.Http11Protocol init INFO: Initializing Coyote HTTP/1.1 on http-8080 Apr 21, 2010 11:22:44 AM org.apache.catalina.startup.Catalina load INFO: Initialization processed in 607 ms Apr 21, 2010 11:22:44 AM org.apache.catalina.core.StandardService start INFO: Starting service Catalina Apr 21, 2010 11:22:44 AM org.apache.catalina.core.StandardEngine start INFO: Starting Servlet Engine: Apache Tomcat/6.0.16 Enter the interceptor Apr 21, 2010 11:22:45 AM org.apache.catalina.core.StandardContext start SEVERE: Error filterStart Apr 21, 2010 11:22:45 AM org.apache.catalina.core.StandardContext start SEVERE: Context [/DataSubmissionToolFinal] startup failed due to previous errors log4j:ERROR LogMananger.repositorySelector was null likely due to error in class reloading, using NOPLoggerRepository. I do not have any I idea whats happening behind the scenes .. Any suggetions ?

    Read the article

  • What is in scala-android.jar?

    - by synic
    I've been trying to experiment with developing Android applications with Scala. I've gotten to the point where I can get the app to compile, but there are no helper functions for things like: button.setOnClickListener( () => { text.setText("test") }) (I'm talking about the closure there) I see lots of references to scala-android.jar, and have this file in my project, but I'm not sure what it does, or how to use it. I get the feeling it has these helper conversion functions, but I'm not sure. Running jar -tvf scala-android.jar on the file gives me this: 401 Sun Jun 06 10:06:02 MDT 2010 scala/Function0$class.class 431 Sun Jun 06 10:06:02 MDT 2010 scala/Function0.class 572 Sun Jun 06 10:06:02 MDT 2010 scala/Function1.class 282 Sun Jun 06 10:06:02 MDT 2010 scala/ScalaObject$class.class 271 Sun Jun 06 10:06:02 MDT 2010 scala/ScalaObject.class 458 Sun Jun 06 10:06:02 MDT 2010 scala/runtime/BoxedUnit.class If this isn't what I'm looking for, is there a simple library that'll do conversions for this kind of stuff?

    Read the article

  • Ejabberd clustering problem with amazon EC2 server

    - by user353362
    Hello Guys! I have been trying to install ejabberd server on Amazons EC2 instance. I am kinds a stuck at this step right now. I am following this guide: http://tdewolf.blogspot.com/2009/07/clustering-ejabberd-nodes-using-mnes... From the guide I have sucessfully completed the Set up First Node (on ejabberd1) part. But am stuck in part 4 of Set up Second Node (on ejabberd2) So all in all, I created the main node and am able to run the server on that node and access its admin console from then internet. In the second node I have installed ejabberd. But I am stuck at point 4 of setting up the node instruction presented in this blog (http://tdewolf.blogspot.com/2009/07/clustering-ejabberd-nodes-using-mnes...). I execute this command " erl -sname ejabberd@domU-12-31-39-0F-7D-14 -mnesia dir '"/var/lib/ejabberd/"' -mnesia extra_db_nodes "['ejabberd@domU-12-31-39-02-C8-36']" -s mnesia " on the second server and get a crashing error: root@domU-12-31-39-0F-7D-14:/var/lib/ejabberd# erl -sname ejabberd@domU-12-31-39-0F-7D-14 -mnesia dir '"/var/lib/ejabberd/"' -mnesia extra_db_nodes "['ejabberd@domU-12-31-39-02-C8-36']" -s mnesia {error_logger,{{2010,5,28},{23,52,25}},"Protocol: ~p: register error: ~p~n",["inet_tcp",{{badmatch,{error,duplicate_name}},[{inet_tcp_dist,listen,1},{net_kernel,start_protos,4},{net_kernel,start_protos,3},{net_kernel,init_node,2},{net_kernel,init,1},{gen_server,init_it,6},{proc_lib,init_p_do_apply,3}]}]} {error_logger,{{2010,5,28},{23,52,25}},crash_report,[[{pid,<0.21.0},{registered_name,net_kernel},{error_info,{exit,{error,badarg},[{gen_server,init_it,6},{proc_lib,init_p_do_apply,3}]}},{initial_call,{net_kernel,init,['Argument__1']}},{ancestors,[net_sup,kernel_sup,<0.8.0]},{messages,[]},{links,[#Port<0.52,<0.18.0]},{dictionary,[{longnames,false}]},{trap_exit,true},{status,running},{heap_size,610},{stack_size,23},{reductions,518}],[]]} {error_logger,{{2010,5,28},{23,52,25}},supervisor_report,[{supervisor,{local,net_sup}},{errorContext,start_error},{reason,{'EXIT',nodistribution}},{offender,[{pid,undefined},{name,net_kernel},{mfa,{net_kernel,start_link,[['ejabberd@domU-12-31-39-0F-7D-14',shortnames]]}},{restart_type,permanent},{shutdown,2000},{child_type,worker}]}]} {error_logger,{{2010,5,28},{23,52,25}},supervisor_report,[{supervisor,{local,kernel_sup}},{errorContext,start_error},{reason,shutdown},{offender,[{pid,undefined},{name,net_sup},{mfa,{erl_distribution,start_link,[]}},{restart_type,permanent},{shutdown,infinity},{child_type,supervisor}]}]} {error_logger,{{2010,5,28},{23,52,25}},crash_report,[[{pid,<0.7.0},{registered_name,[]},{error_info,{exit,{shutdown,{kernel,start,[normal,[]]}},[{application_master,init,4},{proc_lib,init_p_do_apply,3}]}},{initial_call,{application_master,init,['Argument_1','Argument_2','Argument_3','Argument_4']}},{ancestors,[<0.6.0]},{messages,[{'EXIT',<0.8.0,normal}]},{links,[<0.6.0,<0.5.0]},{dictionary,[]},{trap_exit,true},{status,running},{heap_size,233},{stack_size,23},{reductions,123}],[]]} {error_logger,{{2010,5,28},{23,52,25}},std_info,[{application,kernel},{exited,{shutdown,{kernel,start,[normal,[]]}}},{type,permanent}]} {"Kernel pid terminated",application_controller,"{application_start_failure,kernel,{shutdown,{kernel,start,[normal,[]]}}}"} Crash dump was written to: erl_crash.dump Kernel pid terminated (application_controller) ({application_start_failure,kernel,{shutdown,{kernel,start,[normal,[]]}}}) root@domU-12-31-39-0F-7D-14:/var/lib/ejabberd# any idea what going on? I am not really sure how to solve this problem :S how to let ejabberd only access register from one special server? › Is that the right way of copying .erlang.cookie file? Submitted by privateson on Sat, 2010-05-29 00:11. before this I was getting this error (see below), I solved it by running this command: chmod 400 .erlang.cookie Also to copy the cookie I simply created a file using vi on the second server and copied the secret code from server one to the second server. Is that the right way of copying .erlang.cookie file? ERROR ~~~~~~~~~~ root@domU-12-31-39-0F-7D-14:/etc/ejabberd# erl -sname ejabberd@domU-12-31-39-0F-7D-14 -mnesia dir '"/var/lib/ejabberd/"' -mnesia extra_db_nodes "['ejabberd@domU-12-31-39-02-C8-36']" -s mnesia {error_logger,{{2010,5,28},{23,28,56}},"Cookie file /root/.erlang.cookie must be accessible by owner only",[]} {error_logger,{{2010,5,28},{23,28,56}},crash_report,[[{pid,<0.20.0},{registered_name,auth},{error_info,{exit,{"Cookie file /root/.erlang.cookie must be accessible by owner only",[{auth,init_cookie,0},{auth,init,1},{gen_server,init_it,6},{proc_lib,init_p_do_apply,3}]},[{gen_server,init_it,6},{proc_lib,init_p_do_apply,3}]}},{initial_call,{auth,init,['Argument__1']}},{ancestors,[net_sup,kernel_sup,<0.8.0]},{messages,[]},{links,[<0.18.0]},{dictionary,[]},{trap_exit,true},{status,running},{heap_size,987},{stack_size,23},{reductions,439}],[]]} {error_logger,{{2010,5,28},{23,28,56}},supervisor_report,[{supervisor,{local,net_sup}},{errorContext,start_error},{reason,{"Cookie file /root/.erlang.cookie must be accessible by owner only",[{auth,init_cookie,0},{auth,init,1},{gen_server,init_it,6},{proc_lib,init_p_do_apply,3}]}},{offender,[{pid,undefined},{name,auth},{mfa,{auth,start_link,[]}},{restart_type,permanent},{shutdown,2000},{child_type,worker}]}]} {error_logger,{{2010,5,28},{23,28,56}},supervisor_report,[{supervisor,{local,kernel_sup}},{errorContext,start_error},{reason,shutdown},{offender,[{pid,undefined},{name,net_sup},{mfa,{erl_distribution,start_link,[]}},{restart_type,permanent},{shutdown,infinity},{child_type,supervisor}]}]} {error_logger,{{2010,5,28},{23,28,56}},crash_report,[[{pid,<0.7.0},{registered_name,[]},{error_info,{exit,{shutdown,{kernel,start,[normal,[]]}},[{application_master,init,4},{proc_lib,init_p_do_apply,3}]}},{initial_call,{application_master,init,['Argument_1','Argument_2','Argument_3','Argument_4']}},{ancestors,[<0.6.0]},{messages,[{'EXIT',<0.8.0,normal}]},{links,[<0.6.0,<0.5.0]},{dictionary,[]},{trap_exit,true},{status,running},{heap_size,233},{stack_size,23},{reductions,123}],[]]} {error_logger,{{2010,5,28},{23,28,56}},std_info,[{application,kernel},{exited,{shutdown,{kernel,start,[normal,[]]}}},{type,permanent}]} {"Kernel pid terminated",application_controller,"{application_start_failure,kernel,{shutdown,{kernel,start,[normal,[]]}}}"} Crash dump was written to: erl_crash.dump Kernel pid terminated (application_controller) ({application_start_failure,kernel,{shutdown,{kernel,start,[normal,[]]}}}) root@domU-12-31-39-0F-7D-14:/var/lib/ejabberd# cat /var/log/ejabberd/ejabberd.log =INFO REPORT==== 2010-05-28 22:48:53 === I(<0.321.0:mod_pubsub:154) : pubsub init "localhost" [{access_createnode, pubsub_createnode}, {plugins, ["default","pep"]}] =INFO REPORT==== 2010-05-28 22:48:53 === I(<0.321.0:mod_pubsub:210) : ** tree plugin is nodetree_default =INFO REPORT==== 2010-05-28 22:48:53 === I(<0.321.0:mod_pubsub:214) : ** init default plugin =INFO REPORT==== 2010-05-28 22:48:53 === I(<0.321.0:mod_pubsub:214) : ** init pep plugin =ERROR REPORT==== 2010-05-28 23:40:08 === ** Connection attempt from disallowed node 'ejabberdctl1275090008486951000@domU-12-31-39-0F-7D-14' ** =ERROR REPORT==== 2010-05-28 23:41:10 === ** Connection attempt from disallowed node 'ejabberdctl1275090070163253000@domU-12-31-39-0F-7D-14' **

    Read the article

  • view to select specific period or latest when null

    - by edosoft
    Hi I have a product table which simplifies to this: create table product(id int primary key identity, productid int, year int, quarter int, price money) and some sample data: insert into product select 11, 2010, 1, 1.11 insert into product select 11, 2010, 2, 2.11 insert into product select 11, 2010, 3, 3.11 insert into product select 12, 2010, 1, 1.12 insert into product select 12, 2010, 2, 2.12 insert into product select 13, 2010, 1, 1.13 Prices are can be changed each quarter, but not all products get a new price each quarter. Now I could duplicate the data each quarter, keeping the price the same, but I'd rather use a view. How can I create a view that can be used to return prices for (for example) quarter 2? I've written this to return the current (=latest) price: CREATE VIEW vwCurrentPrices AS SELECT * FROM ( SELECT *, ROW_NUMBER() OVER (PARTITION BY productid ORDER BY year DESC, quarter DESC) AS Ranking FROM product ) p WHERE p.Ranking = 1 I'd like to create a view so I can use queries like select * from vwProduct where quarter = 2

    Read the article

  • Tools to help with analysing log files

    - by peter
    I am developing a C# .NET application. In the app.config file I add trace logging as shown, <?xml version="1.0" encoding="UTF-8" ?> <configuration> <system.diagnostics> <trace autoflush="true" /> <sources> <source name="System.Net.Sockets" maxdatasize="1024"> <listeners> <add name="MyTraceFile"/> </listeners> </source> </sources> <sharedListeners> <add name="MyTraceFile" type="System.Diagnostics.TextWriterTraceListener" initializeData="System.Net.trace.log" /> </sharedListeners> <switches> <add name="System.Net" value="Verbose" /> </switches> </system.diagnostics> </configuration> Are there any good tools around to analyse the log file that is output? The output looks like this, System.Net.Sockets Verbose: 0 : [5900] Data from Socket#8764489::Send DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000000 : 4D 49 4D 45 2D 56 65 72-73 69 6F 6E 3A 20 31 2E : MIME-Version: 1. DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000060 : 65 3A 20 37 20 41 70 72-20 32 30 31 30 20 31 35 : e: 7 Apr 2010 15 DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000070 : 3A 32 32 3A 34 30 20 2B-31 32 30 30 0D 0A 53 75 : :22:40 +1200..Su DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000080 : 62 6A 65 63 74 3A 20 5B-45 72 72 6F 72 5D 20 45 : bject: [Error] E DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000090 : 78 63 65 70 74 69 6F 6E-20 69 6E 20 53 79 6E 63 : xception in Sync DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 000000A0 : 53 65 72 76 69 63 65 20-28 32 30 30 38 2E 30 2E : Service (2008.0. DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 000000B0 : 33 30 34 2E 31 32 33 34-32 29 0D 0A 43 6F 6E 74 : 304.12342)..Cont DateTime=2010-04-07T03:22:40.1067012Z Is there anything that can take the output shown above (my output is a text file 100mb in size), group together packets, and help out with finding particular issues I would like to hear about it. Thanks.

    Read the article

  • DateJS parsing mystery

    - by Herb Caudill
    I'm using DateJS to parse user-inputted dates, and getting some strange results. Date.parse("15 Jan 2010") returns Fri Jan 15 00:00:00 EST 2010 (right) Date.parse("15-Apr-2010") returns Thu Apr 15 00:00:00 EDT 2010 (right) Date.parse("15 Apr 2010") returns Thu Apr 1 00:00:00 EDT 2010 (wrong) As far as I can tell, the d MMM yyyy input format works fine for every month except April and August; in those two cases, it returns the first of the month no matter what day is entered. Is this a bug, or is there a logical explanation I'm missing?

    Read the article

  • mysql query clarification

    - by JPro
    I have a query which I am wondering if the result I am getting is the one that I am expecting. The table structure goes like this : Table : results ID TestCase Set Analyzed Verdict StartTime Platform 1 1010101 ros2 false fail 18/04/2010 20:23:44 P1 2 1010101 ros3 false fail 19/04/2010 22:22:33 P1 3 1232323 ros2 true pass 19/04/2010 22:22:33 P1 4 1232323 ros3 false fail 29/04/2010 22:22:33 P2 Table : testcases ID TestCase type 1 1010101 NOSNOS 2 1232323 N212NS is there any way to display only the latest fails on each platform? in the above case Result shoud be : ID TestCase Set Analyzed Verdict StartTime Platform type 2 1010101 ros3 false fail 19/04/2010 22:22:33 P1 NOSNOS 4 1232323 ros3 false fail 29/04/2010 22:22:33 P2 N212NS

    Read the article

  • DeprecationWarning when pushing to Mercurial repo

    - by Josh Nankin
    I'm trying to serve a merurial repository with apache, and when I try to push to the repo I see this in the apache error.log. On the client side I get a 500 error. How do I get this to go away???? [Sun Jun 06 14:43:25 2010] [error] [client 192.168.1.8] /var/lib/python-support/python2.6/mercurial/hgweb/common.py:24: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6 [Sun Jun 06 14:43:25 2010] [error] [client 192.168.1.8] self.message = message [Sun Jun 06 14:43:25 2010] [error] [client 192.168.1.8] /var/lib/python-support/python2.6/mercurial/hgweb/hgweb_mod.py:104: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6 [Sun Jun 06 14:43:25 2010] [error] [client 192.168.1.8] if not inst.message: [Sun Jun 06 14:43:25 2010] [error] [client 192.168.1.8] /var/lib/python-support/python2.6/mercurial/hgweb/hgweb_mod.py:106: DeprecationWarning: BaseException.message has been deprecated as of Python 2.6 [Sun Jun 06 14:43:25 2010] [error] [client 192.168.1.8] return '0\\n%s\\n' % inst.message,

    Read the article

  • how to retrieve inserted checkbox values in checkbox on asp.net page load ?

    - by user522211
    i have 5 checkboxes in webform and textbox1... when i search the record using the date specified in textbox1 when i enter 11-Dec-2010 in Textbox1 and click on submit button then checkbox1, checkbox2, and checkbox3 will be disabled and unchecked ..... and after tat when i type 13-Dec-2010 in Textbox1 and click on submit button then checkbox1, checkbox4, and checkbox5 will be disabled and unchecked .....and all the checkbox of 11-Dec-2010 will be enabled for 13-Dec-2010 I M CURRENTLY WORKING IN ASP.NET (VB) My Datbase structure : ID Name Seats Date 1 Sumit 1,2,3 11-Dec-2010 2 Mili 1,4,5 13-Dec-2010 Example of this is that site have a look to know more : what i want : http://www.redbus.in/Booking/SeatSelection.aspx?rt=4034093&doj=28-Feb-2011&dep=05:00%20PM&showSpInst=false

    Read the article

  • Tools to Help out with

    - by peter
    I am developing a C# .NET application. In the app.config file I add trace logging as shown, <?xml version="1.0" encoding="UTF-8" ?> <configuration> <system.diagnostics> <trace autoflush="true" /> <sources> <source name="System.Net.Sockets" maxdatasize="1024"> <listeners> <add name="MyTraceFile"/> </listeners> </source> </sources> <sharedListeners> <add name="MyTraceFile" type="System.Diagnostics.TextWriterTraceListener" initializeData="System.Net.trace.log" /> </sharedListeners> <switches> <add name="System.Net" value="Verbose" /> </switches> </system.diagnostics> </configuration> Are there any good tools around to analyse the log file that is output? The output looks like this, System.Net.Sockets Verbose: 0 : [5900] Data from Socket#8764489::Send DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000000 : 4D 49 4D 45 2D 56 65 72-73 69 6F 6E 3A 20 31 2E : MIME-Version: 1. DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000060 : 65 3A 20 37 20 41 70 72-20 32 30 31 30 20 31 35 : e: 7 Apr 2010 15 DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000070 : 3A 32 32 3A 34 30 20 2B-31 32 30 30 0D 0A 53 75 : :22:40 +1200..Su DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000080 : 62 6A 65 63 74 3A 20 5B-45 72 72 6F 72 5D 20 45 : bject: [Error] E DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000090 : 78 63 65 70 74 69 6F 6E-20 69 6E 20 53 79 6E 63 : xception in Sync DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 000000A0 : 53 65 72 76 69 63 65 20-28 32 30 30 38 2E 30 2E : Service (2008.0. DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 000000B0 : 33 30 34 2E 31 32 33 34-32 29 0D 0A 43 6F 6E 74 : 304.12342)..Cont DateTime=2010-04-07T03:22:40.1067012Z Is there anything that can take the output shown above (my output is a text file 100mb in size), group together packets, and help out with finding particular issues I would like to hear about it. Thanks.

    Read the article

  • Anybody Know of any Tools to help Analysing .NET Trace Log Files?

    - by peter
    I am developing a C# .NET application. In the app.config file I add trace logging as shown, <?xml version="1.0" encoding="UTF-8" ?> <configuration> <system.diagnostics> <trace autoflush="true" /> <sources> <source name="System.Net.Sockets" maxdatasize="1024"> <listeners> <add name="MyTraceFile"/> </listeners> </source> </sources> <sharedListeners> <add name="MyTraceFile" type="System.Diagnostics.TextWriterTraceListener" initializeData="System.Net.trace.log" /> </sharedListeners> <switches> <add name="System.Net" value="Verbose" /> </switches> </system.diagnostics> </configuration> Are there any good tools around to analyse the log file that is output? The output looks like this, System.Net.Sockets Verbose: 0 : [5900] Data from Socket#8764489::Send DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000000 : 4D 49 4D 45 2D 56 65 72-73 69 6F 6E 3A 20 31 2E : MIME-Version: 1. DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000060 : 65 3A 20 37 20 41 70 72-20 32 30 31 30 20 31 35 : e: 7 Apr 2010 15 DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000070 : 3A 32 32 3A 34 30 20 2B-31 32 30 30 0D 0A 53 75 : :22:40 +1200..Su DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000080 : 62 6A 65 63 74 3A 20 5B-45 72 72 6F 72 5D 20 45 : bject: [Error] E DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 00000090 : 78 63 65 70 74 69 6F 6E-20 69 6E 20 53 79 6E 63 : xception in Sync DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 000000A0 : 53 65 72 76 69 63 65 20-28 32 30 30 38 2E 30 2E : Service (2008.0. DateTime=2010-04-07T03:22:40.1067012Z System.Net.Sockets Verbose: 0 : [5900] 000000B0 : 33 30 34 2E 31 32 33 34-32 29 0D 0A 43 6F 6E 74 : 304.12342)..Cont DateTime=2010-04-07T03:22:40.1067012Z Is there anything that can take the output shown above (my output is a text file 100mb in size), group together packets, and help out with finding particular issues I would like to hear about it. Thanks.

    Read the article

  • PowerShell: Read text, regex sort, write output to file and formatting

    - by Bill Hunter
    I am a Powershell novice and have run into a challenge in reading, sorting, and outputting a csv file. The input csv has no headers, the data is as follows: 05/25/2010,18:48:33,Stop,a1usak,10.128.212.212 05/25/2010,18:48:36,Start,q2uhal,10.136.198.231 05/25/2010,18:48:09,Stop,s0upxb,10.136.198.231 I use the following piping construct to read the file, sort and output to a file: (Get-Content d:\vpnData\u62gvpn2.csv) | %{,[regex]::Split($, ",")} | sort @{Expression={$[3]}},@{Expression={$_[1]}} | out-file d:\vpnData\u62gvpn3.csv The new file is written with the following format: 05/25/2010 07:41:57 Stop a0uaar 10.128.196.160 05/25/2010 12:24:24 Start a0uaar 10.136.199.51 05/25/2010 20:00:56 Stop a0uaar 10.136.199.51 What I would like to see in the output file is a similar format to the original input file with comma dilimiters: 05/25/2010,07:41:57,Stop,a0uaar,10.128.196.160 05/25/2010,12:24:24,Start,a0uaar,10.136.199.51 05/25/2010,20:00:56,Stop,a0uaar,10.136.199.51 But I can't quite seem to get there. I'm almost of the mind that I'll have to write another segment to read the newly produced file and reset its contents to the preferred format for further processing. Thoughts?

    Read the article

  • What sort of object is this and how to use it?

    - by Gary
    What would be the correct name for this type of array? There are 3 main sections and 4 sub-parts consisting of "issuedTime" "text" "url" and "validToTime", how do you start to convert this to an object? If there was only 1 main section, it would be fairly simple to do however with 3 main parts and no identification for each main section has me scratching my head as where to start. Any advise appreciated. [{ "issuedTime":"7:13pm Sunday 13 June 2010", "text":"\nAmended 7:10pm.\n\nText text and more text\n", "url":"\/folder\/fc\/name.png", "validToTime":"12:00am Monday 14 June 2010" },{ "issuedTime":"8:33pm Sunday 13 June 2010", "text":"\nText and more text.\n", "url":"\/folder\/fc\/name.png", "validToTime":"12:00pm Monday 14 June 2010" },{ "issuedTime":"10:40am Sunday 13 June 2010", "text":"\nAnd even more text.", "url":"\/folder\/fc\/name.png", "validToTime":"12:00am Tuesday 15 June 2010" } ]

    Read the article

  • What is the best way to restore(rollback) data in an application to a specified state(date) ?

    - by panzerschreck
    Hello, An example would set the context right, the example below captures the various states of the entity, which needs to be reverted(rolled back) . State 1 - Recorded on 01-Mar-2010 Column1 Column2 Data1 0.56 State 2 - Recorded on 02-Mar-2010 Column1 Column2 Data1 0.57 State 3 - Recorded on 03-Mar-2010 Column1 Column2 Data1 0.58 User notices that state3 is not what he intended to be in, decides to revert back to state2. One approach that I can think of, without modifying the entity is via "auditing" all the inserts/updates, as below, the rollback information captures the data just before the updates/modifications on the entity, so that it can be applied in an order when you need to revert.Please note that changing the entity's schema, is not an option. Rollback - R1 recorded on 01-Mar-2010 Column1 Column2 Data1 0.56 Rollback - R2 Recorded on 02-Mar-2010 Column1 Column2 Data1 0.56 Rollback - R3 Recorded on 03-Mar-2010 Column1 Column2 Data1 0.57 So, to get to state2 , we would start with rollback information R1,apply R2 onto it. Is there a better approach to achieve this ? Thanks for your time.

    Read the article

  • mktime and date questions..

    - by mjboggess
    Hello SO! I am working on a small project and am playing with date() and mktime() in PHP. Compare the two code blocks and their output, notice the second sample adds one to the month in it's first mktime. $monthis = 5; echo date('F', mktime(0,0,0,$monthis,0,0)) . " 1, 2010 is on a " . date("l F", mktime(0, 0, 0, $monthis, 1, 2010)); puts out April 1, 2010 is on a Saturday May but if I change it to $monthis = 5; echo date('F', mktime(0,0,0,$monthis + 1,0,0)) . " 1, 2010 is on a " . date("l F", mktime(0, 0, 0, $monthis, 1, 2010)); puts out May 1, 2010 is on a Saturday May Why do I have to add one to the month in the first mktime so that both emit the same month? Any help or clarity would be appreciated. Thanks :)

    Read the article

  • Long connection times from PHP to MySQL on EC2

    - by Erik Giberti
    I'm having an intermittent issue connecting to a database slave with InnoDB. Intermittently I get connections taking longer than 2 seconds. These servers are hosted on Amazon's EC2. The app server is PHP 5.2/Apache running on Ubuntu. The DB slave is running Percona's XtraDB 5.1 on Ubuntu 9.10. It's using an EBS Raid array for the data storage. We already use skip name resolve and bind to address 0.0.0.0. This is a stub of the PHP code that's failing $tmp = mysqli_init(); $start_time = microtime(true); $tmp-options(MYSQLI_OPT_CONNECT_TIMEOUT, 2); $tmp-real_connect($DB_SERVERS[$server]['server'], $DB_SERVERS[$server]['username'], $DB_SERVERS[$server]['password'], $DB_SERVERS[$server]['schema'], $DB_SERVERS[$server]['port']); if(mysqli_connect_errno()){ $timer = microtime(true) - $start_time; mail($errors_to,'DB connection error',$timer); } There's more than 300Mb available on the DB server for new connections and the server is nowhere near the max allowed (60 of 1,200). Loading on both servers is < 2 on 4 core m1.xlarge instances. Some highlights from the mysql config max_connections = 1200 thread_stack = 512K thread_cache_size = 1024 thread_concurrency = 16 innodb-file-per-table innodb_additional_mem_pool_size = 16M innodb_buffer_pool_size = 13G Any help on tracing the source of the slowdown is appreciated. [EDIT] I have been updating the sysctl values for the network but they don't seem to be fixing the problem. I made the following adjustments on both the database and application servers. net.ipv4.tcp_window_scaling = 1 net.ipv4.tcp_sack = 0 net.ipv4.tcp_timestamps = 0 net.ipv4.tcp_fin_timeout = 20 net.ipv4.tcp_keepalive_time = 180 net.ipv4.tcp_max_syn_backlog = 1280 net.ipv4.tcp_synack_retries = 1 net.core.rmem_max = 16777216 net.core.wmem_max = 16777216 net.ipv4.tcp_rmem = 4096 87380 16777216 net.ipv4.tcp_wmem = 4096 87380 16777216 [EDIT] Per jaimieb's suggestion, I added some tracing and captured the following data using time. This server handles about 51 queries/second at this the time of day. The connection error was raised once (at 13:06:36) during the 3 minute window outlined below. Since there was 1 failure and roughly 9,200 successful connections, I think this isn't going to produce anything meaningful in terms of reporting. Script: date /root/database_server.txt (time mysql -h database_Server -D schema_name -u appuser -p apppassword -e '') /dev/null 2 /root/database_server.txt Results: === Application Server 1 === Mon Feb 22 13:05:01 EST 2010 real 0m0.008s user 0m0.001s sys 0m0.000s Mon Feb 22 13:06:01 EST 2010 real 0m0.007s user 0m0.002s sys 0m0.000s Mon Feb 22 13:07:01 EST 2010 real 0m0.008s user 0m0.000s sys 0m0.001s === Application Server 2 === Mon Feb 22 13:05:01 EST 2010 real 0m0.009s user 0m0.000s sys 0m0.002s Mon Feb 22 13:06:01 EST 2010 real 0m0.009s user 0m0.001s sys 0m0.003s Mon Feb 22 13:07:01 EST 2010 real 0m0.008s user 0m0.000s sys 0m0.001s === Database Server === Mon Feb 22 13:05:01 EST 2010 real 0m0.016s user 0m0.000s sys 0m0.010s Mon Feb 22 13:06:01 EST 2010 real 0m0.006s user 0m0.010s sys 0m0.000s Mon Feb 22 13:07:01 EST 2010 real 0m0.016s user 0m0.000s sys 0m0.010s [EDIT] Per a suggestion received on a LinkedIn question, I tried setting the back_log value higher. We had been running the default value (50) and increased it to 150. We also raised the kernel value /proc/sys/net/core/somaxconn (maximum socket connections) to 256 on both the application and database server from the default 128. We did see some elevation in processor utilization as a result but still received connection timeouts.

    Read the article

  • Telerik RadEditor for MOSS - How do I suppress min-width inline CSS?

    - by James
    I'm having an issue with the RadEditor for MOSS, I'm really baffled as to the source of this issue. I tried using Firebug to find where any min-* CSS settings are happening and search came up empty, but I know it's happening because the downloaded page markup does not have that inline CSS. I believe that one of the Telerik control emitted Javascripts is what is adding inline CSS style to the top level div of the editor, namely min-height, min-width. This is causing layout issues on my page. My question is why is it doing this, and more importantly how do I prevent this from happening? <div style="height: 300px; width: 100%; min-height: 300px; min-width: 1133px;" class="RadEditor Default reWrapper ms-input"> Related thread

    Read the article

  • What is the best way to migrate documents into Sharepoint (MOSS) 2007?

    - by Jeramie Mercker
    I'm working with a customer that needs to migrate documents from their current document management system (not Sharepoint) into Sharepoint MOSS 2007 retaining document history and metadata. I've written a proof of concept using the Sharepoint web services and that looks promising, but the snag so far seems to be programmatically setting the created date/time and user. The webservices allow the fields to be set but implicitly override them to be the currently logged in user + date/time. For obvious reasons, I need to be able to keep the original created date/time and user on migration. Does anyone know the best way to approach this problem?

    Read the article

  • SQLAuthority News – Microsoft SQL Server 2005/2008 Query Optimization & Performance Tuning Training

    - by pinaldave
    Last 3 days to register for the courses. This is one time offer with big discount. The deadline for the course registration is 5th May, 2010. There are two different courses are offered by Solid Quality Mentors 1) Microsoft SQL Server 2005/2008 Query Optimization & Performance Tuning – Pinal Dave Date: May 12-14, 2010 Price: Rs. 14,000/person for 3 days Discount Code: ‘SQLAuthority.com’ Effective Price: Rs. 11,000/person for 3 days 2) SharePoint 2010 – Joy Rathnayake Date: May 10-11, 2010 Price: Rs. 11,000/person for 3 days Discount Code: ‘SQLAuthority.com’ Effective Price: Rs. 8,000/person for 2 days Download the complete PDF brochure. To register, either send an email to [email protected] or call +91 95940 43399. Feel free to drop me an email at pinal “at” SQLAuthority.com for any additional information and clarification. Training Venue: Abridge Solutions, #90/B/C/3/1, Ganesh GHR & MSY Plaza, Vittalrao Nagar, Near Image Hospital, Madhapur, Hyderabad – 500 081. Additionally there is special program of SolidQ India Insider. This is only available to first few registrants of the courses only. Read more details about the course here. Read my TechEd India 2010 experience here. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Optimization, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQL Training, SQLAuthority News, T SQL, Technology

    Read the article

< Previous Page | 230 231 232 233 234 235 236 237 238 239 240 241  | Next Page >