Search Results

Search found 24007 results on 961 pages for 'visual c 2010'.

Page 455/961 | < Previous Page | 451 452 453 454 455 456 457 458 459 460 461 462  | Next Page >

  • mod_mono 'Service Temporarily Unavailable' issue

    - by Charlie Somerville
    I've deployed an ASP.NET web application on a Linux (Debian) server running Apache 2.2 and mod_mono 1.9 It's working well, however Mono occasionally segfaults and uses the entire CPU which causes the website to stop working and display 'Service Temporarily Unavailable' Killing mono fixes it, but obviously this isn't a good solution. I tailed the system log after this happened and I saw the following error messages from the kernel: Apr 20 01:49:37 charliesomerville kernel: [1596436.204158] mono[17909]: segfault at b645f671 ip b645f671 sp b4ffb604 error 4<6>mono[19047]: segfault at b645f66e ip b645f66e sp b4bf7604 error 4<6>mono[18017]: segfault at b645f66e ip b645f66e sp b52fe604 error 4<6>mono[19668]: segfault at b645f5e6 ip b645f5e6 sp b48f4604 error 4<6>mono[22565]: segfault at b645f674 ip b645f674 sp b45f1604 error 4<6>mono[17700]: segfault at b645f661 ip b645f661 sp b51fd604 error 4<6>mono[19596]: segfault at b645f5e6 ip b645f5e6 sp b49f5604 error 4 Apr 20 01:49:37 charliesomerville kernel: [1596436.208172] mono[23219]: segfault at b645f66e ip b645f66e sp b44f0604 error 4 At the end of Apache's error.log are the following errors: [Tue Apr 20 03:10:23 2010] [error] (70014)End of file found: read_data failed [Tue Apr 20 03:10:23 2010] [error] Command stream corrupted, last command was 1 [Tue Apr 20 03:10:23 2010] [error] Command stream corrupted, last command was 1 [Tue Apr 20 03:10:23 2010] [error] Command stream corrupted, last command was 1 System.ArgumentNullException: null key Parameter name: key at System.Collections.Hashtable.get_Item (System.Object key) [0x00000] at System.Runtime.Serialization.SerializationCallbacks.GetSerializationCallbacks (System.Type t) [0x00000] at System.Runtime.Serialization.ObjectManager.RaiseOnDeserializingEvent (System.Object obj) [0x00000] at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ReadObjectContent (System.IO.BinaryReader reader, System.Runtime.Serialization.Formatters.Binary.TypeMetadata metadata, Int64 objectId, System.Object& objectInstance, System.Runtime.Serialization.SerializationInfo& info) [0x00000] at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ReadObjectInstance (System.IO.BinaryReader reader, Boolean isRuntimeObject, Boolean hasTypeInfo, System.Int64& objectId, System.Object& value, System.Runtime.Serialization.SerializationInfo& info) [0x00000] at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ReadObject (BinaryElement element, System.IO.BinaryReader reader, System.Int64& objectId, System.Object& value, System.Runtime.Serialization.SerializationInfo& info) [0x00000] at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ReadNextObject (System.IO.BinaryReader reader) [0x00000] at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ReadObjectGraph (System.IO.BinaryReader reader, Boolean readHeaders, System.Object& result, System.Runtime.Remoting.Messaging.Header[]& headers) [0x00000] at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.NoCheckDeserialize (System.IO.Stream serializationStream, System.Runtime.Remoting.Messaging.HeaderHandler handler) [0x00000] at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Deserialize (System.IO.Stream serializationStream) [0x00000] at System.Runtime.Remoting.Channels.CADSerializer.DeserializeObject (System.IO.MemoryStream mem) [0x00000] at System.Runtime.Remoting.RemotingServices.GetDomainProxy (System.AppDomain domain) [0x00000] at System.AppDomain.CreateDomain (System.String friendlyName, System.Security.Policy.Evidence securityInfo, System.AppDomainSetup info) [0x00000] at System.Web.Hosting.ApplicationHost.CreateApplicationHost (System.Type hostType, System.String virtualDir, System.String physicalDir) [0x00000] at Mono.WebServer.VPathToHost.CreateHost (Mono.WebServer.ApplicationServer server, Mono.WebServer.WebSource webSource) [0x00000] at Mono.WebServer.ApplicationServer.GetApplicationForPath (System.String vhost, Int32 port, System.String path, Boolean defaultToRoot) [0x00000] at (wrapper remoting-invoke-with-check) Mono.WebServer.ApplicationServer:GetApplicationForPath (string,int,string,bool) at Mono.WebServer.ModMonoWorker.GetOrCreateApplication (System.String vhost, Int32 port, System.String filepath, System.String virt) [0x00000] at Mono.WebServer.ModMonoWorker.InnerRun (System.Object state) [0x00000] at Mono.WebServer.ModMonoWorker.Run (System.Object state) [0x00000] [Tue Apr 20 03:10:26 2010] [error] (70014)End of file found: read_data failed [Tue Apr 20 03:10:26 2010] [error] Command stream corrupted, last command was -1 Along with the above errors, Apache's error.log is packed with hundreds (if not thousands) of the following error: Maximum number (20) of concurrent mod_mono requests to /tmp/mod_mono_dashboard_default_2.lock reached. Droping request. At the moment, I'm thinking there might be something wrong with configuration here (it's basically running on out-of-the-box config)

    Read the article

  • SNTP, why do you mock me?!

    - by Matthew
    --- SOLVED SEE EDIT 5 --- My w2k3 pdc is configured as an authoritative time server. Other servers on the domain are able to sync with it if I manually specify it in the peer list. By if I try to sync from flags 'domhier', it wont resync; I get the error message The computer did not resync because no time data was available. I can only think that it is not querying the pdc. I also tried setting the registry as shown here (http://support.microsoft.com/kb/193825). But no luck (I have not restarted the server, I am hoping I wont have to since it is the pdc) If you would like any further information on my config, please let me know. Edit 1: I have set the w32time service config AnnouceFlags to 0x05 as documented here www.krr.org/microsoft/authoritative_time_servers.php and a number of other places. The PDC syncs to an external time source (ntp). I can get the stripchart on the client from the pdc no problems. The loginserver for the host I am trying to configure is shown as the pdc. Edit 2: The packet capture has revealed something interesting. The client is contacting the correct server, and getting a valid response but I still get the same error message. Here is the NTP excerpt from the client to the server Flags: 11.. .... = Leap Indicator: alarm condition (clock not synchronized) (3) ..01 1... = Version number: NTP Version 3 (3) .... .011 = Mode: client (3) Peer Clock Stratum: unspecified or unavailable (0) Peer Polling Interval: 10 (1024 sec) Peer Clock Precision: 0.015625 sec Root Delay: 0.0000 sec Root Dispersion: 1.0156 sec Reference Clock ID: NULL Reference Clock Update Time: Sep 1, 2010 05:29:39.8170 UTC Originate Time Stamp: NULL Receive Time Stamp: NULL Transmit Time Stamp: Nov 8, 2010 01:44:44.1450 UTC Key ID: DC080000 Here is the reply NTP excerpt from the server to the client Flags: 0x1c 00.. .... = Leap Indicator: no warning (0) ..01 1... = Version number: NTP Version 3 (3) .... .100 = Mode: server (4) Peer Clock Stratum: secondary reference (3) Peer Polling Interval: 10 (1024 sec) Peer Clock Precision: 0.00001 sec Root Delay: 0.1484 sec Root Dispersion: 0.1060 sec Reference Clock ID: 192.189.54.17 Reference Clock Update Time: Nov 8,2010 01:18:04.6223 UTC Originate Time Stamp: Nov 8, 2010 01:44:44.1450 UTC Receive Time Stamp: Nov 8, 2010 01:46:44.1975 UTC Transmit Time Stamp: Nov 8, 2010 01:46:44.1975 UTC Key ID: 00000000 Edit 3: dumpreg for paramters on pdc Value Name Value Type Value Data ------------------------------------------------------------------------ ServiceMain REG_SZ SvchostEntry_W32Time ServiceDll REG_EXPAND_SZ C:\WINDOWS\system32\w32time.dll NtpServer REG_SZ bhvmmgt01.domain.com,0x1 Type REG_SZ AllSync and config Value Name Value Type Value Data -------------------------------------------------------------------------- LastClockRate REG_DWORD 156249 MinClockRate REG_DWORD 155860 MaxClockRate REG_DWORD 156640 FrequencyCorrectRate REG_DWORD 4 PollAdjustFactor REG_DWORD 5 LargePhaseOffset REG_DWORD 50000000 SpikeWatchPeriod REG_DWORD 900 HoldPeriod REG_DWORD 5 LocalClockDispersion REG_DWORD 10 EventLogFlags REG_DWORD 2 PhaseCorrectRate REG_DWORD 7 MinPollInterval REG_DWORD 6 MaxPollInterval REG_DWORD 10 UpdateInterval REG_DWORD 100 MaxNegPhaseCorrection REG_DWORD -1 MaxPosPhaseCorrection REG_DWORD -1 AnnounceFlags REG_DWORD 5 MaxAllowedPhaseOffset REG_DWORD 300 FileLogSize REG_DWORD 10000000 FileLogName REG_SZ C:\Windows\Temp\w32time.log FileLogEntries REG_SZ 0-300 Edit 4: Here are some notables from the ntp log file on the pdc. ReadConfig: failed. Use default one 'TimeJumpAuditOffset'=0x00007080 DomainHierachy: we are now the domain root. ClockDispln: we're a reliable time service with no time source: LS: 0, TN: 864000000000, WAIT: 86400000 Edit 5: F&^%ING SOLVED! Ok so I was reading about people with similar problems, some mentioned w32time server settings applied by GPO, but I tested this early on and there were no settings applied to this service by gpo. Others said that the reporting software may not be picking up some old gpo settings applied. So I searched the registry for all w32time instaces. I came across an interesting key that indicated there may be some other ntp software running on the server. Sure enough, I look through the installed software list and there the little F*&%ER is. Uninstalled and now working like a dream. FFFFFFFUUUUUUUUUUUU

    Read the article

  • Batch break up after call other batch file

    - by Sven Arno Jopen
    i have the problem, that my batch process breaking up each time after call a other batch file. The batch files are used to run a make process out from IBM Rhapsody. There convert the call from Rhapsody to the Visual Studio tools. So nmake will be called from the batch after make different settings. The scripts aren’t written completely from me, I only adapt theme to run under both windows architecture versions, x86 and x64. The first script (vs2005_make.bat) will be called from Rhapsody and run to the “call” statement. The second script (Vcvars_VisualStudio2005.bat) runs to the end. But the first script isn’t resume working, at this point the process break up without a error message. I'm not very familiar with batch files, this is the first time I make more than simple console commands in a batch file. So I hope I have given all information’s which needed, otherwise ask me. Here the start script (vs2005_make.bat): :: parameter 1 - Makefile which should be used :: parameter 2 - The make target mark @echo off IF "%2"=="" set target=all IF "%2"=="all" set target=all IF "%2"=="build" set target=all IF "%2"=="rebuild" set target=clean all IF "%2"=="clean" set target=clean set RegQry="HKLM\Hardware\Description\System\CentralProcessor\0" REG.exe Query %RegQry% > checkOS.txt Find /i "x86" < CheckOS.txt > StringCheck.txt IF %ERRORLEVEL%==0 ( set arch=x86 ) ELSE ( set arch=x64 ) call "%ProgramFiles%\IBM\Rhapsody752\Share\etc\Vcvars_VisualStudio2005.bat" %arch% IF %ERRORLEVEL%==0 ( set makeflags= nmake /nologo /S /F %1 %target% ) del checkOS.txt del StringCheck.txt exit and here the called script (Vcvars_VisualStudio2005.bat): :: param 1 - Processor architecture @echo off ECHO param 1 = %1 IF %1==x86 ( SET ProgrammPath=%ProgramFiles% ) ELSE IF %1==x64 ( SET ProgrammPath=%ProgramFiles(x86)% ) ELSE ( ECHO Unknowen architectur EXIT /B 1 ) SET VSINSTALLDIR="%ProgrammPath%\Microsoft Visual Studio 8\Common7\IDE" SET VCINSTALLDIR="%ProgrammPath%\Microsoft Visual Studio 8" SET FrameworkDir=C:\WINDOWS\Microsoft.NET\Framework SET FrameworkVersion=v2.0.50727 SET FrameworkSDKDir="%ProgrammPath%\Microsoft Visual Studio 8\SDK\v2.0" rem Root of Visual Studio common files. IF %VSINSTALLDIR%=="" GOTO Usage IF %VCINSTALLDIR%=="" SET VCINSTALLDIR=%VSINSTALLDIR% rem rem Root of Visual Studio ide installed files. rem SET DevEnvDir=%VSINSTALLDIR% rem rem Root of Visual C++ installed files. rem SET MSVCDir=%VCINSTALLDIR%\VC SET PATH=%DevEnvDir%;%MSVCDir%\BIN;%VCINSTALLDIR%\Common7\Tools;%VCINSTALLDIR%Common7 \Tools\bin\prerelease;%VCINSTALLDIR%\Common7\Tools\bin;%FrameworkSDKDir%\bin;%FrameworkDir%\%FrameworkVersion%;%PATH%; SET INCLUDE=%MSVCDir%\ATLMFC\INCLUDE;%MSVCDir%\INCLUDE;%MSVCDir%\PlatformSDK\include\gl;%MSVCDir%\PlatformSDK\include;%FrameworkSDKDir%\include;%INCLUDE% SET LIB=%MSVCDir%\ATLMFC\LIB;%MSVCDir%\LIB;%MSVCDir%\PlatformSDK\lib;%FrameworkSDKDir%\lib;%LIB% GOTO end :Usage ECHO. VSINSTALLDIR variable is not set. ECHO. ECHO SYNTAX: %0 GOTO end :end Here the console output, what I not understand here and find very suspicious is that after the “IF %ERRORLEVEL%” Statement in the first script all will be put out regardless the echo is set to off… Executing: "C:\Programme\IBM\Rhapsody752\Share\etc\vs2005_make.bat" Simulation.mak build IF %ERRORLEVEL%==0 ( Mehr? set arch=x86 Mehr? ) ELSE ( Mehr? set arch=x64 Mehr? ) call "%ProgramFiles%\IBM\Rhapsody752\Share\etc\Vcvars_VisualStudio2005.bat" %arch% :: param 1 - Processor architecture @echo off ECHO param 1 = %1 param 1 = x86 IF %1==x86 ( Mehr? SET ProgrammPath=%ProgramFiles% Mehr? ) ELSE IF %1==x64 ( Mehr? SET ProgrammPath=%ProgramFiles(x86)% Mehr? ) ELSE ( Mehr? ECHO Unknowen architectur Mehr? EXIT /B 1 Mehr? ) SET VSINSTALLDIR="%ProgrammPath%\Microsoft Visual Studio 8\Common7\IDE" SET VCINSTALLDIR="%ProgrammPath%\Microsoft Visual Studio 8" SET FrameworkDir=C:\WINDOWS\Microsoft.NET\Framework SET FrameworkVersion=v2.0.50727 SET FrameworkSDKDir="%ProgrammPath%\Microsoft Visual Studio 8\SDK\v2.0" rem Root of Visual Studio common files. IF %VSINSTALLDIR%=="" GOTO Usage IF %VCINSTALLDIR%=="" SET VCINSTALLDIR=%VSINSTALLDIR% rem rem Root of Visual Studio ide installed files. rem SET DevEnvDir=%VSINSTALLDIR% rem rem Root of Visual C++ installed files. rem SET MSVCDir=%VCINSTALLDIR%\VC SET PATH=%DevEnvDir%;%MSVCDir%\BIN;%VCINSTALLDIR%\Common7\Tools;%VCINSTALLDIR%\Common7\Tools\bin\prerelease;%VCINSTALLDIR%\Common7\Tools\bin;%FrameworkSDKDir%\bin;%FrameworkDir%\%FrameworkVersion%;%PATH%; SET INCLUDE=%MSVCDir%\ATLMFC\INCLUDE;%MSVCDir%\INCLUDE;%MSVCDir%\PlatformSDK\include\gl;%MSVCDir%\PlatformSDK\include;%FrameworkSDKDir%\include;%INCLUDE% SET LIB=%MSVCDir%\ATLMFC\LIB;%MSVCDir%\LIB;%MSVCDir%\PlatformSDK\lib;%FrameworkSDKDir%\lib;%LIB% GOTO end Build Done I hope someone have an idea, i work now since two days and don't find the error... thanking you in anticipation. Note: The word “Mehr?” in the text output is german and means “more”. I don't know were it comes from and it is possible that it is a bad translation from the English output to german.

    Read the article

  • Remote Debug Windows Azure Cloud Service

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/11/02/remote-debug-windows-azure-cloud-service.aspxOn the 22nd of October Microsoft Announced the new Windows Azure SDK 2.2. It introduced a lot of cool features but one of it shocked most, which is the remote debug support for Windows Azure Cloud Service (a.k.a. WACS).   Live Debug is Nightmare for Cloud Application When we are developing against public cloud, debug might be the most difficult task, especially after the application had been deployed. In order to minimize the debug effort, Microsoft provided local emulator for cloud service and storage once the Windows Azure platform was announced. By using local emulator developers could be able run their application on local machine with almost the same behavior as running on Windows Azure, and that could be debug easily and quickly. But when we deployed our application to Azure, we have to use log, diagnostic monitor to debug, which is very low efficient. Visual Studio 2012 introduced a new feature named "anonymous remote debug" which allows any workstation under any user could be able to attach the remote process. This is less secure comparing the authenticated remote debug but much easier and simpler to use. Now in Windows Azure SDK 2.2, we could be able to attach our application from our local machine to Windows Azure, and it's very easy.   How to Use Remote Debugger First, let's create a new Windows Azure Cloud Project in Visual Studio and selected ASP.NET Web Role. Then create an ASP.NET WebForm application. Then right click on the cloud project and select "publish". In the publish dialog we need to make sure the application will be built in debug mode, since .NET assembly cannot be debugged in release mode. I enabled Remote Desktop as I will log into the virtual machine later in this post. It's NOT necessary for remote debug. And selected "advanced settings" tab, make sure we checked "Enable Remote Debugger for all roles". In WACS, a cloud service could be able to have one or more roles and each role could be able to have one or more instances. The remote debugger will be enabled for all roles and all instances if we checked. Currently there's no way for us to specify which role(s) and which instance(s) to enable. Finally click "publish" button. In the windows azure activity window in Visual Studio we can find some information about remote debugger. To attache remote process would be easy. Open the "server explorer" window in Visual Studio and expand "cloud services" node, find the cloud service, role and instance we had just published and wanted to debug, right click on the instance and select "attach debugger". Then after a while (it's based on how fast our Internet connect to Windows Azure Data Center) the Visual Studio will be switched to debug mode. Let's add a breakpoint in the default web page's form load function and refresh the page in browser to see what's happen. We can see that the our application was stopped at the breakpoint. The call stack, watch features are all available to use. Now let's hit F5 to continue the step, then back to the browser we will find the page was rendered successfully.   What Under the Hood Remote debugger is a WACS plugin. When we checked the "enable remote debugger" in the publish dialog, Visual Studio will add two cloud configuration settings in the CSCFG file. Since they were appended when deployment, we cannot find in our project's CSCFG file. But if we opened the publish package we could find as below. At the same time, Visual Studio will generate a certificate and included into the package for remote debugger. If we went to the azure management portal we will find there will a certificate under our application which was created, uploaded by remote debugger plugin. Since I enabled Remote Desktop there will be two certificates in the screenshot below. The other one is for remote debugger. When our application was deployed, windows azure system will open related ports for remote debugger. As below you can see there are two new ports opened on my application. Finally, in our WACS virtual machine, windows azure system will copy the remote debug component based on which version of Visual Studio we are using and start. Our application then can be debugged remotely through the visual studio remote debugger. Below is the task manager on the virtual machine of my WACS application.   Summary In this post I demonstrated one of the feature introduced in Windows Azure SDK 2.2, which is Remote Debugger. It allows us to attach our application from local machine to windows azure virtual machine once it had been deployed. Remote debugger is powerful and easy to use, but it brings more security risk. And since it's only available for debug build this means the performance will be worse than release build. Hence we should only use this feature for staging test and bug fix (publish our beta version to azure staging slot), rather than for production.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • BizTalk 2009 - Installing BizTalk Server 2009 on XP for Development

    - by StuartBrierley
    At my previous employer, when developing for BizTalk Server 2004 using Visual Studio 2003, we made use of separate development and deployment environments; developing in Visual Studio on our client PCs and then deploying to a seperate shared BizTalk 2004 Server from there.  This server was part of a multi-server Standard BizTalk environment comprising of separate BizTalk Server 2004 and SQL Server 2000 servers.  This environment was implemented a number of years ago by an outside consulting company, and while it worked it did occasionally cause contention issues with three developers deploying to the same server to carry out unit testing! Now that I am making the design and implementation decisions about the environment that BizTalk will be developed in and deployed to, I have chosen to create a single "server" installation on my development PC, installling SQL Server 2008, Visual Studio 2008 and BizTalk Server 2009 on a single system.  The client PC in use is actually a MacBook Pro running Windows XP; not the most powerful of systems for high volume processing but it should be powerful enough to allow development and initial unit testing to take place. I did not need to, and so chose not to, install all of the components detailed in the Microsoft guide for installing BizTalk 2009 on Windows XP but I did follow the basics of the procedures detailed within.  Outlined below are the highlights of this process and any details of what choices I made.   Install IIS I had previsouly installed Windows XP, including all current service packs and critical updates.  At the time of installation this included Service Pack 3, the .Net Framework 3.5 and MS Windows Installer 3.1.  Having a running XP system, my first step was to install IIS - this is quite straightforward and posed no difficulties. Install Visual Studio 2008 The next step for me was to install Visual Studio 2008.  Making sure to select a custom installation is crucial at this point, as you need to make sure that you deselect SQL Server 2005 Express Edition as it can cause the BizTalk installation to fail.  The installation guide suggests that you only select Visual C# when selecting features to install, but  I decided that due to some legacy systems I have code for that I would also select the VB and ASP options. Visual Studio 2008 Service Pack 1 Following the completion of the installation of Visual Studio itself you should then install the Visual Studio 2008 Service Pack 1. SQL Server 2008 Standard Edition The next step before intalling BizTalk Server 2009 itself is to install SQL Server 2008 Standard Edition. On the feature selection screen make sure that you select the follwoing options: Database Engine Services SQL Server Replication Full-Text Search Analysis Services Reporting Services Business Intelligence Development Studio Client Tools Connectivity Integration Services Management Tools Basic and Complete Use the default instance and the same accounts for all SQL server instances - in my case I used the Network Service and Local Service accounts for the two sets of accounts. On the database engine configuration screen I selected windows authentication and added the current user, adding the same user again on the Analysis services Configuration screen.  All other screens were left on the default settings. The SQL Server 2008 installation also included the installation of hotfix for XP KB942288-v3, the Windows Installer 4.5 Redistributable. System Configuration At this stage I took a moment to disable the SQL Server shared memory protocol and enable the Named Pipes and TCP/IP protocols.  These can be found in the SQL Server Configuration Manager > SQL Server Network Configuration > Protocols for MSSQLServer.  I also made sure that the DTC settings were configured correctley.   BizTalk Server 2009 The penultimate step is to install BizTalk Server 2009 Standard Edition. I had previsouly downloaded the redistributable prerequisites as a CAB file so was able to make use of this when carrying out the installation. When selecting which components to install I selected: Server Runtime BizTalk EDI/AS2 Runtime WCF Adapter Runtime Portal Components Administrative Tools WFC Administartion Tools Developer Tools and SDK, Enterprise SSO Administration Module Enterprise SSO Master Secret Server Business Rules Components BAM Alert Provider BAM Client BAM Eventing Once installation has completed clear the launch BizTalk Server Configuration check box and select finish. Verify the Installation Before configuring BizTalk Server it is a good idea to check that BizTalk Server 2009 is installed and that SQL Server 2008 has started correctly.  The easiest way to verify the BizTalk installation is check the Programs and Features in Control panel.  Check that SQL is started by looking in the SQL Server Configuration Manager. Configure BizTalk Server 2009 Finally we are ready to configure BizTalk Server 2009.  To start this I opted for a custom configuration that allowed me to choose in more detail the settings to be used. For all databases I selected the local server and default database names. For all Accounts I used a local account that had been created specifically for the BizTalk Services. For all windows groups I allowed the configuration wizard to create the default local groups. The configuration wizard then ran:   Upon completion you will be presented with a screen detailing the success or failure of the configuration.  If your configuration failed you will need to sort out the issues and try again (it is possible to save the configuration settings for later use if you want too - except passwords of course!).  If you see lots of nice green ticks - congratulations BizTalk Server 2009 on XP is now installed and configured ready for development.

    Read the article

  • Adventures in Windows 8: Understanding and debugging design time data in Expression Blend

    - by Laurent Bugnion
    One of my favorite features in Expression Blend is the ability to attach a Visual Studio debugger to Blend. First let’s start by answering the question: why exactly do you want to do that? Note: If you are familiar with the creation and usage of design time data, feel free to scroll down to the paragraph titled “When design time data fails”. Creating design time data for your app When a designer works on an app, he needs to see something to design. For “static” UI such as buttons, backgrounds, etc, the user interface elements are going to show up in Blend just fine. If however the data is fetched dynamically from a service (web, database, etc) or created dynamically, most probably Blend is going to show just an empty element. The classical way to design at that stage is to run the application, navigate to the screen that is under construction (which can involve delays, need to log in, etc…), to measure what is on the screen (colors, margins, width and height, etc) using various tools, going back to Blend, editing the properties of the elements, running again, etc. Obviously this is not ideal. The solution is to create design time data. For more information about the creation of design time data by mocking services, you can refer to two talks of mine “Deep dive MVVM” and “MVVM Applied From Silverlight to Windows Phone to Windows 8”. The source code for these talks is here and here. Design time data in MVVM Light One of the main reasons why I developed MVVM Light is to facilitate the creation of design time data. To illustrate this, let’s create a new MVVM Light application in Visual Studio. Install MVVM Light from here: http://mvvmlight.codeplex.com (use the MSI in the Download section). After installing, make sure to read the Readme that opens up in your favorite browser, you will need one more step to install the Project Templates. Start Visual Studio 2012. Create a new MvvmLight (Win8) app. Run the application. You will see a string showing “Welcome to MVVM Light”. In the Solution explorer, right click on MainPage.xaml and select Open in Blend. Now you should see “Welcome to MVVM Light [Design]” What happens here is that Expression Blend runs different code at design time than the application runs at runtime. To do this, we use design-time detection (as explained in a previous article) and use that information to initialize a different data service at design time. To understand this better, open the ViewModelLocator.cs file in the ViewModel folder and see how the DesignDataService is used at design time, while the DataService is used at runtime. In a real-life applicationm, DataService would be used to connect to a web service, for instance. When design time data fails Sometimes however, the creation of design time data fails. It can be very difficult to understand exactly what is happening. Expression Blend is not giving a lot of information about what happened. Thankfully, we can use a trick: Attaching a debugger to Expression Blend and debug the design time code. In WPF and Silverlight (including Windows Phone 7), you could simply attach the debugger to Blend.exe (using the “Managed (v4.5, v4.0) code” option even for Silverlight!!) In Windows 8 however, things are just a bit different. This is because the designer that renders the actual representation of the Windows 8 app runs in its own process. Let’s illustrate that: Open the file DesignDataService in the Design folder. Modify the GetData method to look like this: public void GetData(Action<DataItem, Exception> callback) { throw new Exception(); // Use this to create design time data var item = new DataItem("Welcome to MVVM Light [design]"); callback(item, null); } Go to Blend and build the application. The build succeeds, but now the page is empty. The creation of the design time data failed, but we don’t get a warning message. We need to investigate what’s wrong. Close MainPage.xaml Go to Visual Studio and select the menu Debug, Attach to Process. Update: Make sure that you select “Managed (v4.5, v4.0) code” in the “Attach to” field. Find the process named XDesProc.exe. You should have at least two, one for the Visual Studio 2012 designer surface, and one for Expression Blend. Unfortunately in this screen it is not obvious which is which. Let’s find out in the Task Manager. Press Ctrl-Alt-Del and select Task Manager Go to the Details tab and sort the processes by name. Find the one that says “Blend for Microsoft Visual Studio 2012 XAML UI Designer” and write down the process ID. Go back to the Attach to Process dialog in Visual Studio. sort the processes by ID and attach the debugger to the correct instance of XDesProc.exe. Open the MainViewModel (in the ViewModel folder) Place a breakpoint on the first line of the MainViewModel constructor. Go to Blend and open the MainPage.xaml again. At this point, the debugger breaks in Visual Studio and you can execute your code step by step. Simply step inside the dataservice call, and find the exception that you had placed there. Visual Studio gives you additional information which helps you to solve the issue. More info and Conclusion I want to thank the amazing people on the Expression Blend team for being very fast in guiding me in that matter and encouraging me to blog about it. More information about the XDesProc.exe process can be found here. I had to work on a Windows 8 app for a few days without design time data because of an Exception thrown somewhere in the code, and it was really painful. With the debugger, finding the issue was a simple matter of stepping into the code until it threw the exception.   Laurent Bugnion (GalaSoft) Subscribe | Twitter | Facebook | Flickr | LinkedIn

    Read the article

  • Package Manager Console For More Than Managing Packages

    - by Steve Michelotti
    Like most developers, I prefer to not have to pick up the mouse if I don’t have to. I use the Executor launcher for almost everything so it’s extremely rare for me to ever click the “Start” button in Windows. I also use shortcuts keys when I can so I don’t have to pick up the mouse. By now most people know that the Package Manager Console that comes with NuGet is PowerShell embedded inside of Visual Studio. It is based on its PowerConsole predecessor which was the first (that I’m aware of) to embed PowerShell inside of Visual Studio and give access to the Visual Studio automation DTE object. It does this through an inherent $dte variable that is automatically available and ready for use. This variable is also available inside of the NuGet Package Manager console. Adding a new class file to a Visual Studio project is one of those mundane tasks that should be easier. First I have to pick up the mouse. Then I have to right-click where I want it file to go and select “Add –> New Item…” or “Add –> Class…”   If you know the Ctrl+Shift+A shortcut, then you can avoid the mouse for adding a new item but you have to manually assign a shortcut for adding a new class. At this point it pops up a dialog just so I can enter the name of the class I want. Since this is one of the most common tasks developers do, I figure there has to be an easier way and a way that avoids picking up the mouse and popping up dialogs. This is where your embedded PowerShell prompt in Visual Studio comes in. The first thing you should do is to assign a keyboard shortcut so that you can get a PowerShell prompt (i.e., the Package Manager console) quickly without ever picking up the mouse. I assign “Ctrl+P, Ctrl+M” because “P + M” stands for “Package Manager” so it is easy to remember:   At this point I can type this command to add a new class: PM> $dte.ItemOperations.AddNewItem("Code\Class", "Foo.cs") which will result in the class being added: At this point I’ve satisfied my original goal of not having to pick up a mouse and not having the “Add New Item” dialog pop up. However, having to remember that $dte method call is not very user-friendly at all. The best thing to do is to make this a re-usable function that always loads when Visual Studio starts up. There is a $profile variable that you can use to figure out where that location is for your machine: PM> $profile C:\Users\steve.michelotti\Documents\WindowsPowerShell\NuGet_profile.ps1 If the NuGet_profile.ps1 file does not already exist, you can just create it yourself and place it in the directory. Now you can put a function inside like this: 1: function addClass($className) 2: { 3: if ($className.EndsWith(".cs") -eq $false) { 4: $className = $className + ".cs" 5: } 6: 7: $dte.ItemOperations.AddNewItem("Code\Class", $className) 8: } Since it’s in the NuGet_profile.ps1 file, this function will automatically always be available for me after starting Visual Studio. Now I can simply do this: PM> addClass Foo At this point, we have a *very* nice developer experience. All I did to add a new class was: “Ctrl-P, Ctrl-M”, then “addClass Foo”. No mouse, no pop up dialogs, no complex commands to remember. In fact, PowerShell gives you auto-completion as well. If I type “addc” followed by [TAB], then intellisense pops up: You can see my custom function appear in intellisense above. Now I can type the next letter “c” and [TAB] to auto-complete the command. And if that’s still too many key strokes for you, then you can create your own PowerShell custom alias for your function like this: PM> Set-Alias addc addClass PM> addc Foo While all this is very useful, I did run into some issues which prompted me to make even further customization. This command will add the new class file to the current active directory. Depending on your context, this may not be what you want. For example, by convention all view model objects go in the “Models” folder in an MVC project. So if the current document is in the Controllers folder, it will add your class to that folder which is not what you want. You want it to always add it to the “Models” folder if you are adding a new model in an MVC project. For this situation, I added a new function called “addModel” which looks like this: 1: function addModel($className) 2: { 3: if ($className.EndsWith(".cs") -eq $false) { 4: $className = $className + ".cs" 5: } 6: 7: $modelsDir = $dte.ActiveSolutionProjects[0].UniqueName.Replace(".csproj", "") + "\Models" 8: $dte.Windows.Item([EnvDTE.Constants]::vsWindowKindSolutionExplorer).Activate() 9: $dte.ActiveWindow.Object.GetItem($modelsDir).Select([EnvDTE.vsUISelectionType]::vsUISelectionTypeSelect) 10: $dte.ItemOperations.AddNewItem("Code\Class", $className) 11: } First I figure out the path to the Models directory on line #7. Then I activate the Solution Explorer window on line #8. Then I make sure the Models directory is selected so that my context is correct when I add the new class and it will be added to the Models directory as desired. These are just a couple of examples for things you can do with the PowerShell prompt that you have available in the Package Manager console. As developers we spend so much time in Visual Studio, why would you not customize it so that you can work in whatever way you want to work?! The next time you’re not happy about the way Visual Studio makes you do a particular task – automate it! The sky is the limit.

    Read the article

  • "The system time has changed" events after waking from sleep

    - by Damir Arh
    Sometimes when my computer running Windows 7 wakes up from sleep, it has to adjust the time. When this happens the following system event is logged: <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'> <System> <Provider Name='Microsoft-Windows-Kernel-General' Guid='{A68CA8B7-004F-D7B6-A698-07E2DE0F1F5D}'/> <EventID>1</EventID> <Version>0</Version> <Level>4</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000010</Keywords> <TimeCreated SystemTime='2010-03-06T19:09:57.500000000Z'/> <EventRecordID>10672</EventRecordID> <Correlation/> <Execution ProcessID='4' ThreadID='56'/> <Channel>System</Channel> <Computer>GAME</Computer> <Security/> </System> <EventData> <Data Name='NewTime'>2010-03-06T19:09:57.500000000Z</Data> <Data Name='OldTime'>2010-03-06T17:34:32.870117200Z</Data> </EventData> <RenderingInfo Culture='sl-SI'> <Message>The system time has changed to ?2010?-?03?-?06T19:09:57.500000000Z from ?2010?-?03?-?06T17:34:32.870117200Z.</Message> <Level>Information</Level> <Task></Task> <Opcode>Info</Opcode> <Channel>System</Channel> <Provider>Microsoft-Windows-Kernel-General</Provider> <Keywords> <Keyword>Time</Keyword> </Keywords> </RenderingInfo> </Event> When this happens (I noticed it twice until now) the old time always corresponds to the time when computer entered sleep. The problem is that if Windows Media Center is scheduled for recording during this time, it just skips it as if the computer was turned off. I never had this problem running Windows Vista on the same machine. Any ideas what could be causing this problem and how to solve it are welcome.

    Read the article

  • Failed MDADM Array With Ext.4 Partition - "e2fsck: unable to set superblock flags on /dev/md0"

    - by Matthew Hodgkins
    Had a power failure and now my mdadm array is having problems. sudo mdadm -D /dev/md0 [hodge@hodge-fs ~]$ sudo mdadm -D /dev/md0 /dev/md0: Version : 0.90 Creation Time : Sun Apr 25 01:39:25 2010 Raid Level : raid5 Array Size : 8790815232 (8383.57 GiB 9001.79 GB) Used Dev Size : 1465135872 (1397.26 GiB 1500.30 GB) Raid Devices : 7 Total Devices : 7 Preferred Minor : 0 Persistence : Superblock is persistent Update Time : Sat Aug 7 19:10:28 2010 State : clean, degraded, recovering Active Devices : 6 Working Devices : 7 Failed Devices : 0 Spare Devices : 1 Layout : left-symmetric Chunk Size : 128K Rebuild Status : 10% complete UUID : 44a8f730:b9bea6ea:3a28392c:12b22235 (local to host hodge-fs) Events : 0.1307608 Number Major Minor RaidDevice State 0 8 81 0 active sync /dev/sdf1 1 8 97 1 active sync /dev/sdg1 2 8 113 2 active sync /dev/sdh1 3 8 65 3 active sync /dev/sde1 4 8 49 4 active sync /dev/sdd1 7 8 33 5 spare rebuilding /dev/sdc1 6 8 16 6 active sync /dev/sdb sudo mount -a [hodge@hodge-fs ~]$ sudo mount -a mount: wrong fs type, bad option, bad superblock on /dev/md0, missing codepage or helper program, or other error In some cases useful info is found in syslog - try dmesg | tail or so sudo fsck.ext4 /dev/md0 [hodge@hodge-fs ~]$ sudo fsck.ext4 /dev/md0 e2fsck 1.41.12 (17-May-2010) fsck.ext4: Group descriptors look bad... trying backup blocks... /dev/md0: recovering journal fsck.ext4: unable to set superblock flags on /dev/md0 sudo dumpe2fs /dev/md0 | grep -i superblock [hodge@hodge-fs ~]$ sudo dumpe2fs /dev/md0 | grep -i superblock dumpe2fs 1.41.12 (17-May-2010) Primary superblock at 0, Group descriptors at 1-524 Backup superblock at 32768, Group descriptors at 32769-33292 Backup superblock at 98304, Group descriptors at 98305-98828 Backup superblock at 163840, Group descriptors at 163841-164364 Backup superblock at 229376, Group descriptors at 229377-229900 Backup superblock at 294912, Group descriptors at 294913-295436 Backup superblock at 819200, Group descriptors at 819201-819724 Backup superblock at 884736, Group descriptors at 884737-885260 Backup superblock at 1605632, Group descriptors at 1605633-1606156 Backup superblock at 2654208, Group descriptors at 2654209-2654732 Backup superblock at 4096000, Group descriptors at 4096001-4096524 Backup superblock at 7962624, Group descriptors at 7962625-7963148 Backup superblock at 11239424, Group descriptors at 11239425-11239948 Backup superblock at 20480000, Group descriptors at 20480001-20480524 Backup superblock at 23887872, Group descriptors at 23887873-23888396 Backup superblock at 71663616, Group descriptors at 71663617-71664140 Backup superblock at 78675968, Group descriptors at 78675969-78676492 Backup superblock at 102400000, Group descriptors at 102400001-102400524 Backup superblock at 214990848, Group descriptors at 214990849-214991372 Backup superblock at 512000000, Group descriptors at 512000001-512000524 Backup superblock at 550731776, Group descriptors at 550731777-550732300 Backup superblock at 644972544, Group descriptors at 644972545-644973068 Backup superblock at 1934917632, Group descriptors at 1934917633-1934918156 sudo e2fsck -b 32768 /dev/md0 [hodge@hodge-fs ~]$ sudo e2fsck -b 32768 /dev/md0 e2fsck 1.41.12 (17-May-2010) /dev/md0: recovering journal e2fsck: unable to set superblock flags on /dev/md0 sudo dmesg | tail [hodge@hodge-fs ~]$ sudo dmesg | tail EXT4-fs (md0): ext4_check_descriptors: Checksum for group 0 failed (59837!=29115) EXT4-fs (md0): group descriptors corrupted! EXT4-fs (md0): ext4_check_descriptors: Checksum for group 0 failed (59837!=29115) EXT4-fs (md0): group descriptors corrupted! Please Help!!!

    Read the article

  • Backup VM : Copy virtual disk xxx.vmdk The virtual disk is either corrupted or not a supported forma

    - by boiteavinc
    Hi I'm french. I'm student and I use ESXI4 for my studies. When I try backup my VMs, with ghettoVCBg2.pl and vSphere Management Assistant, I get the following error message on Vsphere Client : "Copy virtual disk xxx.vmdk The virtual disk is either corrupted or not a supported format". I have no error in the log VCB ; 03-23-2010 08:31:56 -- debug: Main: Login by vi-fastpass to: esxi 03-23-2010 08:31:56 -- debug: copyTask: Task START 03-23-2010 08:31:56 -- debug: copyTask: waiting for next job and sleep ... 03-23-2010 08:32:00 -- info: Initiate backup for AD_DNS_DHCP found on esxi 03-23-2010 08:32:09 -- debug: AD_DNS_DHCP original powerState: poweredOn 03-23-2010 08:32:09 -- debug: Creating Snapshot "ghettoVCBg2-snapshot-2010-03-23" for AD_DNS_DHCP 03-23-2010 08:33:19 -- info: AD_DNS_DHCP has 1 VMDK(s) 03-23-2010 08:33:19 -- debug: backupVMDK: Backing up "Raptor1 AD_DNS_DHCP/AD_DNS_DHCP.vmdk" to "Backup_VM VM/AD_DNS_DHCP/AD_DNS_DHCP-2010-03-23/$ 03-23-2010 08:33:19 -- debug: backupVMDK: Signal copyThread to start 03-23-2010 08:33:19 -- debug: backupVMDK: Backup progress: Elapsed time 0 min 03-23-2010 08:33:19 -- debug: copyTask: Wake up and follow the white rabbit, with status: doCopy 03-23-2010 08:33:19 -- debug: CopyThread: Start backing up VMDK(s) ... 03-23-2010 08:33:25 -- debug: copyTask: send copySuccess message ... 03-23-2010 08:33:25 -- debug: copyTask: waiting for next job and sleep ... 03-23-2010 08:34:20 -- debug: backupVMDK: Successfully completed backup for Raptor1 AD_DNS_DHCP/AD_DNS_DHCP.vmdk Elapsed time: 1 min 03-23-2010 08:34:22 -- debug: Removing Snapshot "ghettoVCBg2-snapshot-2010-03-23" for AD_DNS_DHCP 03-23-2010 08:34:24 -- debug: checkVMBackupRotation: Starting ... 03-23-2010 08:34:26 -- debug: Purging Backup_VM VM/AD_DNS_DHCP/AD_DNS_DHCP-2010-03-23--1 due to rotation max 03-23-2010 08:34:28 -- info: Backup completed for AD_DNS_DHCP! 03-23-2010 08:34:28 -- debug: Main: Disconnect from: esxi 03-23-2010 08:34:28 -- debug: Main: Calling final clean up 03-23-2010 08:34:28 -- debug: cleanUP: Thread clean up starting ... 03-23-2010 08:34:28 -- debug: cleanUp: Send exit to copyThread 03-23-2010 08:34:28 -- debug: copyTask: Wake up and follow the white rabbit, with status: exit 03-23-2010 08:34:28 -- debug: copyTask: die ... 03-23-2010 08:34:28 -- debug: cleanUp: Join passed 03-23-2010 08:34:28 -- info: ============================== ghettoVCBg2 LOG END ============================== My ghetto conf file is : VM_BACKUP_DATASTORE = "Backup_VM" VM_BACKUP_DIRECTORY = "VM" VM_BACKUP_ROTATION_COUNT = "3" DISK_BACKUP_FORMAT = "thin" ADAPTER_FORMAT = "lsilogic" POWER_VM_DOWN_BEFORE_BACKUP = "0" VM_SNAPSHOT_MEMORY = "1" VM_SNAPSHOT_QUIESCE = "1" LOG_LEVEL = "info" VM_VMDK_FILES = "all" I tried several I tried several DISK_BACKUP_FORMAT, I have same error. Despite the error, even when I get files on the NFS share. But when I try to open the vmx file with vmware workstation. I get this error: Can not open the disk 'G: \ Backup ESX \ VM \ AD_DNS_DHCP \ AD_DNS_DHCP-2010-03-23 - 1 \ AD_DNS_DHCP.vmdk' or one of the snapshot disks it depends on. Reason: The called function can not be performed on partial chains. Please open the parent virtual disk. I have no snapshot on my VM on ESXi. Can you help me ?

    Read the article

  • "The system time has changed" avents after waking from sleep

    - by Damir Arh
    Sometimes when my computer running Windows 7 wakes up from sleep, it has to adjust the time. When this happens the following system event is logged: <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'> <System> <Provider Name='Microsoft-Windows-Kernel-General' Guid='{A68CA8B7-004F-D7B6-A698-07E2DE0F1F5D}'/> <EventID>1</EventID> <Version>0</Version> <Level>4</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000010</Keywords> <TimeCreated SystemTime='2010-03-06T19:09:57.500000000Z'/> <EventRecordID>10672</EventRecordID> <Correlation/> <Execution ProcessID='4' ThreadID='56'/> <Channel>System</Channel> <Computer>GAME</Computer> <Security/> </System> <EventData> <Data Name='NewTime'>2010-03-06T19:09:57.500000000Z</Data> <Data Name='OldTime'>2010-03-06T17:34:32.870117200Z</Data> </EventData> <RenderingInfo Culture='sl-SI'> <Message>The system time has changed to ?2010?-?03?-?06T19:09:57.500000000Z from ?2010?-?03?-?06T17:34:32.870117200Z.</Message> <Level>Information</Level> <Task></Task> <Opcode>Info</Opcode> <Channel>System</Channel> <Provider>Microsoft-Windows-Kernel-General</Provider> <Keywords> <Keyword>Time</Keyword> </Keywords> </RenderingInfo> </Event> When this happens (I noticed it twice until now) the old time always corresponds to the time when computer entered sleep. The problem is that if Windows Media Center is scheduled for recording during this time, it just skips it as if the computer was turned off. I never had this problem running Windows Vista on the same machine. Any ideas what could be causing this problem and how to solve it are welcome.

    Read the article

  • Email can't be sent to my domain

    - by Jack W-H
    Hi Folks, Basically I have my domain howcode.com bought at DomainMonster.com. I have set it all up to point to MediaTemple nameservers and everything works - mostly - fine. I have registered an email address [email protected]. The setup is, I presume, working correctly. I can successfully send emails with the account. And I presume I can receive them - but the problem is, nobody can send them to me. Emailing from a regular, non-Googlemail account appears to work fine but it never arrives in the inbox. But when you email from a GoogleMail address, an error message is instantly returned saying this: Delivery to the following recipient failed permanently: [email protected] Technical details of permanent failure: Google tried to deliver your message, but it was rejected by the recipient domain. We recommend contacting the other email provider for further information about the cause of this error. The error that the other server returned was: 550 550 relay not permitted (state 14). ----- Original message ----- Received: by 10.216.91.12 with SMTP id g12mr3673969wef.77.1271503997091; Sat, 17 Apr 2010 04:33:17 -0700 (PDT) Return-Path: Received: from [192.168.0.3] (client-81-98-94-79.cht-bng-014.adsl.virginmedia.net [81.98.94.79]) by mx.google.com with ESMTPS id x1sm29451927wbx.19.2010.04.17.04.33.15 (version=TLSv1/SSLv3 cipher=RC4-MD5); Sat, 17 Apr 2010 04:33:16 -0700 (PDT) From: Jack Webb-Heller Mime-Version: 1.0 (Apple Message framework v1078) Content-Type: multipart/alternative; boundary=Apple-Mail-7--1008464685 Subject: Re: Hi Date: Sat, 17 Apr 2010 12:33:14 +0100 In-Reply-To: <[email protected] To: Jack Webb-Heller References: <[email protected] Message-Id: <[email protected] X-Mailer: Apple Mail (2.1078) Does this work? On 17 Apr 2010, at 12:32, Jack Webb-Heller wrote: Hi I thought this may be something to do with my MX DNS settings. They are setup like so: MX name: howcode.com TTL: 43200 Type: MX Data: 10 mail.howcode.com. The A-Record for mail.howcode.com is setup like this: Name: mail.howcode.com TTL: 43200 Type: A Data: 205.186.187.129 Is this what's going wrong with the issue? Thanks very much Jack

    Read the article

  • Php 5.3.3. Access log

    - by irolla
    Hi I'm using php-fpm. In 5.3.2 when I'm opening phpinfo page in access log I get: ip - - [26/Aug/2010:16:35:32 +0400] "GET /phpinfo.php HTTP/1.1" 200 13322 "-" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5" But in 5.3.3 I'm getting: ip - - [26/Aug/2010:16:30:30 +0400] "GET /phpinfo.php HTTP/1.1" 200 11891 "-" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5" ip - - [26/Aug/2010:16:30:30 +0400] "GET /phpinfo.php?=PHPE9568F34-D428-11d2-A769-00AA001ACF42 HTTP/1.1" 200 2536 "http://site.com/phpinfo.php" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5" ip - - [26/Aug/2010:16:30:30 +0400] "GET /phpinfo.php?=SUHO8567F54-D428-14d2-A769-00DA302A5F18 HTTP/1.1" 200 2825 "http://site.com/phpinfo.php" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5" ip - - [26/Aug/2010:16:30:30 +0400] "GET /phpinfo.php?=PHPE9568F35-D428-11d2-A769-00AA001ACF42 HTTP/1.1" 200 2158 "http://site.com/phpinfo.php" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5" Why there is 4 lines insted of 1? And what means "?=PHPE...". Is it PHP sessions? My php5.3.3 fpm config: [global] pid = /var/run/php5-fpm.pid error_log = /var/log/php5-fpm.log log_level = notice [pool_0] listen = 127.0.0.1:9000 listen.backlog = -1 listen.allowed_clients = 127.0.0.1 user = www-data group = www-data pm = dynamic pm.max_children = 50 pm.min_spare_servers = 5 pm.max_spare_servers = 35 pm.max_requests = 500 pm.status_path = /pool_0/status rlimit_files = 1024 rlimit_core = 0 catch_workers_output = yes php_admin_flag[register_globals] = true php_admin_value[error_reporting] = E_ALL & ~E_DEPRECATED php_admin_value[max_execution_time] = 15 php_admin_flag[short_open_tag] = true php_admin_flag[display_errors] = false

    Read the article

  • Microsoft, please help me diagnose TFS Administration permission issues!

    - by Martin Hinshelwood
    I recently had a fun time trying to debug a permission issue I ran into using TFS 2010’s TfsConfig. Update 5th March 2010 – In its style of true excellence my company has added rant to its “Suggestions for Better TFS”. <rant> I was trying to run the TfsConfig tool and I kept getting the message: “TF55038: You don't have sufficient privileges to run this tool. Contact your Team Foundation system administrator." This message made me think that it was something to do with the Install permissions as it is always recommended to use a single account to do every install of TFS. I did not install the original TFS on our network and my account was not used to do the TFS2010 install. But I did do the upgrade from 2010 beta 2 to 2010 RC with my current account. So I proceeded to do some checking: Am I in the administrators group on the server? Figure: Yes, I am in the administrators group on the server Am I in the Administration Console users list? Figure: Yes, I am in the Administration Console users list Have I reapplied the permissions in the Administration Console users list ticking all the options? Figure: Make sure you check all of the boxed if you want to have all the admin options Figure: Yes, I have made sure that all my options are correct. Am I in the Team Foundation administrators group? Figure: Yes, I am in the Team Foundation Administrators group Is my account explicitly SysAdmin on the Database server? Figure: Yes, I do have explicit SysAdmin on the database Can you guess what the problem was? The command line window was not running as the administrator! As with most other applications there should be an explicit error message that states: "You are not currently running in administrator mode; please restart the command line with elevated privileges!" This would have saved me 30 minutes, although I agree that I should change my name to Muppet and just be done with it. </rant>   Technorati Tags: Visual Studio ALM,Administration,Team Foundation Server Admin Console,TFS Admin Console

    Read the article

  • Older SAS1 hardware Vs. newer SAS2 hardware

    - by user12620172
    I got a question today from someone asking about the older SAS1 hardware from over a year ago that we had on the older 7x10 series. They didn't leave an email so I couldn't respond directly, but I said this blog would be blunt, frank, and open so I have no problem addressing it publicly. A quick history lesson here: When Sun first put out the 7x10 family hardware, the 7410 and 7310 used a SAS1 backend connection to a JBOD that had SATA drives in it. This JBOD was not manufactured by Sun nor did Sun own the IP for it. Now, when Oracle took over, they had a problem with that, and I really can’t blame them. The decision was made to cut off that JBOD and it’s manufacturer completely and use our own where Oracle controlled both the IP and the manufacturing. So in the summer of 2010, the cut was made, and the 7410 and 7310 had a hardware refresh and now had a SAS2 backend going to a SAS2 JBOD with SAS2 drives instead of SATA. This new hardware had two big advantages. First, there was a nice performance increase, mostly due to the faster backend. Even better, the SAS2 interface on the drives allowed for a MUCH faster failover between cluster heads, as the SATA drives were the bottleneck on the older hardware. In September of 2010 there was a major refresh of the rest of the 7000 hardware, the controllers and the other family members, and that’s where we got today’s current line-up of the 7x20 series. So the 7x20 has always used the new trays, and the 7410 and 7310 have used the new SAS2 trays since last July of 2010. Now for the bad news. People who have the 7410 and 7310 from BEFORE the July 2010 cutoff have the models with SAS1 HBAs in them to connect to the older SAS1 trays. Remember, that manufacturer cut all ties with us and stopped making the JBOD, so there’s just no way to get more of them, as they don’t exist. There are some options, however. Oracle support does support taking out the SAS1 HBAs in the old 7410 and 7310 and put in newer SAS2 HBAs which can talk to the new trays. Hey, I didn’t say it was a great option, I just said it’s an option. I fully realize that you would then have a SAS1 JBOD full of SATA drives that you could no longer connect. I do know a client that did this, and took the SAS1 JBOD and connected it to another server and formatted the drives and is using it as a plain, non-7000 JBOD. This is not supported by Oracle support. The other option is to just keep it as-is, as it works just fine, but you just can’t expand it. Then you can get a newer 7x20 series, and use the built-in ZFSSA replication feature to move the data over. Now you can use the newer one for your production data and use the older one for DR, snaps and clones.

    Read the article

  • Free Online Performance Tuning Event

    - by Andrew Kelly
      On June 9th 2010 I will be showing several sessions related to performance tuning for SQL Server and they are the best kind because they are free :).  So mark your calendars. Here is the event info and URL: June 29, 2010 - 10:00 am - 3:00 pm Eastern SQL Server is the platform for business. In this day-long free virtual event, well-known SQL Server performance expert Andrew Kelly will provide you with the tools and knowledge you need to stay on top of three key areas related to peak performance...(read more)

    Read the article

  • SQLPeople Interviews - Crys Manson, Jeremiah Peschka, and Tim Mitchell

    - by andyleonard
    Introduction Late last year I announced an exciting new endeavor called SQLPeople . At the end of 2010 I announced the 2010 SQLPeople Person of the Year . Check out these interviews from your favorite SQLPeople ! Interviews To Date Tim Mitchell Jeremiah Peschka Crys Manson Ben McEwan Thomas LaRock Lori Edwards Brent Ozar Michael Coles Rob Farley Jamie Thomson Conclusion I plan to post two or three interviews each week for the forseeable future. SQLPeople is just one of the cool new things I get to...(read more)

    Read the article

  • Tulsa Azure Boot Camp

    - by dmccollough
    Windows Azure Boot Camp presented by HyperVize & TulsaTech When: Thursday July 1st and Friday July 2nd Registration: Click here. Where: TulsaTech Riverside Campus 801 East 91st Street Tulsa, Ok 74132-4008 Click here for a map. Summary Tulsa Windows Azure Boot Camp is a comprehensive 2 day training program for members of the development community in Tulsa Oklahoma. At the conclusion of this program, the attendees should have a deep understanding of Azure, BPOS, and advanced development techniques for both platforms. Who should attend: Web Developers, Backend Developers, SQL DBAs, Consultants, & IT Leaders who are interested in using Azure for development, data storage, or processing. Both days is suggested, but if you can't attend both days, contact us for a special one day pass. Schedule: Day one of the training sessions will be held from July 1st 2010 between the hours of 9AM and 4:30PM. Topics covered on day 1: Azure Basics, Web Development, & Data Storage. Day two of the training sessions will be held from July 2nd 2010 between the hours of 9AM and 4:30PM. Topics covered on day 2: Architecture, Business Value, SOA Development, SQL Azure, & Advanced Development. Pre-requisites: If you want to stay up to speed on the Windows Azure Labs you will need to install the tools and updates listed on the Windows Azure Boot Camp website: http://windowsazurebootcamp.com/whattobring Boot Camp Agenda Day 1 – July1st 2010:  · 8:30 – 9:00 - Registration · 9:00 – 10:00 - Module 1: Intro to Azure & Cloud Computing · 10:00 – 11:00 - Module 2: Using Web Roles · 11:00 – Noon - Lab 1 & workstation configuration · Noon – 1:00 - Lunch · 1:00 – 2:00 - Module 3: Blobs · 2:00 – 3:00 - Module 4: Tables · 3:00 – 4:00 - Module 5: Queues · 4:00 – ? - Q&A / Open Discussion Day 2 – July 2nd 2010: · 9:00 – 10:00 - Module 6: Building a business with Azure · 10:00 – 11:00 - Module 7: Cloud Scenarios · 11:00 – Noon - Module 8: SQL Azure · Noon – 1:00 - Lunch · 1:00 – 2:00 - Module 9: Basic Worker Roles · 2:00 – 3:00 - Module 10: Advanced Worker Roles · 3:00 – 4:00 - Module 11: Azure Diagnostics · 4:00 –    ??? - Module 12: App Fabric  

    Read the article

  • Oracle anuncia resultados de Q3 FY10

    - by Paulo Folgado
    Oracle Reports GAAP EPS of $0.23, Non-GAAP EPS of $0.38New Software Licenses Up 13%, Applications New Licenses Up 21%Oracle Corporation today announced fiscal 2010 Q3 GAAP total revenues were up 17% to $6.4 billion, while non-GAAP total revenues were up 18% to $6.5 billion. Excluding the impact of Sun Microsystems, Inc., which Oracle acquired on January 26, 2010, GAAP total revenue grew 7%. GAAP new software license revenues were up 13% to $1.7 billion, and up 10% to $1.7 billion excluding Sun. GAAP software license updates and product support revenues were up 13% to $3.3 billion, while non-GAAP software license updates and product support revenues were up 12% to $3.3 billion. GAAP operating income was down 5% to $1.8 billion, and GAAP operating margin was 29%. Non-GAAP operating income was up 13% to $2.9 billion, and non-GAAP operating margin was 45%. GAAP net income was down 10% to $1.2 billion, while non-GAAP net income was up 9% to $1.9 billion. GAAP earnings per share were $0.23, down 11% compared to last year while non-GAAP earnings per share were up 9% to $0.38. GAAP operating cash flow on a trailing twelve-month basis was $8.2 billion. "Our solid top line growth, coupled with disciplined expense management, was key in generating $8.0 billion of free cash flow over the last twelve months," said Oracle CFO Jeff Epstein."The Sun integration is going even better than we expected," said Oracle President, Safra Catz. "We believe that Sun will make a significant contribution to our fourth quarter earnings per share as well as meet the profitability goals we set for next year.""Exadata is the fastest growing product in Oracle's history," said Oracle President, Charles Phillips. "Introduced a little over a year ago, the Exadata pipeline is now approaching $400 million with Q4 bookings forecast at nearly $100 million. This strengthens both sales growth and profitability in our Sun server and storage businesses.""Every quarter we grab huge chunks of market share from SAP," said Oracle CEO, Larry Ellison. "SAP's most recent quarter was the best quarter of their year, only down 15%, while Oracle's application sales were up 21%. But SAP is well ahead of us in the number of CEOs for this year, announcing their third and fourth, while we only had one."In addition, Oracle's Board of Directors declared a cash dividend of $0.05 per share of outstanding common stock to be paid to stockholders of record as of the close of business on April 14, 2010, with a payment date of May 5, 2010. Future declarations of quarterly dividends and the establishment of future record and payment dates are subject to the final determination of Oracle's Board of Directors.Q3 Earnings Conference Call and WebcastOracle will hold a conference call and web broadcast today to discuss these results at 2:00 p.m. Pacific. You may listen to the call by dialing (800) 214-0694 or (719) 955-1425, Passcode: 567035. To access the live Web broadcast of this event, please visit the Oracle Investor Relations Web site at http://www.oracle.com/investor.

    Read the article

  • Team Review @ TSUG

    - by dmckinstry
    In case you haven’t heard, JB Brown is going to be presenting online at the Team System User Group this Thursday.  This month’s presentation will explain how Team Review (freely available) can be used with Team Foundation Server 2005, 2008 and even 2010! Meeting Date: Thursday, March 18th, 2010 Time: 5:00PM Pacific {Add to Calendar} {Join Meeting}

    Read the article

  • SQLPeople Interviews Wrap Up January 2011 with Matt Velic

    - by andyleonard
    Introduction Late last year I announced an exciting new endeavor called SQLPeople . At the end of 2010 I announced the 2010 SQLPeople Person of the Year . Check out this interview with Matt Velic! SQLPeople is off to a great start. Thanks to all who have our first month awesome - those willing to share and respond to interview requests and those who are enjoying the interviews! Here's a wrap up of January 2011: January 2011 Interviews Matt Velic Cindy Gross Steve Fibich Tim Mitchell Jeremiah Peschka...(read more)

    Read the article

  • SQLPeople Interviews - Michael Coles and Brent Ozar

    - by andyleonard
    Introduction Late last year I announced an exciting new endeavor called SQLPeople . At the end of 2010 I announced the 2010 SQLPeople Person of the Year . More interviews have been posted. Interviews To Date Jamie Thomson Rob Farley Michael Coles Brent Ozar Conclusion I plan to post two or three interviews each week for the forseeable future. SQLPeople is just one of the cool new things I get to do in 2011! :{>...(read more)

    Read the article

  • Java Space on Parleys

    - by Yolande Poirier
    Now available! A great selection of JavaOne 2010 and JVM Language Summit 2010 sessions as well as Oracle Technology Network TechCasts on the new Java Space on Parleys website. Oracle partnered with Stephan Janssen, founder of Parleys to make this happen. Parleys website offers a user friendly experience to view online content. You can download some of the talks to your desktop or watch them on the go on mobile devices. The current selection is a well of expertise from top Java luminaries and Oracle experts. JavaOne 2010 sessions: ·        Best practices for signing code by Sean Mullan   ·        Building software using rich client platforms by Rickard Thulin ·        Developing beyond the component libraries by Ryan Cuprak ·        Java API for keyhole markup language by Florian Bachmann ·        Avoiding common user experience anti-patterns by Burk Hufnagel ·        Accelerating Java workloads via GPUs by Gary Frost JVM Languages Summit 2010 sessions: ·      Mixed language project compilation in Eclipse by Andy Clement  ·      Gathering the threads by John Rose  ·      LINQ: language features for concurrency by Neal Gafter  ·      Improvements in OpenJDK useful for JVM languages by Eric Caspole  ·      Symmetric Multilanguage - VM Architecture by Oleg Pliss  Special interviews with Oracle experts on product innovations: ·      Ludovic Champenois, Java EE architect on Glassfish 3.1 and Java EE. ·      John Jullion-Ceccarelli and Martin Ryzl on NetBeans IDE 6.9 You can chose to listen to a section of talks using the agenda view and search for related content while watching a presentation.  Enjoy the Java content and vote on it! 

    Read the article

  • Google Docs : encore plus de fonctionnalités de partage et de travail collaboratif, l'effet de la no

    Mise à jour du 18/06/10 Google Docs : encore plus de fonctionnalités de partage Et de travail collaboratif, l'effet de la nouvelle concurrence de Microsoft ? Tiens, tiens... A peine trois jours après la sortie officielle de Microsoft Office 2010 (lire par ailleurs notre dossier sur les nouveautés de Microsoft Office 2010), soit une petite semaine après l'arrivée de sa version gratuite en ligne (

    Read the article

< Previous Page | 451 452 453 454 455 456 457 458 459 460 461 462  | Next Page >