Search Results

Search found 31884 results on 1276 pages for 'microsoft sync framework'.

Page 381/1276 | < Previous Page | 377 378 379 380 381 382 383 384 385 386 387 388  | Next Page >

  • Query Execution Failed in Reporting Services reports

    - by Chris Herring
    I have some reporting services reports that talk to Analysis Services and at times they fail with the following error: An error occurred during client rendering. An error has occurred during report processing. Query execution failed for dataset 'AccountManagerAccountManager'. The connection cannot be used while an XmlReader object is open. This occurs sometimes when I change selections in the filter. It also occurs when the machine has been under heavy load and then will consistently error until SSAS is restarted. The log file contains the following error: processing!ReportServer_0-18!738!04/06/2010-11:01:14:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Query execution failed for dataset 'AccountManagerAccountManager'., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Query execution failed for dataset 'AccountManagerAccountManager'. ---> System.InvalidOperationException: The connection cannot be used while an XmlReader object is open. at Microsoft.AnalysisServices.AdomdClient.XmlaClient.CheckConnection() at Microsoft.AnalysisServices.AdomdClient.XmlaClient.ExecuteStatement(String statement, IDictionary connectionProperties, IDictionary commandProperties, IDataParameterCollection parameters, Boolean isMdx) at Microsoft.AnalysisServices.AdomdClient.AdomdConnection.XmlaClientProvider.Microsoft.AnalysisServices.AdomdClient.IExecuteProvider.ExecuteTabular(CommandBehavior behavior, ICommandContentProvider contentProvider, AdomdPropertyCollection commandProperties, IDataParameterCollection parameters) at Microsoft.AnalysisServices.AdomdClient.AdomdCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.AnalysisServices.AdomdClient.AdomdCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.ReportingServices.DataExtensions.AdoMdCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.ReportingServices.OnDemandProcessing.RuntimeDataSet.RunDataSetQuery() Can anyone shed light on this issue?

    Read the article

  • Troubleshooting SQL Azure Connectivity

    - by kaleidoscope
    Technorati Tags: Rituraj,Connectivity Issues with SQL Azure Troubleshooting SQL Azure Connectivity How to resolve some of the common connectivity error messages that you would see while connecting to SQL Azure A transport-level error has occurred when receiving results from the server. (Provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.) System.Data.SqlClient.SqlException: Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding. The statement has been terminated. An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections Error: Microsoft SQL Native Client: Unable to complete login process due to delay in opening server connection. A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond. Some troubleshooting tips a) Verify Azure Firewall Settings and Service Availability     Reference: SQL Azure Firewall - http://msdn.microsoft.com/en-us/library/ee621782.aspx b) Verify that you can reach our Virtual IP     Reference: Telnet Troubleshooting Guide - http://technet.microsoft.com/en-us/library/cc753360(WS.10).aspx    Reference: How to Use TRACERT to Troubleshoot TCP/IP Problems in Windows - http://support.microsoft.com/kb/314868 c) Windows Firewall on the local machine     Frequently Asked Questions - http://msdn.microsoft.com/en-us/library/bb736261(VS.85).aspx     Reference: Windows Firewall with Advanced Security Getting Started Guide - http://technet.microsoft.com/en-us/library/cc748991(WS.10).aspx d) Other Firewall products     Reference: http://www.whatismyip.com/ e) Generate a Network Trace using Microsoft Network Monitor tool    Reference: How to capture network traffic with Network Monitor - http://support.microsoft.com/kb/148942 f) SQL Azure Denial of Service (DOS) Guard SQL Azure utilizes techniques to prevent denial of service attacks. If your connection is getting reset by our service due to a potential DOS attack you would  be able to see a three way handshake established and then a RESET in your network trace.

    Read the article

  • When runs a product out of support?

    That is a question I get regularly from customers. Microsoft has a great site where you can find that information. Unfortunately this site is not easy to find, and a lot of people are not aware of this site. A good reason to promote it a little. So if you ever get a question on this topic, go to http://support.microsoft.com/lifecycle/search/Default.aspx. At that site, you can find also the details of the policy Microsoft Support Lifecycle Policy The Microsoft Support Lifecycle policy took effect in October 2002, and applies to most products currently available through retail purchase or volume licensing and most future release products. Through the policy, Microsoft will offer a minimum of: 10 years of support (5 years Mainstream Support and 5 years Extended Support) at the supported service pack level for Business and Developer products 5 years Mainstream Support at the supported service pack level for Consumer/Hardware/Multimedia products 3 years of Mainstream Support for products that are annually released (for example, Money, Encarta, Picture It!, and Streets & Trips) Phases of the Support Lifecycle Mainstream Support Mainstream Support is the first phase of the product support lifecycle. At the supported service pack level, Mainstream Support includes: Incident support (no-charge incident support, paid incident support, support charged on an hourly basis, support for warranty claims) Security update support The ability to request non-security hotfixes Please note: Enrollment in a maintenance program may be required to receive these benefits for certain products Extended Support The Extended Support phase follows Mainstream Support for Business and Developer products. At the supported service pack level, Extended Support includes: Paid support Security update support at no additional cost Non-security related hotfix support requires a separate Extended Hotfix Support Agreement to be purchased (per-fix fees also apply) Please note: Microsoft will not accept requests for warranty support, design changes, or new features during the Extended Support phase Extended Support is not available for Consumer, Hardware, or Multimedia products Enrollment in a maintenance program may be required to receive these benefits for certain products Self-Help Online Support Self-Help Online Support is available throughout a product's lifecycle and for a minimum of 12 months after the product reaches the end of its support. Microsoft online Knowledge Base articles, FAQs, troubleshooting tools, and other resources, are provided to help customers resolve common issues. Please note: Enrollment in a maintenance program may be required to receive these benefits for certain products (source: http://support.microsoft.com/lifecycle/#tab1)

    Read the article

  • What I&rsquo;m Reading &ndash; 2 &ndash; Microsoft Silverlight 4 Data and Services Cookbook

    - by Dave Campbell
    A while back I mentioned that I had a couple books on my desktop that I’ve been “shooting holes” in … in other words, reading pieces that are interesting at the time, or looking something up rather than starting at the front and heading for the back. The book I want to mention today is Microsoft Silverlight 4 Data and Services Cookbook : by Gill Cleeren and Kevin Dockx. As opposed to the authors of the last book I reviewed, I don’t personally know Gill or Kevin, but I’ve blogged a lot of their articles… both prolific and on-topic writers. The ‘recipe’ style of the book shouldn’t put you off. It’s more of the way the chapters are laid out than anything else and once you see one of them, you recognize the pattern. This is a great eBook to have around to open when you need to find something useful. As with the other PACKT book I talked about have the eBook because for technical material, at least lately, I’ve gravitated toward that. I can have it with me on a USB stick at work, or at home. Read the free chapter then check out their blogs. You may be surprised by some of the items you’ll find inside the covers. One such nugget is one I don’t think I’ve seen blogged:  “Converting You Existing Applications to Use Silverlight”. Another good job! Technorati Tags: Silverlight 4

    Read the article

  • Quel Environnements de Développement Intégré utilisez-vous pour coder avec Symfony, le framework MVC libre en PHP ?

    Quel EDI utilisez-vous pour coder avec Symfony ? Cela fait déjà un petit temps que je développe avec symfony, en utilisant un bon vieux Notepad++. Et je remarque que, pour le C++, je code quand même bien plus vite avec un EDI plus que correct (Visual Studio) qu'avec Notepad++. Ce n'est pas le premier que je tente d'utiliser mais c'est le seul qui me convient. J'ai déjà essayé NetBeans avec support de symfony, mais je n'ai pas vraiment réussi à le prendre en main, pas hypra intuitif, etc. Ce qui fait que je suis retourné vite fait à mon Notepad++. Mais c'était au tout début du support de symfony (6.5, il me semble). Depuis, on en est à la 6.9 (7.0 en beta), est-ce que ça vaut la peine de réessayer cet EDI ? ...

    Read the article

  • Exchange ActiveSync Exception

    - by Dmeglio
    One of the users on my network is having an issue with his iPhone syncing via ActiveSync. Overall it's working, but every now and then he gets a "Synchronization with your iPhone failed for 3 items." I asked him to go into OWA and turn on the Mobile Phone logging. I looked through the logs and this is what stood out to me: SyncCommand_GenerateResponsesXmlNode_AddChange_Exception : Microsoft.Exchange.Data.Storage.PropertyErrorException: Property: [{00062008-0000-0000-c000-000000000046}:0x8501] ReminderMinutesBeforeStartInternal, PropertyErrorCode: NotFound, PropertyErrorDescription: . at Microsoft.Exchange.Data.Storage.PropertyBag.ThrowIfPropertyError(StorePropertyDefinition propertyDefinition, Object propertyValue) at Microsoft.Exchange.Data.Storage.StoreObject.GetProperty(PropertyDefinition propertyDefinition) at Microsoft.Exchange.Data.Storage.MeetingMessage.get_Item(PropertyDefinition propertyDefinition) at Microsoft.Exchange.AirSync.SchemaConverter.XSO.XsoMeetingRequestProperty.get_NestedData() at Microsoft.Exchange.AirSync.SchemaConverter.AirSync.AirSyncMeetingRequestProperty.InternalCopyFrom(IProperty srcProperty) at Microsoft.Exchange.AirSync.SchemaConverter.AirSync.AirSyncProperty.CopyFrom(IProperty srcProperty) at Microsoft.Exchange.AirSync.SchemaConverter.AirSync.AirSyncDataObject.CopyFrom(IProperty srcRootProperty) at Microsoft.Exchange.AirSync.SyncCollection.ConvertServerToClientObject(ISyncItem syncItem, XmlNode airSyncParentNode, SyncOperation changeObject) at Microsoft.Exchange.AirSync.SyncCollection.GenerateCommandsXmlNode(XmlDocument xmlResponse, IAirSyncVersionFactory versionFactory, String deviceType, ProtocolLogger protocolLogger, MailboxLogger mailboxLogger) Does anyone have any idea what might cause this? We have 4 iPhone users connected to our Exchange via ActiveSync. Right now, this seems to be the only user experiencing this issue. I'd appreciate any help anyone can provide. Thanks.

    Read the article

  • Which framework would you recommend to use to add "social networking" components to a website?

    - by blueberryfields
    Given which already enables users to create and publish content, is there a service or tool which can add the standard social networking suite of components? Specifically, I'm looking quickly add functionality which allows users to friend each other, vote on/like/rank content on the site, send each other links to parts they find interesting, chat and send offline message each other. There's no specific limitation on the technology used for these components - as long as its been proven to work, and scales without issue. I'd slightly prefer a solution which is offered as a service rather than one that I have to install. Edit Some additional commenter requested clarifications - there are no restrictions that the site imposes on user identification or authentication. Feel free to assume that portion of the work is not relevant to the answers.

    Read the article

  • Outlook and IMAP - Outlook doesn't allow the Drafts and Trash folders to sync with the respective IMAP folders

    - by Matt
    I'm using Outlook 2007 and Outlook 2010 against an IMAP server (the problem exists across many, like Gmail, you name it). Outlook lets you set your Outlook "Sent" folder to map to the IMAP server's Sent folder (the other choice is to map your Outlook Sent to your Personal Folders Sent) - this is good. When you send a message from Outlook and then look in the sent folder of the IMAP server (e.g. from a different client or from a browser), the messages are there. This is the behavior I want. Outlook does NOT support the same behavior for Drafts and Trash. In both cases, items deleted (or Drafts saved) in Outlook go in to Outlook's local folders and do NOT show on the IMAP server's Trash or Drafts folders. Same problem in reverse. Thunderbird on the other hand does support the proper mapping of Drafts, Sent and Trash. I expected this to be IMAP-specific but it appears to be client specific. What does Outlook implement it this way and is there a workaround?

    Read the article

  • please explain my fio results - is O_SYNC|O_DIRECT misbehaving on linux?

    - by Zoltan
    I'm going mad over figuring out what the problem could be with one of our storage boxes. With a simple fio script I'm testing random writes using bs=1M and direct=1. The SSD is a Samsung 840pro attached to an LSI HBA (3Gbit/s ports). This is the result I'm getting under FreeBSD 9.1: WRITE: io=13169MB, aggrb=224743KB/s, minb=224743KB/s, maxb=224743KB/s, mint=60002msec, maxt=60002msec This is regardless of sync being set to 0 or 1. On linux, this is the result with sync=0: WRITE: io=14828MB, aggrb=253060KB/s, minb=253060KB/s, maxb=253060KB/s, mint=60001msec, maxt=60001msec and with sync=1: WRITE: io=6360.0MB, aggrb=108542KB/s, minb=108542KB/s, maxb=108542KB/s, mint=60001msec, maxt=60001msec My understanding is that since I'm operating on the raw block device, O_SYNC should not make any difference - there's no filesystem, any barrier, anything between the writes and the drive itself. Especially with O_DIRECT|O_SYNC set. Any ideas? For reference, here's the fio script I'm testing with: [global] bs=1M ioengine=sync iodepth=4 size=16g direct=1 runtime=60 filename=/dev/sdh sync=1 [rand-write] rw=randwrite stonewall

    Read the article

  • Team Foundation Server 2012 Build Global List Problems

    - by Bob Hardister
    My experience with the upgrade and use of TFS 2012 has been very positive. I did come across a couple of issues recently that tripped things up for a while. ISSUE 1 The first issue is that 2012 prior to Update 1 published an invalid build list item value to the collection global list. In 2010, the build global list, list item value syntax is an underscore between the build definition and the build number. In the 2012 RTM this underscore was replaced with a backslash, which is invalid.  Specifically, an upload of the global list fails when the backslash is followed at some point by a period. The error when using the API is: <detail ExceptionMessage="TF26204: The account you entered is not recognized. Contact your Team Foundation Server administrator to add your account." BaseExceptionName="Microsoft.TeamFoundation.WorkItemTracking.Server.ValidationException"><details id="600019" http://schemas.microsoft.com/TeamFoundation/2005/06/WorkItemTracking/faultdetail/03"http://schemas.microsoft.com/TeamFoundation/2005/06/WorkItemTracking/faultdetail/03" /></detail> when uploading the global list via the process editor the error is: This issue is corrected in Update1 as the backslash is changed to a forward slash. ISSUE 2 The second issue is that when upgrading from 2010 to 2012, the builds in 2010 are not published to the 2012 global list.  After the upgrade the 2012 global lists doesn’t have any builds and only builds run in 2012 are published to the global list. This was reported to the MSDN forums and Connect. To correct this I wrote a utility to pull all the builds and recreate the builds global list for each project in each collection.  This is a console application with a program.cs, a globallists.cs and a app.config (not published here). The utility connects to TFS 2012, loops through the collections or a target collection as specified in the app.config. Then loops through the projects, the build definitions, and builds.  It creates a global list for each project if that project has at least one build. Then it imports the new list to TFS.  Here’s the code for program and globalists classes. Program.CS using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.TeamFoundation.Framework.Client; using Microsoft.TeamFoundation.Framework.Common; using Microsoft.TeamFoundation.Client; using Microsoft.TeamFoundation.Server; using System.IO; using System.Xml; using Microsoft.TeamFoundation.WorkItemTracking.Client; using System.Diagnostics; using Utilities; using System.Configuration; namespace TFSProjectUpdater_CLC { class Program { static void Main(string[] args) { DateTime temp_d = System.DateTime.Now; string logName = temp_d.ToShortDateString(); logName = logName.Replace("/", "_"); logName = logName + "_" + temp_d.TimeOfDay; logName = logName.Replace(":", "."); logName = "TFSGlobalListBuildsUpdater_" + logName + ".log"; Trace.Listeners.Add(new TextWriterTraceListener(Path.Combine(ConfigurationManager.AppSettings["logLocation"], logName))); Trace.AutoFlush = true; Trace.WriteLine("Start:" + DateTime.Now.ToString()); Console.WriteLine("Start:" + DateTime.Now.ToString()); string tfsServer = ConfigurationManager.AppSettings["TargetTFS"].ToString(); GlobalLists gl = new GlobalLists(); //replace this with the URL to your TFS instance. Uri tfsUri = new Uri("https://" + tfsServer + "/tfs"); //bool foundLite = false; TfsConfigurationServer config = new TfsConfigurationServer(tfsUri, new UICredentialsProvider()); config.EnsureAuthenticated(); ITeamProjectCollectionService collectionService = config.GetService<ITeamProjectCollectionService>(); IList<TeamProjectCollection> collections = collectionService.GetCollections().OrderBy(collection => collection.Name.ToString()).ToList(); //target Collection string targetCollection = ConfigurationManager.AppSettings["targetCollection"]; foreach (TeamProjectCollection coll in collections) { if (targetCollection.Equals(string.Empty)) { if (!coll.Name.Equals("TFS Archive") && !coll.Name.Equals("DefaultCol") && !coll.Name.Equals("Team Project Template Gallery")) { doWork(coll, tfsServer); } } else { if (coll.Name.Equals(targetCollection)) { doWork(coll, tfsServer); } } } Trace.WriteLine("Finished:" + DateTime.Now.ToString()); Console.WriteLine("Finished:" + DateTime.Now.ToString()); if (System.Diagnostics.Debugger.IsAttached) { Console.WriteLine("\nHit any key to exit..."); Console.ReadKey(); } Trace.Close(); } static void doWork(TeamProjectCollection coll, string tfsServer) { GlobalLists gl = new GlobalLists(); //target Collection string targetProject = ConfigurationManager.AppSettings["targetProject"]; Trace.WriteLine("Collection: " + coll.Name); Uri u = new Uri("https://" + tfsServer + "/tfs/" + coll.Name.ToString()); TfsTeamProjectCollection c = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(u); ICommonStructureService icss = c.GetService<ICommonStructureService>(); try { Trace.WriteLine("\tChecking Collection Global Lists."); gl.RebuildBuildGlobalLists(c); } catch (Exception ex) { Console.WriteLine("Exception! :" + coll.Name); } } } } GlobalLists.CS using System; using System.Collections.Generic; using System.Linq; using System.Text; using Microsoft.TeamFoundation.Client; using Microsoft.TeamFoundation.Framework.Client; using Microsoft.TeamFoundation.Framework.Common; using Microsoft.TeamFoundation.Server; using Microsoft.TeamFoundation.WorkItemTracking.Client; using Microsoft.TeamFoundation.Build.Client; using System.Configuration; using System.Xml; using System.Xml.Linq; using System.Diagnostics; namespace Utilities { public class GlobalLists { string GL_NewList = @"<gl:GLOBALLISTS xmlns:gl=""http://schemas.microsoft.com/VisualStudio/2005/workitemtracking/globallists""> <GLOBALLIST> </GLOBALLIST> </gl:GLOBALLISTS>"; public void RebuildBuildGlobalLists(TfsTeamProjectCollection _tfs) { WorkItemStore wis = new WorkItemStore(_tfs); //export the current globals lists file for the collection to save as a backup XmlDocument globalListsFile = wis.ExportGlobalLists(); globalListsFile.Save(@"c:\temp\" + _tfs.Name.Replace("\\", "_") + "_backupGlobalList.xml"); LogExportCurrentCollectionGlobalListsAsBackup(_tfs); //Build a new global build list from each build definition within each team project IBuildServer buildServer = _tfs.GetService<IBuildServer>(); foreach (Project p in wis.Projects) { XmlDocument newProjectGlobalList = new XmlDocument(); newProjectGlobalList.LoadXml(GL_NewList); LogInstanciateNewProjectBuildGlobalList(_tfs, p); BuildNewProjectBuildGlobalList(_tfs, wis, newProjectGlobalList, buildServer, p); LogEndOfProject(_tfs, p); } } // Private Methods private static void BuildNewProjectBuildGlobalList(TfsTeamProjectCollection _tfs, WorkItemStore wis, XmlDocument newProjectGlobalList, IBuildServer buildServer, Project p) { //locate the template node XmlNamespaceManager nsmgr = new XmlNamespaceManager(newProjectGlobalList.NameTable); nsmgr.AddNamespace("gl", "http://schemas.microsoft.com/VisualStudio/2005/workitemtracking/globallists"); XmlNode node = newProjectGlobalList.SelectSingleNode("//gl:GLOBALLISTS/GLOBALLIST", nsmgr); LogLocatedGlobalListNode(_tfs, p); //add the name attribute for the project build global list XmlElement buildListNode = (XmlElement)node; buildListNode.SetAttribute("name", "Builds - " + p.Name); LogAddedBuildNodeName(_tfs, p); //add new builds to the team project build global list bool buildsExist = false; if (AddNewBuilds(_tfs, newProjectGlobalList, buildServer, p, node, buildsExist)) { //import the new build global list for each project that has builds newProjectGlobalList.Save(@"c:\temp\" + _tfs.Name.Replace("\\", "_") + "_" + p.Name + "_" + "newGlobalList.xml"); //write out temp copy of the global list file to be imported LogImportReady(_tfs, p); wis.ImportGlobalLists(newProjectGlobalList.InnerXml); LogImportComplete(_tfs, p); } } private static bool AddNewBuilds(TfsTeamProjectCollection _tfs, XmlDocument newProjectGlobalList, IBuildServer buildServer, Project p, XmlNode node, bool buildsExist) { var buildDefinitions = buildServer.QueryBuildDefinitions(p.Name); foreach (var buildDefinition in buildDefinitions) { var builds = buildDefinition.QueryBuilds(); foreach (var build in builds) { //insert the builds into the current build list node in the correct 2012 format buildsExist = true; XmlElement listItem = newProjectGlobalList.CreateElement("LISTITEM"); listItem.SetAttribute("value", buildDefinition.Name + "/" + build.BuildNumber.ToString().Replace(buildDefinition.Name + "_", "")); node.AppendChild(listItem); } } if (buildsExist) LogBuildListCreated(_tfs, p); else LogNoBuildsInProject(_tfs, p); return buildsExist; } // Logging Methods private static void LogExportCurrentCollectionGlobalListsAsBackup(TfsTeamProjectCollection _tfs) { Trace.WriteLine("\tExported Global List for " + _tfs.Name + " collection."); Console.WriteLine("\tExported Global List for " + _tfs.Name + " collection."); } private void LogInstanciateNewProjectBuildGlobalList(TfsTeamProjectCollection _tfs, Project p) { Trace.WriteLine("\t\tInstanciated the new build global list for project " + p.Name + " in the " + _tfs.Name + " collection."); Console.WriteLine("\t\tInstanciated the new build global list for project \n\t\t\t" + p.Name + " in the \n\t\t\t" + _tfs.Name + " collection."); } private static void LogLocatedGlobalListNode(TfsTeamProjectCollection _tfs, Project p) { Trace.WriteLine("\t\tLocated the build global list node for project " + p.Name + " in the " + _tfs.Name + " collection."); Console.WriteLine("\t\tLocated the build global list node for project \n\t\t\t" + p.Name + " in the \n\t\t\t" + _tfs.Name + " collection."); } private static void LogAddedBuildNodeName(TfsTeamProjectCollection _tfs, Project p) { Trace.WriteLine("\t\tAdded the name attribute to the build global list for project " + p.Name + " in the " + _tfs.Name + " collection."); Console.WriteLine("\t\tAdded the name attribute to the build global list for project \n\t\t\t" + p.Name + " in the \n\t\t\t" + _tfs.Name + " collection."); } private static void LogBuildListCreated(TfsTeamProjectCollection _tfs, Project p) { Trace.WriteLine("\t\tAdded all builds into the " + "Builds - " + p.Name + " list in the " + _tfs.Name + " collection."); Console.WriteLine("\t\tAdded all builds into the " + "Builds - \n\t\t\t" + p.Name + " list in the \n\t\t\t" + _tfs.Name + " collection."); } private static void LogNoBuildsInProject(TfsTeamProjectCollection _tfs, Project p) { Trace.WriteLine("\t\tNo builds found for project " + p.Name + " in the " + _tfs.Name + " collection."); Console.WriteLine("\t\tNo builds found for project " + p.Name + " \n\t\t\tin the " + _tfs.Name + " collection."); } private void LogEndOfProject(TfsTeamProjectCollection _tfs, Project p) { Trace.WriteLine("\t\tEND OF PROJECT " + p.Name); Trace.WriteLine(" "); Console.WriteLine("\t\tEND OF PROJECT " + p.Name); Console.WriteLine(); } private static void LogImportReady(TfsTeamProjectCollection _tfs, Project p) { Trace.WriteLine("\t\tReady to import the build global list for project " + p.Name + " to the " + _tfs.Name + " collection."); Console.WriteLine("\t\tReady to import the build global list for project \n\t\t\t" + p.Name + " to the \n\t\t\t" + _tfs.Name + " collection."); } private static void LogImportComplete(TfsTeamProjectCollection _tfs, Project p) { Trace.WriteLine("\t\tImport of the build global list for project " + p.Name + " to the " + _tfs.Name + " collection completed."); Console.WriteLine("\t\tImport of the build global list for project \n\t\t\t" + p.Name + " to the \n\t\t\t" + _tfs.Name + " collection completed."); } } }

    Read the article

  • Query Execution Failed in Reporting Services reports

    - by Chris Herring
    I have some reporting services reports that talk to Analysis Services and at times they fail with the following error: An error occurred during client rendering. An error has occurred during report processing. Query execution failed for dataset 'AccountManagerAccountManager'. The connection cannot be used while an XmlReader object is open. This occurs sometimes when I change selections in the filter. It also occurs when the machine has been under heavy load and then will consistently error until SSAS is restarted. The log file contains the following error: processing!ReportServer_0-18!738!04/06/2010-11:01:14:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Query execution failed for dataset 'AccountManagerAccountManager'., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: Query execution failed for dataset 'AccountManagerAccountManager'. ---> System.InvalidOperationException: The connection cannot be used while an XmlReader object is open. at Microsoft.AnalysisServices.AdomdClient.XmlaClient.CheckConnection() at Microsoft.AnalysisServices.AdomdClient.XmlaClient.ExecuteStatement(String statement, IDictionary connectionProperties, IDictionary commandProperties, IDataParameterCollection parameters, Boolean isMdx) at Microsoft.AnalysisServices.AdomdClient.AdomdConnection.XmlaClientProvider.Microsoft.AnalysisServices.AdomdClient.IExecuteProvider.ExecuteTabular(CommandBehavior behavior, ICommandContentProvider contentProvider, AdomdPropertyCollection commandProperties, IDataParameterCollection parameters) at Microsoft.AnalysisServices.AdomdClient.AdomdCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.AnalysisServices.AdomdClient.AdomdCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.ReportingServices.DataExtensions.AdoMdCommand.ExecuteReader(CommandBehavior behavior) at Microsoft.ReportingServices.OnDemandProcessing.RuntimeDataSet.RunDataSetQuery() Can anyone shed light on this issue?

    Read the article

  • How do you get SharePoint back in sync when you change a user's sAMAccountName?

    - by Kirk Liemohn
    I have observed on SharePoint 2010 that if you change the sAMAccountName of a user after the user has logged into a SharePoint site collection, the tp_Login field in the UserInfo table does not get updated. It still has the old user Id. While the user can log into SharePoint under the new account, these new logins do not update the table. I have code that looks at the SPUser.LoginName and this value appears to be the tp_Login field value which is now old. The fact that this value is old causes my code to fail. I suspect this behavior is identical in SharePoint 2007. Is there any way to force SharePoint to recognize the new sAMAccountName? I suspect that profile synchronization might help, but I would like for my solution to work with WSS 3.0 and SharePoint 2010 Foundation. I considered manually updating the database table, but I would like to stick with supported approaches.

    Read the article

  • Which Java web framework do you recommend for intranet webapp (not content website)?

    - by pregzt
    I'm about to start development of small purpose build intranet web application for small software vendor. It will be administration console of the server managing licenses for off-the-shelf software installed by users. There will be a few users who need to be able to sign in, issue a batch of license codes, revoke some, renew outdated, resolve issues, etc. Bear in mind that my customer requires Java for this solution. I'm seasoned Java programmer and before I used different frameworks to implement webapps, mainly Apache Struts in the past and Spring MVC recently. I was wondering what else could you recommend for such specific intranet webapp. I looked at using Google Web Toolkit (possibly with SmartGWT) Ext JS for fancy widgets in UI and REST back-end in SpringMVC SpringMVC with JQueryUI Could you please think of any piece of recommendation with regard to the choice I'm going to made?

    Read the article

  • How to add assemblies in a 64-bit machine?

    - by marko
    My old cmd-script: C:\Windows\Microsoft.NET\Framework\v2.0.50727\RegAsm blabla.dll C:\Windows\Microsoft.NET\Framework\v2.0.50727\GacUtil -i blabla.dll (Which works fine in my old machine.) But now I have a script for a 64-bit machine (Windows Server 2008 R2): C:\Windows\Microsoft.NET\Framework64\v2.0.50727\RegAsm blabla.dll C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\NETFX 4.0 Tools\GacUtil -i blabla.dll Then I get this message: C:\Windows\Microsoft.NET\Framework64\v2.0.50727\RegAsm blabla.dll Microsoft (R) .NET Framework Assembly Registration Utility 2.0.50727.5420 Copyright (C) Microsoft Corporation 1998-2004. All rights reserved. Types registered successfully C:\Program Files\Microsoft SDKs\Windows\v7 .1\Bin\NETFX 4.0 Tools\GacUtil -i blabla.dll 'C:\Program' is not recognized as an internal or external command, operable program or batch file. The second command is not successful.

    Read the article

  • What I&rsquo;m Reading &ndash; Microsoft Silverlight 4 Business Application Development: Beginner&rs

    - by Dave Campbell
    I don’t have a lot of time for reading lately, so James Patterson and all those guys are *way* ahead of me … but I do try to make time to read technical material. A couple books have come across just recently and I thought I’d mention them one at a time. The book I want to mention tonight is Microsoft Silverlight 4 Business Application Development: Beginner’s Guide : by Cameron Albert and Frank LaVigne. Cameron and Frank are both great guys and you’ve seen their blog posts come across my SilverlightCream posts many times. I like the writing and format of the book. It leads you quite well from one concept to the next and for a technical book, it holds your interest. You can check out a free chapter here. I have the eBook because for technical material, at least lately, I’ve gravitated toward that. I can have it with me on a USB stick at work, or at home. Read the free chapter then check out their blogs. Even if you think you know a lot of this material, I think you’ll find yourself learning something, and besides, it’s a great one-place reference. Good work guys! Technorati Tags: Silverlight 4

    Read the article

  • Need data on disk drive management by OS: getting base I/O unit size, "sync" option, Direct Memory A

    - by Richard T
    Hello All, I want to ensure I have done all I can to configure a system's disks for serious database use. The three areas I know of (any others?) to be concerned about are: I/O size: the database engine and disk's native size should either match, or the database's native I/O size should be a multiple of the disk's native I/O size. Disks that are capable of Direct Memory Access (eg. IDE) should be configured for it. When a disk says it has written data persistently, it must be so! No keeping it in cache and lying about it. I have been looking for information on how to ensure these are so for CENTOS and Ubuntu, but can't seem to find anything at all! I want to be able to check these things and change them if needed. Any and all input appreciated.

    Read the article

  • BizTalk: History of one project architecture

    - by Leonid Ganeline
    "In the beginning God made heaven and earth. Then he started to integrate." At the very start was the requirement: integrate two working systems. Small digging up: It was one system. It was good but IT guys want to change it to the new one, much better, chipper, more flexible, and more progressive in technologies, more suitable for the future, for the faster world and hungry competitors. One thing. One small, little thing. We cannot turn off the old system (call it A, because it was the first), turn on the new one (call it B, because it is second but not the last one). The A has a hundreds users all across a country, they must study B. A still has a lot nice custom features, home-made features that cannot disappear. These features have to be moved to the B and it is a long process, months and months of redevelopment. So, the decision was simple. Let’s move not jump, let’s both systems working side-by-side several months. In this time we could teach the users and move all custom A’s special functionality to B. That automatically means both systems should work side-by-side all these months and use the same data. Data in A and B must be in sync. That’s how the integration projects get birth. Moreover, the specific of the user tasks requires the both systems must be in sync in real-time. Nightly synchronization is not working, absolutely.   First draft The first draft seems simple. Both systems keep data in SQL databases. When data changes, the Create, Update, Delete operations performed on the data, and the sync process could be started. The obvious decision is to use triggers on tables. When we are talking about data, we are talking about several entities. For example, Orders and Items [in Orders]. We decided to use the BizTalk Server to synchronize systems. Why it was chosen is another story. Second draft   Let’s take an example how it works in more details. 1.       User creates a new entity in the A system. This fires an insert trigger on the entity table. Trigger has to pass the message “Entity created”. This message includes all attributes of the new entity, but I focused on the Id of this entity in the A system. Notation for this message is id.A. System A sends id.A to the BizTalk Server. 2.       BizTalk transforms id.A to the format of the system B. This is easiest part and I will not focus on this kind of transformations in the following text. The message on the picture is still id.A but it is in slightly different format, that’s why it is changing in color. BizTalk sends id.A to the system B. 3.       The system B creates the entity on its side. But it uses different id-s for entities, these id-s are id.B. System B saves id.A+id.B. System B sends the message id.A+id.B back to the BizTalk. 4.       BizTalk sends the message id.A+id.B to the system A. 5.       System A saves id.A+id.B. Why both id-s should be saved on both systems? It was one of the next requirements. Users of both systems have to know the systems are in sync or not in sync. Users working with the entity on the system A can see the id.B and use it to switch to the system B and work there with the copy of the same entity. The decision was to store the pairs of entity id-s on both sides. If there is only one id, the entities are not in sync yet (for the Create operation). Third draft Next problem was the reliability of the synchronization. The synchronizing process can be interrupted on each step, when message goes through the wires. It can be communication problem, timeout, temporary shutdown one of the systems, the second system cannot be synchronized by some internal reason. There were several potential problems that prevented from enclosing the whole synchronization process in one transaction. Decision was to restart the whole sync process if it was not finished (in case of the error). For this purpose was created an additional service. Let’s call it the Resync service. We still keep the id pairs in both systems, but only for the fast access not for the synchronization process. For the synchronizing these id-s now are kept in one main place, in the Resync service database. The Resync service keeps record as: ·       Id.A ·       Id.B ·       Entity.Type ·       Operation (Create, Update, Delete) ·       IsSyncStarted (true/false) ·       IsSyncFinished (true/false0 The example now looks like: 1.       System A creates id.A. id.A is saved on the A. Id.A is sent to the BizTalk. 2.       BizTalk sends id.A to the Resync and to the B. id.A is saved on the Resync. 3.       System B creates id.B. id.A+id.B are saved on the B. id.A+id.B are sent to the BizTalk. 4.       BizTalk sends id.A+id.B to the Resync and to the A. id.A+id.B are saved on the Resync. 5.       id.A+id.B are saved on the B. Resync changes the IsSyncStarted and IsSyncFinished flags accordingly. The Resync service implements three main methods: ·       Save (id.A, Entity.Type, Operation) ·       Save (id.A, id.B, Entity.Type, Operation) ·       Resync () Two Save() are used to save id-s to the service storage. See in the above example, in 2 and 4 steps. What about the Resync()? It is the method that finishes the interrupted synchronization processes. If Save() is started by the trigger event, the Resync() is working as an independent process. It periodically scans the Resync storage to find out “unfinished” records. Then it restarts the synchronization processes. It tries to synchronize them several times then gives up.     One more thing, both systems A and B must tolerate duplicates of one synchronizing process. Say on the step 3 the system B was not able to send id.A+id.B back. The Resync service must restart the synchronization process that will send the id.A to B second time. In this case system B must just send back again also created id.A+id.B pair without errors. That means “tolerate duplicates”. Fourth draft Next draft was created only because of the aesthetics. As it always happens, aesthetics gave significant performance gain to the whole system. First was the stupid question. Why do we need this additional service with special database? Can we just master the BizTalk to do something like this Resync() does? So the Resync orchestration is doing the same thing as the Resync service. It is started by the Id.A and finished by the id.A+id.B message. The first works as a Start message, the second works as a Finish message.     Here is a diagram the whole process without errors. It is pretty straightforward. The Resync orchestration is waiting for the Finish message specific period of time then resubmits the Id.A message. It resubmits the Id.A message specific number of times then gives up and gets suspended. It can be resubmitted then it starts the whole process again: waiting [, resubmitting [, get suspended]], finishing. Tuning up The Resync orchestration resubmits the id.A message with special “Resubmitted” flag. The subscription filter on the Resync orchestration includes predicate as (Resubmit_Flag != “Resubmitted”). That means only the first Sync orchestration starts the Resync orchestration. Other Sync orchestration instantiated by the resubmitting can finish this Resync orchestration but cannot start another instance of the Resync   Here is a diagram where system B was inaccessible for some period of time. The Resync orchestration resubmitted the id.A two times. Then system B got the response the id.A+id.B and this finished the Resync service execution. What is interesting about this, there were submitted several identical id.A messages and only one id.A+id.B message. Because of this, the system B and the Resync must tolerate the duplicate messages. We also told about this requirement for the system B. Now the same requirement is for the Resunc. Let’s assume the system B was very slow in the first response and the Resync service had time to resubmit two id.A messages. System B responded not, as it was in previous case, with one id.A+id.B but with two id.A+id.B messages. First of them finished the Resync execution for the id.A. What about the second id.A+id.B? Where it goes? So, we have to add one more internal requirement. The whole solution must tolerate many identical id.A+id.B messages. It is easy task with the BizTalk. I added the “SinkExtraMessages” subscriber (orchestration with one receive shape), that just get these messages and do nothing. Real design Real architecture is much more complex and interesting. In reality each system can submit several id.A almost simultaneously and completely unordered. There are not only the “Create entity” operation but the Update and Delete operations. And these operations relate each other. Say the Update operation after Delete means not the same as Update after Create. In reality there are entities related each other. Say the Order and Order Items. Change on one of it could start the series of the operations on another. Moreover, the system internals are the “black boxes” and we cannot predict the exact content and order of the operation series. It worth to say, I had to spend a time to manage the zombie message problems. The zombies are still here, but this is not a problem now. And this is another story. What is interesting in the last design? One orchestration works to help another to be more reliable. Why two orchestration design is more reliable, isn’t it something strange? The Synch orchestration takes all the message exchange between systems, here is the area where most of the errors could happen. The Resync orchestration sends and receives messages only within the BizTalk server. Is there another design? Sure. All Resync functionality could be implemented inside the Sync orchestration. Hey guys, some other ideas?

    Read the article

  • Web deploy (msdeploy), syncing everything but sites and pools (but include siteDefaults)

    - by jishi
    Today I do the following to sync two webservers but skip all site configuration: msdeploy -verb:sync -source:webServer -dest:webServer,computerName=web25:8080 -skip:objectName=section,absolutePath=system.applicationHost/sites -skip:objectName=section,absolutePath=system.applicationHost/applicationPools However, this effectively also skip the siteDefaults, which I do like to sync (system.applicationHost/sites/siteDefaults) There doesn't seem to be a way to "include" a section, to override the skip directive. And there doesn't seem to be a way to sync only the siteDefaults section from applicationHost either, since source appHostConfig only seem to sync a specified site, and not siteDefaults. Maybe it is possible to "skip" using an Xpath expression or similar, to only skip the nodes, but include , but I find the documentation a bit confusing and my Xpath is rusty.

    Read the article

  • CodePlex Daily Summary for Saturday, June 16, 2012

    CodePlex Daily Summary for Saturday, June 16, 2012Popular ReleasesCosmos (C# Open Source Managed Operating System): Release 92560: Prerequisites Visual Studio 2010 - Any version including Express. Express users must also install Visual Studio 2010 Integrated Shell runtime VMWare - Cosmos can run on real hardware as well as other virtualization environments but our default debug setup is configured for VMWare. VMWare Player (Free). or Workstation VMWare VIX API 1.11AutoUpdaterdotNET : Autoupdate for VB.NET and C# Developer: AutoUpdater.NET 1.1: Release Notes *New feature added that allows user to select remind later interval.Sumzlib: API document: API documentMicrosoft SQL Server Product Samples: Database: AdventureWorks 2008 OLTP Script: Install AdventureWorks2008 OLTP database from script The AdventureWorks database can be created by running the instawdb.sql DDL script contained in the AdventureWorks 2008 OLTP Script.zip file. The instawdb.sql script depends on two path environment variables: SqlSamplesDatabasePath and SqlSamplesSourceDataPath. The SqlSamplesDatabasePath environment variable is set to the default Microsoft ® SQL Server 2008 path. You will need to change the SqlSamplesSourceDataPath environment variable to th...HigLabo: HigLabo_20120613: Bug fix HigLabo.Mail Decode header encoded by CP1252Jasc (just another script compressor): 1.3.1: Updated Ajax Minifier to 4.55.WipeTouch, a jQuery touch plugin: 1.2.0: Changes since 1.1.0: New: wipeMove event, triggered while moving the mouse/finger. New: added "source" to the result object. Bug fix: sometimes vertical wipe events would not trigger correctly. Bug fix: improved tapToClick handler. General code refactoring. Windows Phone 7 is not supported, yet! Its behaviour is completely broken and would require some special tricks to make it work. Maybe in the future...Phalanger - The PHP Language Compiler for the .NET Framework: 3.0.0.3026 (June 2012): Fixes: round( 0.0 ) local TimeZone name TimeZone search compiling multi-script-assemblies PhpString serialization DocDocument::loadHTMLFile() token_get_all() parse_url()BlackJumboDog: Ver5.6.4: 2012.06.13 Ver5.6.4  (1) Web???????、???POST??????????????????Yahoo! UI Library: YUI Compressor for .Net: Version 2.0.0.0 - Ferret: - Merging both 3.5 and 2.0 codebases to a single .NET 2.0 assembly. - MSBuild Task. - NAnt Task.Bumblebee: Version 0.3.1: Changed default config values to decent ones. Restricted visibility of Hive.fs to internal. Added some XML documentation. Added Array.shuffle utility. The dll is also available on NuGet My apologies, the initial source code referenced was missing one file which prevented it from building The source code contains two examples, one in C#, one in F#, illustrating the usage of the framework on the Travelling Salesman Problem: Source CodeSharePoint XSL Templates: SPXSLT 0.0.9: Added new template FixAmpersands. Fixed the contents of the MultiSelectValueCheck.xsl file, which was missing the stylesheet wrapper.ExcelFileEditor: .CS File: nothingBizTalk Scheduled Task Adapter: Release 4.0: Works with BizTalk Server 2010. Compiled in .NET Framework 4.0. In this new version are available small improvements compared to the current version (3.0). We can highlight the following improvements or changes: 24 hours support in “start time” property. Previous versions had an issue with setting the start time, as it shown 12 hours watch but no AM/PM. Daily scheduler review. Solved a small bug on Daily Properties: unable to switch between “Every day” and “on these days” Installation e...Weapsy - ASP.NET MVC CMS: 1.0.0 RC: - Upgrade to Entity Framework 4.3.1 - Added AutoMapper custom version (by nopCommerce Team) - Added missed model properties and localization resources of Plugin Definitions - Minor changes - Fixed some bugsXenta Framework - extensible enterprise n-tier application framework: Xenta Framework 1.8.0 Beta: Catalog and Publication reviews and ratings Store language packs in data base Improve reporting system Improve Import/Export system A lot of WebAdmin app UI improvements Initial implementation of the WebForum app DB indexes Improve and simplify architecture Less abstractions Modernize architecture Improve, simplify and unify API Simplify and improve testing A lot of new unit tests Codebase refactoring and ReSharpering Utilize Castle Windsor Utilize NHibernate ORM ...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.55: Properly handle IE extension to CSS3 grammar that allows for multiple parameters to functional pseudo-class selectors. add new switch -braces:(new|same) that affects where opening braces are placed in multi-line output. The default, "new" puts them on their own new line; "same" outputs them at the end of the previous line. add new optional values to the -inline switch: -inline:(force|noforce), which can be combined with the existing boolean value via comma-separators; value "force" (which...Microsoft Media Platform: Player Framework: MMP Player Framework 2.7 (Silverlight and WP7): Additional DownloadsSMFv2.7 Full Installer (MSI) - This will install everything you need in order to develop your own SMF player application, including the IIS Smooth Streaming Client. It only includes the assemblies. If you want the source code please follow the link above. Smooth Streaming Sample Player - This is a pre-built player that includes support for IIS Smooth Streaming. You can configure the player to playback your content by simplying editing a configuration file - no need to co...Liberty: v3.2.1.0 Release 10th June 2012: Change Log -Added -Liberty is now digitally signed! If the certificate on Liberty.exe is missing, invalid, or does not state that it was developed by "Xbox Chaos, Open Source Developer," your copy of Liberty may have been altered in some (possibly malicious) way. -Reach Mass biped max health and shield changer -Fixed -H3/ODST Fixed all of the glitches that users kept reporting (also reverted the changes made in 3.2.0.2) -Reach Made some tag names clearer and more consistent between m...Media Companion: Media Companion 3.503b: It has been a while, so it's about time we release another build! Major effort has been for fixing trailer downloads, plus a little bit of work for episode guide tag in TV show NFOs.New Projects.NinJa (dotNinja): An extensive JavaScript Framework revolving around principles found in .NET and aiming to integrate full Intellisense support. bab-rizg: solve unemployment problemBizTalk Multi-part Message Attachments Zipper Pipeline Component: This pipeline component replaces all attachments of a multi-part message, in a send pipeline, for its zipped equivalent.Boggle.Net: A basic implementation of Boggle for WPF.CFScript: CFScript is an ANT-like scripting system for Compact Framework. Tasks like copying files, setting registry values o install CAB files can be done with CFScript.Diablo3: Diablo3Dygraphs.NET: Dygraphs.NETDynamics CRM plugin for nopCommerce: This plugins is a bridge between nopCommerce and Dynamics CRM. nms.gaming: Place holderProject Bright Star: Project Bright Star. Deal with it.RDFSharp: RDFSharp is a library designed to ease the development of .NET applications based on the RDF and Semantic Web data model.SlamCMS: An application framework that allows you to build content managed sites leveraging SharePoint 2010 for publishing with tools to query and manifest your data.test02: no

    Read the article

  • What tools, libraries, or framework is needed to create a completely offline Javascript application?

    - by makerofthings7
    I am interested in creating a HTML application that can run as disconnected from the server as possible. Two examples of this include OWA in Exchange 2013 and to a lesser extent and the client available at www.ripple.com With the focus on OWA in Exchange 2013, what is needed to replicate the offline functionality available in a different application? A list technologies, frameworks, etc would be immensely helpful

    Read the article

  • SQL2008R2 install issues on windows 7 - unable to install setup support files?

    - by Liam
    I am trying to install the above but am getting the following errors when its attempting to install the setup support files, This is the first error that occurs during installation of the setup support files TITLE: Microsoft SQL Server 2008 R2 Setup ------------------------------ The following error has occurred: The installer has encountered an unexpected error. The error code is 2337. Could not close file: Microsoft.SqlServer.GridControl.dll GetLastError: 0. Click 'Retry' to retry the failed action, or click 'Cancel' to cancel this action and continue setup. For help, click: http://go.microsoft.com/fwlink?LinkID=20476&ProdName=Microsoft+SQL+Server&EvtSrc=setup.rll&EvtID=50000&ProdVer=10.50.1600.1&EvtType=0xDF039760%25401201%25401 This is the second error that occurs after clicking continue in the installer after the first error is generated TITLE: Microsoft SQL Server 2008 R2 Setup ------------------------------ The following error has occurred: SQL Server Setup has encountered an error when running a Windows Installer file. Windows Installer error message: The Windows Installer Service could not be accessed. This can occur if the Windows Installer is not correctly installed. Contact your support personnel for assistance. Windows Installer file: C:\Users\watto_uk\Desktop\In-Digital\Software\Microsoft\SQL Server 2008 R2\1033_ENU_LP\x64\setup\sqlsupport_msi\SqlSupport.msi Windows Installer log file: C:\Program Files\Microsoft SQL Server\100\Setup Bootstrap\Log\20110713_205508\SqlSupport_Cpu64_1_ComponentUpdate.log Click 'Retry' to retry the failed action, or click 'Cancel' to cancel this action and continue setup. For help, click: http://go.microsoft.com/fwlink?LinkID=20476&ProdName=Microsoft+SQL+Server&EvtSrc=setup.rll&EvtID=50000&ProdVer=10.50.1600.1&EvtType=0xDC80C325 These errors are generated from an ISO package downloaded from Microsoft. I have also tried using the web platform installer to install the express version instead but the SQL Server Installation fails with that also. The management studio installs fine but not the server. I have checked to make sure that the Windows Installer is started and it is. Cant seem to find an answer for this anywhere as all previous reported issues appear to be related to XP. I did have the express edition installed on the machine previously but uninstalled it to upgrade to the full version, I wish I hadn't now. Can anyone kindly offer any advice or point me in the right direction to stop me going insane with this? Any advice will be appreciated. Update======================= After digging a bit deeper ive located details of the error from the setup log file, i can also upload the log file if required. MSI (s) (E8:28) [23:35:18:705]: Assembly Error:The module '%1' was expected to contain an assembly manifest. MSI (s) (E8:28) [23:35:18:705]: Note: 1: 1935 2: 3: 0x80131018 4: IStream 5: Commit 6: MSI (s) (E8:28) [23:35:18:705]: Note: 1: 2337 2: 0 3: Microsoft.SqlServer.GridControl.dll MSI (s) (E8:28) [23:35:22:869]: Product: Microsoft SQL Server 2008 R2 Setup (English) -- Error 2337. The installer has encountered an unexpected error. The error code is 2337. Could not close file: Microsoft.SqlServer.GridControl.dll GetLastError: 0. MSI (s) (E8:28) [23:35:22:916]: Internal Exception during install operation: 0xc0000005 at 0x000007FEE908A23E. MSI (s) (E8:28) [23:35:22:916]: WER report disabled for silent install. MSI (s) (E8:28) [23:35:22:932]: Internal MSI error. Installer terminated prematurely. Error 2337. The installer has encountered an unexpected error. The error code is 2337. Could not close file: Microsoft.SqlServer.GridControl.dll GetLastError: 0. MSI (s) (E8:28) [23:35:22:932]: MainEngineThread is returning 1603 MSI (s) (E8:58) [23:35:22:932]: RESTART MANAGER: Session closed. Installer stopped prematurely. MSI (c) (0C:14) [23:35:22:947]: Decrementing counter to disable shutdown. If counter >= 0, shutdown will be denied. Counter after decrement: -1 MSI (c) (0C:14) [23:35:22:947]: MainEngineThread is returning 1601 === Verbose logging stopped: 13/07/2011 23:35:22 ===

    Read the article

  • My system administrator set up 2 databases that sync. Master-Master. However, these two databases a

    - by Alex
    DB1 and DB2. I made changes to DB1, and it does not seem to be on DB2. When I do "SHOW SLAVE STATUS\G" on DB2, there seems to be an error: mysql> show slave status\G *************************** 1. row *************************** Slave_IO_State: Waiting for master to send event Master_Host: Master_User: Master_Port: Connect_Retry: 60 Master_Log_File: mysql-bin.0005496 Read_Master_Log_Pos: 5445649315 Relay_Log_File: mysqld-relay-bin.0041705 Relay_Log_Pos: 1624302119 Relay_Master_Log_File: mysql-bin.0004461 Slave_IO_Running: Yes Slave_SQL_Running: No Replicate_Do_DB: Replicate_Ignore_DB: Replicate_Do_Table: Replicate_Ignore_Table: Replicate_Wild_Do_Table: Replicate_Wild_Ignore_Table: Last_Errno: 1062 Last_Error: Error 'Duplicate entry '4779' for key 1' on query. Default database: 'falc'. Query: 'INSERT INTO `log` (`anon_id`, `created_at`, `query`, `episode_url`, `detail_id`, `ip`) VALUES ('fdzn1d45kMavF4qbyePv', '2009-11-19 04:19:13', 'amazon', '', '', '130.126.40.57')' Skip_Counter: 0 Exec_Master_Log_Pos: 162301982 Relay_Log_Space: 136505187184 Until_Condition: None Until_Log_File: Until_Log_Pos: 0 Master_SSL_Allowed: No Master_SSL_CA_File: Master_SSL_CA_Path: Master_SSL_Cert: Master_SSL_Cipher: Master_SSL_Key: Seconds_Behind_Master: NULL 1 row in set (0.00 sec) Then, I did show tables, and it seems like DB2 is lacking a table that I created on DB1...that means that for some reason, DB2 stopped syncing with DB1. How can I simply allow them to be in full synchronization again? All I want is DB2 to be exactly the same as DB1!

    Read the article

  • Why isn't Stripes popular, even though it's an awesome web framework?

    - by Mr.Chowdary
    I'm new to Stripes. I worked on MVC frameworks like Struts 1.x and 2.x. When I started learning, its features are awesome and very lightweight; it has in-depth validations and offers easy integration with other frameworks too. There are no configurations and everything is simplified with annotations. I don't understand why Stripes is not popular compared with other Java web frameworks like Struts or JSF? I didn't find any drawbacks in Stripes. Any ideas why?

    Read the article

  • What is required for a scope in an injection framework?

    - by johncarl
    Working with libraries like Seam, Guice and Spring I have become accustomed to dealing with variables within a scope. These libraries give you a handful of scopes and allow you to define your own. This is a very handy pattern for dealing with variable lifecycles and dependency injection. I have been trying to identify where scoping is the proper solution, or where another solution is more appropriate (context variable, singleton, etc). I have found that if the scope lifecycle is not well defined it is very difficult and often failure prone to manage injections in this way. I have searched on this topic but have found little discussion on the pattern. Is there some good articles discussing where to use scoping and what are required/suggested prerequisites for scoping? I interested in both reference discussion or your view on what is required or suggested for a proper scope implementation. Keep in mind that I am referring to scoping as a general idea, this includes things like globally scoped singletons, request or session scoped web variable, conversation scopes, and others. Edit: Some simple background on custom scopes: Google Guice custom scope Some definitions relevant to above: “scoping” - A set of requirements that define what objects get injected at what time. A simple example of this is Thread scope, based on a ThreadLocal. This scope would inject a variable based on what thread instantiated the class. Here's an example of this: “context variable” - A repository passed from one object to another holding relevant variables. Much like scoping this is a more brute force way of accessing variables based on the calling code. Example: methodOne(Context context){ methodTwo(context); } methodTwo(Context context){ ... //same context as method one, if called from method one } “globally scoped singleton” - Following the singleton pattern, there is one object per application instance. This applies to scopes because there is a basic lifecycle to this object: there is only one of these objects instantiated. Here's an example of a JSR330 Singleton scoped object: @Singleton public void SingletonExample{ ... } usage: public class One { @Inject SingeltonExample example1; } public class Two { @Inject SingeltonExample example2; } After instantiation: one.example1 == two.example2 //true;

    Read the article

  • Why won't Mail sync To Do/Tasks with Exchange?

    - by cebjyre
    I'm using Apple Mail (Snow Leopard, everything is fully up-to-date), and am happily using an Exchange 2007 server for email needs, but I can't get it to synchronise the To Do notes from Mail with the Tasks from Exchange. I've tried creating a task in each and neither of them went to the other side. Bizarrely I have a single task from before I actually upgraded to Snow Leopard that did get into Mail from Exchange. Right-clicking on the Inbox and hitting 'Get Account Info' in Mail reports the correct number of entries in the 'To Do' folder for 'Messages on Server'.

    Read the article

< Previous Page | 377 378 379 380 381 382 383 384 385 386 387 388  | Next Page >