Search Results

Search found 5262 results on 211 pages for 'operation'.

Page 97/211 | < Previous Page | 93 94 95 96 97 98 99 100 101 102 103 104  | Next Page >

  • Does SQL Server Management Studio 2008 Activity Monitor work with SQL Server 2000?

    - by Andrew Janke
    I am trying to use SQL Server Management Studio 2008's Activity Monitor with an SQL Server 2000 instance to diagnose some query performance issues. I can connect SMSS 2008 to the db fine, and use it to browse objects and run queries. But when I press the Activity Monitor button, it pops up an error message saying: Microsoft SQL Server Management Studio This operation does not support connections to Microsoft SQL Server Personal Edition version 8.00.818. This MSDN article implies that Activity Monitor works with SQL Server 2000. Is it the fact that it's Personal Edition that's preventing it from working? The error message isn't clear whether it's the edition or version that's the problem.

    Read the article

  • Unkillable process problem

    - by skevar7
    I closed application, but the process remained in the list. I try to stop in from the task manager, but nothing happens. No error messages, process just stays in the list. I try to debug it, but the debugger says: Unable to attach to the crashing process. The requested operation is not supported. This happens with VS2008 and some other programs, sometimes. What? The? Hell? How to kill it?

    Read the article

  • Linux Raid: Can mdadm --grow a raid1 while mounted?

    - by Chris
    I have 2 500gb drives in a RAID1 setup that I needed to upgrade for more space. I mdadm --fail'ed each drive in turn and I used dd to copy each drive to it's respective larger drive (2tb each), removed the smaller drives and replaced them with the larger drives, and reassembled the array and forced a resync. So now I've got a 500gb RAID1 sitting on 2TB drives, and wish to grow them. The plan is to use mdadm --manage /dev/md0 --grow to grow them, then boot a rescue cd, assemble the array under that environment, and do the resize2fs on them. Can I use mdadm --grow on a mounted and live filesystem? Also, do I need more options to make sure the grow operation stays raid1?

    Read the article

  • How to use XSLT to tag specific nodes with unique, sequential, increasing integer ids?

    - by ~otakuj462
    Hi, I'm trying to use XSLT to transform a document by tagging a group of XML nodes with integer ids, starting at 0, and increasing by one for each node in the group. The XML passed into the stylesheet should be echoed out, but augmented to include this extra information. Just to be clear about what I am talking about, here is how this transformation would be expressed using DOM: states = document.getElementsByTagName("state"); for( i = 0; i < states.length; i++){ states.stateNum = i; } This is very simple with DOM, but I'm having much more trouble doing this with XSLT. The current strategy I've devised has been to start with the identity transformation, then create a global variable which selects and stores all of the nodes that I wish to number. I then create a template that matches that kind of node. The idea, then, is that in the template, I would look up the matched node's position in the global variable nodelist, which would give me a unique number that I could then set as an attribute. The problem with this approach is that the position function can only be used with the context node, so something like the following is illegal: <template match="state"> <variable name="stateId" select="@id"/> <variable name="uniqueStateNum" select="$globalVariable[@id = $stateId]/position()"/> </template> The same is true for the following: <template match="state"> <variable name="stateId" select="@id" <variable name="stateNum" select="position($globalVariable[@id = $stateId])/"/> </template> In order to use position() to look up the position of an element in $globalVariable, the context node must be changed. I have found a solution, but it is highly suboptimal. Basically, in the template, I use for-each to iterate through the global variable. For-each changes the context node, so this allows me to use position() in the way I described. The problem is that this turns what would normally be an O(n) operation into an O(n^2) operation, where n is the length of the nodelist, as this require iterating through the whole list whenever the template is matched. I think that there must be a more elegant solution. Altogether, here is my current (slightly simplified) xslt stylesheet: <?xml version="1.0"?> <xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:s="http://www.w3.org/2005/07/scxml" xmlns="http://www.w3.org/2005/07/scxml" xmlns:c="http://msdl.cs.mcgill.ca/" version="1.0"> <xsl:output method="xml"/> <!-- we copy them, so that we can use their positions as identifiers --> <xsl:variable name="states" select="//s:state" /> <!-- identity transform --> <xsl:template match="@*|node()"> <xsl:copy> <xsl:apply-templates select="@*|node()"/> </xsl:copy> </xsl:template> <xsl:template match="s:state"> <xsl:variable name="stateId"> <xsl:value-of select="@id"/> </xsl:variable> <xsl:copy> <xsl:apply-templates select="@*"/> <xsl:for-each select="$states"> <xsl:if test="@id = $stateId"> <xsl:attribute name="stateNum" namespace="http://msdl.cs.mcgill.ca/"> <xsl:value-of select="position()"/> </xsl:attribute> </xsl:if> </xsl:for-each> <xsl:apply-templates select="node()"/> </xsl:copy> </xsl:template> </xsl:stylesheet> I'd appreciate any advice anyone can offer. Thanks.

    Read the article

  • When I connect my iPhone 3GS to my PC, iTunes "freezes": Activity Monitor shows iTunes as "Not Respo

    - by reachmanoj74
    When I connect my iPhone 3GS to my PC, iTunes "freezes": Activity Monitor shows iTunes as "Not Responding". The iPhone indicates that it is being charged, and otherwise seems normal. When I disconnect the iPhone, iTunes immediately returns to normal operation. Connect again, and instant freeze again. I have rebooted the PC, uninstalled iTunes and reinstalled it and have reset the iPhone. No help. Everything worked fine yesterday. I haven't installed software or changed anything since the last time I synced. iPhone 3GS is running version 3.1.2(7D11). It is jailbroken.

    Read the article

  • Why does Outlook 2010 give the message "Creating a new item from the selected items could take some time...are you sure you create a new item...?

    - by Matt
    I'm using Outlook 2010 with Exchange 2007. I am moving emails from my Deleted Items folder to a user-created folder. When I move a "low" number of messages, say a few hundred or less, the operation completes successfully. When I move a "large" number of messages (in this example it's over 800) I get the message shown in the screenshot below. If I click Yes, a new email is generated and has links to all the emails I selected in the Attachment field. When I cancel that email, not only have the messages not moved but they appear to be deleted entirely. What does the message mean and why does it get presented? Why does clicking Yes do the behavior I described above?

    Read the article

  • Opening of the spWeb.ContentTypes gives SOAP Exception 0x80004004

    - by mdi
    Hi everybody! I have the code which going through the sharepoint contenttypes and changes needed field display names. On my local server everything works fine, but on the client side it gives me an error: Microsoft.SharePoint.SPException: Operation aborted (Exception from HRESULT: 0x80004004 (E_ABORT)) --- System.Runtime.InteropServices.COMException (0x80004004): Operation aborted (Exception from HRESULT: 0x80004004 (E_ABORT)) at Microsoft.SharePoint.Library.SPRequestInternalClass.OpenWebInternal(String bstrUrl, Guid& pguidID, String& pbstrRequestAccessEmail, UInt32& pwebVersion, String& pbstrServerRelativeUrl, UInt32& pnLanguage, UInt32& pnLocale, String& pbstrDefaultTheme, String& pbstrDefaultThemeCSSUrl, String& pbstrAlternateCSSUrl, String& pbstrCustomizedCssFileList, String& pbstrCustomJSUrl, String& pbstrAlternateHeaderUrl, String& pbstrMasterUrl, String& pbstrCustomMasterUrl, String& pbstrSiteLogoUrl, String& pbstrSiteLogoDescription, Object& pvarUser, Boolean& pvarIsAuditor, Int32& plSiteFlags) at Microsoft.SharePoint.Library.SPRequest.OpenWebInternal(String bstrUrl, Guid& pguidID, String& pbstrRequestAccessEmail, UInt32& pwebVersion, String& pbstrServerRelativeUrl, UInt32& pnLanguage, UInt32& pnLocale, String& pbstrDefaultTheme, String& pbstrDefaultThemeCSSUrl, String& pbstrAlternateCSSUrl, String& pbstrCustomizedCssFileList, String& pbstrCustomJSUrl, String& pbstrAlternateHeaderUrl, String& pbstrMasterUrl, String& pbstrCustomMasterUrl, String& pbstrSiteLogoUrl, String& pbstrSiteLogoDescription, Object& pvarUser, Boolean& pvarIsAuditor, Int32& plSiteFlags) --- End of inner exception stack trace --- at Microsoft.SharePoint.Library.SPRequest.OpenWebInternal(String bstrUrl, Guid& pguidID, String& pbstrRequestAccessEmail, UInt32& pwebVersion, String& pbstrServerRelativeUrl, UInt32& pnLanguage, UInt32& pnLocale, String& pbstrDefaultTheme, String& pbstrDefaultThemeCSSUrl, String& pbstrAlternateCSSUrl, String& pbstrCustomizedCssFileList, String& pbstrCustomJSUrl, String& pbstrAlternateHeaderUrl, String& pbstrMasterUrl, String& pbstrCustomMasterUrl, String& pbstrSiteLogoUrl, String& pbstrSiteLogoDescription, Object& pvarUser, Boolean& pvarIsAuditor, Int32& plSiteFlags) at Microsoft.SharePoint.SPWeb.InitWebPublic() at Microsoft.SharePoint.SPWeb.get_ServerRelativeUrl() at Microsoft.SharePoint.SPWeb.get_Url() at Microsoft.SharePoint.SPContentTypeCollection.FetchCollection() at Microsoft.SharePoint.SPContentTypeCollection..ctor(SPWeb web, Boolean bAll) at Microsoft.SharePoint.SPWeb.get_ContentTypes() the code is below: SPWebApplication webApp = SPWebService.ContentService.WebApplications[someGuid]; foreach (SPSite spSite in webApp.Sites) { using (SPWeb spWeb = spSite.RootWeb) { try { foreach (SPContentType spContentType in spWeb.ContentTypes) { ... }}}.. Could anybody provide me with workaround or with the reason of the problem.

    Read the article

  • Ubuntu Linux -- create custom burnable/bootable DVD image?

    - by ashgromnies
    I recently developed some kiosk software that runs on Ubuntu Linux, and my client needs me to set up ten more computers with the complete software package(and that number will only grow in the future). So I'm looking for a way to make this less of a pain in the neck and prevent me from shooting myself in the foot -- I had to disable some things on the installations of the operating systems like screensavers, automatic updates, etc. that would pop up and disrupt the kiosk operation. I don't feel comfortable doing that by hand across 10 computers, it seems stupid. Does anybody have recommendations for software that would let me burn an installable DVD with a complete image of the hard drive from one of the devices? I've looked at Clonezilla, G4L, and PartImage and I'm still not quite sure if any of them offer what I need. I know PartImage for sure won't work, because it doesn't support Ext4.

    Read the article

  • Unable to format/add a partition to Windows XP

    - by sma
    I recently add a 1T disk to an XP machine. I found the disk in the "disk management", I click "initialize" to init it, then creates a primary partition with size 950G, then select quick format to format the disk, the "disk management" will then complains: The disk configuration operation did not complete. Check the system event log for more information on the error. Verify the status of your storage devices before retrying. If that does not solve the problem, close the disk management console, then restart disk management or restart the computer. What could be the reason?

    Read the article

  • ruby on rails server is intermittently slow

    - by Richard
    My rails installation was chugging along nicely. Last night we had to perform a hot-patch with was really a standard deploy of some exception code. Once capistrano finished the operation one of our admins discovered that there were two long running passenger processes. While we have deployed release over the past two weeks it would appear that these processes have been here and alive the whole time. Granted they could have been zombies or any other artifact and at this point we do not know what state they were in. Which leads me to the question: There are so many moving parts between the rails application and the OS/hardware that being a SME is probably no longer possible. So; how does a sysadmin perform root-cause analysis with any certainty? And: When do I just start rebooting servers?

    Read the article

  • Office 2007 Mail Merge: How do I view field names instead of data?

    - by One Monkey
    I've just received a document which forms the basis of a mail merge as an attachment and I need to view the field names like they display in 2003 with the double chevrons e.g. <<titles>><<initials>><<surname>> However even though I get a dialogue as I open the docx file saying that it is going to attempt to merge from a file (which I don't have) and I cancel that operation the document still displays merge data e.g. Mr A Test Instead of the field names. I have clicked on the fields which turn grey to demonstrate that they are fields but I can't find a way to make it display the field names not the data. I don't even know where it's getting the data from as I don't have the data source file for the document to use.

    Read the article

  • How can I delete Time Machine files using the commandline

    - by Tim
    I want to delete some files/directories from my Time Machine Partition using rm, but am unable to do so. I'm pretty sure the problem is related to some sort of access control extended attributes on files in the backup, but do not know how to override/disable them in order to get rm to work. An example of the error I'm getting is: % sudo rm -rf Backups.backupdb/MacBook/Latest/MacBook/somedir rm: Backups.backupdb/MacBook/Latest/MacBook/somedir: Directory not empty rm: Backups.backupdb/MacBook/Latest/MacBook/somedir/somefile: Operation not permitted There are a number of reasons I do not want to use either the Time Machine GUI or Finder for this. If possible, I'd like to be able to maintain the extended protection for all other files (I'd like not to disable them globally, unless I can re-enable once I've done my work).

    Read the article

  • Is a memory upgrade a viable option to fix performance issues? [closed]

    - by ratchet freak
    I'm currently seeing my PC getting bogged down by Firefox 11.0 alone with only one hundred tabs open. Resulting in a memory use of over 530M , VM size of over 800M and an insane amount of page faults (easily reaching 100 million over the course of the day). The PF delta during normal operation easily reaches 7k with peaks to 15k sometimes reaching over 20k. This leads to a (real) deterioration to response time when switching, opening and closing tabs, opening menus, typing, ... My question is: Am I right in assuming that plugging in more RAM (either adding 2x1GB or replacing the existing RAM with 2x2GB or 4x1GB) will solve this problem? My specs: Windows XP Home Edition SP3 (32 bit) Intel Core Duo 2,4 GHz 2x512MB RAM 800MHz DDR2 (dual channel) 4MB unified cache 320GB HDD Intel G33 (X3100) onboard graphics (no graphics card but PCI express x16 slot is available)

    Read the article

  • What is the best way to shutdown hard disk?

    - by Sunil
    Right Now I'm using hdparm command in unix to shut down the hard disk but there are few issues with it. when it wakes back up it consumes lots power. Is there any other way to do it? Many times when I put my hard disk to sleep, I can see few bursts at the beginning and then after a while it goes to sleep. I think its because of the journaling system in ubuntu (which I use) Have anybody encountered that? What would be the best linux/unix operating system (eg: ubuntu/centos/redhat) to work on extensive hard disk operations? I would highly appreciate if you could share the problems you encountered while doing this operation.

    Read the article

  • Is it possible to change User's Home Directorys permission in OSX?

    - by Sosiska
    Most of your staff uses OSX as main operation system. The problem is that recently we were attacked with some odd malware: users are getting zip-file via mail, and when they open this zip file, they execute a binary keylogger malware, that is inside this zipped file. (One click is enough). We have some non-technical limitations and due this limitation we can't configure user's mail servers. But actually we have physical access to their laptops. As far as I know, there is possible to mount user's home directory without "x" (execution) permission in Linux and *BSD. So users can't run some binary file inside home directory. Is it possible to configure OS X so that user can't execute files inside /Users/?

    Read the article

  • Sharing RAM resources between 2 or more computers

    - by davee44
    I know there was a somewhat similar question before: How to share CPU or RAM? But, let me just specify it a little more... When Microsoft Windows requires more RAM capacity than available it uses a swap-file to temporarily store the data there, this is actually something like a hard-drive-based RAM. This technology is used for many years. Theoretically, it shouldn't be too hard to implement a similar technology that uses the RAM of different computer(s) in the network for temporary data storage. This just requires a software that runs on computers in the network that accepts and returns data from/to the main computer and keep that data in the RAM; plus the operation system of the main computer must have the ability to use computers in the network instead of (or in addition to) the swap-file. I wonder, are there any implementations of this idea? This would allow users to build RAM clusters using all of their home or office computers, that will boost the performance of a single computer for some development/gaming/video tasks, etc.

    Read the article

  • Rearrange content of a file

    - by VikJES
    I'd like to rearrange the content of a file on a per line basis (see below), ideally without using Perl or Python (I'm not allowed to... Don't ask.) The input file contains unordered header lines and lines with backup operation results. The output files should contain the lines ordered as shown below. Original file: Completed Backups Backups with Warnings Failed Backups Server A backup was completed with warnings Server B backup was successful Server C backup failed Server D backup was completed with warnings End result: Completed Backups Server B backup was successful Backups with Warnings Server A backup was completed with warnings Server D backup was completed with warnings Failed Backups Server C backup failed

    Read the article

  • Firefox addon to display all shortucts

    - by p1
    Is there a firefox addon that would display all shortcuts on a web page and also browser shortcuts. for eg: c is the keyboard shortcut for gmail compose. So either on a particular key combination or by hovering over the compose button it should show a "c" to indicate there is a shortcut for this operation. I guess If we keep on seeing the shortcuts popping each time when we do an action then we can start using and remembering more and more of it. Thanks. P.S: If this is not the forum to ask this question please suggest appropriately in comments.

    Read the article

  • Fastest security check of file tree on NFS

    - by fungs
    I am currently experiencing very bad performance using the following on an NFS network folder: time find . | while read f; do test -L "$f" && f=$(readlink -m $f); grp="$(stat -c %G $f)"; perm="$(stat -c %A $f)"; done Question 1) Within the loop permissions are checked using the variables grp and perm. Is there a way to lower the amount of disc I/O for these kind of checks over the network (e.g. read all meta data at once using find)? Question 2) It seems like the NFS isn't tuned very well, the same operation on a similar network link via SSHFS take only one third of the time. All parameters are auto-negotiated. Any suggestions?

    Read the article

  • Router for Infrastructure Network

    - by amfortas
    We have an HPC operation that down the years has grown to several racks of gear at three sites, hooked up via Gigabit fiber and Catalyst 2960s (we control the links and switches). Thus far all machines have been on a flat RCF1918 10/8 but we are looking to segment the network in order to streamline matters for iSCSI and generally keep infrastructure equipment away from our end-users. We have now reached a point where we need to consider introducing VLANs for specific subnets and are wondering if it would be worthwhile in the longer run to acquire a small router to keep to keep track of all this stuff and cut down on the complexity of netmasks and routes on host machines, etc. Has anyone here had a similar experience? Suggestions as to suitable equipment would be welcome.

    Read the article

  • tomcat processParameters complains about "invalid chunk ignored"

    - by cgicgi
    I am hosting a software system running under tomcat for quite a number of customers. Some of these send invalid URLs as request. These URLs may contain "&=" or "&&", which is not within the http specs. Now my tomcat complains about the following: "08.09.2010 12:36:04 org.apache.tomcat.util.http.Parameters processParameters WARNING: Parameters: Invalid chunk '' ignored." It is no problem, as is doesn't affect the operation in any way. Only problem ist that the tomcat/logs/catalina.out is growing with every single request. In the net you can find suggestions like: - Fix your URLs (which I can't, as it is the customers who send them) - Raise tomcats log level to ERROR (which I don't want to do, as it would suppress INFO like "INFO: Reloading context [/ContextName]" and other stuff you want to know. - Redirect the log to the application log (which won't solve the problem, as the message will flood just another log) Does anyone know how to solve the problem at its ROOT, which means: Tell tomcat not to complain about invalid request parameters any longer

    Read the article

  • How to reduce the pain of the command prompt

    - by Adam
    I want to learn to use the command prompt better on Windows to have more control over what I do and just for the learning experience. The main annoyance I have right now is all of the typing. If I want to perform an operation on a file with a large path I'm sitting there typing it out for a minute at least, and if I make a mistake I have to press the up arrow key and scroll through the entire thing and find what I did wrong. Is there any tools to make this easier?

    Read the article

  • Best ways to move a server room and 80 desktops to a new building

    - by Marko
    Need to plan the best way to physically move a server room (12 IBM x3400 towers) and all associated networking equipment to a new building. Everything is going, comms (80 strong call centre operation), desktops(80-90) the whole lot and has to be done efficiently over a weekend. What I cant seem to find are the best ways to move equipment, i.e is it best to remove hard drives and pack them separetely, do I pack the chassis' with foam and/or surround them in bubble wrap? Are there specific products designed to move computer equipment and where would I find these. The internet has failed me so far, please help

    Read the article

  • KVM vs Hyper-V. Which hypervisor is best for windows guests?

    - by user198851
    I am currently testing openstack for windows guests (XP and 7). I have deployed openstack "all in one" on system with following specs Processor corei5. (4 physical cores and 8 Threads with HT Technology) RAM 8 GB. HD 500 GB. I have created 4 windows xp guests with 512MB RAM and 1VCPU. On each windows guest i have installed visual studio 2008 only. In nova.conf CPU Over-Commit ratio is 2 for better performance (as mentioned in openstack operation guide). Using KVM as hyerpvisor. I have observed poor performance when simultaneously using visual studio in four windows instances. How i can improve performance ? Should i use KVM or Hyper-V ? or any other suggestion ?

    Read the article

  • ASUS WL-500gP v2 network between two local machines

    - by Epsiloncool
    I have two windows XP machines in my home networks, connected with ASUS WL-500gp V2 which also used as internet router. Problem is: while I have both computers normally goes to internet (used DHCP, static routes is ON, routing table is empty, operation mode is Home Gateway). I see both computers listed on the Network Neighborhood on 1st computer (wired to router), can enter to my own computer, but can not enter to other. I see only one computer (2nd) on the Network Neighborhood on 2nd computer (connected thru Wi-Fi), can enter to itself, getting error when trying to enter 1st computer address in address line (like \My1stComp). What is the problem? I totally crazy founding problem about 3 months.

    Read the article

< Previous Page | 93 94 95 96 97 98 99 100 101 102 103 104  | Next Page >