Search Results

Search found 20049 results on 802 pages for 'virtual drive'.

Page 324/802 | < Previous Page | 320 321 322 323 324 325 326 327 328 329 330 331  | Next Page >

  • Ubuntu 12.10 MBR not loading from dual boot selector

    - by Justin Holmes
    Since I am new to the forums here I cannot post pictures, but hopefully this link to my error screen will work. Error when selecting to boot the Ubuntu mbr Basically no matter what I do, windows boot manager wants to load windows and windows only. I tried running from my hard drive and also from a USB drive that had live installed. Both times I ended up with this message. I am running windows 7 with all the newest updates, on a Samsung series 7 Laptop. I have had many dual boot machines in the past and never seen this issue. I even had a dual booting windows 7 machine a few months ago and had no issues at all. Any help would be appreciated! Thanks! Justin

    Read the article

  • Clean up after Visual Studio

    - by psheriff
    As programmer’s we know that if we create a temporary file during the running of our application we need to make sure it is removed when the application or process is complete. We do this, but why can’t Microsoft do it? Visual Studio leaves tons of temporary files all over your hard drive. This is why, over time, your computer loses hard disk space. This blog post will show you some of the most common places where these files are left and which ones you can safely delete..NET Left OversVisual Studio is a great development environment for creating applications quickly. However, it will leave a lot of miscellaneous files all over your hard drive. There are a few locations on your hard drive that you should be checking to see if there are left-over folders or files that you can delete. I have attempted to gather as much data as I can about the various versions of .NET and operating systems. Of course, your mileage may vary on the folders and files I list here. In fact, this problem is so prevalent that PDSA has created a Computer Cleaner specifically for the Visual Studio developer.  Instructions for downloading our PDSA Developer Utilities (of which Computer Cleaner is one) are at the end of this blog entry.Each version of Visual Studio will create “temporary” files in different folders. The problem is that the files created are not always “temporary”. Most of the time these files do not get cleaned up like they should. Let’s look at some of the folders that you should periodically review and delete files within these folders.Temporary ASP.NET FilesAs you create and run ASP.NET applications from Visual Studio temporary files are placed into the <sysdrive>:\Windows\Microsoft.NET\Framework[64]\<vernum>\Temporary ASP.NET Files folder. The folders and files under this folder can be removed with no harm to your development computer. Do not remove the "Temporary ASP.NET Files" folder itself, just the folders underneath this folder. If you use IIS for ASP.NET development, you may need to run the iisreset.exe utility from the command prompt prior to deleting any files/folder under this folder. IIS will sometimes keep files in use in this folder and iisreset will release the locks so the files/folders can be deleted.Website CacheThis folder is similar to the ASP.NET Temporary Files folder in that it contains files from ASP.NET applications run from Visual Studio. This folder is located in each users local settings folder. The location will be a little different on each operating system. For example on Windows Vista/Windows 7, the folder is located at <sysdrive>:\Users\<UserName>\AppData\Local\Microsoft\WebsiteCache. If you are running Windows XP this folder is located at <sysdrive>:\ Documents and Settings\<UserName>\Local Settings\Application Data\Microsoft\WebsiteCache. Check these locations periodically and delete all files and folders under this directory.Visual Studio BackupThis backup folder is used by Visual Studio to store temporary files while you develop in Visual Studio. This folder never gets cleaned out, so you should periodically delete all files and folders under this directory. On Windows XP, this folder is located at <sysdrive>:\Documents and Settings\<UserName>\My Documents\Visual Studio 200[5|8]\Backup Files. On Windows Vista/Windows 7 this folder is located at <sysdrive>:\Users\<UserName>\Documents\Visual Studio 200[5|8]\.Assembly CacheNo, this is not the global assembly cache (GAC). It appears that this cache is only created when doing WPF or Silverlight development with Visual Studio 2008 or Visual Studio 2010. This folder is located in <sysdrive>:\ Users\<UserName>\AppData\Local\assembly\dl3 on Windows Vista/Windows 7. On Windows XP this folder is located at <sysdrive>:\ Documents and Settings\<UserName>\Local Settings\Application Data\assembly. If you have not done any WPF or Silverlight development, you may not find this particular folder on your machine.Project AssembliesThis is yet another folder where Visual Studio stores temporary files. You will find a folder for each project you have opened and worked on. This folder is located at <sysdrive>:\Documents and Settings\<UserName>Local Settings\Application Data\Microsoft\Visual Studio\[8|9].0\ProjectAssemblies on Windows XP. On Microsoft Vista/Windows 7 you will find this folder at <sysdrive>:\Users\<UserName>\AppData\Local\Microsoft\Visual Studio\[8|9].0\ProjectAssemblies.Remember not all of these folders will appear on your particular machine. Which ones do show up will depend on what version of Visual Studio you are using, whether or not you are doing desktop or web development, and the operating system you are using.SummaryTaking the time to periodically clean up after Visual Studio will aid in keeping your computer running quickly and increase the space on your hard drive. Another place to make sure you are cleaning up is your TEMP folder. Check your OS settings for the location of your particular TEMP folder and be sure to delete any files in here that are not in use. I routinely clean up the files and folders described in this blog post and I find that I actually eliminate errors in Visual Studio and I increase my hard disk space.NEW! PDSA has just published a “pre-release” of our PDSA Developer Utilities at http://www.pdsa.com/DeveloperUtilities that contains a Computer Cleaner utility which will clean up the above-mentioned folders, as well as a lot of other miscellaneous folders that get Visual Studio build-up. You can download a free trial at http://www.pdsa.com/DeveloperUtilities. If you wish to purchase our utilities through the month of November, 2011 you can use the RSVP code: DUNOV11 to get them for only $39. This is $40 off the regular price.NOTE: You can download this article and many samples like the one shown in this blog entry at my website. http://www.pdsa.com/downloads. Select “Tips and Tricks”, then “Developer Machine Clean Up” from the drop down list.Good Luck with your Coding,Paul Sheriff** SPECIAL OFFER FOR MY BLOG READERS **We frequently offer a FREE gift for readers of my blog. Visit http://www.pdsa.com/Event/Blog for your FREE gift!

    Read the article

  • Unlock the Full Value of Oracle CRM On Demand

    - by ruth.donohue
    Register for this live Oracle CRM On Demand Virtual Community Session! Oracle CRM On Demand delivers the most complete On Demand CRM solution on the market. But how can you ensure you are getting maximum value from the many powerful features that Oracle CRM On Demand offers? Join our interactive Oracle CRM On Demand Virtual Community Session on Tuesday, January 11, 2011 from 10.00-11 a.m. PT / 1 p.m.-2 p.m. ET to get expert advice and discuss the best ways to unlock the full potential of Oracle CRM Demand with Mike Lairson, author of 'Oracle CRM On Demand Reporting'. Book Offer Send your Oracle CRM On Demand configuration ideas before the Webcast to [email protected] and you could win a free copy of 'Oracle CRM On Demand Reporting' by Mike Lairson. Learn more and register now!

    Read the article

  • How do I disable unwanted iPXE boot attempt in Libvirt/qemu-kvm?

    - by gertvdijk
    Somehow after upgrading to 12.04, my virtual machines always boot with an attempt to boot from the network first. See this: while I don't have any PXE configuration set: I've tried: to disable SPICE, by changing the emulator to /usr/bin/kvm from /usr/bin/kvm-spice by editing the XML. Ctrl+B to configure the iPXE, but it doesn't let disable this as a boot option. setting another type of NIC - not an option, I need virtio for performance reasons. However, e1000e doesn't work either. removing the NIC: works. However, I need network. Googling around. Hard. Lots of result is about failing configured PXE boots. Not a big issue, but it increases boot times by 50-100% here (booting from SSD), so it's relatively long and annoys me. How can I disable this and boot from virtual hard disk directly?

    Read the article

  • Getting Started With nServiceBus on VAN Mar 31

    - by van
    Topic: nServiceBus is mature and powerful open source framework that enables to design robust, scalable, message-based, service-oriented architectures. Latest improvements in the configuration API enables developers to quickly get started and build a working simple system that uses messaging infrastructure. The goal of this session is to give a jump start with the framework, introduce basic concepts such as message handlers, Sagas, Pub/Sub, Generic Host and also create a working demo application that uses publish/subscribe messaging. The content of the session is addressed to developers that are interested in learning how to get started using nServiceBus in order to design and build distributed systems. Bio: Bernard Kowalski is currently a Software Developer at Microdesk, one of Autodesk's leading partners in providing variety of Geospatial and Computer-Aided Design solutions. Bernard has experience developing .NET framework-based applications utilizing Windows Forms, Windows Services, ASP.NET MVC, and Web services. In a recent project, Bernard architected and implemented a distributed system based on SOA principles using an open source implementation of an Enterprise Service Bus. Bernard develops software with Agile patterns and practices using Domain Driven Design combined with TDD (Test Driven Development). He is familiar with all of the following APIs: Autodesk Vault/Product Stream API, AutoCAD ActiveX/VBA/.NET API, AutoCAD Mechanical API, Autodesk Inventor API, Autodesk MapGuide Enterprise. Prior to joining Microdesk, Bernard worked as a researcher and teacher at the University of Science and Technology in Krakow, Poland where he was awarded with a PhD in Computer Methods in Materials Science. He also participated in research projects where he developed applications for analysis of hot compression test results using advanced optimization techniques. He also developed Finite Element Method-based programs for thermal and stress analysis using C++ and FORTRAN. Bernard is a member of the Domain Driven Design and ALT.NET user groups in NYC. Virtual ALT.NET (VAN) is the online gathering place of the ALT.NET community. Through conversations, presentations, pair programming and dojos, we strive to improve, explore, and challenge the way we create software. Using net conferencing technology such as Skype and LiveMeeting, we hold regular meetings, open to anyone, usually taking the form of a presentation or an Open Space Technology-style conversation. Please see the Calendar(http://www.virtualaltnet.com/Home/Calendar) to find a VAN group that meets at a time convenient to you, and feel welcome to join a meeting. Past sessions can be found on the Recording page. To stay informed about VAN activities, you can subscribe to the Virtual ALT.NET Google Group and follow the Virtual ALT.NET blog. Times below are Central Standard Time Start Time: Wed, Mar 31, 2010 8:00 PM UTC/GMT -5 hours End Time: Wed, Mar 31, 2010 10:00 PM UTC/GMT -5 hours Attendee URL: http://www.virtualaltnet.com/van Zach Young http://www.virtualaltnet.com

    Read the article

  • A Small Utility to Delete Files recursively by Date

    - by Rick Strahl
    It's funny, but for me the following seems to be a recurring theme: Every few months or years I end up with a host of files on my server that need pruning selectively and often under program control. Today I realized that my SQL Server logs on my server were really piling up and nearly ran my backup drive out of drive space. So occasionally I need to check on that server drive and clean out files. Now with a bit of work this can be done with PowerShell or even a complicated DOS batch file, but heck, to me it's always easier to just create a small Console application that handles this sort of thing with a full command line parser and a few extra options, plus in the end I end up with code that I can actually modify and add features to as is invariably the case. No more searching for a script each time :-) So for my typical copy needs the requirements are: Need to recursively delete files Need to be able to specify a filespec (ie. *.bak) Be able to specify a cut off date before which to delete files And it'd be nice to have an option to send files to the Recycle bin just in case for operator error :-)(and yes that came in handy as I blew away my entire database backup folder by accident - oops!) The end result is a small Console file copy utility that I popped up on Github: https://github.com/RickStrahl/DeleteFiles The source code is up there along with the binary file you can just run. Creating DeleteFiles It's pretty easy to create a simple utility like DeleteFiles of course, so I'm not going to spend any talking about how it works. You can check it out in the repository or download and compile it. The nice thing about using a full programming language like C over something like PowerShell or batch file is that you can make short work of the recursive tree walking that's required to make this work. There's very little code, but there's also a very small, self-contained command line parser in there that might be useful that can be plugged into any project - I've been using it quite a bit for just about any Console application I've been building. If you're like me and don't have the patience or the persistence (that funky syntax requires some 'sticking with it' that I simply can't get over) to get into Powershell coding, having an executable file that I can just copy around or keep in my Utility directory is the only way I'll ever get to reuse this functionality without going on a wild search each time :-) Anyway, hope some of you might find this useful. © Rick Strahl, West Wind Technologies, 2005-2012Posted in Windows  CSharp   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • VirtualBox Ubuntu Server with DNS

    - by Boris Karl Schlein
    I just want to have a local server inside my VirtualBox that offers dns functionality like www.example.local = 127.0.0.1 . Host: Ubuntu 11.10, Guest: Ubuntu 10.10 On my server I can already ping www.example.local - so I configured my virtual host correctly. Question is, how can I address www.example.local from outside my VirtualBox? I searched Google and askubuntu and found dnsmasq. I installed dnsmasq on my server and followed all steps on help.ubuntu. On my host system I've set 192.168.178.91 to the list of dns servers (192.168.178.91 is the IP address of my local server which uses a bridged network adapter). Thing is, I still cannot ping (or connect to) my example.local virtual host. It gives me an unknown host response. How can I set up my DNS server correctly?

    Read the article

  • Can&rsquo;t install Transport HUB in Exchange 2010

    - by Kelly Jones
    When building my latest SharePoint 2010 demo virtual machines, I decided to try installing Exchange 2010 as well.  Now, I’m not an Exchange admin, but I thought “how hard can this be?”  Well, a little more than I thought. Pretty early during the install, I got an error saying that it couldn’t “install Transport HUB”.  I double checked that my VM was meeting all of the requirements, both hardware and software, and everything looked fine. After much researching, it turns out that the error was caused by not having IPv6 enabled on the network adapter inside the virtual machine.  I had turned it off because I thought I wouldn’t need it.  I guess Exchange 2010 does.

    Read the article

  • Sublime text 2 syntax highlighter?

    - by BigSack
    I have coded my first custom syntax highlighter for sublime text 2, but i don't know how to install it. It is based on notepad++ highlighter found here https://70995658-a-62cb3a1a-s-sites.googlegroups.com/site/lohanplus/files/smali_npp.xml?attachauth=ANoY7criVTO9bDmIGrXwhZLQ_oagJzKKJTlbNDGRzMDVpFkO5i0N6hk_rWptvoQC1tBlNqcqFDD5NutD_2vHZx1J7hcRLyg1jruSjebHIeKdS9x0JCNrsRivgs6DWNhDSXSohkP1ZApXw0iQ0MgqcXjdp7CkJJ6pY_k5Orny9TfK8UWn_HKFsmPcpp967NMPtUnd--ad-BImtkEi-fox2tjs7zc5LabkDQ%3D%3D&attredirects=0&d=1 <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>fileTypes</key> <array> <string>smali</string> </array> <dict> <key>Word1</key> <string>add-double add-double/2addr add-float add-float/2addr add-int add-int/2addr add-int/lit16 add-int/lit8 add-long add-long/2addr aget aget-boolean aget-byte aget-char aget-object aget-short aget-wide and-int and-int/2addr and-int/lit16 and-int/lit8 and-long and-long/2addr aput aput-boolean aput-byte aput-char aput-object aput-short aput-wide array-length check-cast cmp-long cmpg-double cmpg-float cmpl-double cmpl-float const const-class const-string const-string-jumbo const-wide const-wide/16 const-wide/32 const-wide/high16 const/16 const/4 const/high16 div-double div-double/2addr div-float div-float/2addr div-int div-int/2addr div-int/lit16 div-int/lit8 div-long div-long/2addr double-to-float double-to-int double-to-long execute-inline fill-array-data filled-new-array filled-new-array/range float-to-double float-to-int float-to-long goto goto/16 goto/32 if-eq if-eqz if-ge if-gez if-gt if-gtz if-le if-lez if-lt if-ltz if-ne if-nez iget iget-boolean iget-byte iget-char iget-object iget-object-quick iget-quick iget-short iget-wide iget-wide-quick instance-of int-to-byte int-to-char int-to-double int-to-float int-to-long int-to-short invoke-direct invoke-direct-empty invoke-direct/range invoke-interface invoke-interface/range invoke-static invoke-static/range invoke-super invoke-super-quick invoke-super-quick/range invoke-super/range invoke-virtual invoke-virtual-quick invoke-virtual-quick/range invoke-virtual/range iput iput-boolean iput-byte iput-char iput-object iput-object-quick iput-quick iput-short iput-wide iput-wide-quick long-to-double long-to-float long-to-int monitor-enter monitor-exit move move-exception move-object move-object/16 move-object/from16 move-result move-result-object move-result-wide move-wide move-wide/16 move-wide/from16 move/16 move/from16 mul-double mul-double/2addr mul-float mul-float/2addr mul-int mul-int/2addr mul-int/lit8 mul-int/lit16 mul-long mul-long/2addr neg-double neg-float neg-int neg-long new-array new-instance nop not-int not-long or-int or-int/2addr or-int/lit16 or-int/lit8 or-long or-long/2addr rem-double rem-double/2addr rem-float rem-float/2addr rem-int rem-int/2addr rem-int/lit16 rem-int/lit8 rem-long rem-long/2addr return return-object return-void return-wide rsub-int rsub-int/lit8 sget sget-boolean sget-byte sget-char sget-object sget-short sget-wide shl-int shl-int/2addr shl-int/lit8 shl-long shl-long/2addr shr-int shr-int/2addr shr-int/lit8 shr-long shr-long/2addr sparse-switch sput sput-boolean sput-byte sput-char sput-object sput-short sput-wide sub-double sub-double/2addr sub-float sub-float/2addr sub-int sub-int/2addr sub-int/lit16 sub-int/lit8 sub-long sub-long/2addr throw throw-verification-error ushr-int ushr-int/2addr ushr-int/lit8 ushr-long ushr-long/2addr xor-int xor-int/2addr xor-int/lit16 xor-int/lit8 xor-long xor-long/2addr</string> </dict> <dict> <key>Word2</key> <string>v0 v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 v11 v12 v13 v14 v15 v16 v17 v18 v19 v20 v21 v22 v23 v24 v25 v26 v27 v28 v29 v30 v31 v32 v33 v34 v35 v36 v37 v38 v39 v40 v41 v42 v43 v44 v45 v46 v47 v48 v49 v50 p0 p1 p2 p3 p4 p5 p6 p7 p8 p9 p10 p11 p12 p13 p14 p15 p16 p17 p18 p19 p20 p21 p22 p23 p24 p25 p26 p27 p28 p29 p30</string> </dict> <dict> <key>Word3</key> <string>array-data .catch .catchall .class .end .end\ local .enum .epilogue .field .implements .line .local .locals .parameter .prologue .registers .restart .restart\ local .source .subannotation .super</string> </dict> <dict> <key>Word4</key> <string>abstract bridge constructor declared-synchronized enum final interface native private protected public static strictfp synchronized synthetic system transient varargs volatile</string> </dict> <dict> <key>Word4</key> <string>(&quot;0)&quot;0</string> </dict> <dict> <key>Word5</key> <string>.method .annotation .sparse-switch .packed-switch</string> </dict> <dict> <key>word6</key> <string>.end\ method .end\ annotation .end\ sparse-switch .end\ packed-switch</string> </dict> <dict> <key>word7</key> <string>&quot; ( ) , ; { } &gt;</string> </dict> <key>uuid</key> <string>27798CC6-6B1D-11D9-B8FA-000D93589AF6</string> </dict> </plist>

    Read the article

  • Changing the default installation path to a newly installed hard disk

    - by mgj
    Hi, I am currently working on a dual-booted PC. I am using Windows XP and Ubuntu 10.04 Lucid Lynx released in April 2010. The allocated partition to Ubuntu that I am making use of has almost exhausted. Current memory allocations on the PC wrt Ubuntu OS looks like this: bodhgaya@pc146724-desktop:~$ df -h Filesystem Size Used Avail Use% Mounted on /dev/sda2 8.6G 8.0G 113M 99% / none 998M 268K 998M 1% /dev none 1002M 580K 1002M 1% /dev/shm none 1002M 100K 1002M 1% /var/run none 1002M 0 1002M 0% /var/lock none 1002M 0 1002M 0% /lib/init/rw /dev/sda1 25G 16G 9.8G 62% /media/C /dev/sdb1 37G 214M 35G 1% /media/ubuntulinuxstore bodhgaya@pc146724-desktop:~$ cd /tmp I am trying to mount a 40GB(/dev/sdb1 - given below) new hard disk along with my existing Ubuntu system to overcome with hard disk space related issues. I referred to the following tutorial to mount a new hard disk onto the system:- http://www.smorgasbord.net/how-to-in...untu-linux%20/ I was able to successfully mount this hard disk for Ubuntu 0S. I have this new hard disk setup in /media/ubuntulinuxstore directory. The current partition in my system looks like this: bodhgaya@pc146724-desktop:/media/ubuntulinuxstore$ sudo fdisk -l [sudo] password for bodhgaya: Disk /dev/sda: 40.0 GB, 40000000000 bytes 255 heads, 63 sectors/track, 4863 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x446eceb5 Device Boot Start End Blocks Id System /dev/sda1 * 2 3264 26210047+ 7 HPFS/NTFS /dev/sda2 3265 4385 9004432+ 83 Linux /dev/sda3 4386 4863 3839535 82 Linux swap / Solaris Disk /dev/sdb: 40.0 GB, 40000000000 bytes 255 heads, 63 sectors/track, 4863 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xfa8afa8a Device Boot Start End Blocks Id System /dev/sdb1 1 4862 39053983+ 7 HPFS/NTFS bodhgaya@pc146724-desktop:/media/ubuntulinuxstore$ Now, I have a concern wrt the "location" where the new softwares will be installed. Generally softwares are installed via the terminal and by default a fixed path is used to where the post installation set up files can be found (I am talking in context of the drive). This is like the typical case of Windows, where softwares by default are installed in the C: drive. These days people customize their installations to a drive which they find apt to serve their purpose (generally based on availability of hard disk space). I am trying to figure out how to customize the same for Ubuntu. As we all know the most softwares are installed via commands given from the Terminal. My road block is how do I redirect the default path set on the terminal where files get installed to this new hard disk. This if done will help me overcome space constraints I am currently facing wrt the partition on which my Ubuntu is initially installed. I would also by this, save time on not formatting my system and reinstalling Ubuntu and other softwares all over again. Please help me with this, your suggestions are much appreciated.

    Read the article

  • NEC Corporation uPD720200 USB 3.0 controller doesn't run at full speed

    - by Radek Zyskowski
    I have fresh install of Ubuntu 10.10. I have external HD on USB 3.0. Trying to connect this via PCI Express NEC controller. dmesg: [ 8966.820078] usb 6-3: new high speed USB device using xhci_hcd and address 0 [ 8966.839831] xhci_hcd 0000:02:00.0: WARN: short transfer on control ep [ 8966.840580] xhci_hcd 0000:02:00.0: WARN: short transfer on control ep [ 8966.841329] xhci_hcd 0000:02:00.0: WARN: short transfer on control ep [ 8966.842079] xhci_hcd 0000:02:00.0: WARN: short transfer on control ep [ 8966.843343] scsi8 : usb-storage 6-3:1.0 [ 8967.847144] scsi 8:0:0:0: Direct-Access SAMSUNG HD204UI 1AQ1 PQ: 0 ANSI: 5 [ 8967.847589] sd 8:0:0:0: Attached scsi generic sg2 type 0 [ 8967.847923] sd 8:0:0:0: [sdb] 3907029168 512-byte logical blocks: (2.00 TB/1.81 TiB) [ 8967.848341] xhci_hcd 0000:02:00.0: WARN: Stalled endpoint [ 8967.850959] sd 8:0:0:0: [sdb] Write Protect is off [ 8967.850963] sd 8:0:0:0: [sdb] Mode Sense: 23 00 00 00 [ 8967.850966] sd 8:0:0:0: [sdb] Assuming drive cache: write through [ 8967.851818] xhci_hcd 0000:02:00.0: WARN: Stalled endpoint [ 8967.852365] sd 8:0:0:0: [sdb] Assuming drive cache: write through [ 8967.852370] sdb: sdb1 [ 8967.871315] xhci_hcd 0000:02:00.0: WARN: Stalled endpoint [ 8967.871853] sd 8:0:0:0: [sdb] Assuming drive cache: write through [ 8967.871856] sd 8:0:0:0: [sdb] Attached SCSI disk [ 8967.950728] xhci_hcd 0000:02:00.0: WARN: Stalled endpoint [ 8967.951355] sd 8:0:0:0: [sdb] Sense Key : Recovered Error [current] [descriptor] [ 8967.951361] Descriptor sense data with sense descriptors (in hex): [ 8967.951363] 72 01 04 1d 00 00 00 0e 09 0c 00 00 00 00 00 00 [ 8967.951375] 00 00 00 00 00 50 [ 8967.951380] sd 8:0:0:0: [sdb] ASC=0x4 ASCQ=0x1d [ 8968.790076] xhci_hcd 0000:02:00.0: HC died; cleaning up [ 8968.790076] usb 6-3: USB disconnect, address 2 [ 8999.008554] scsi 8:0:0:0: [sdb] Unhandled error code [ 8999.008558] scsi 8:0:0:0: [sdb] Result: hostbyte=DID_TIME_OUT driverbyte=DRIVER_OK [ 8999.008562] scsi 8:0:0:0: [sdb] CDB: Read(10): 28 00 74 70 97 39 00 00 3e 00 [ 8999.008573] end_request: I/O error, dev sdb, sector 1953535801 [ 8999.008578] Buffer I/O error on device sdb1, logical block 1953535738 [ 8999.008582] Buffer I/O error on device sdb1, logical block 1953535739 [ 8999.008585] Buffer I/O error on device sdb1, logical block 1953535740 [ 8999.008589] Buffer I/O error on device sdb1, logical block 1953535741 [ 8999.008592] Buffer I/O error on device sdb1, logical block 1953535742 [ 8999.008595] Buffer I/O error on device sdb1, logical block 1953535743 [ 8999.008600] Buffer I/O error on device sdb1, logical block 1953535744 [ 8999.008603] Buffer I/O error on device sdb1, logical block 1953535745 [ 8999.008606] Buffer I/O error on device sdb1, logical block 1953535746 [ 8999.008609] Buffer I/O error on device sdb1, logical block 1953535747 [ 8999.008642] scsi 8:0:0:0: rejecting I/O to offline device [ 8999.008747] scsi 8:0:0:0: [sdb] Unhandled error code [ 8999.008749] scsi 8:0:0:0: [sdb] Result: hostbyte=DID_NO_CONNECT driverbyte=DRIVER_OK [ 8999.008752] scsi 8:0:0:0: [sdb] CDB: Read(10): 28 00 74 70 97 77 00 00 3e 00 [ 8999.008760] end_request: I/O error, dev sdb, sector 1953535863 sudo lspci -v 2:00.0 USB Controller: NEC Corporation uPD720200 USB 3.0 Host Controller (rev 03) (prog-if 30) Physical Slot: 32 Flags: bus master, fast devsel, latency 0, IRQ 16 Memory at fe9fe000 (64-bit, non-prefetchable) [size=8K] Capabilities: [50] Power Management version 3 Capabilities: [70] MSI: Enable- Count=1/8 Maskable- 64bit+ Capabilities: [90] MSI-X: Enable- Count=8 Masked- Capabilities: [a0] Express Endpoint, MSI 00 Capabilities: [100] Advanced Error Reporting Capabilities: [140] Device Serial Number ff-ff-ff-ff-ff-ff-ff-ff Capabilities: [150] #18 Kernel driver in use: xhci_hcd Kernel modules: xhci-hcd If I try to put into this controller any USB 2.0, it works fine. But USB 3.0 nope. Any idea?

    Read the article

  • Cannot right click with synaptics touchpad

    - by fluteflute
    I have a Sony VAIO E14. The touchpad detects all clicks as Left clicks. In Windows 7, pressing on the right side of the touchpad is recognised as a right click. How can I enable right clicking? greg@greg-SVE14A1C5E:~$ xinput ? Virtual core pointer id=2 [master pointer (3)] ? ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ? SynPS/2 Synaptics TouchPad id=11 [slave pointer (2)] ... greg@greg-SVE14A1C5E:~$ grep "TouchPad: buttons:" /var/log/Xorg.0.log [ 23.112] (--) synaptics: SynPS/2 Synaptics TouchPad: buttons: left double triple

    Read the article

  • Setup a vpn server on Ubuntu server, but client can't setup a connection

    - by Jing
    I setup a vpn server using pptpd on the ubuntu server which is installed in a virtual box machine on a windows 7 desktop. I configure the vpn server and port forwarding in the windows. However, I cannot setup a connection to the server. Here's the ubuntu log I got when I tried to connect to this vpn server Anyone can help me solve this problem? Also, is there any alternative way to setup a vpn server on windows or virtual machine installed on windows? Thanks!

    Read the article

  • Managing Multiple dedicated servers centrally using a Web GUI tools?

    - by Sampath
    Application Architecture I am having a single ruby on rails application code running with multiple instances (ie. each client having identical sub domains) running on a multiple dedicated server using phusion passenger + nginx. sub domains setup done using vhost option in nginx passenger module. For Example server 1 serving 1 - 100 client with identical sub domains www.client1.product.com upto www.client100.product.com server 2 serving 101 - 200 client with identical sub domains www.client101.product.com upto www.client200.product.com server 3 serving 201 - 300 client with identical sub domains www.client201.product.com upto www.client300.product.com What my question is i need to centrally manage all my N dedicated servers using an gui tool I am looking for Web GUI tool to manage tasks like 1) backup all mysql databases automatically from all dedicated servers and send it to an some FTP backup drive 2) back files and folders from all dedicated servers and send it to an some FTP backup drive 3) need to manage firewall (CSF http://configserver.com/cp/csf.html) centrally for all dedicated servers 4) look to see server load , bandwidth used in graphical manner for all N no of dedicated servers Note: I am prefer to looking for an open source solution

    Read the article

  • The JRockit Book is Now in Print!

    - by Marcus Hirt
    Yes. I know. It’s been in print for some days already, but I haven’t found time to write about it until now. The book is a good guide for JVM’s in general, and for JRockit in particular. If you’ve ever wondered how the innards of the Java Virtual Machine works, or how to use the JRockit Mission Control to hunt down problems in your Java applications, this book is for you. The book is written for intermediate to advanced Java Developers. These are the chapters: Getting Started Adaptive Code Generation Adaptive Memory Management Threads and Synchronization Benchmarking and Tuning JRockit Mission Control The Management Console The Runtime Analyzer The Flight Recorder The Memory Leak Detector JRCMD Using the JRockit Management APIs JRockit Virtual Edition Appendix A: Bibliography Appendix B: Glossary Index The book is 588 pages long. For more information about the book, see the book page at Packt.

    Read the article

  • QotD: Peter Wayner on Programming trend No. 1

    - by $utils.escapeXML($entry.author)
    Programming trend No. 1: The JVM is not just for Java anymoreA long time ago, Sun created Java and shared the virtual machine with the world. By the time Microsoft created C#, people recognized that the VM didn't have to be limited to one language. Anything that could be transformed into the byte code could use it.Now, it seems that everyone is building their language to do just that. Leave the job of building a virtual machine to Sun/Oracle, and concentrate your efforts on the syntactic bells and structural whistles, goes the mantra today.Peter Wayner in an article on "11 programming trends to watch" at ITWorld.

    Read the article

  • Oracle VM Administration: Oracle VM Server for x86 - new Training

    - by Antoinette O'Sullivan
     Oracle VM Administration: Oracle VM Server for x86 - new course just released. This 3 day hands-on course teaches students how to build a virtualization platform using the Oracle VM Manager and Oracle VM Server for x86. Students learn how deploy and manage highly configurable, inter-connected virtual machines. The course teaches students how to install and configure Oracle VM Server for x86 as well as details of network and storage configuration, pool and repository creation, and virtual machine management. You can follow this class that brings you great hands-on experience either in-class or from your own desk.

    Read the article

  • Oracle VM Administration: Oracle VM Server for x86 - new Training

    - by Antoinette O'Sullivan
     Oracle VM Administration: Oracle VM Server for x86 - new course just released. This 3 day hands-on course teaches students how to build a virtualization platform using the Oracle VM Manager and Oracle VM Server for x86. Students learn how deploy and manage highly configurable, inter-connected virtual machines. The course teaches students how to install and configure Oracle VM Server for x86 as well as details of network and storage configuration, pool and repository creation, and virtual machine management. You can follow this class that brings you great hands-on experience either in-class or from your own desk.

    Read the article

  • Installing SharePoint 2010 and PowerPivot for SharePoint on Windows 7

    - by smisner
    Many people like me want (or need) to do their business intelligence development work on a laptop. As someone who frequently speaks at various events or teaches classes on all subjects related to the Microsoft business intelligence stack, I need a way to run multiple server products on my laptop with reasonable performance. Once upon a time, that requirement meant only that I had to load the current version of SQL Server and the client tools of choice. In today's post, I'll review my latest experience with trying to make the newly released Microsoft BI products work with a Windows 7 operating system. The entrance of Microsoft Office SharePoint Server 2007 into the BI stack complicated matters and I started using Virtual Server to establish a "suitable" environment. As part of the team that delivered a lot of education as part of the Yukon pre-launch activities (that would be SQL Server 2005 for the uninitiated), I was working with four - yes, four - virtual servers. That was a pretty brutal workload for a 2GB laptop, which worked if I was very, very careful. It could also be a finicky and unreliable configuration as I learned to my dismay at one TechEd session several years ago when I had to reboot a very carefully cached set of servers just minutes before my session started. Although it worked, it came back to life very, very slowly much to the displeasure of the audience. They couldn't possibly have been less pleased than me. At that moment, I resolved to get the beefiest environment I could afford and consolidate to a single virtual server. Enter the 4GB 64-bit laptop to preserve my sanity and my livelihood. Likewise, for SQL Server 2008, I managed to keep everything within a single virtual server and I could function reasonably well with this approach. Now we have SQL Server 2008 R2 plus Office SharePoint Server 2010. That means a 64-bit operating system. Period. That means no more Virtual Server. That means I must use Hyper-V or another alternative. I've heard alternatives exist, but my few dabbles in this area did not yield positive results. It might have been just me having issues rather than any failure of those technologies to adequately support the requirements. My first run at working with the new BI stack configuration was to set up a 64-bit 4GB laptop with a dual-boot to run Windows Server 2008 R2 with Hyper-V. However, I was generally not happy with running Windows Server 2008 R2 on my laptop. For one, I couldn't put it into sleep mode, which is helpful if I want to prepare for a presentation beforehand and then walk to the podium without the need to hold my laptop in its open state along the way (my strategy at the TechEd session long, long ago). Secondly, it was finicky with projectors. I had issues from time to time and while I always eventually got it to work, I didn't appreciate those nerve-wracking moments wondering whether this would be the time that it wouldn't work. Somewhere along the way, I learned that it was possible to load SharePoint 2010 in a Windows 7 which piqued my interest. I had just acquired a new laptop running Windows 7 64-bit, and thought surely running the BI stack natively on my laptop must be better than running Hyper-V. (I have not tried booting to Hyper-V VHD yet, but that's on my list of things to try so the jury of one is still out on this approach.) Recently, I had to build up a server with the RTM versions of SQL Server 2008 R2 and Sharepoint Server 2010 and decided to follow suit on my Windows 7 Ultimate 64-bit laptop. The process is slightly different, but I'm happy to report that it IS possible, although I had some fits and starts along the way. DISCLAIMER: These products are NOT intended to be run in production mode on the Windows 7 operating system. The configuration described in this post is strictly for development or learning purposes and not supported by Microsoft. If you have trouble, you will NOT get help from them. I might be able to help, but I provide no guarantees of my ability or availablity to help. I won't provide the step-by-step instructions in this post as there are other resources that provide these details, but I will provide an overview of my approach, point you to the relevant resources, describe some of the problems I encountered, and explain how I addressed those problems to achieve my desired goal. Because my goal was not simply to set up SharePoint Server 2010 on my laptop, but specifically PowerPivot for SharePoint, I started out by referring to the installation instructions at the PowerPiovt-Info site, but mainly to confirm that I was performing steps in the proper sequence. I didn't perform the steps in Part 1 because those steps are applicable only to a server operating system which I am not running on my laptop. Then, the instructions in Part 2, won't work exactly as written for the same reason. Instead, I followed the instructions on MSDN, Setting Up the Development Environment for SharePoint 2010 on Windows Vista, Windows 7, and Windows Server 2008. In general, I found the following differences in installation steps from the steps at PowerPivot-Info: You must copy the SharePoint installation media to the local drive so that you can edit the config.xml to allow installation on a Windows client. You also have to manually install the prerequisites. The instructions provides links to each item that you must manually install and provides a command-line instruction to execute which enables required Windows features. I will digress for a moment to save you some grief in the sequence of steps to perform. I discovered later that a missing step in the MSDN instructions is to install the November CTP Reporting Services add-in for SharePoint. When I went to test my SharePoint site (I believe I tested after I had a successful PowerPivot installation), I ran into the following error: Could not load file or assembly 'RSSharePointSoapProxy, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. I was rather surprised that Reporting Services was required. Then I found an article by Alan le Marquand, Working Together: SQL Server 2008 R2 Reporting Services Integration in SharePoint 2010,that instructed readers to install the November add-in. My first reaction was, "Really?!?" But I confirmed it in another TechNet article on hardware and software requirements for SharePoint Server 2010. It doesn't refer explicitly to the November CTP but following the link took me there. (Interestingly, I retested today and there's no longer any reference to the November CTP. Here's the link to download the latest and greatest Reporting Services Add-in for SharePoint Technologies 2010.) You don't need to download the add-in anymore if you're doing a regular server-based installation of SharePoint because it installs as part of the prerequisites automatically. When it was time to start the installation of SharePoint, I deviated from the MSDN instructions and from the PowerPivot-Info instructions: On the Choose the installation you want page of the installation wizard, I chose Server Farm. On the Server Type page, I chose Complete. At the end of the installation, I did not run the configuration wizard. Returning to the PowerPivot-Info instructions, I tried to follow the instructions in Part 3 which describe installing SQL Server 2008 R2 with the PowerPivot option. These instructions tell you to choose the New Server option on the Setup Role page where you add PowerPivot for SharePoint. However, I ran into problems with this approach and got installation errors at the end. It wasn't until much later as I was investigating an error that I encountered Dave Wickert's post that installing PowerPivot for SharePoint on Windows 7 is unsupported. Uh oh. But he did want to hear about it if anyone succeeded, so I decided to take the plunge. Perseverance paid off, and I can happily inform Dave that it does work so far. I haven't tested absolutely everything with PowerPivot for SharePoint but have successfully deployed a workbook and viewed the PowerPivot Management Dashboard. I have not yet tested the data refresh feature, but I have installed. Continue reading to see how I accomplished my objective. I unintalled SQL Server 2008 R2 and started again. I had different problems which I don't recollect now. However, I uninstalled again and approached installation from a different angle and my next attempt succeeded. The downside of this approach is that you must do all of the things yourself that are done automatically when you install PowerPivot as a new server. Here are the steps that I followed: Install SQL Server 2008 R2 to get a database engine instance installed. Run the SharePoint configuration wizard to set up the SharePoint databases. In Central Administration, create a Web application using classic mode authentication as per a TechNet article on PowerPivot Authentication and Authorization. Then I followed the steps I found at How to: Install PowerPivot for SharePoint on an Existing SharePoint Server. Especially important to note - you must launch setup by using Run as administrator. I did not have to manually deploy the PowerPivot solution as the instructions specify, but it's good to know about this step because it tells you where to look in Central Administration to confirm a successful deployment. I did spot some incorrect steps in the instructions (at the time of this writing) in How To: Configure Stored Credentials for PowerPivot Data Refresh. Specifically, in the section entitled Step 1: Create a target application and set the credentials, both steps 10 and 12 are incorrect. They tell you to provide an actual Windows user name and password on the page where you are simply defining the prompts for your application in the Secure Store Service. To add the Windows user name and password that you want to associate with the application - after you have successfully created the target application - you select the target application and then click Set credentials in the ribbon. Lastly, I followed the instructions at How to: Install Office Data Connectivity Components on a PowerPivot server. However, I have yet to test this in my current environment. I did have several stops and starts throughout this process and edited those out to spare you from reading non-essential information. I believe the explanation I have provided here accurately reflect the steps I followed to produce a working configuration. If you follow these steps and get a different result, please let me know so that together we can work through the issue and correct these instructions. I'm sure there are many other folks in the Microsoft BI community that will appreciate the ability to set up the BI stack in a Windows 7 environment for development or learning purposes. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Why not Green Threads?

    - by redjamjar
    Whilst I know questions on this have been covered already (e.g. http://stackoverflow.com/questions/5713142/green-threads-vs-non-green-threads), I don't feel like I've got a satisfactory answer. The question is: why don't JVM's support green threads anymore? It says this on the code-style Java FAQ: A green thread refers to a mode of operation for the Java Virtual Machine (JVM) in which all code is executed in a single operating system thread. And this over on java.sun.com: The downside is that using green threads means system threads on Linux are not taken advantage of and so the Java virtual machine is not scalable when additional CPUs are added. It seems to me that the JVM could have a pool of system processes equal to the number of cores, and then run green threads on top of that. This could offer some big advantages when you have a very number large of threads which block often (mostly because current JVM's cap the number of threads). Thoughts?

    Read the article

  • DBCC CHECKDB (BatmanDb, REPAIR_ALLOW_DATA_LOSS) &ndash; Are you Feeling Lucky?

    - by David Totzke
    I’m currently working for a client on a PowerBuilder to WPF migration.  It’s one of those “I could tell you, but I’d have to kill you” kind of clients and the quick-lime pits are currently occupied by the EMC tech…but I’ve said too much already. At approximately 3 or 4 pm that day users of the Batman[1] application here in Gotham[1] started to experience problems accessing the application.  Batman[2] is a document management system here that also integrates with the ERP system.  Very little goes on here that doesn’t involve Batman in some way.  The errors being received seemed to point to network issues (TCP protocol error, connection forcibly closed by the remote host etc…) but the real issue was much more insidious. Connecting to the database via SSMS and performing selects on certain tables underlying the application areas that were having problems started to reveal the issue.  You couldn’t do a SELECT * FROM MyTable without it bombing and giving the same error noted above.  A run of DBCC CHECKDB revealed 14 tables with corruption.  One of the tables with issues was the Document table.  Pretty central to a “document management” system.  Information was obtained from IT that a single drive in the SAN went bad in the night.  A new drive was in place and was working fine.  The partition that held the Batman database is configured for RAID Level 5 so a single drive failure shouldn’t have caused any trouble and yet, the database is corrupted.  They do hourly incremental backups here so the first thing done was to try a restore.  A restore of the most recent backup failed so they worked backwards until they hit a good point.  This successful restore was for a backup at 3AM – a full day behind.  This time also roughly corresponds with the time the SAN started to report the drive failure.  The plot thickens… I got my hands on the output from DBCC CHECKDB and noticed a pattern.  What’s sad is that nobody that should have noticed the pattern in the DBCC output did notice.  There was a rush to do things to try and recover the data before anybody really understood what was wrong with it in the first place.  Cooler heads must prevail in these circumstances and some investigation should be done and a plan of action laid out or you could end up making things worse[3].  DBCC CHECKDB also told us that: repair_allow_data_loss is the minimum repair level for the errors found by DBCC CHECKDB Yikes.  That means that the database is so messed up that you’re definitely going to lose some stuff when you repair it to get it back to a consistent state.  All the more reason to do a little more investigation into the problem.  Rescuing this database is preferable to having to export all of the data possible from this database into a new one.  This is a fifteen year old application with about seven hundred tables.  There are TRIGGERS everywhere not to mention the referential integrity constraints to deal with.  Only fourteen of the tables have an issue.  We have a good backup that is missing the last 24 hours of business which means we could have a “do-over” of yesterday but that’s not a very palatable option either. All of the affected tables had TEXT columns and all of the errors were about LOB data types and orphaned off-row data which basically means TEXT, IMAGE or NTEXT columns.  If we did a SELECT on an affected table and excluded those columns, we got all of the rows.  We exported that data into a separate database.  Things are looking up.  Working on a copy of the production database we then ran DBCC CHECKDB with REPAIR_ALLOW_DATA_LOSS and that “fixed” everything up.   The allow data loss option will delete the bad rows.  This isn’t too horrible as we have all of those rows minus the text fields from out earlier export.  Now I could LEFT JOIN to the exported data to find the missing rows and INSERT them minus the TEXT column data. We had the restored data from the good 3AM backup that we could now JOIN to and, with fingers crossed, recover the missing TEXT column information.  We got lucky in that all of the affected rows were old and in the end we didn’t lose anything.  :O  All of the row counts along the way worked out and it looks like we dodged a major bullet here. We’ve heard back from EMC and it turns out the SAN firmware that they were running here is apparently buggy.  This thing is only a couple of months old.  Grrr…. They dispatched a technician that night to come and update it .  That explains why RAID didn’t save us. All-in-all this could have been a lot worse.  Given the root cause here, they basically won the lottery in not losing anything. Here are a few links to some helpful posts on the SQL Server Engine blog.  I love the title of the first one: Which part of 'REPAIR_ALLOW_DATA_LOSS' isn't clear? CHECKDB (Part 8): Can repair fix everything? (in fact, read the whole series) Ta da! Emergency mode repair (we didn’t have to resort to this one thank goodness)   Dave Just because I can…   [1] Names have been changed to protect the guilty. [2] I'm Batman. [3] And if I'm the coolest head in the room, you've got even bigger problems...

    Read the article

  • Installing Ubuntu 10.04 to external HDD overwrites the MBR of the internal HDD

    - by zkrpar
    I have a Asus A42F laptop which has Windows 7 32 bit installed on it's internal HDD. I have just installed Ubuntu 10.04 on a portable HDD using the laptop. Now my laptop does not boot Windows 7 if the portable HDD is disconnected. I can only get the boot menu when the portable HDD is connected. The portable HDD does not boot when connected to another computer. Please help me, I want to: Boot Windows from the internal drive, without GRUB Boot Ubuntu from the external drive via the BIOS boot menu (F8 or F12)

    Read the article

  • How to boot Ubuntu 12.04-64bit from a USB from Compaq CQ58

    - by user208092
    I try to boot Ubuntu 12.04, 64-bit on my Compaq CQ58 laptop from a USB but it is not working. I've correctly installed the Ubuntu on my pen drive following the instructions on Ubuntu website. (http://www.ubuntu.com/download/desktop/create-a-usb-stick-on-windows) These are my BIOS settings: Post Hotkey Delay (sec) <0 CD-ROM Boot Internal Network Adapter Boot Network Boot Protocol Legacy Support Secure Boot Platform Key Enrolled Pending Action None Clear All Secure Boot UEFI Boot Order: USB Diskette on Key/USB Hard Disk OS Boot Manager Internal CD/DVD ROM Drive ! Network Adapter With these settings when i restart my computer what shows up is: Boot Device Not Found. This is what I get on the Boot Manager: Boot Option Menu OS boot Manager Boot From EFI File (Arrow Up) and (Arrow Down) to change option, ENTER to select an option. Press F10 to BIOS Setup Options, ESC to exit. PLEASE HELP... P.S. My laptop has windows 8

    Read the article

< Previous Page | 320 321 322 323 324 325 326 327 328 329 330 331  | Next Page >