Search Results

Search found 21006 results on 841 pages for 'virtual desktop'.

Page 152/841 | < Previous Page | 148 149 150 151 152 153 154 155 156 157 158 159  | Next Page >

  • How to render remote assistance to a person using Live Messenger?

    - by Cheeso
    There is a feature within Windows Live Messenger v9 that allows a person to ask for remote assistance. BBut as I understand it, this works only if the router is UPnP enabled on both ends. Today I tried this with a friend during an active chat session, and nothing happened. I suspect a router problem. as I am remote, I cannot configure the router for them. What's a good way to render remote assistance? Here's the scenario: it will be based on invitation only (it's not a remote desktop or "logmein" situation). It's a younger person, a computer novice, on the other end of the wire. I'll be assiting with their use of applications on the PC. I'd l ike to be able to SEE the screen, and also use the mouse and keyboard. I have used Ultra-Vnc on the target machine and vncviewer on my machine, on a LAN. It works well. But I don't think I can use that, because it's my kids' computer in my ex-wife's place, and I don't want her to accuse me of spying on her computer. That's why I need it to be invitation only. Advice please. Is there an easy way for me to set up Remote Assistance? IS there some other tool I can use?

    Read the article

  • Can't connect to a Hyper-V VM from anywhere but the host OS

    - by Elbelcho
    I have an unusual situation on hand where I'm able to connect to a Hyper-V guest VM from the HOST, but not from anywhere but the host. The VM is running WIn2k8R2 and has IIS installed and Remote Desktop enabled. If I browse to the IP from the host OS, the IIS7 page displays. I can also RDP into the guest OS from the host as well as ping. From OFF the host, RDP, web and ping all fail. If I completely shut off the guest VM's firewall, ping will then start to respond, but all RDP and port 80 still don't. The physical host machine has 2 nics installed, but only one is plugged in. The one plugged in has a static IP. I have one Hyper-V virtual network and it's set to external. The guest VM has one NIC with a different static IP than the host, but both are on the same subnet. The host machine is joined to the domain, the guest VM is not. Any sugestions? Thanks so much for any help you may be able to provide!

    Read the article

  • CD-ROM Cant Be Accessed After Installing VMware Tools on VMware Server 2.0.2

    - by Optimal Solutions
    Using VMware Server 2.02, I set up a new VM (Windows XP Pro) applied all of the updates, added Windows addons from the install CD, etc... I got it to a stable point and up through that point I was able to access the CD-ROM drive (E: on my host). What I never did before was install "VMware Tools" and since it claims to give better mouse and video support, I gave it a shot. What it does is it places the install package in a virtual CD-ROM drive. I ran the install, no errors and it wants a reboot. I log back in after reboot and pop in the install CD for Microsoft Office 2003 and I receive the message "Please Insert A Disc Into Drive D:". Drive D: would be the next logical drive after the C: drive where I chose to install the OS. The message box sits there and if I click "Cancel", to return to Windows Explorer, the status bar seems to blink ever 1/2 second - as if its polling for a CD-ROM drive or something. No bangs or exclamations in the Device Manager for any hardware. I had taken a snapshot prior to the VMware Tools install and upon restoring it, the CD-ROM is back. I made copies of two other VMs, installed the VMware Tools on those VMs and both experienced the same issues: Windows 2003 Server and Windows 7 (32-bit). Has anyone seen this issue and know of a fix for this? It would be nice to have the better graphics and better mouse control AND use my CD-ROM drive as well! Thank you.

    Read the article

  • VMWare Newbie - looking for hardware recommendations and help :) [closed]

    - by Dan
    I am looking for some hardware recommendations on an upcoming virtualization project. We are a small company (80 users - 25 in site 1, 55 in site 2) currently using Windows Server 2003 - no VM servers yet. Our AD is setup where site 1 is the root domain and site 2 is a subdomain/subnet - connected by T1 and VPN for failover. The current DC's also server as file servers, print servers, AntiVirus servers. Email is in the cloud. Additionally then in site 1 we have 3 additional member servers - one running IBM Websphere for a customer specific app, one running Infor PowerLink (no real heavy load) and another that we use for Virtual Studio apps and also runs DirSync for Exchange Online. No heavy workloads on any of these machines really. We also have an AS400 box that we run ERP/CRM software on that site 2 connects to over the WAN link. In site 2 we also have a SQL machine that runs on Win2K server. Database files are not large less than 5 GB. Light to Medium workload on this machine. File servers in each site store less than 500 GB data and probably won't grow to more than 1TB in the next 5 years. I am looking to go to VMWare in both sites and virtualize all servers. What recommendations do you have for server, storage hardware? Is it safe to virtualize all of your DC's? Any help or advice would be greatly appreciated. Thanks.

    Read the article

  • Virtualized Screen Resolution

    - by Jim R
    I have a 64 bit Ubuntu 9.10 workstation with two virtualized guest OSes using KVM/QEMU. Also both 64-bit. One is Fedora 12 the other is beta of Ubuntu 10.04. The problem is that I would like to use a larger size display that is configured by default. Both guest OSes have a maximum screen resolution of 1024x768. I would like to increase this to something like 1280x900 or 1440x900. The resolution of the host system is 1920x1080. This configuration appears to be a result of the installation detecting the resolution being reported by the virtual screen during installation. The only information I have found on the subject suggests modifying the xorg.conf file in the /etc/X11 directory. Neither guest system has this file. I tried creating one by hand in the Fedora system and managed to render it completely unusable. Not a big deal as this is recently installed and can be reinstalled easily. Is what I want to do possible? If so, how do I accomplish it?

    Read the article

  • Virtualized Screen Resolution

    - by Jim R
    I have a 64 bit Ubuntu 9.10 workstation with two virtualized guest OSes using KVM/QEMU. Also both 64-bit. One is Fedora 12 the other is beta of Ubuntu 10.04. The problem is that I would like to use a larger size display that is configured by default. Both guest OSes have a maximum screen resolution of 1024x768. I would like to increase this to something like 1280x900 or 1440x900. The resolution of the host system is 1920x1080. This configuration appears to be a result of the installation detecting the resolution being reported by the virtual screen during installation. The only information I have found on the subject suggests modifying the xorg.conf file in the /etc/X11 directory. Neither guest system has this file. I tried creating one by hand in the Fedora system and managed to render it completely unusable. Not a big deal as this is recently installed and can be reinstalled easily. Is what I want to do possible? If so, how do I accomplish it?

    Read the article

  • How can I connect integrated webcam with virtualbox

    - by Mike Stumpf
    I am trying to use a Windows XP VM for VirtualBox on my Windows 8.1 laptop. I have tried the usual attaching USB device but I get an error saying "USB device is busy with previous request". My webcam is not active in any applications and this happens after a clean reboot of the host, the guest, and VirtualBox. Here are the details: Host -HP Pavilion 17 Notebook PC (stock) -Windows 8.1 -AMD A10-5750M APU -HP Truevision HD (integrated webcam) VM I got the VM here: http://www.modern.ie/en-us/virtualization-tools VirtualBox -VirtualBox 4.3.12 installed -VirtualBox Extension pack installed -Guest additions are installed for 4.3.12 -Enable USB Controller is checked -It does not matter if enable 2.0 controller is checked or not -It does not matter if a USB device filter is set up for the webcam or not -Here is the error message: Failed to attach the USB device DDFEQ01G45BFBV HP Truevision HD [0004] to the virtual machine IE8 - WinXP. USB device 'DDFEQ01G45BFBV HP Truevision HD' with UUID {7a2e2a45-974d-482b-9b4e-9f9abbcd0ebb} is busy with a previous request. Please try again later. Result Code: E_INVALIDARG (0x80070057) Component: HostUSBDevice Interface: IHostUSBDevice {173b4b44-d268-4334-a00d-b6521c9a740a} Callee: IConsole {8ab7c520-2442-4b66-8d74-4ff1e195d2b6} I read on some VirtualBox forums that disabling USB 2.0 support in the host BIOS solved their issue but I wanted to know if there were any other ideas before I muck around in there. Thanks

    Read the article

  • C#/Resharper 5 structural search, detect and warn if any non-virtual public methods on classes with

    - by chillitom
    Hi All, I'm using LinFu's dynamic proxy to add some advice to some classes. The problem is that the proxied objects can only intercept virtual methods and will return the return type's default value for non-virtual methods. I can tell whether a class is proxied or not based whether the class or any of it's method has an interception attribute, e.g. [Transaction] Is it possible to write a ReSharper 5 structural search that would warn if any non-virtual public methods are defined on a class with an interception attribute. E.g. Ok public class InterceptedClass { [Transaction] public virtual void TransactionalMethod() { ... } public virtual void AnotherMethod() { ... } } Bad public class InterceptedClass { [Transaction] public virtual void TransactionalMethod() { ... } public void AnotherMethod() // non-virtual method will not be called by proxy { ... } } Many Thanks.

    Read the article

  • Logmein does not work at home?

    - by Littlet-ENG
    I've been using logmein successfully for may situations and have had very good success. Our company has an Log me in Pro account. I have used this to share my desktop with customers. At work, I have had no problem with my laptop. At home, one program (solid-works) that I need to share with my customers, will not display the active screen. I spent 45 min on the phone with both the software for the cad system and logmein support with not help. I need help in narrowing down what the problem is on my computer. The support guys at Solid-works got another remote software to work, so its not the program. I can get the logmein to work at the office so its not the settings of the logmein pro account. The LMI people say its a setting on my computer.? -internet is fast enough at home -can't narrow down the problem -changed graphical settings and that didn't work. Any Suggestions?

    Read the article

  • VMWare Workstation Linux Host performance tuning

    - by Hoghweed
    I need to improve my linux hosted vmware workstation for using multiple virtual machines at the same time. I feel very stupid I lost a great blog post link which I found last month (and I'm not able to find it again..) so I try to ask here if anyone can help me: This is my host (laptop): 16GB DDR3 Ram HDD Hybrid 750GB 7200 (8GB SSD Cache) Mint 15 x64 Kernel 3.9.7 swappiness set to 10 The above are the important things about the host. So, My need is the ability to run 2 or 3 VMs at the same time. The lack of performance is about the disk, The last time from that blog post I lost, I setup /tmp to be mounted ad a memory partition and in my previous installation that was good, now I'm not able to find a good solution to tweak the things. I think with 16GB o RAM there will be no problems to run multiple VMs, but whe they start to swap or use the /tmp things going bad (guest cursor going too fast after a freeze, guest freeze and so on) Anyone can help me to fit a good host tweak and configuration to get better performance? Thanks in advance

    Read the article

  • What is the best VM for developing WPF apps from within OS X?

    - by MarqueIV
    All of my machines are Macs (Mac Pro, MacBook Pro, MacBook Air and Mac Mini (and Apple TV 2.0 too! :) ) but for my day-job, I develop .NET/WPF applications. Normally I just boot into Boot Camp and develop that way, which of course works great, but there are times when I need to simultaneously get to things on my Mac-side of the equation, so I've bought both VMware 3.1 and Parallels 6. Both work, however, even on my Mac Pro where I paid to upgrade to the better video cards (the NVidia 8600s I think vs. the stock ATI cards) the WPF performance bites!! Now this confuses me since both boast that they support not only hardware-accelerated OpenGL 2.1, but also hardware-accelerated DirectX 9 (VMware even allegedly supports DirectX 10!) via their respective virtual drivers and both can run 3D games just fine, even in a window. But even the simple act of resizing a WPF window that has a tiled background results in some HIDEOUS repainting and resizing behaviors. It's damn near closer to what you'd expect over RDP let alone a software-only renderer (forget accelerated hardware completely!) So... can anyone please tell me WTF WPF is doing differently? More importantly, how can I speed up the WPF performance? Should I switch to VirtualBox that also has support for DirectX? Or am I just gonna have to 'byte' the bullet (sorry... had to. So I like puns! Thank Jon Stewart!) and continue using Boot Camp?

    Read the article

  • Ubuntu 10.04 RGBA(Transparent theme) bug

    - by Nrew
    I tried to follow this tutorial frome ghacks.net. But I end up with a bug. Everytime I try to change the desktop background or the theme. It opens up lots of folders. And then close it back again. So I cannot do anything when I try to change the background or the theme to the default. Here is the tutorial And I can't even shutdown my machine now, please help. EDIT I tried executing these commands in the terminal sudo add-apt-repository ppa:erik-b-andersen/rgba-gtk sudo apt-get update && sudo apt-get upgrade sudo apt-get install gnome-color-chooser gtk2-module-rgba sudo apt-get install murrine-themes And it installed gnome color changer. It works, and the windows became transparent, but I can't change to the default theme and I can't put a background Here's the screenshot, when I try to change something in the preferences: Its just okay if I can't change the theme, but it won't let me change the background either. Or maybe the background is changed but its transparent just like the other windows. You can see that in the taskbar(or whatever its called in Ubuntu). Its full. What can I do to atleast make the background visible.

    Read the article

  • Ubuntu VM "read only file system" fix?

    - by David
    I was going to install VMWare tools on an Ubuntu server Virtual Machine, but I ran into the issue of not being able to create a cdrom directory in the /mnt directory. I then tested to see if it was just a permissions issue, but I couldn't even create a folder in the home directory. It continues to state that it is a read only file system. I know a little about Linux, and I'm not comfortable with it yet. Any advice would be much appreciated. Requested Information from a comment: username@servername:~$ mount /dev/sda1 on / type ext4 (rw,errors=remount-ro) proc on /proc type proc (rw) none on /sys type sysfs (rw,noexec,nosuid,nodev) none on /sys/fs/fuse/connections type fusectl (rw) none on /sys/kernel/debug type debugfs (rw) none on /sys/kernel/security type securityfs (rw) udev on /dev type tmpfs (rw,mode=0755) none on /dev/pts type devpts (rw,noexec,nosuid,gid=5,mode=0620) none on /dev/shm type tmpfs (rw,nosuid,nodev) none on /var/run type tmpfs (rw,nosuid,mode=0755) none on /var/lock type tmpfs (rw,noexec,nosuid,nodev) none on /lib/init/rw type tmpfs (rw,nosuid,mode=0755) binfmt_misc on /proc/sys/fs/binfmt_misc type binfmt_misc (rw,noexec,nosuid,nodev) For sure root output. root@server01:~# mount /dev/sda1 on / type ext4 (rw,errors=remount-ro) proc on /proc type proc (rw) none on /sys type sysfs (rw,noexec,nosuid,nodev) none on /sys/fs/fuse/connections type fusectl (rw) none on /sys/kernel/debug type debugfs (rw) none on /sys/kernel/security type securityfs (rw) udev on /dev type tmpfs (rw,mode=0755) none on /dev/pts type devpts (rw,noexec,nosuid,gid=5,mode=0620) none on /dev/shm type tmpfs (rw,nosuid,nodev) none on /var/run type tmpfs (rw,nosuid,mode=0755) none on /var/lock type tmpfs (rw,noexec,nosuid,nodev) none on /lib/init/rw type tmpfs (rw,nosuid,mode=0755) binfmt_misc on /proc/sys/fs/binfmt_misc type binfmt_misc (rw,noexec,nosuid,nodev)

    Read the article

  • How can I change the default location/action of 'Open Outlook Data File' in Outlook 2010?

    - by Chadddada
    I have recently deployed a Remote Desktop Host server that functions as a remote Microsoft Office 2010 work space for users. In part of the locking down of this server I have installed all programs on the D: drive and, through the use of Group Policy, hidden all the drives on the server from standard users. In addition to hiding these drives I am not allowing users to save anything locally (on the server) or open Libraries. However one of the functions of the server is to provide the Outlook client. Often users will have the .PST file stored on a network location and want to open this in Outlook. Can I change the default action or location that File Open Open Outlook Data File looks or tries to pull the file from? The default location seems to be under Users / Libraries. When click 'Open' you get a warning: This operation has been cancelled due to restrictions in effect on this computer. Clicking OK drops the user into a small menu that shows attached network drives under Computer. Can I instead have the 'Open' click drop the users in a defined network drive or just open computer and allow them to select a share? I don't want them to see the error message. A solution that looks to have been used for Office 2000/03 is: Key: HKEY_CURRENT_USER\Software\Microsoft\Office\<version>\Outlook Value name: ForceOSTPath Value type: REG_EXPAND_SZ Value: path to your storage folder I am not sure if there is a better way to do this now OR if this even works with Office 2010.

    Read the article

  • Windows and file system abstraction - how much does it matter where something comes from?

    - by deceze
    I have come across the following phenomenon and would like to know how leaky Windows' file system abstraction is or if there's something else involved. I partitioned the hard disk of my MacBook Pro and installed Windows 7 (64 bit). The Bootcamp driver package includes file system drivers (right term?) that enable Windows to access the Mac OS HFS+ partition. AFAIK it's a read-only access, but it works. Now, I have some disk images of stuff I usually install, so I grabbed a copy of Daemon Tools to mount them. When I mount an image saved on the HFS+ partition, about two out of three installers on these disks (usually InstallShield) crash with all sorts of weird errors. Most are just gibberish that lead to all sorts of non-solutions on Google, one was "This application is not the right type for your computer, check if you need 32 or 64 bit versions." When moving the image files to another Windows 7 computer on the network and mounting them from the network share, they work fine. My question now is, why do applications behave differently depending on whether the read-only image file, which should be abstracted away through the read-only virtual Daemon Tools drive, is located on a read-only HFS+ partition or on a Windows network share? And I'll just roll this into the question as well since I was wondering: Does the file system of a network share matter? Does the client system need to understand the file system of the share host or is that abstracted away in SMB?

    Read the article

  • Recommendation on remote access setup for accessing customer systems

    - by gregmac
    I'm looking for a product recommendation (open or commercial) that will allow remote access to customer sites for tech support purposes. We need to be able to gain access to help troubleshoot problems on servers. Currently end up using anything from RDP on public IP, to various VPNs that clients happen to have, to webex-type sessions that require lots of interaction from both sides to get things working. This often means a problem that could take 10 minutes to solve takes an extra 30+ minutes messing around trying to get a connection up. There are multiple customer sites, which should NOT have access to each other. At each site, there is anywhere from 1 to 8 servers (Windows 2003 or 2008) that need to be accessed. Support connection to machines even if they're behind a firewall/router with no public IP Be able to selectively allow/deny access from customer site. Customer site should not be able to connect outbound to anywhere else (our systems, or other customer sites) Support multiple users from our end If not a VPN connection (where RDP could be used over top), should support: Remote desktop access, including copy/paste File transfers Preferably would have some way to list all remote systems, showing online/offline. Anyone have any suggestions?

    Read the article

  • Using my old PC as a web/file server?

    - by Garrett
    I have an old desktop computer that I've been trying to sell for AGES. I guess nobody is looking for computers because it was advertised at a dirt cheap price on craigslist, local papers, etc. Anyways, I was wondering if it would be worth it to set it up as a home file server, a web dev server (I have a web host for actual production use), and maybe host a few server applications (ex: ventrillo). The computer is actually an old Dell that I cannibalized after the motherboard being destroyed by lightning, so it has fairly new parts in it. The specs are: P4 3.4GHz w/ HT and Artic Cooling Freezer 7 3GB DDR2 533 RAM 80GB hdd (will upgrade the hard drive if it's even worth using as a server) basic dvd rom 430 Watt Thermaltake PSU (it might be important to note that it is only 60% efficiency) ATI Radeon x600 256MB Antec 300 case It's not a really beefy machine, I just can't see giving it away or putting it in the corner to just collect dust. I have Windows Server 2008 R2 Standard and I am confident in my skills in operating most Linux operating systems. I'd also be using it to tinker with when I learn new things in my server admin classes (I'm finishing my 2nd year in college at the moment so I'm still learning) Also, my house is quite old and the electrical wiring is pretty poor (it MIGHT be up to code, then again, where I live most people don't even know what regulations are or let alone know how to spell it...) Would it be safe to leave it running all day and is it going to run up my electric bill because of the PSU efficiency? I only have 5mbit cable internet, but I won't be running very bandwidth intense services on it so it should be ok. I should elaborate on why I am concerned about the power. The circuits should be fine, but I'm more concerned about fire hazard. What is the likelihood that the server could cause an electrical fire? Again, thank you all for the feedback!

    Read the article

  • One server with multiple desktops "heads" with VNC

    - by Alexis K
    I am managing a system of kiosks. Each kiosk currently is running a web browser with the application for the kiosk running in the browser. Each kiosk needs to be able to display separate content. At times, the application running in the web browser freezes. Thus, I have to go out to the site to refresh the page. I want to see if there is a way to have one central server that has multiple browser heads. Then each kiosk would run a program like VNC to display one of the heads. This way when the program freezes, I just have to login to the central server and refresh the page. Getting VNC or another remote desktop software installed on the clients is no problem. What I am looking for is a way to have VNC remote into a specific head of a head of a web browser. Does such a thing exist? Or do I have to run a VM for each kiosk to remote into? Any advice, pointers, or solutions would be helpful.

    Read the article

  • How to clone a HDD and then use the clone with VMware (so that Windows works!)?

    - by Ahmad
    I have a system on which Windows 7 is installed, and I am trying to make a clone of its HDD image, which I then want to use in my main PC with VMware, so that I can boot Windows 7 off the cloned HDD. I used Ultimate Boot CD v5.1.1 with the system whose HDD I wanted to clone, and I cloned it using EaseUs Disk Copy, which comes with Ultimate Boot CD. The source HDD was 250 GB in size which had 3 partitions, while the USB HDD I attached to the system, which was supposed to be the destination/clone HDD, was 320 GB in size. I chose to create an exact replica, and so 250 GB worth of data (partitions, etc.) was copied exactly, and the rest of the space was un-allocated. I now connected this USB HDD to my main PC, fired up VMware Workstation 8 and defined a new Virtual Machine, and chose to boot off the USB HDD. Result is that when Windows is booting (from the cloned HDD inside VMware), I get the blue screen error before I reach the login screen. How can I change my methodology so that Windows even boots from the clone? I can change any tools I use, etc.

    Read the article

  • Why can't I mount an image hosted on a read-only HFS+ partition via Boot Camp?

    - by deceze
    I have come across the following phenomenon and would like to know how leaky Windows' file system abstraction is or if there's something else involved. I partitioned the hard disk of my MacBook Pro and installed Windows 7 (64 bit). The Boot Camp driver package includes file system drivers that enable Windows to access the Mac OS HFS+ partition. It's read-only access, but it works. Now, I have some disk images of stuff I usually install, so I grabbed a copy of Daemon Tools to mount them. When I mount an image saved on the HFS+ partition, about two out of three installers on these disks (usually InstallShield) crash with all sorts of weird errors. Most are just gibberish that lead to all sorts of non-solutions on Google, one was "This application is not the right type for your computer, check if you need 32 or 64 bit versions." When moving the image files to another Windows 7 computer on the network and mounting them from the network share, they work fine. My question now is, why do applications behave differently depending on whether the read-only image file, which should be abstracted away through the read-only virtual Daemon Tools drive, is located on a read-only HFS+ partition or on a Windows network share? And I'll just roll this into the question as well since I was wondering: Does the file system of a network share matter? Does the client system need to understand the file system of the share host or is that abstracted away in SMB?

    Read the article

  • Arch Linux: How to handle patches which only you will use?

    - by user12932
    I'm using freerdp together with xmonad and it has been giving me a lot of trouble. The super key (or "windows key") is my mod key in xmonad and it has been interfering with my freerdp usage rather annoyingly. Whenever I switched workspaces (or did anything else in xmonad involving the super key), windows (controlled by the freerdp instance in focus) registered a keypress as well. This event combined with the loss of focus got the super key stuck in windows indefinitely: the press of the keys d and r would first show my desktop, then open the run dialog (as if I was pressing the windows key constantly). I've tried several versions of freerdp, but all exhibited this annoying behavior. So I resorted to patching freerdp myself to just ignore the left super key on my keyboard. I love free software for a lot of reasons (especially the ability to alter things like this myself), however I still find it annoying to patch and rebuild freerdp on all version (and dependency) changes. How do you deal with situations like this? Is there even a "right way" to resolve this issue?

    Read the article

  • Why do disk images hosted on a read-only HFS+ partition behave differently?

    - by deceze
    I have come across the following phenomenon and would like to know how leaky Windows' file system abstraction is or if there's something else involved. I partitioned the hard disk of my MacBook Pro and installed Windows 7 (64 bit). The Boot Camp driver package includes file system drivers that enable Windows to access the Mac OS HFS+ partition. It's read-only access, but it works. Now, I have some disk images of stuff I usually install, so I grabbed a copy of Daemon Tools to mount them. When I mount an image saved on the HFS+ partition, about two out of three installers on these disks (usually InstallShield) crash with all sorts of weird errors. Most are just gibberish that lead to all sorts of non-solutions on Google, one was "This application is not the right type for your computer, check if you need 32 or 64 bit versions." When moving the image files to another Windows 7 computer on the network and mounting them from the network share, they work fine. My question now is, why do applications behave differently depending on whether the read-only image file, which should be abstracted away through the read-only virtual Daemon Tools drive, is located on a read-only HFS+ partition or on a Windows network share? And I'll just roll this into the question as well since I was wondering: Does the file system of a network share matter? Does the client system need to understand the file system of the share host or is that abstracted away in SMB?

    Read the article

  • Windows VirtualBox failed to attach USB device to Linux Guest

    - by joltmode
    I have Windows 7 64bit Host system, and I am using VirtualBox 4.1.18 (r78361). I have an Arch Linux Guest OS. I have installed VirtualBox Extension Pack (to enable USB2 support) and added my USB device filter to VM. I have also installed the Guest Additions provided by Arch: virtualbox-archlinux-additions (but I have no idea whether it's actually needed for my environment). I can see my USB device from VirtualBox Devices menu. Whenever I am trying to access it, I end up with: Failed to attach the USB device Kingston DT 100 G2 [0100] to the virtual machine Archlinux. USB device 'Kingston DT 100 G2' with UUID {a836ec33-0f41-4ca7-a31d-09cceaf5d173} is busy with a previous request. Please try again later. Details ? Result Code:    E_INVALIDARG (0x80070057) Component:      HostUSBDevice Interface:      IHostUSBDevice {173b4b44-d268-4334-a00d-b6521c9a740a} Callee:         IConsole {1968b7d3-e3bf-4ceb-99e0-cb7c913317bb} From what I have googled, most guides shows how to solve this the other way around - Linux Host to Windows Guest. How do I resolve this? Update I have tried to Eject (virtually, not physically) the device from my Windows Host system and then try to access the Device from Guest. Same error.

    Read the article

  • Tools required for a Web Development Project..

    - by RBA
    Hi, I wanted to design a project in linux which could contain programming languages(C, perl, PHP, HTML, XML etc) basically a web based project. Why i have chosen to build on Linux is because it is Open Source, and lot many things can be automated through scripting languages, which in windows i don't know. So, i have installed linux on a virtual machine(Host-Windows 2007 & Guest Linux CentOS), CentOS(command line interface). Since i am a beginner, so I want to know what all tools can be used to facilitate and ease my development process. Some which i know are listed below, and request you to please share your experience on this. 1) Using Putty so that can access the Linux machine from anywhere within the network. 2) Since i want to develop on Linux, but want to use windows as developing platform. So have downloaded Eclipse Editor (C/PHP) on windows. But want to know how can i access linux files from here?? 3) Installed Samba, and still trying to figure out how can i access linux files remotely on Windows. 4) Please share your experience, as how can i ease my development process. and what all tools i can use..?? Please let me know if you need any other clarification..

    Read the article

  • RAID 10 or RAID 5 for multiple VMs - what is the best choice?

    - by Lars Fastrup
    I have just ordered a new rig for my business. We do a lot of software development for Microsoft SharePoint and need the rig to run several virtual machines for development and test purposes. We will be using the free VMware ESXi for virtualization. For a start, we plan to build and start the following VMs - all with Windows Server 2008 R2 x64: Active Directory server MS SQL Server 2008 R2 Automated Build Server SharePoint 2010 Server for hosting our public Web site and our internal Intranet for a few people. The load on this server is going to be quite insignificant. 2xSharePoint 2007 development server 2xSharePoint 2010 development server Beyond that we will need to build several SharePoint farms for testing purposes. These VMs will only be started when needed. The specs of the new rig is: Dell R610 rack server 2xIntel XEON E5620 48GB RAM 6x146GB SAS drives Dell H700 RAID controller We believe the new server is going to make our VMs perform a lot better than our existing setup (2xIntel XEON, 16GB RAM, 2x500 GB SATA in RAID 1). But we are not sure about the RAID level for the new rig. Should we go for having the the 6x146GB SAS drives in a RAID 10 configuration or a RAID 5 configuration? RAID 10 seems to offer better write performance and lower risk of a RAID failure. But it comes at a cost of less drive space. Do we need RAID 10 or would RAID 5 also be a good choice for us?

    Read the article

< Previous Page | 148 149 150 151 152 153 154 155 156 157 158 159  | Next Page >