Search Results

Search found 11618 results on 465 pages for 'shared storage'.

Page 184/465 | < Previous Page | 180 181 182 183 184 185 186 187 188 189 190 191  | Next Page >

  • Printing on Remote Desktop session

    - by Arindam Banerjee
    We have to connect a Windows 2008 server using Remote Desktop from Windows XP machine. A Barcode Printer is attached with XP machine and the printer is shared as Local Resource in RDC session to the server. On the server we have to print from an application which prints either to LPT port or shared printer (UNC path). For this I use to configure print pooling combining LPT1 and (Terminal Server) TSxxx port. As I don't know the option to access the Terminal Session printer via UNC path. But I have the following issues - Every time I connect to a remote session, the printer from my local Win XP machine is showing in Printers and Faxes on Win 2008 Server (Terminal Server), but I am not allowed to manage the Win XP printer from Terminal Server to enable pooling. On the server I have to change the security permission every time and then enable print pooling. How can I keep the security permission unchanged? Secondly I created a batch file to enable print pooling. rundll32 printui.dll,PrintUIEntry /Xs /n "Printer (from CLIENT)" Portname "LPT1:,TS005" But every time the printer in terminal session connects in diffrent terminal Session port. Any solution to make the TS port fixed? Help from anyone will be highly appreciated.

    Read the article

  • Where do vendors publish internal transfer rates of HDDs?

    - by red888
    So I've started to dig into storage fundamentals and found that in order to calculate the IOPS of a HDD you need to know the internal transfer rate of the drive (time it takes data to move from the platters to internal disk's cache). I went on newegg and even a few vendor sites and could not find this info published for any HDDs. Is it sometimes called something else? Take this link to a seagate HDD for instance. Nowhere do I see "internal transfer rate", but I do see something called "Sustained Data Rate OD"- is that the same thing? Just so you know where I'm getting this info (Book: "Information Storage and Management Storing, Managing..."): Consider an example with the following specifications provided for a disk: The average seek time is 5 ms in a random I/O environment; therefore, T = 5 ms. Disk rotation speed of 15,000 revolutions per minute or 250 revolutions per second — from which rotational latency (L) can be determined, which is one-half of the time taken for a full rotation or L = (0.5/250 rps expressed in ms). 40 MB/s internal data transfer rate, from which the internal transfer time (X) is derived based on the block size of the I/O — for example, an I/O with a block size of 32 KB; therefore X = 32 KB/40 MB. Consequently, the time taken by the I/O controller to serve an I/O of block size 32 KB is (TS) = 5 ms + (0.5/250) + 32 KB/40 MB = 7.8 ms. Therefore, the maximum number of I/Os serviced per second or IOPS is (1/TS) = 1/(7.8 × 10^-3) = 128 IOPS.

    Read the article

  • How should a small company administer their web server?

    - by John Isaacks
    We currently have our website hosted by a small company that is actually a reseller for Rackspace. They act as our server administrators. They configured the servers, handle the backups, if there is a problem, we call them and they fix it. We are growing and want to move away from our shared server to either a cloud or dedicated server. I am thinking cloud myself but I am open to either. The current company doesn't seem to want to offer us anything more than a shared hosting plan. I looked into cloud solutions at vps.net, with them I would have to be the server administrator myself. I am the website programmer but administering the server is outside my comfort zone. vps.net does have a $99/month plan for Pro-Active Managed Support but I am not sure if this is the equivalent on a server admin that is there when you need them. We could hire someone in house, but I think that would be overkill for our needs. I am not exactly sure what we need, I do know we need as close to 100% uptime as we possible can. and we need the ability to add/remove/change the server configuration/software/etc. when needed (though changes shouldn't be very often once everything is setup right). Can someone point me in the right direction? What do other companies do?

    Read the article

  • How can one use online backup with large amounts of static data?

    - by Billy ONeal
    I'd like to setup an offsite backup solution for about 500GB of data that's currently stored between my various machines. I don't care about data retention rates, as this is only a backup of, not primary storage, for my data. If the backup is stored on crappy non-redundant systems, that does not matter. The data set is almost entirely static, and mostly consists of things like installers for Visual Studio, and installer disk images for all of my games. I have found two services which meet most of this: Mozy Carbonite However, both services impose low bandwidth caps, on the order of 50kb/s, which prevent me from backing up a dataset of this size effectively (somewhere on the order of 6 weeks), despite the fact that I get multiple MB/s upload speeds everywhere else from this location. Carbonite has the additional problem that it tries to ignore pretty much every file in my backup set by default, because the files are mostly iso files and vmdk files, which aren't backed up by default. There are other services such as EC2 which don't have such bandwidth caps, but such services are typically stored in highly redundant servers, and therefore cost on the order of 10 cents/gb/month, which is insanely expensive for storage of this kind of data set. (At $50/month I could build my own NAS to hold the data which would pay for itself after ~2-3 months) (To be fair, they're offering quite a bit more service than I'm looking for at that price, such as offering public HTTP access to the data) Does anything exist meeting those requirements or am I basically hosed?

    Read the article

  • Create account for service

    - by Andy
    I am configuring a new server. The server is running Hudson that is going to copy some files from this server to another. The other server is a virtual machine. Both running Windows Server 2012. Hudson is started on server A with log on as "Local System". When I come to the copy phase it says "Access denied". Changing the log on to "Administrator" works. However, I guess this is bad. I do not have much experience with user management. I tried to create a own hudson account on both servers A and B. I tried to log on as hudson account in the service-management but it doesn't start. How would you create an account for this particular service that has access to the shared folder on server B and can be used to start the service on server A? I guess I need two accounts with same username and password on server A and server B? The folder on Server B is shared with everyone and the guest account is enabled.

    Read the article

  • Copying windows home server backup offsite

    - by Simon
    What ways are there to copy a windows home server backup to an offsite location? I'm talking specifically (and only) about the automated backup of my entire machine, and not the shared network folders. I am 90% working away from home on my laptop which has a 640GB drive so the shared folders are essentially useless to me. I backup every night, but if my house burns down or broken into the I'm in serious serious trouble ! I'm really looking for some alternative way to back up my entire machine - which much not interfere with the reliability or speed by which my WHS backs up my laptop every night. Either a way to 'export' a complete machine backup from the server, or recommendations on non-conflicting software I can backup to a 1TB drive at work are what I'm looking for. Note: I believe that WHS uses its own completely proprietary backup and doesn't use things like any 'backup bit' or 'archive bit'. I just dont want to install some other backup software that will conflict. PS I'm now running Windows 7 and just realized that I should probably check out the backup functionality it gives me. I assume that won't conflict right! Edit: Thanks for the hosted solutions. I'd also appreciate ways to backup to an 'offsite' location that I control - like my office vs. my home. The hosted solutions I think will be too slow or expensive for my needs.

    Read the article

  • Permissions in OS X for iTunes library with multiple users

    - by John
    I currently have a lot of music on an external drive and my iTunes set up from there. However, periodically, when the external drive isn't connected, iTunes will default back to the library location of my home directory user path. I don't want to mess with an external drive, as my Mac's HD is large enough to house the music collection. However, I have 4 family members – all with their own logins – using this same gob of music. I don't want four copies of the library, only one with all libraries referencing it. So, what I want to do is: Move all music files to a shared directory at /Macintosh HD/users/music. I created this directory and adjusted permissions, so all four users can read and write to this directory. Get all four accounts to reference this library instead of the external or local home locations I am hoping I can just check the box to keep library organized in my account, which is the admin and let iTunes move it all. Then delete current libraries for each account and re-add from the new shared location. Will the iTunes organization process cause permissions issues either by setting permissions to all the files access to my account only or write permissions or any other 'gotcha'? I am having a hard time coming up with a smooth solution that won't break everything and cause me to have mega duplicates or access issues. I would prefer not to do any XML library file editing if possible. Am I dreaming?

    Read the article

  • Painless deployment of a Django app (port from Drupal). Do I have to switch to a VPS?

    - by Monden
    I'm about to complete porting my Drupal based community site to Django. My Drupal site is hosted at a shared hosting (Dreamhost) for last 4 years, and stability & performance has been satisfactory. The site gets around 5k unique visitors with 70-80k page views a day. This will be my first deployment of a Django application and I'm not comfortable with managing my own VPS. I use Ubuntu as a dev. server, but I don't have experience with it at the production env. I have an unrelated internal CRM app (Django) that I host with Webfaction. However security and performance isn't an issue as it's only accessed by 5 people. Unfortunately, I don't have much time to learn and maintain a VPS at this moment. I would like to know if I can host a site with this much traffic at Webfaction's shared environment? How would performance differ in comparison to Linode or Slicehost? Google AppEngine isn't an option at the moment as I'll be using my current Postgresql database.

    Read the article

  • Delay init from starting a service for a period of time?

    - by Matthew
    I am trying to get a rudimentary NFS server up and running. Right now the server is configured as an NFS server due to a workaround for a vendor issue not supporting direct attached clustered storage, which we are trying to get them to resolve. The vendor software is Splunk. The splunk feature we are using requires files be located on shared storage (which for us is /mnt/nfs until they support a real clustered filesystem). Currently the server has a GFS2 filesystem mounted at bootup (it is the only server with the filesystem actively mounted so there should be no problems with locking). We went with GFS2 so switching over to a clustered filesystem is easy should the vendor begin supporting it. NFS is configured to mount that filesystem at /mnt/nfs, which the splunk installation than sees. Splunk is configured to find it's configuration files in /mnt/nfs. However, I am running into a problem where the splunk daemon starts before nfs is finished loading, and because it sees nothing at /mnt/nfs it starts creating files there, and then when the files disappear (nfs finishes mounting the share), splunk craps out. Splunk is set to run at runlevel 3, S90. NFS is set at runlevels 2-5, S60. Is there any way to delay the startup of the splunk process further?

    Read the article

  • Ubuntu 12.10 64bit host reboots when trying to install any guest system using VirtualBox

    - by gts123
    I am having a really nasty problem with VirtualBox as everytime I try to install any guest OS(using ISO file as CD for installation media), the installation starts normally but as soon as it is about to start either installing to virtual hard drive or loading(e.g. as LiveFS) it causes the host system to reboot abruptly. Config is as below: Host system: Ubuntu 12.10 64bit - Intel® Core i7-2640M CPU @ 2.80GHz × 4 Virtualbox version: 4.1.18_Ubuntu r78361 Guest OS systems tried: 32bit version of FreeBSD 9, Debian 6, Tails 0.14 VM setup Tried to have the minimal setup necessary just in case it would avoid for each system to make sure I'd avoid conflicts, but to no avail. I've tried different values and combinations of the below but the problem still persists: Shared Clipboard: Disabled Show in fullscreen/seamless: Disabled Remember runtime changes: DIsabled Base Memory: 2048 MB Chipset: PIIX3 IO APIC: Disabled EFI: Disabled Absolute Pointing device: Disabled Processor(s): 1 CPU PAE/NX: Disabled VT-x/AMD-V: Disabled Video Memory: 12 MB 3d/2d acceleration: Disabled Storage IDE COntroller: PIIX3 (same as chipset instead of PIIX4) Use host I/O cache: No Audio: disabled Network adapter: NAT USB controller: disabled No shared folder Also another sideeffect of the reboot is that it appears that it does not log any information in the error log files; not making things any easier. Please help.

    Read the article

  • Copying files between linux machines with strong authentication but without encryption

    - by Zizzencs
    I'm looking for a suitable program to copy files from one linux machine to another one. The program should be able to do authentication but it should not do encryption. The reason behind the latter is the lack of CPU power to do the encryption. I copy backups from ~70 machines to a single backup server simultaneously. The single server is an HP Proliant DL360 G7, with 10 Gbps ethernet connection and an FC storage backend that can do 4 Gbps. Through FTP I can write ~400MB/sec to the storage (that's about what I want) but through ssh with arcfour I can only do ~100MB/sec while having 100% CPU usage. That's why I want file transfers not to be encrypted. The alternatives that I found not really suitable: rcp: no authentication, forget it FTP: making the authentication "secure" (at least preventing plain-text password exchange) is possible but not really easy and I haven't found a method to force any FTP daemon to encrypt the control channel (for the authentication) and not to encrypt the data channel (for data transfers) SCP/SFTP: in farely recent ssh(d) implementations you can't turn off encryption. The best you can do is to use the arcfour cypher for the encryption but it sill uses too much CPU power for my needs. rsync over ssh: same problems as with SCP/SFTP. plain rsync: from the documentation of rsyncd: "The authentication protocol used in rsync is a 128 bit MD4 based challenge response system. This is fairly weak protection, though (with at least one brute-force hash-finding algorithm publicly available), so if you want really top-quality security, then I recommend that you run rsync over ssh." It's a no-go. Is there a protocol/program that can do exactly what I want? (A big plus would be if it could work on windows as well and/or if it would support rsync-stlye copying/synchronization (e.g. copy only the differences).)

    Read the article

  • What is going on when I can't access an SMB server share (not accessible error) until I run cmdkey to delete the credential?

    - by Warren P
    I have a network connection share issue. The first connection works, and seems to stay connected for at least a few hours. However, after each time my windows 7 PC reboots, it can no longer form a network connection to the shared folder, nor browse to it, until I not only unmap and remap the mapped drive, but also, I have to use cmdkey to delete the stored credentials like this: cmdkey /delete:Domain:target=HOSTNAME My work PC is on a domain, and I am not the IT administrator, but I'm curious if there is anything I can do to investigate this issue. Any settings in registry or group policy that I could examine to see why the first connection works, but each subsequent attempt (once a stored credential exists) to browse or use the connection, fails with a connection error saying it is "not accessible", like this: I do not even get any error until at least several minutes go by. THe first thing I see is a window frozen and empty, and then I get this error: This has happened when connecting to a share on a DROBO device, and on a share which is not on the domain, but which was a Microsoft Home Server. I wonder if there's something broken in WIndows 7 professional with regards to connecting to non-domain shares when an active directory domain controller exists, and a particular workstation is joined to a domain? The problem only occurs if I click "remember credentials". It is not fixed by any amount of working with net use. Usingcmdkey to delete all stored credentials for the host is the only way to get back in, and it affects all non-domain shared folders. Update I'm hoping there are some registry locations I could check that could be misconfigured in some way that might explain why SMB/CIFS stored credentials for non-domain systems seem to be auto-invalidated in this weird way. Knowing how whacko Microsoft Windows domain and security handling is sometimes, this could be some kind of stupid "feature".

    Read the article

  • External HDD incorrectly detected as internal - how change to enable hot swap/eject?

    - by Sam
    Hi All, I have win 7 x64 Home Prem. The HDD is a seagate barracuda, 7200.7 ST3120827AS. 3.5", Serial: 3ms006n6, Firmware: 3.42 (no further updates) NexStar CX External case (drivers installed). I have three drives: WD320 with OS installed WD750 data storage (internal) seagate 120 (external) - connected via esata board connected to sata on motherboard (MSI p43 neo) Tried uninstalling HDD in device manager to no effect. Also the internal WD750 is detected as an external drive and win taskbar icon allows for it to be ejected (unlike the seagate). All drives are configured - Online, Simple, Basic, NTFS, Active, Primary Partition (except c drive). The seagate was previously used as a primary disk with XP operating system so I deleted the volume and created/reformatted (not quick). HDD is no longer "Active". But did not fix problem. Background Originally, I installed win 7 with the bios set to IDE and forgot to install the chipset drivers. Then I changed win 7 to install the AHCI drivers, changed the bios to AHCI and rebooted. Win 7 loaded drivers but WD HDD gave problems/crashed. I installed chipset drivers and latest intell storage matrix software thingie (in safe mode). Everything worked fine after that except for the problem of not corrrectly detecting the external drive] I have noticed that under the driver properties (and similarly in the registry) the two drives are configured differently (e.g. in driver details property capabilities for the WD the value is set to 0000006, CM_DEVCAP_REMOVABLE & EJECTSUPPORTED - whereas the seagate shows 0000080 & CM_DEVCAP_SURPRISEREMOVALOK). Any easy way to configure things? I tried physically swapping the sata connections on the mainboard without success So far I have found that a solution to my problem might be to perform some reg changes: http://superuser.com/questions/12955/how-do-i-remove-the-option-to-eject-sata-drives-from-the-windows-7-tray-icon

    Read the article

  • While using an ntfs smb share for mac users, do symbolic links and extended attributes work?

    - by scape
    We have a majority of mac users but we'd rather support their file sharing using a Windows server with an ntfs drive, or at least a Linux server with ext3. We've had trouble, much trouble, utilizing the OS X server software and after the years are now looking to abandon it. What's mostly holding us back is the fact that the mac users very often utilize symbolic links and other special features that exist for an HFS+ partition. The shared locations are mostly primary storage and not just used as an archive storage location. While there is an option to create symbolic links under ntfs, I'm curious if there is anything I need to look out for if I were to move the files over to a new partition that's hosted from a Windows server from the HFS+ partition; in addition, how well creating a symbolic link from a mac might work. I am also worried about windows backup software and if it will ruin these special sym links, and how placing permissions on sub-folders will work. Alternatively I could remotely backup the files using a mac and Bru, nonetheless I still want to get away from mac server for hosting the shares.

    Read the article

  • IIS 7 and ASP.NET State Service Configuration

    - by Shawn
    We have 2 web servers load balanced and we wanted to get away from sticky sessions for obvious reasons. Our attempted approach is to use the ASP.NET State service on one of the boxes to store the session state for both. I realize that it's best to have a server dedicated to storing sessions but we don't have the resources for that. I've followed these instructions to no avail. The session still isn't being shared between the two servers. I'm not receiving any errors. I have the same machine key for both servers, and I've set the application ID to a unique value that matches between the two servers. Any suggestions on how I can troubleshoot this issue? Update: I turned on the session state service on my local machine and pointed both servers to the ip address on my local machine and it worked as expected. The session was shared between both servers. This leads me to believe that the problem might be that I'm not using a standalone server as my state service. Perhaps the problem is because I am using the ip address 127.0.0.1 on one server and then using a different ip address on the other server. Unfortunately when I try to use the network ip address as opposed to localhost the connection doesn't seem to work from the host server. Any insight on whether my suspicions are correct would be appreciated.

    Read the article

  • Webcam security camera software that runs as a service

    - by hurfdurf
    I've been looking for Windows webcam software that will run as a Windows service without any user login. The goal is to use the webcam as a cheap security camera and log the results to secure networked storage (windows share, not FTP). The requirements are: Motion detection Video capture Runs as a service (should start recording immediately after reboot) Nice to have: Round-robin storage, e.g. 10Gb limit, oldest files overwritten/deleted when space gets low I've read the other webcam questions but still haven't stumbled across anything suitable. Evaluations thus far: Title MotionDetect Service Snapshots Video SpaceLimit License Yawcam Yes Yes Yes No No GPL WebCam ZoneTrigger Yes No Yes Yes No Commercial Dorgem Yes No Yes Yes No GPL AbelCam Yes No Yes Yes No Commercial Logitech Yes No Yes Yes No Paired with camera IspyConnect Yes No Yes Yes Yes Free SecureCam (SourcefoYes No Yes Yes No GPL AbelCam Yes No Yes Yes No Commercial Active WebCam Yes Yes(?) Yes Yes Volume Free Commercial WebCam Surveyor Yes No Yes Yes No Commercial WebCamsPy NA NA NA NA NA GPL Camera: Logitech Webcam Pro 9000 Windows 7 32-bit WebCamsPy failed to initialize so couldn't be tested So far, the contenders: Active Webcam comes the closest, and claims to run as a service, but i haven't been able to get it to record after a cold boot even though a service is running. Yawcam can be set up as a service but doesn't record video. IspyConnect has exactly the type of space limit I want and looks great, but doesn't run as a service (seems also to be a bit of a cpu hog) Any other suggestions? I'm locked into Windows so can't use linux Motion, which looks almost perfect. Any pointers to rich Windows webcam/motion detection libraries out there that could easily be turned into a command line program would also be appreciated.

    Read the article

  • Doesn't VirtualBox 4.0 support drag-drop file copy yet?

    - by Benjamin
    Version 4.0.0 will be new major release. The following major new features were added: -New settings/disk file layout for VM portability; see the manual for more information. -Open Virtualization Format Archive (OVA) support; see the manual for more information. -VMM: support more than 1.5/2 GB guest RAM on 32-bit hosts -Language bindings: uniform Java bindings for both local (COM/XPCOM) and remote (SOAP) -invocation APIs -Chipset: added support for the Intel ICH9 chipset with 3 PCI buses, PCI express and -Message Signaled Interrupts (MSI) -Audio: Intel HD Audio is now available as guest hardware, for better support with modern -guest operating systems (e.g. 64-bit Windows; bug #2785). -GUI: redesigned user interface with guest window preview -GUI: new display mode with downscaled guest display -Resource control: added support for limiting a VM's CPU time and IO bandwidth. -Storage: support asynchronous I/O for iSCSI, VMDK, VHD and Parallels images -Storage: support for resizing VDI and VHD images -Windows Additions: support for automatically updating the Guest Additions (requires -installed Windows Guest Additions 4.0 or later) -Guest Additions: support for copying files into the guest file system What does the last line mean? I thought this is a drag-drop file copy feature like VMWare. I tried that. But I couldn't copy by drag-drop, ctrl-c ctrl-v either. Edit: I mean VBox 4.0 beta, not 3.x The release note is here. Download link is here.

    Read the article

  • Virtualbox Networking: XP Guest, Ubuntu Host: Connecting to Windows servers & local network?

    - by user51833
    Here's what I have: Windows XP running in VirtualBox 3.0.8_OSE r53138; Host OS = Ubuntu 9.10 "Karmic Koala"; Windows network in my office with smb fileservers; Guest OS is connected to the internet and is sharing folders with Host OS; Limited networking expertise. Here's what I actually need to do: Use MS Outlook in my XP guest with all its calendar-sharing features and stuff (if this is all done through the internet then great) - or find a Linux app that can do the same stuff; Map Windows network servers, eg. smb://server01/ in my XP guest (I can already access these in Ubuntu. Here's what I've tried with no luck: Entering the server address (example above) in my XP guest windows explorer address bar (got a "could not access the file, path or drive" error message - maybe if I could enter login/pass information? But I don't know how); Mapping the server as a network drive (Windows could not find the path); Mounting the server as one of my shared folders (I couldn't find it through the shared folders browser in VirtualBox - is there somewhere in the Linux filesystem that Ubuntu keeps links to mounted servers?).

    Read the article

  • What can I do with a home server?

    - by Joel Coehoorn
    I have an old 700 Mhz Pentium III at home running Windows 2000 Server, with a home router set up to pass incoming requests to it and a DynDNS account set up so it's easy to find. Right now I'm using it for a number of things: Shared folders + backup inside the home network Shared Printer inside the home network Domain Controller, just because I feel like it and because it's useful to me as practice to keep those "enterprise" administration skills. Web Server FTP remote access for my files. I abandoned this for security reasons, but it's still worth leaving visible. Remote Desktop in to the home network (thinking about adding VPN service) SVN repository MySQL - Will be moving to SQL Server 2008 Standard soon. After I upgrade my wife's laptop from home to pro later this year it will also become a domain controller It's the only place I still have access to Internet Explorer 6 any more without setting up a new virtual machine, so I use it for testing code with that browser. The question is: What else could I be doing with this machine? Update Additional ideas based on the suggestions: Media Server/DVR Build server PBX SSH Proxy Server Continuous Integration Server Personal OpenID Provider Update2 Just a note that this server was recently upgraded to an Atom330 with 2 GB ram and bigger hard drive. For all that's slow for a "modern" cpu, it should still be much faster than the old Pentium III and the expected power savings should make the upgrade essentially free over the course of the next year or two. Also, it's now running Windows Server 2008.

    Read the article

  • virtual machines, dual booting and data disks on SSD

    - by stevemarvell
    This is in planning, so if I've got the strategy wrong, please let me know. There are multiple questions here, but I think they all degenerate to the same answers. The hardware is a laptop with a single SSD. I'm trying to not lose the performance of the SSD. I plan a native dual booting Windows (plus cygwin) and Linux machine which is my BYOD and represents the development environment. I keep the codebase on a shared partition (though sometimes this is an external thunderbolt SSD) which can be natively "mounted" by whichever OS is in operation. I boot into one or the other environments depending on the task in hand. Sometime I have to develop with windows tools, but generally, Linux is my preferred development environment. It would be ideal if I could VM the other OS and run either in either. I'm going to assume, because I've not found a sensible VM based solution, that I have get samba involved to share the code partition between VMs. Is this going to blow my SSD performance in the VM? The client also supplies me with a VM for the target environment, usually linux. This is not often suited to development and is used for testing only. I normally keep two copies of this, one as a sandbox and one which I deploy to using the client's preferred method. I keep these VM snapshots on the shared partition. The latter is interacted with over the network and so has no disk sharing requirements. However, it would be useful for the sandbox to be able to "mount" the code base from the natively running OS. Is this samba or nfs again, depending on the native OS? Am I missing a trick which allows this to all work smoothly with all four environments running at once without loosing the SSD performance?

    Read the article

  • IBM BladeCenter S: Disk Configuration

    - by gravyface
    Have just the one storage bay right now (SAS 15K 600GB x 6) and have configured one storage pool in RAID 10 with 4 disks (and two global spares). For each blade, I've created a volume and mapped accordingly: Blade #1 400 GB Blade #2 200 GB Blade #3 100 GB Blade #4 100 GB When I boot up Blade 1 and enter into the UEFI Setup (F1) followed by the Adapters and UEFI Drivers LSI Logic Fusion MPT SAS Driver Utility, I see 4 disks: two are the on-board 73GB drives, the other two are 200GB each and assume I'm being presented with two logical disks from the volume I created and mapped to this blade. I was a bit surprised by this: I figured I would've been presented with one logical drive per volume, not two. I'm assuming I can just configure whatever RAID level I wish that supports two disks, but not really sure what the benefits/trade-offs here. Should I go with RAID 10 on top of RAID 10? RAID 0? Software RAID 0/1/10? Does it even matter? If this is "normal" to see two disks, then I'm going to likely just do some benchmarking and see if it makes a difference changing the RAID levels (my guess is no); if this is not normal, well, please let me know. :)

    Read the article

  • Network Sharing Issues

    - by Mark Kramer
    I have two computers I want to have on a network share together (so the laptop can print through the desktop's printer). These are both connected to the same router. (One wired and one wireless) And they both have the same workgroup name. However, when I type "net view" in the command prompt, the only computer that shows up is the computer I type the command into. How do I get these computers to be shared with each other? Update: The tower is running Windows XP, the Laptop is running Windows 7. I have disabled the Firewall on both until I get this set up. Update 2: Typing the command net view on my desktop is acting very strangely. Noe sometimes it only shows itself on the list of computers, sometimes it shows itself and the laptop and sometimes it doesn't work at all and displays system error 6118. I typed net view into the command prompt on the Windows 7 laptop and both computers showed up, so I went to connect to the desktop from the laptop and it said the dekptop could not be found, even though it is showing up int he list of shared computers. Here are supporting pictures (these are coming from the laptop)

    Read the article

  • My yum repository able to search packages, but not able to install it in RHEL?

    - by mandy
    I set up yum from dvd. Following is the containts of my .repo file: [dvd] name=Red Hat Enterprise Linux Installation DVD baseurl=file:///media/dvd enabled=0. I'm able to search packages. However while installation I'm getting below error: [root@localhost dvd]# yum install libstdc++.x86_64 Loaded plugins: rhnplugin, security This system is not registered with RHN. RHN support will be disabled. Setting up Install Process Nothing to do My Yum Search output: [root@localhost dvd]# yum search gcc Loaded plugins: rhnplugin, security This system is not registered with RHN. RHN support will be disabled. ============================================================================= Matched: gcc ============================================================================= compat-libgcc-296.i386 : Compatibility 2.96-RH libgcc library compat-libstdc++-296.i386 : Compatibility 2.96-RH standard C++ libraries compat-libstdc++-33.i386 : Compatibility standard C++ libraries compat-libstdc++-33.x86_64 : Compatibility standard C++ libraries cpp.x86_64 : The C Preprocessor. libgcc.i386 : GCC version 4.1 shared support library libgcc.x86_64 : GCC version 4.1 shared support library libgcj.i386 : Java runtime library for gcc libgcj.x86_64 : Java runtime library for gcc libstdc++.i386 : GNU Standard C++ Library libstdc++.x86_64 : GNU Standard C++ Library libtermcap.i386 : A basic system library for accessing the termcap database. libtermcap.x86_64 : A basic system library for accessing the termcap database. Please guide me on this, I want to install gcc on my RHEL.

    Read the article

  • Windows XP Disappearing Folders

    - by XenoFoxx
    I am researching a problem for a friend, and unfortunatly do not have direct access to his computer. I've tried to gather as much information as possible and I have researched it on various websites. I've not found anyone having the same problem my friend is having. So here goes: He has a media server in his home running Microsoft Windows XP. It has 3 drives, 1 for the OS and 2 for mass storage. Not long ago he went to access one of the mass storage media drives and it was empty, except for a single folder. His first assumption was that his roommate had deleted everything on the drive (excluding the remaining folder). He then checked the properties of the drive and it was still saying that the hard drive was nearly full. I told him to check the recycling bin, thinking that whoever deleted them didn't clear them from recycling and that they were still taking up space on the drive. My friend said the recycling bin was empty. So we have a drive that the windows file management system says is empty (again except for the remaining folder), but the properties of the drive say it's mostly full. Now it gets weirder My friend tried to create a new folder on this drive and it auto-named itself "New Folder(1)" which means that it recognizes there is already a "New Folder" in that directory. He tried to rename it to a name that he KNEW was there previsouly, and windows wouldn't allow it because it was a duplicate folder name. SO now it seems the folders are there, but not displaying in Windows Explorer. Both of us have no idea why this is occuring, why the folders vanished, why the one remaining folder didn't vanish, or how to make them visable again. Anyone else ever experience this? I can get more details if needed.

    Read the article

  • External HDD incorrectly detected as internal - how change to enable hot swap/eject?

    - by Sam
    I have win 7 x64 Home Prem. The HDD is a seagate barracuda, 7200.7 ST3120827AS. 3.5", Serial: 3ms006n6, Firmware: 3.42 (no further updates) NexStar CX External case (drivers installed). I have three drives: WD320 with OS installed WD750 data storage (internal) seagate 120 (external) - connected via esata board connected to sata on motherboard (MSI p43 neo) Tried uninstalling HDD in device manager to no effect. Also the internal WD750 is detected as an external drive and win taskbar icon allows for it to be ejected (unlike the seagate). All drives are configured - Online, Simple, Basic, NTFS, Active, Primary Partition (except c drive). The seagate was previously used as a primary disk with XP operating system so I deleted the volume and created/reformatted (not quick). HDD is no longer "Active". But did not fix problem. Background Originally, I installed win 7 with the bios set to IDE and forgot to install the chipset drivers. Then I changed win 7 to install the AHCI drivers, changed the bios to AHCI and rebooted. Win 7 loaded drivers but WD HDD gave problems/crashed. I installed chipset drivers and latest intell storage matrix software thingie (in safe mode). Everything worked fine after that except for the problem of not corrrectly detecting the external drive] I have noticed that under the driver properties (and similarly in the registry) the two drives are configured differently (e.g. in driver details property capabilities for the WD the value is set to 0000006, CM_DEVCAP_REMOVABLE & EJECTSUPPORTED - whereas the seagate shows 0000080 & CM_DEVCAP_SURPRISEREMOVALOK). Any easy way to configure things? I tried physically swapping the sata connections on the mainboard without success So far I have found that a solution to my problem might be to perform some reg changes: How do I remove the option to eject SATA drives from the Windows 7 tray icon?

    Read the article

< Previous Page | 180 181 182 183 184 185 186 187 188 189 190 191  | Next Page >