Search Results

Search found 16386 results on 656 pages for 'flash drive'.

Page 433/656 | < Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >

  • Gateway GT5220 Boot/POST Failure

    - by John Rudy
    I have a Gateway GT5220 I'm troubleshooting. It is, in fact, the machine I just gave my father for his birthday a couple months ago. (Prior to that, it was my home PC. My home PC is now the MacBook on which I'm writing this.) Before going any further, I suspect that the answer will be, "It's worse than that, it's dead, Jim, it's dead, Jim, it's dead, Jim." At least, mobo and/or CPU. The initial symptoms were as follows: Turn on power All fans fire up (thus making it so I can't hear if the hard drive is spinning or not, nor are my hands sensitive enough anymore to feel it) No LEDs remained lit on the front panel. (Initially, the hard drive indicator flashed briefly.) No beep, no video, no nothing. Following some advice I found here, I tried to "drain the stored power." After following those steps, the new symptoms were: Turn on power All fans fire up The front panel LEDs remained lit! After about 20, maybe 30 seconds, we had video! Sort of. We got to the Gateway splash/POST screen, which appeared thoroughly corrupted. How corrupted? Well, I imagine it's what a POST screen would look like after reading the wrong passage out of the Necronomicon: It stayed there. I gave it at least 5, maybe 6 minutes, and it didn't move. So I shut her down, started her up again, and now (this is where we currently stand, symptomatically) we have this: Turn on power All fans fire up The front panel LEDs remain lit No video, no beep, no nothing. I'm a software guy; haven't done real hardware troubleshooting in years. My gut tells me that the mobo and/or CPU is fried, and unfortunately my gut didn't get to be as big as it is being wrong all the time. :( In addition to the link above, I have read all of the following (trying to save you some LMGTFY trouble): Gateway Support POST Error Messages and Handling About a zillion (useless) POST beep code sites A kioskea.net post indicating that most likely we're at what I consider "total loss" (mobo and/or CPU) My questions: Are there any conditions other than mobo/CPU that could cause symptoms like these? Is it worth my time to try the next hardware troubleshooting step?(IE, remove all non-critical hardware from the machine, try to boot, systematically replace one by one until we find the failing component) Which mobos will fit in the Gateway GT5220 case (with rear ports correctly aligned)? (Why this is not a dupe: I wouldn't have posted this question if it hadn't been for the funkadelic possessed video display on the one occasion we got video out. I think that justified this not being an exact dupe. Of course, if the community overrules, I will understand.)

    Read the article

  • Merge LVM Partition with unallocated Space

    - by David
    I have a linux hard drive with three areas: /dev/hda1 - ext3 boot partition (20 MB) /dev/hda2 - lvm2 main partition (6 GB) unpartitioned space - 12 GB I would like to merge the unpartitioned space into the lvm2 partition known as /dev/hda2. I tried using GParted, but it does not support lvm2. What commands or utilities could I use to add the unpartitioned space to hda2 without losing my existing data?

    Read the article

  • How to upgrade XBMC Live from 9.04.1 to 9.11 via command line?

    - by sunpech
    I've been unable to do a fresh install of XBMC Live 9.11 to my hard drive. Everytime it fails at the Install System step. But I am able to get XBMC Live 9.04.1 to install successfully. How do I upgrade XBMC Live 9.04.1 to 9.11? I understand that Ctrl+Shift+F2 brings up the command line, but what are the next set of commands to run?

    Read the article

  • Connecting a 2560x1440 display to a laptop?

    - by tjollans
    Having read Jeff Atwood's blog post on Korean 27" IPS LCDs, I've been wondering to what extent these are useful in a notebook + large display situation. I own a Lenovo Thinkpad Edge E320 with 2nd gen. integrated Intel graphics. According to the spec from Intel, this should support HDMI version 1.4, and, using DisplayPort, resolutions up to 2560x1600. HDMI version 1.4 supports resolutions up to 4096×2160, however, according to c't (German), the HDMI interface used with Intel chips only supports 1920x1200. The same goes for the DVI output - dual-link DVI-D, apparently, is not supported by Intel. It would appear that my laptop cannot digitally drive this kind of resolution. Now what about other laptops? According to the article in c't above, AMD's integrated graphics chips have the same limitation as Intel's. NVIDIA graphics cards, apparently, only offer resolutions up to 1900x1200 over HDMI out of the box, but it's possible, when using Linux at least, to trick the driver into enabling higher resolutions. Is this still true? What's the situation on Windows and OSX? I found no information on whether discrete AMD chips support ultra-high resolutions over HDMI. Owners of laptops with (Mini) DisplayPort / Thunderbolt won't have any issues with displays this large, but if you're planning to go for a display with dual-link DVI-D input only (like the Korean ones), you're going to need an adapter, which will set you back something like €70-€100 (since the protocols are incompatible). The big question mark in this equation is VGA: a lot of laptops have it, and I don't see any reason to think this resolution is not supported by the hardware (an oft-quoted figure appears to be 2048x1536@75Hz, so 2560x1440@60Hz should be possible, right?), but are the drivers likely to cause problems? Perhaps more critically, you'd need a VGA to dual-link DVI-D adapter that converts analog to digital signals. Do these exist? How good are they? How expensive are they? Is there a performance penalty involved? Please correct me if I'm wrong on any points. In summary, what are the requirements on a laptop to drive an external LCD at 2560x1440, in particular one that supports dual-link DVI-D only, and what tools and adapters can be used to lower the bar?

    Read the article

  • Windows 2003 Dynamic Disk error

    - by ChrisH
    Hi, I was trying to ghost a partition on a Windows 2003 server, using Ghost 2003. Unfortunately things went horribly wrong, and now I can't boot back into my system. As you can see, Ghost creates a wee little partition to do its dirty work, and has dislodged my other partitions. Partition 2 in the image below is my C drive. Any suggestions as to how I might get this active again so that it boots? Cheers, Chris

    Read the article

  • Is there an easy way to copy an audio CD in Mac OS X?

    - by Bob D
    (not a commercial CD). I did some recordings of a band years ago and ran into one of the band members who asked me if I could make copies. I assumed that this would be easy. I know that I can rip the CD into iTunes and then burn a new CD, but I have two optical drives available, is there a way to simply copy the CD from one drive to the other in one step?

    Read the article

  • How to change hardcoded database paths in a Lotus Approach Database

    - by asoltys
    One of our Lotus Approach files (file extension .apr) has hardcoded paths to additional underlying database files (file extension .dbf). We've moved the Approach database and all the underlying files to a new network drive, so these paths need to be updated. You can see the old paths by opening the File menu and selecting "Approach File Properties", under the "Databases" heading. How can you change these paths?

    Read the article

  • Ruby on rails gems and win 7

    - by cbrulak
    I have a networked home share (at work). This is my HOMEPATH,HOMESHARE,etc. When I installed the rails gem and other the .gems folder was created there (which is on a network drive) and consequently any code that needs a gem or a gem lookup is slow. And I'm wondering which environment variables need to be corrected. I've googled this but most results are dealing with cygwin so any ideas? Thanks

    Read the article

  • Free CD burning software for Windows?

    - by MiffTheFox
    I need to burn CDs (maybe DVDs too) from images (both bin/cue and iso format), and also from files. What is a good software to handle this? Note that I don't have the software that may have come with the drive, but I'm assuming it works because I can burn files with Explorer. I just need something that can handle images too.

    Read the article

  • RHEL5: Can't create sparse file bigger than 256GB in tmpfs

    - by John Kugelman
    /var/log/lastlog gets written to when you log in. The size of this file is based off of the largest UID in the system. The larger the maximum UID, the larger this file is. Thankfully it's a sparse file so the size on disk is much smaller than the size ls reports (ls -s reports the size on disk). On our system we're authenticating against an Active Directory server, and the UIDs users are assigned end up being really, really large. Like, say, UID 900,000,000 for the first AD user, 900,000,001 for the second, etc. That's strange but should be okay. It results in /var/log/lastlog being huuuuuge, though--once an AD user logs in lastlog shows up as 280GB. Its real size is still small, thankfully. This works fine when /var/log/lastlog is stored on the hard drive on an ext3 filesystem. It breaks, however, if lastlog is stored in a tmpfs filesystem. Then it appears that the max file size for any file on the tmpfs is 256GB, so the sessreg program errors out trying to write to lastlog. Where is this 256GB limit coming from, and how can I increase it? As a simple test for creating large sparse files I've been doing: dd if=/dev/zero of=sparse-file bs=1 count=1 seek=300GB I've tried Googling for "tmpfs max file size", "256GB filesystem limit", "linux max file size", things like that. I haven't been able to find much. The only mention of 256GB I can find is that ext3 filesystems with 2KB blocks are limited to 256GB files. But our hard drives are formatted with 4K blocks so that doesn't seem to be it--not to mention this is happening in a tmpfs mounted ON TOP of the hard drive so the ext3 partition shouldn't be a factor. This is all happening on a 64-bit Red Hat Enterprise Linux 5.4 system. Interestingly, on my personal development machine, which is a 32-bit Fedora Core 6 box, I can create 300GB+ files in tmpfs filesystems no problem. On the RHEL5.4 systems it is no go.

    Read the article

  • Installing windows 8 on windows 7

    - by ZainShah120
    I had windows 7 installed on my Laptop,I installed windows 8 on it, without formatting the System Drive,Therefore now I can see two folders windows.old and Windows,The windows folder contain my new Windows 8, IS there any way that I can go back to Window 7? without formatting, This question is not programming related but I hope I can find some good answers here, If someone think that it is not fruitful, please comment on it, and I will delete that instead of down vote.. Looking forward to hear from you Thanks

    Read the article

  • Mount network drives over ssh on Windows

    - by petersohn
    There is a remote filesystem I can reach through ssh. On Linux, there are several ways of dealing with it. I like sshfs, because with it I can work with the remote files the same way as with my local files. Is there any similar to Windows, that can map a network drive through ssh? The best I can use is WinSCP, which is good, but not good enough.

    Read the article

  • IIS Strategies for Accessing Secured Network Resources

    - by Emtucifor
    Problem: A user connects to a service on a machine, such as an IIS web site or a SQL Server database. The site or the database need to gain access to network resources such as file shares (the most common) or a database on a different server. Permission is denied. This is because the user the service is running as doesn't have network permissions in the first place, or if it does, it doesn't have rights to access the remote resource. I keep running into this problem over and over again and am tired of not having a really solid way of handling it. Here are some workarounds I'm aware of: Run IIS as a custom-created domain user who is granted high permissions If permissions are granted one file share at a time, then every time I want to read from a new share, I would have to ask a network admin to add it for me. Eventually, with many web sites reading from many shares, it is going to get really complicated. If permissions are just opened up wide for the user to access any file shares in our domain, then this seems like an unnecessary security surface area to present. This also applies to all the sites running on IIS, rather than just the selected site or virtual directory that needs the access, a further surface area problem. Still use the IUSR account but give it network permissions and set up the same user name on the remote resource (not a domain user, a local user) This also has its problems. For example, there's a file share I am using that I have full rights to for sharing, but I can't log in to the machine. So I have to find the right admin and ask him to do it for me. Any time something has to change, it's another request to an admin. Allow IIS users to connect as anonymous, but set the account used for anonymous access to a high-privilege one This is even worse than giving the IIS IUSR full privileges, because it means my web site can't use any kind of security in the first place. Connect using Kerberos, then delegate This sounds good in principle but has all sorts of problems. First of all, if you're using virtual web sites where the domain name you connect to the site with is not the base machine name (as we do frequently), then you have to set up a Service Principal Name on the webserver using Microsoft's SetSPN utility. It's complicated and apparently prone to errors. Also, you have to ask your network/domain admin to change security policy for the web server so it is "trusted for delegation." If you don't get everything perfectly right, suddenly your intended Kerberos authentication is NTLM instead, and you can only impersonate rather than delegate, and thus no reaching out over the network as the user. Also, this method can be problematic because sometimes you need the web site or database to have permissions that the connecting user doesn't have. Create a service or COM+ application that fetches the resource for the web site Services and COM+ packages are run with their own set of credentials. Running as a high-privilege user is okay since they can do their own security and deny requests that are not legitimate, putting control in the hands of the application developer instead of the network admin. Problems: I am using a COM+ package that does exactly this on Windows Server 2000 to deliver highly sensitive images to a secured web application. I tried moving the web site to Windows Server 2003 and was suddenly denied permission to instantiate the COM+ object, very likely registry permissions. I trolled around quite a bit and did not solve the problem, partly because I was reluctant to give the IUSR account full registry permissions. That seems like the same bad practice as just running IIS as a high-privilege user. Note: This is actually really simple. In a programming language of your choice, you create a class with a function that returns an instance of the object you want (an ADODB.Connection, for example), and build a dll, which you register as a COM+ object. In your web server-side code, you create an instance of the class and use the function, and since it is running under a different security context, calls to network resources work. Map drive letters to shares This could theoretically work, but in my mind it's not really a good long-term strategy. Even though mappings can be created with specific credentials, and this can be done by others than a network admin, this also is going to mean that there are either way too many shared drives (small granularity) or too much permission is granted to entire file servers (large granularity). Also, I haven't figured out how to map a drive so that the IUSR gets the drives. Mapping a drive is for the current user, I don't know the IUSR account password to log in as it and create the mappings. Move the resources local to the web server/database There are times when I've done this, especially with Access databases. Does the database have to live out on the file share? Sometimes, it was just easiest to move the database to the web server or to the SQL database server (so the linked server to it would work). But I don't think this is a great all-around solution, either. And it won't work when the resource is a service rather than a file. Move the service to the final web server/database I suppose I could run a web server on my SQL Server database, so the web site can connect to it using impersonation and make me happy. But do we really want random extra web servers on our database servers just so this is possible? No. Virtual directories in IIS I know that virtual directories can help make remote resources look as though they are local, and this supports using custom credentials for each virtual directory. I haven't been able to come up with, yet, how this would solve the problem for system calls. Users could reach file shares directly, but this won't help, say, classic ASP code access resources. I could use a URL instead of a file path to read remote data files in a web page, but this isn't going to help me make a connection to an Access database, a SQL server database, or any other resource that uses a connection library rather than being able to just read all the bytes and work with them. I wish there was some kind of "service tunnel" that I could create. Think about how a VPN makes remote resources look like they are local. With a richer aliasing mechanism, perhaps code-based, why couldn't even database connections occur under a defined security context? Why not a special Windows component that lets you specify, per user, what resources are available and what alternate credentials are used for the connection? File shares, databases, web sites, you name it. I guess I'm almost talking about a specialized local proxy server. Anyway, so there's my list. I may update it if I think of more. Does anyone have any ideas for me? My current problem today is, yet again, I need a web site to connect to an Access database on a file share. Here we go again...

    Read the article

  • Uninstall Linux Mint without deleting partition?

    - by user3525308
    I've got a dual-boot system using Windows 7 and Linux Mint 13. When I first installed Mint using the regular Mint installer, the installer did not give me the option of partitioning my drive, presumably because my computer already had 4 partitions. I decided that I wanted to uninstall Mint and opt for Ubuntu instead, but I have no idea how to remove Mint! Every guide I look at tells me to just delete/reformat the Linux partition... But in my case, no separate partition exists.

    Read the article

  • Backing up the master boot

    - by petersohn
    I want to back up the master boot on my hard drive, in case something screws it up. What software do you recommend for this? My first idea is to boot from a Linux CD and dd the first 512 bytes of /dev/sda, and dd it back to recover. Will this solution work, and is it safe?

    Read the article

  • Local links ( in browsers ) on *nix systems

    - by meder
    On Windows I can access files directly from the browser ( or at least I have it configured currently, forget if it was native like this ) with the file:// protocol, so I can access files from say the C drive. I'm wondering what the equivalent would be to accessing my files from the browser, if at all possible on a *nix system such as Debian.

    Read the article

  • Snow Leopard not reading NTFS

    - by dtmunir
    I just upgraded a brand new Macbook Pro that I bought with 10.5 installed to Snow Leopard. After doing the upgrade I am trying to access NTFS partitions on my external hard drive but they are just not showing up. I went ahead and installed Windows 7 using Parallels, and I can access the NTFS partitions from within Windows, but not from Mac OS X. I know that Snow Leopard should be able to at least read NTFS drives, so what is going on here?

    Read the article

  • Best practice for setting up SQL server on a Virtual Machine

    - by CrazyCoderz
    This is my first attempt at virtualizing SQL server on VMWare and I want to make sure I am doing things correctly. Should I have SQL server installed on the C: drive / same partition as the OS, Then add a virtual disk for the Data files, say 300GB, and then another virtual disk for the log files say 100GB? Or should I add 2 300GB vdisks, for the data files mirror them in the operating system, and then add a non mirrored vdisk of 100GB for LogFiles??

    Read the article

  • Trying to upgrade SQL Server 2008 to R2 but SQL is sleeping or dead?

    - by oJM86o
    I've used the option to upgrade SQL Server 2008 to R2 but I noticed it gets to about 20-30% and it just sits on this screen: I've left it alone for over 2 hours, the PC is definately not frozen cause I can click the help or move the window around but it says "Install_sql_common_core_loc_Cpu64_1033_action: Install Files. Copying new files" for the past 2 hours. I have tried to do the install from a CD as well as a network drive, both with the same issue. Is there anything I can check or do ?

    Read the article

< Previous Page | 429 430 431 432 433 434 435 436 437 438 439 440  | Next Page >