Search Results

Search found 3324 results on 133 pages for 'gb j'.

Page 3/133 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Debug Apache mod_status showing 151 requests/sec - 2.7 GB/second

    - by James Hackett
    One of my web servers went crazy this morning and showed "151 requests/sec - 2.7 GB/second - 140.7 MB/request" the normal is like "11.1 requests/sec - 65.6 kB/second - 5.9 kB/request" I don't even think that kind of through put is possible on my server. It was also listing odd symbols for the urls and the amount of data transfered for connections was off the meter 246-0 -1286402072 0/0/0 ? 0.00 -1444841118 0 -5416403825852416.0 0.00 0.00 °Rk³ 247-0 18 0/0/0 ? -13112985.76 2094967848 0 -5428200825946112.0 0.00 0.00 248-2 23437 0/0/2 _ 0.00 0 0 0.0 0.00 -5340330065920.00 74.53.23.134 web2.mydomain.com OPTIONS / HTTP/1.0 249-2 23279 0/2981898840/0 W 16673317.60 11 0 0.0 2844.06 0.00 201.144.221.245 www.mydomain.com GET /cb8ff49a2395a7b1accbbce1e4cf164f/view/256 HTTP/1.1 250-0 0 40600/3009863336/0 ? 3816369.92 910209710 0 2913775.3 -5323551899648.00 -5324315849947.28 èøϲ Has anyone seen anything like this before and know what might be causing it? I posted the full mod_status output here http://pastie.org/916066

    Read the article

  • I want to upgrade my GPU to MSI NVIDIA N630GT-MD4GD3 4 GB DDR3 Graphic card

    - by jatin singh
    Hello every body this is my first post here.. I want to know about my motherboard's PCI express version , my motherboard as I don't have 10 reputation so I am providing image link here http://i.stack.imgur.com/6PWV9.png I have PCI express slot and I want to upgrade it to MSI NVIDIA N630GT-MD4GD3 4 GB DDR3 Graphic card.. this GPU has PCI exp version 2. Here in my city,the shopkeeper said that we have to check if the PCI express of my board is compatible to that GPU but I want to purchase it from flipkart because they sell it for less money (sorry for bad English :/) I mailed to my computers company (Acer) but they didn't reply to any of my mail so my friend told me about Super User.

    Read the article

  • Open and scroll through 42 GB text file in Mac OS X

    - by Django Johnson
    I am running Mac OS X 10.8.4 (Mountain Lion) and I am trying to open and scroll through a 42 GB .XML file. I plan on using an XML parser to parse through it and delete parts, but first I need to know how the document is structured so I can know what parts to save. How can I open this text / XML file and scroll through it so I can get a glimpse of its structure? I tried my default text-editor, text-mate, and that couldn't open it. I tried gEdit and that shows the first 10 or so lines, but then quits after trying to load the rest. I would greatly appreciate any and all suggestions!

    Read the article

  • Optimal Configuration for five 300 GB 15K SAS Drives

    - by Bob
    I recently acquired an HP Z800 workstation that has five 300 GB 15K SAS Drives. This system will be dedicated to running multiple virtual machines under VMware Workstation (Note: I'm not using ESXi because I do plan to use the system for other purposes.). For the host OS, I plan to install RHEL 5. My number one concern is guest performance. For example, should I create a RAID 10 array for the OS and virtual machine storage with four of the drives and reserve the 5th? Or, is there a solution that will provide better performance?

    Read the article

  • Optimal Configuration for five 300 GB 15K SAS Drives

    - by Bob
    I recently acquired an HP Z800 workstation that has five 300 GB 15K SAS Drives. This system will be dedicated to running multiple virtual machines under VMware Workstation (Note: I'm not using ESXi because I do plan to use the system for other purposes.). For the host OS, I plan to install RHEL 5. My number one concern is guest performance. For example, should I create a RAID 10 array for the OS and virtual machine storage with four of the drives and reserve the 5th? Or, is there a solution that will provide better performance?

    Read the article

  • Change XRDP keyboard layout to en-gb Ubuntu 12.04

    - by Earl Sven
    Does anybody know how to change the keyboard layout to en-gb in an XRDP session on Ubuntu 12.04? I am using mstsc.exe to connect to an XRDP server hosting an XVNC session, however I cannot work out how to apply the UK keyboard layout. A bit of googling has yeilded these instructions which allow me to change the keymap, however using the keymap file I downloaded from here I loose the ability to use the arrow keys, home/end etc. Comparing the file with the standard one there are substantially more differences than I would expect considering the similarity between the layouts. I only have RDP access to the box so i don't seem to be able to actually generate a new layout per the instructions above, maybe it's a local console thing? Also I can't change either the RDP client used or the RDP server as they are my only access to the system, I don't have local console access. I do have root priveleges on the OS however. Any thoughts? Edit: I have found http:// xrdp.sourceforge.net/documents/keymap/newkeymap.html (apologies for not typing the link properly but the antispam filter won't let me post more than 2 links) this documentation on the XRDP sourceforge page which describes keymap file format. It indicates the values in the keymap files are unicode 0x64 etc, however the files I have already on my system seem to use a different format 0:0 or 65307:27 etc, does anybody know what the difference is?

    Read the article

  • Is there way to enable 4 GB RAM in 32-bit Windows OS?

    - by Wahid Bitar
    I upgraded my PC to 4 GB RAM and I get only 3 GB. Windows 7 32-Bit consider that I've 4 GB RAM but didn't use more than 3 GB. Someone told me that MS Windows 32-bit doesn't support RAM larger than 3 GB. So please is there any way to make my OS "Windows 7 32-Bit" support more than 3 GB RAM ? *`Note: I can't move to 64-bit because I've many program doesn't work with a 64-bit OS. Edit:: I tried what Mr. Wonsungi advised me but whenever I check this option: Enable support for 4 GB of RAM I get the following error: 'Cannot access to the registry key HKEY_CLASSES_ROOT\CLSID\{E88DCCE0-11d1-A9F0-00AA0060FA31}.' There is no "CLSID" in my registry, I don't know why!.

    Read the article

  • Details of 5GB and 50GB SQL Azure databases have now been released, along with new price points

    - by Eric Nelson
    Like many others signed up to the Windows Azure Platform, I received an email overnight detailing the upcoming database size changes for SQL Azure. I know from our work with early adopters over the last 12 months that the 1GB and 10GB limits were sometimes seen as blockers, especially when migrating existing application to SQL Azure. On June 28th 2010, we will be increasing the size limits: SQL Azure Web Edition database from 1 GB to 5 GB SQL Azure Business Edition database will go from 10 GB to 50 GB Along with these changes comes new price points, including the option to increase in increments of 10GB: Web Edition: Up to 1 GB relational database = $9.99 / month Up to 5 GB relational database = $49.95 / month Business Edition: Up to 10 GB relational database = $99.99 / month Up to 20 GB relational database = $199.98 / month Up to 30 GB relational database = $299.97 / month Up to 40 GB relational database = $399.96 / month Up to 50 GB relational database = $499.95 / month Check out the full SQL Azure pricing. Related Links: http://ukazure.ning.com UK community site Getting started with the Windows Azure Platform

    Read the article

  • Leverage cloud and programming to share GB's of photos

    - by jcmoney
    My friends and I went on a trip and we have over 8 GB of photos we want to share. We live in different geographic locations and all of us (14 people) have a part of the 8 GB. I was wondering if there's a way to leverage my php skills to share all these photos. My current plan is to make a simple site that you can upload a bunch of files and also list those files for people to download (probably a compressed folder of a bunch of selected ones) but was wondering if there's a better way or if I'm grossly underestimating scalability issues. All of us have high speed internet (essentially T1) and I was planning on using Amazon EC2 since this is a heavy task but for a short time period. That's also the reason I can't use dropbox or similar services since they have a 2GB cap (and I don't want to have everyone sign up and install something). I also don't want to set up anything too tricky since not all of them are tech savvy.

    Read the article

  • Looking for updated BIOS for '99 Gateway in order to format/recognize >127 GB HD

    - by Jeff
    I have a '99 Gateway that's apparently too old for even Gateway to acknowledge it exists. Want to use it as a media hub and put in a 320GB HD, but it will not format above 127GB even running Win XP SP3. Read somewhere that upgrading the BIOS may do the trick, but I can't find the correct BIOS, and GW has been no help. Hoping I can just upgrade the BIOS, which is 11 years old. Any help would be much appreciated! I don't know where to look, and searches have been fruitless. System info: OS Name Microsoft Windows XP Home Edition Version 5.1.2600 Service Pack 3 Build 2600 OS Manufacturer Microsoft Corporation System Name xxxx System Manufacturer Gateway System Model TABOR_II System Type X86-based PC Processor x86 Family 6 Model 7 Stepping 3 GenuineIntel ~596 Mhz BIOS Version/Date Intel Corp. 4W4SB0X0.15A.0015.P10, 9/28/1999 SMBIOS Version 2.1 BIOS info (from a free app I located): BIOS Type: Phoenix BIOS Date: September 28th 1999 BIOS ID: 4W4SB0X0.15A.0015.P10.9909281445-None BIOS OEM: 4W4SB0X0.15A.0015.P10 Chipset: Intel 440BX/ZX rev 3 SuperIO: SMC 70x or 80x rev 0 at port 0370 Manufacturer: Gateway Motherboard: WS440BX

    Read the article

  • Sharing large (multi-Gb) files with clients

    - by Tim Long
    I wasn't sure if this was the best place for this question, but I think it is squarely in the realm of the IT admin so that's the reason I put it here. We need to share large files (several Gigabytes) with external clients. We need a simple way of reliably and automatically publishing these files so that clients can then download them. Our organization has Windows desktops and a Windows SBS 2011 server. Sharing from our server is probably suboptimal from the client's perspective, because of the low upstream bandwidth of typical ADSL (around 1 Mbps) - it would take all day (9 hours for a 4Gb file) for the client to download the file. Uploading to a 3rd party sever is good for the client but painful for us, because we then have to deal with a multi-hour upload. Uploading to a third-part server would be less problematic if it could be made reliable and automatic, e.g. something like a Groove/SharePoint Workspace, simply drop the file in and wait for it to synchronize - but Groove has a 2Gb limit which is not big enough. So ideally I'd like a service with the following attributes: Must work for files of at least 5Gb, preferably 10Gb Once the transfer is started, it must be reliable (i.e. not sensitive to disconnections and service outages) and completely automatic Ideally, the sender would get a notification when the transfer completes. Has to work with Windows based systems. Any suggestions?

    Read the article

  • Extract large zip file (50 GB) on Mac OS X

    - by chingjun
    I was trying to move the files to another hard drive. So I archived all my photos in one large ZIP file using the Mac OS X built-in compress function. But the file failed to extract. I've tried many programs, but none of the programs I tried were able to extract the file. I've tried Mac OS X's extract utility, StuffIt Expander, 7-Zip (command line), all failed. Mac's archive utility and StuffIt don't seem to support large files, and 7-Zip's command line version gave an error stating unsupported archive. I have no luck in Windows either as many of my files have Chinese filenames, and couldn't extract to the correct name under Windows. Are there some programs that can support large files, can handle files compressed using Mac OS X's compress function, and can support UTF-8 filename? With or without GUI is fine. Update Well, I had made the wrong decision to compress the files, and it's already too late. I thought I should be able to extract the file if I could compress it. It's too late, the original copies are gone, only a large ZIP file left here. I have tried using 'unzip', but it says End-of-central-directory signature not found. I guess it doesn't have large file support as well. I would try the Windows Vista method as stated by SuperMagic, but I need to borrow a computer for that. Anyway, thank you everyone, but please provide more suggestions on what software that could possibly extract that file.

    Read the article

  • WSS Search fills 10 GB limit on SBS server 2011

    - by Kactus
    I've got a SBS Server 2011 Standard SP1 that isn't very busy. 2 Users local and 2 remote. We have sharepoint that has maybe a dozen small documents at most. I've just started getting the following two error occur Could not allocate space for object 'dbo.MSSBatchHistory'.'IX_MSSBatchHistory' in database 'WSS_Search_SERVER' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. And CREATE DATABASE or ALTER DATABASE failed because the resulting cumulative database size would exceed your licensed limit of 10240 MB per database. Digging around in SQL manager I see that WSS Search DB file size is 10241MB, the log file is only 147 MB Firstly, why is WSS Search taking up so much space? How can I stop it from doing so, and what can I do now to get things running ok. I know about log file truncating and this isn't the case here since the log is tiny. Any help is appreciated. There is plenty of free space on the disk (791GB free) Thanks Kactus

    Read the article

  • MacBook Air i5 4 GB RAM shows screen tearing when scrolling in browsers

    - by Sandro Dzneladze
    I see screen tearing pretty often while scrolling webpages up and down, and it has been like this from day 1. But now I purchased an external monitor which is huge (23 inches compared to the Mac's 11), and the effect is more visible. It is driving me nuts and giving me headaches. I wonder if you see the same. I was reading a lot about this problem, and it seems to be present on MacBook Pros as well? Can someone confirm I'm not alone? In other words: I'm deciding weather to go through warranty repair, or if it makes no sense if all of theme exhibit same behavior. It is shame to see this beautiful machine with an insane price tag to be lagging when browsing web, when the processor is just sitting there idle at 4-12%! This is an example I found on Wikipedia that more or less describes what happens when I scroll up down. The effect is not so pronounced and clears when I stop scrolling, but it is sure annoying the hell out of me. Firefox is tearing like hell, Chrome less, Safari exhibits jagged scrolling but less tearing compared to Chrome and Firefox. With synthetic benchmarks I don't see any problems with hardware. But this is not particularly revealing.

    Read the article

  • Array on servers which receive several hundred GB of data a day

    - by Matthew
    This is hopefully a simple question. Right now we are deploying servers which will serve as data warehouses. I know with raid 5 the best practice is 6 disks per raid 5. However, our plan is to use RAID 10 (both for performance and safety). We have a total of 14 disks (16 actually, but two are being used for OS). Keeping in mind that performance is very much an issue, which is better - doing several raid 1's? Do one large raid 10? One large raid 10 had been our original plan, but I want to see if anyone has any opinions I haven't thought of. Please note: This system was designed for using Raid 1+0, so losing half of the raw storage capacity is not an issue. Sorry i hadn't mentioned that initially. The concern is more whether or not we want to use one large Raid 1+0 containing all 14 disks, or several smaller raid 1+0's and then stripe across them using LVM. I know the best practice for higher raid levels is to never use more than 6 disks in an array.

    Read the article

  • Unnamed, hidden partitions on my 500 GB HD, HP Pavilion dm4 Laptop

    - by emotionull
    I have multiple doubts here. Its a Seagate 500GB 7200RPM HD. I had installed it few months back after my original Laptop HD stopped working. The current drives on my latop, as shown by the Windows Disk Management are: After installing the new HD, I had done a complete clean install of Windows 7 and I didn't create any parition myself, manually. So there are 4 drives. Even previously, before I installed this new HD, my laptop had 4 Partitions. But the there were no un-named partitions like the two in this case. The other two were HP tools and Recovery or something. It was pre-configured, Factory installed Windows. Also, now when I right cick on the unnamed Drives from Disk Management, all the options are greyed out (see image) except the delete partition image. So how do I know what's inside those partitions? Will it be ok if I delete them? I want install Ubuntu and dual boot it with my current windows installation. I cannot do it in current setup as there are already 4 partitions of my HD and if I will try to make a new partition, it will be a logical one (correct me if I am wrong here). So can I delete the un-named, hidden partitions and use them for Ubuntu? A bit unrelated question. As a backup option, can I use the Windows 7's Backup and Restore facility to keep a complete backup of all the drivers and system softwares.

    Read the article

  • Opening Large (24 GB) File In C

    - by zacaj
    I'm trying to read in a 24 GB XML file in C, but it won't work. I'm printing out the current position using ftell() as I read it in, but once it gets to a big enough number, it goes back to a small number and starts over, never even getting 20% through the file. I assume this is a problem with the range of the variable that's used to store the position (long), which can go up to about 4,000,000,000 according to http://msdn.microsoft.com/en-us/library/s3f49ktz%28VS.80%29.aspx, while my file is 25,000,000,000 bytes in size. A long long should work, but how would I change what my compiler(Cygwin/mingw32) uses or get it to have fopen64?

    Read the article

  • Higher speed options for executing very large (20 GB) .sql file in MySQL

    - by Jonogan
    My firm was delivered a 20+ GB .sql file in reponse to a request for data from the gov't. I don't have many options for getting the data in a different format, so I need options for how to import it in a reasonable amount of time. I'm running it on a high end server (Win 2008 64bit, MySQL 5.1) using Navicat's batch execution tool. It's been running for 14 hours and shows no signs of being near completion. Does anyone know of any higher speed options for such a transaction? Or is this what I should expect given the large file size? Thanks

    Read the article

  • how to restore very large .bak file (180 GB) in SQL Server 2008

    - by Umutos
    Hello! I have a verly large .bak file (180 GB) which was stored in Microsoft SQL Server 2008 and I have to restore it. I first installed Microsoft SQL Server 2008 Express and tried to restore it in MS SQL management studio express but it didn't work because there is a size limit. Does anybody know a method how i can restore the file? Its the first time I work with Microsoft SQL and I have no clue what to do. Its really urgend and I would be really helpful for any help! Thanks a lot! Umutos

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >