Search Results

Search found 9025 results on 361 pages for 'quad core'.

Page 39/361 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • What do you need to implement to provide a Content Set for an NSArrayController?

    - by whuuh
    Heys, I am writing something in Xcode. I use Core Data for persistency and link the view and the model together with Cocoa Bindings; pretty much your ordinary Core Data application. I have an array controller (NSArrayController) in my Xib. This has its managedObjectContext bound to the AppDelegate, as is convention, and tracks an entity. So far so good. Now, the "Content Set" biding of this NSArrayController limits its content set (as you'd expect), by a keyPath from the selection in another NSArrayController (otherAc.selection.detailsOfMaster). This is the usual way to implement a Master-Detail relationship. I want to variably change the key path at runtime, using other controls. This way, I sould return a content set that includes several other content sets, which is all advanced and beyond Interface Builder. To achieve this, I think I should bind the Content Set to my AppDelegate instead. I have tried to do this, but don't know what methods to implement. If I just create the KVC methods (objectSet, setObjectSet), then I can provide a Content Set for the Array Controller in the contentSet method. However, I don't think I'm binding this properly, because it doesn't "refresh". I'm new to binding; what do I need to implement to properly update the Content Set when other things, like the selection in the master NSArrayController, changes?

    Read the article

  • Archiver Securing SQLite Data without using Encryption on iPhone

    - by Redrocks
    I'm developing an iphone app that uses Core Data with a SQLite data store and lots of images in the resource bundle. I want a "simple" way to obfuscate the file structure of the SQLite database and the image files to prevent the casual hacker/unscrupulous developer from gaining access to them. When the app is deployed, the database file and image files would be obfuscated. Upon launching the app it would read in and un-obfuscate the database file, write the un-obfuscated version to the users "tmp" directory for use by core data, and read/un-obfuscate image files as needed. I'd like to apply a simple algorithm to the files that would somehow scramble/manipulate the file data so that the sqlite database data isn't discernible when the db is opened in a text editor and so that neither is recognized by other applications (SQLite Manager, Photoshop, etc.) It seems, from the information I've read, that I could use NSFileManager, NSKeyedArchiver, and NSData to accomplish this but I'm not sure how to proceed. Been developing software for many years but I'm new to everything CocoaTouch, Mac and iPhone. Also never had to secure/encrypt my data so this is new. Any thoughts, suggestions, or links to solutions are appreciated.

    Read the article

  • iphone coredata fetch-request sorting

    - by satyam
    I'm using core data and fetching the results successfully. I've few questions regarding core data 1. When I add a record, will it be added at the end or at the start of entity. 2. I'm using following code to fetch the data. Array is being populated with all the records. But they are not in the same order as I entered records into entity. why? on what basis default sorting is used? NSFetchRequest* allLatest = [[NSFetchRequest alloc] init]; [allLatest setEntity:[NSEntityDescription entityForName:@"Latest" inManagedObjectContext:[self managedObjectContext]]]; NSError* error = nil; NSArray* records = [managedObjectContext executeFetchRequest:allLatest error:&error]; [allLatest release]; 3. The way that I enter the records, 1,2,3,4......... after some time, I want to delete the records that I entered first(i mean oldest data). Something like delete oldest two records. How to do it?

    Read the article

  • Detailstext in TableView from Coredata Iphone

    - by user554867
    Hey, that is my first question here at stackoverflow. Let me try to be specific as I can. The project is as follows: I am trying to parse an xml file in an iphone app, which works already. I am saving the parsed data into the CoreData of the Iphone. So far so good: I have two elements of the xml file which I want to show in the tableview as the text and the detailtext. Now the strange behaviour occurs: If I take the data of the Core Data and trying to visualize them in the tableview, then both elements are displayed in seperate cells and not as detailtext of the element. If I am just taking a constant string for the detailtext, then it works, but if I am trying to take for every cell a specific element from the Core Data then it is shown seperately in the tableview. I googled a lot, reading here and there. I dont know which code I should show because I dont have a clue where exactly the mistake could be. Maybe someone can answer that immediately because it is a stupid mistake somewhere, which is common. Thanks for your help.

    Read the article

  • Moving UITableView cells and maintaining consistent data

    - by Mark F
    I've enabled editing mode and moving cells around to allow users to position table view content in the order they please. I'm using Core Data as the data source, which sorts the content by the attribute "userOrder". When content is first inserted, userOrder is set to a random value. The idea is that when the user moves a cell around, the userOrder of that cell changes to accomodate its new position. The following are problems I am running into while trying to accomplish this: Successfully saving the the new location of the cell and adjusting all changed locations of influenced cells. Getting the data to be consistent. For example, the TableView handles the movement fine, but when i click on the new location of the cell, it displays data for the old cell that used to be that location. Data of all influenced cells gets messed up as well. I know I have to implement this in: - (void)tableView:(UITableView *)tableView moveRowAtIndexPath:(NSIndexPath *)sourceIndexPath toIndexPath:(NSIndexPath *)destinationIndexPath {} I just don't know how. The apple docs are not particularly helpful if you are using Core Data, as in my situation. Any guidance greatly appreciated!

    Read the article

  • What type of data store should I use for my ios app?

    - by mwiederrecht
    I am pretty new to ios and using servers so forgive me. I am building an ios app for research. I need to monitor things that the user does and then push it up to a server for analysis (yes, with user and IRB permission). On the client's side I need to keep quite a bit of data that won't really change except in the case of pulling an updated version from the server, and then a minimal amount of user-specific data. Most of the data I will collect needs to be pushed to a server for analysis and then can be deleted from the client side. I am struggling to figure out what kind of data store I need to use, especially since I am not quite sure how the pushing and pulling from the server process works yet. Does it make sense to use Core Data? XML? SQLite? I like the Core Data idea, but I am not sure what kind of problems I will run into when I need to send large amounts of data to it and from it from the server. I imagine I might need to send data in a different form than it is probably stored in on either end - so what kind of overhead am I likely to run into in the process of converting that data? Is there a good format to save stuff in that would work well for me on both ends AND for sending the data? As you can probably tell, I could use some advice. Thanks!

    Read the article

  • Cmdlets for AD CS deployment: Install-ADcsCertificationAuthority cmdlet failing when attempting to install an offline policy CA

    - by red888
    I installed an offline root CA without issue using this command: Install-ADcsCertificationAuthority ` -OverwriteExistingKey ` <#In the case of a re-installation#> ` -AllowAdministratorInteraction ` -CACommonName ` "LAB Corporate Root CA" ` -CADistinguishedNameSuffix ` 'O=LAB Inc.,C=US' ` -CAType ` StandaloneRootCA ` -CryptoProviderName ` "RSA#Microsoft Software Key Storage Provider" ` -HashAlgorithmName ` SHA256 ` -KeyLength ` 2048 ` -ValidityPeriod ` Years ` -ValidityPeriodUnits ` 20 ` -DatabaseDirectory ` 'E:\CAData\CertDB' ` -LogDirectory ` 'E:\CAData\CertLog' ` -Verbose I installed the root CA's cert and CRl on the policy CA, installed the AD CS binaries, and attempted to run this command to install the policy CA and export a req file: Install-ADcsCertificationAuthority ` -OverwriteExistingKey ` <#In the case of a re-installation#> ` -AllowAdministratorInteraction ` -CACommonName ` "LAB Corporate Policy Internal CA" ` -CADistinguishedNameSuffix ` 'O=LAB Inc.,C=US' ` -CAType ` StandaloneSubordinateCA ` -ParentCA ` rootca ` -OutputCertRequestFile ` 'e:\polca-int.req' ` -CryptoProviderName ` "RSA#Microsoft Software Key Storage Provider" ` -HashAlgorithmName ` SHA256 ` -KeyLength ` 2048 ` -ValidityPeriod ` Years ` -ValidityPeriodUnits ` 10 ` -DatabaseDirectory ` 'E:\CAData\CertDB' ` -LogDirectory ` 'E:\CAData\CertLog' ` -Verbose When doing this I receive the following error: VERBOSE: Calling InitializeDefaults method on the setup object. Install-ADcsCertificationAuthority : At line:1 char:1 + Install-ADcsCertificationAuthority ` + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidArgument: (:) [Install-AdcsCertificationA uthority], CertificationAuthoritySetupException + FullyQualifiedErrorId : ValidateParameters,Microsoft.CertificateServices .Deployment.Commands.CA.InstallADCSCertificationAuthority Is there a parameter I am entering incorrectly or something?

    Read the article

  • Can I use LGA775 heatsink on LGA1156 CPU?

    - by Ghostrider
    Will cooler that is designed to LGA775 CPUs install on LGA1156 CPU or are they mechanically incompatible? I'm building my first 1U server and surprise-surprise - Intel stock cooler doesn't fit (unless I'm willing to keep the case open). Places that I prefer to buy stuff from are currently out of stock on 1U 1156 heatsinks so I'm wondering if LGA775 would do the trick.

    Read the article

  • Unlocking AMD Phenom II X2 550 Black Edition cores - what are the risks?

    - by Vilx-
    I've got the above mentioned CPU and a GigaByte GA-MA790XT-UD4P motherboard, which should be capable of unlocking the extra two cores - if I'm lucky and they're not faulty. The Internet is full of instructions on how to do that. What I don't have is spare money to buy new hardware if I brick something. What are the risks when attempting to do this? Is it completely safe, or can I be left with an expensive pile of junk? What should I keep in mind when doing so? Bigger cooler maybe (I'm running with the default box cooler)? Should I lower the frequencies too? I've never done any OC before.

    Read the article

  • Forgot the username

    - by prithviraj
    Hello all I have fedora installed in my system. I know the password but i forgot the user name. I can access through terminal but i don't no how to login through gui. Please help me. Thanks in advance.

    Read the article

  • Processor Upgrade HP Elite M9510F

    - by DaveM
    I have an HP 9510F that uses the ASUS IPIBL-LB MB. It ships with an Intel Q8200 Quad Core processor but it does not support virtualization. Specs for the board from HP (ASUS does not list this OEM board) do not show support for the Intel Q8200 it ships with (obviously incorrect) but only these • Supports the following processors: o Intel Core 2 Quad (Yorkfield core) Q9xxx o Intel Core 2 Duo (Wolfsdale core) E8xxx o Intel Core 2 Quad (Kentsfield core) up to Q6600 o Core 2 Duo E6x00 (Conroe core) up to E6700 o Core 2 Duo E4x00 (Conroe core) up to E4400 Can this MB support the Q8400 or will it only support the indicated Q9xxx series? Naturally HP is little help here. Specs are located hereHP/ASUS MB specs

    Read the article

  • ffmpeg not using all cores

    - by user2783132
    I just got my new server with two intel e5-2695 but I was shocked to see that ffmpeg or ubuntu doesn't utilize all cores. top while ffmpeg was running top - 23:35:25 up 2:41, 2 users, load average: 5.35, 4.37, 3.12 Tasks: 333 total, 2 running, 331 sleeping, 0 stopped, 0 zombie %Cpu0 : 0.0 us, 1.0 sy, 35.6 ni, 63.4 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu1 : 0.0 us, 0.7 sy, 35.5 ni, 63.9 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu2 : 0.0 us, 0.7 sy, 33.4 ni, 65.9 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu3 : 0.0 us, 0.0 sy, 32.7 ni, 67.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu4 : 0.0 us, 0.3 sy, 32.3 ni, 67.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu5 : 0.0 us, 0.3 sy, 33.0 ni, 66.7 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu6 : 0.0 us, 0.0 sy, 32.6 ni, 67.4 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu7 : 0.0 us, 0.3 sy, 32.7 ni, 67.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu8 : 0.0 us, 0.7 sy, 32.6 ni, 66.8 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu9 : 0.0 us, 0.3 sy, 33.9 ni, 65.8 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu10 : 0.0 us, 0.0 sy, 35.0 ni, 65.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu11 : 0.0 us, 0.7 sy, 30.0 ni, 69.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu12 : 21.1 us, 0.0 sy, 0.0 ni, 78.9 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu13 : 0.7 us, 0.0 sy, 4.3 ni, 95.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu14 : 0.3 us, 0.0 sy, 5.0 ni, 94.6 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu15 : 24.9 us, 0.0 sy, 0.0 ni, 75.1 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu16 : 0.3 us, 0.0 sy, 3.7 ni, 96.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu17 : 0.7 us, 0.3 sy, 4.9 ni, 94.1 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu18 : 1.0 us, 0.0 sy, 4.6 ni, 94.4 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu19 : 0.7 us, 0.0 sy, 4.7 ni, 94.7 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu20 : 11.1 us, 0.0 sy, 0.0 ni, 88.9 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu21 : 1.3 us, 0.0 sy, 4.6 ni, 94.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu22 : 2.0 us, 0.3 sy, 4.3 ni, 93.4 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu23 : 96.7 us, 1.0 sy, 0.0 ni, 2.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu24 : 0.0 us, 0.0 sy, 0.7 ni, 99.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu25 : 0.0 us, 0.0 sy, 3.0 ni, 97.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu26 : 0.0 us, 0.0 sy, 1.3 ni, 98.7 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu27 : 0.0 us, 0.0 sy, 4.0 ni, 96.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu28 : 0.0 us, 0.0 sy, 1.7 ni, 98.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu29 : 0.0 us, 0.0 sy, 1.7 ni, 98.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu30 : 0.0 us, 0.0 sy, 1.7 ni, 98.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu31 : 0.0 us, 0.0 sy, 1.0 ni, 99.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu32 : 0.0 us, 0.0 sy, 0.7 ni, 99.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu33 : 0.0 us, 0.0 sy, 1.7 ni, 98.3 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu34 : 0.0 us, 0.0 sy, 2.0 ni, 98.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu35 : 0.0 us, 0.0 sy, 1.0 ni, 99.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st %Cpu36 : 0.0 us, 0.0 sy, 0.0 ni,100.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st ffmpeg was sent with -threads 0 *I also tried sending ffmoeg with -threads 500- no difference

    Read the article

  • Need Windows XP VGA driver for i3 Haswell

    - by AFH
    Background: I have recently upgraded my hardware because the previous Pentium system started failing to the point that it would not run long enough to boot. It was obviously a hardware fault, but I had no way of knowing whether it was in the motherboard, CPU or memory. Not all the components were now available, so I decided to replace all three. In order to get some benefit from the expenditure, I though I would put in faster components, and for future-proofing went for recently released ones: MSI Z87-G41-PCMate and Intel i3-4130 with 4400 HD graphics. The system performs excellently with Ubuntu 13.10, so I know there are no hardware problems, but I need to continue running XP because it runs several thousand pounds (UK) worth of software, which meets my needs more than adequately: in some cases there is no longer support for later Windows releases, and in most others an expensive and to me unnecessary upgrade is required. Problem: The motherboard specifications claim Windows XP support for the live driver update utility, which misled me into believing that XP drivers were available. Not true: Intel have apparently refused to provide XP drivers for Haswell chips. The update program runs on XP, but finds no suitable Intel drivers. The system is more or less running on the default fail-safe VGA driver, but DirectX will not load, which stops a number of my applications from running. I have been trawling the internet for a month now, but I have not found a graphics driver which will load successfully: all show "This device cannot start. (code 10)". I don't need HDMI support: my monitor is 1280x1024 and connected through the VGA port, so all I need is a driver which will handle this resolution well enough to support DirectX. Has anyone found a driver which will do this? Please don't reply with information found from internet searches, unless you have actually solved this problem: be assured that I have been all round the houses looking at solutions which others have reported as working, but none of them does for me. Incidentally, I did find an Intel HD sound driver which XP accepts (winxp_145111.exe from Intel), though without connecting to an HDMI port on a TV or sound system I have no idea if it works in practice. However, the graphics section of the same driver fails, like all the others I've tried.

    Read the article

  • AMD switchable graphics are not working OR I don't know how to make them work

    - by Deus Deceit
    I don't know if this is the right place for this question, but I'll give it a shot. I have a Dell Inspiron 17R 5721. It's supposed to be using switchable graphics. It has an Intel HD 4000 and a Radeon HD 8730M and I'm using windows 8. My problem is this, I Installed the drivers that dell gives me but I don't see the AMD graphics card ANYWHERE ( I do see it in device manager but not anywhere else to select and play a game using AMD). I installed the latest drivers from AMD and same thing, I can't run a game with AMD graphics card. I change the applications preferences in the Catalyst control center but even after that, games don't give me the option to select the AMD card, they list only the Intel HD 4000. Can someone tell me what I have to do to make this work? ------------------- After looking around and messing with stuff. I think.... I THINK that you don't really get the option to select a graphics card. Switchable graphics is all about switching automatically depending on application's needs. Cause when I uninstalled AMD's drivers or actually (screwed up lol) games were playing much worse. When I re-installed them, games went back to being good looking. So even if a game sees only the Intel HD 4000 graphics card windows or AMD's drivers will switch to the AMD Readeon graphics card automatically. I hope someone can verify this. Cause seriously I don't think you get to play scyrim with High graphics settings or even ultra with the Intel HD Graphics card. -------------------

    Read the article

  • How to get Windows Server 2008 VM to use multiple cores

    - by David Fraser
    I have a Windows Server 2008 machine running in VirtualBox. On initial installation, only one processor was made available, but now I want to run it as a multiprocessor machine. I have made all four cores available in the VirtualBox settings (as well as enabling VT-x/AMD-V and Nested Paging), but Task Manager still only shows one CPU. However, the four CPU cores are visible in Device Manager under Processors. In the event log on startup, I can see the following relevant events: EventLog.6009 Microsoft (R) Windows (R) 6.00.6002 Service Pack 2 Multiprocessor Free Kernel-Processor-Power.4 Processor 0 exposes the following: 1 idle state(s), 0 performance state(s), 0 throttle state(s) Kernel-Processor-Power.4 Processor 255 exposes the following: 0 idle state(s), 0 performance state(s), 0 throttle state(s) Kernel-Processor-Power.4 Processor 255 exposes the following: 0 idle state(s), 0 performance state(s), 0 throttle state(s) Kernel-Processor-Power.4 Processor 255 exposes the following: 0 idle state(s), 0 performance state(s), 0 throttle state(s) How can I make this system actually boot up as a multiprocessor machine?

    Read the article

  • intel HD graphics with integrated tv tuner

    - by Tamir
    Hi all! I have new Dell laptop with Intel HD graphics with integrated TV tuner. How can i use this TV tuner? should I install third party software for using it or just configure something? I tried to google it but couldn't find a thing :-( so, How can I use this TV tuner? Many thanks!

    Read the article

  • How to get Windows Server 2008 VM to use multiple cores

    - by David Fraser
    I have a Windows Server 2008 machine running in VirtualBox. On initial installation, only one processor was made available, but now I want to run it as a multiprocessor machine. I have made all four cores available in the VirtualBox settings (as well as enabling VT-x/AMD-V and Nested Paging), but Task Manager still only shows one CPU. However, the four CPU cores are visible in Device Manager under Processors. In the event log on startup, I can see the following relevant events: EventLog.6009 Microsoft (R) Windows (R) 6.00.6002 Service Pack 2 Multiprocessor Free Kernel-Processor-Power.4 Processor 0 exposes the following: 1 idle state(s), 0 performance state(s), 0 throttle state(s) Kernel-Processor-Power.4 Processor 255 exposes the following: 0 idle state(s), 0 performance state(s), 0 throttle state(s) Kernel-Processor-Power.4 Processor 255 exposes the following: 0 idle state(s), 0 performance state(s), 0 throttle state(s) Kernel-Processor-Power.4 Processor 255 exposes the following: 0 idle state(s), 0 performance state(s), 0 throttle state(s) How can I make this system actually boot up as a multiprocessor machine?

    Read the article

  • Display with intel integrated graphics, bitcoin mine with Radeon 6950

    - by karategeek6
    I'm on Ubuntu Linux 11.04 64 bit. I have an intel i5 with integrated graphics and a Radeon 6950, with one monitor. I would like to run my graphics on the integrated card, and run bitcoin mining on the 6950. I have bitcoin mining working when I use the 6950 for both display and mining. Every time I try and and use the integrated graphics instead, OpenCL doesn't recognize my 6950. Using aticonfig --initial when using the integrated graphics for display breaks things. So I used the xorg.conf it created as a basis and tried to manually edit it. I really don't know what I'm doing, though. My last attempt is given below. The graphics ran off the integrated card, but the 6950 wasn't recognized. Any help would be greatly appreciated! xorg.conf: #Section "ServerLayout" # Identifier "Intel Layout" # Screen "Default Screen" # Identifier "aticonfig Layout" # Screen "aticonfig-Screen[0]-0" # Screen 0 "aticonfig-Screen[0]-0" 0 0 #EndSection Section "Module" Load "glx" EndSection # Intel Section "Device" Identifier "Intel Integrated Graphics" Driver "intel" BusID "PCI:0:2:0" EndSection Section "Monitor" Identifier "Default Monitor" Option "VendorName" "Monitor Vendor" Option "ModelName" "Monitor Name" Option "DPMS" "true" EndSection Section "Screen" Identifier "Default Screen" Device "Intel Integrated Graphics" Monitor "Default Monitor" DefaultDepth 24 EndSection # ATI Section "Device" Identifier "aticonfig-Device[0]-0" Driver "fglrx" BusID "PCI:1:0:0" EndSection Section "Monitor" Identifier "aticonfig-Monitor[0]-0" Option "VendorName" "ATI Proprietary Driver" Option "ModelName" "Generic Autodetecting Monitor" Option "DPMS" "true" EndSection Section "Screen" Identifier "aticonfig-Screen[0]-0" Device "aticonfig-Device[0]-0" Monitor "aticonfig-Monitor[0]-0" DefaultDepth 24 SubSection "Display" Viewport 0 0 Depth 24 EndSubSection EndSection

    Read the article

  • Computer freezes for 2+ seconds, mouse still moves

    - by xsaero00
    I have this problem on my workstation. The computer would effectively freeze for 2-5 seconds for no apparent reason, then continue as normal. While frozen the mouse would still be movable, but only on one of screens in my multi-screen setup. What can be the likely cause. System: CPU: i7-920 Memory: 12G of Patriot DDR3, 6 modules OS: SLED 11, Suse Linux Enterprise Desktop, using Gnome Main board: Asus P6T Video: two Nvidia 9500GT connected to three displays I am using memory at recommended settings of 8-8-8-1333. It has an XMP profile. Th CPU is a bit overclocked to 3.3 GHz, but my cooling more than allows for it. I ran the computer with all overclocks off and lower memory speed but the issue was still there. Any ideas? Where should I start looking?

    Read the article

  • Nvidia 9 series or Intel HD 2000? [closed]

    - by EApubs
    I just tested an Nvidia 9300 GS card with a Intel Corei3 HD 2000 graphics system. Here is the windows experience index scores I got : Nvidia 9300 GS : Base Score 3.9 Processor : 7.1 Memory : 7.5 Graphics : 3.9 Gaming Graphics : 5.1 Hard Disk : 5.9 Intel HD 2000 : Base Score : 5.2 Processor : 7.1 Memory : 5.9 Graphics : 5.2 Gaming Graphics : 5.8 Hard Disk : 5.9 My questions are : When using Intel HD graphics, it reduces the score of my Ram! How is that possible? It checks the speed of the ram. Not the size (i think). Intel graphics take some of the ram space but how can that effect the speed? From both of them, what will be the good choice?

    Read the article

  • kill SIGABRT does not generate core file from daemon started from crontab.

    - by Guma
    I am running CentOS 5.5 and working on server application that sometimes I need to force core dump so I can see what is going on. If I start my server from shell and send kill SIGABRT, a core file is created. If I start same program from crontab and then I send the same signal to it the server is "killed" but no core file is generated. Does any one know why is that and what need to be added to my code or changed in system settings to allow core file generation? Just a side note I have ulimit set to unlimited in /etc/profile I have set kernel.core_uses_pid = 1 kernel.core_pattern=/var/cores/%h-%e-%p.core in /etc/sysctl.conf Also my server app was added to crontab under same login id as I am running it from shell. Any help greatly appreciated

    Read the article

  • SQL server availability issue: large query stops other connections from connecting

    - by Carlos
    I've got a high-spec (multicore, RAID) server running MS SQL 2008, with several databases on it. I have a low throughput process that periodically needs a small amount of information from one of the DBs, and the code seems to work fine. However, sometimes when one of my colleagues does a huge query against one of the other DBs, I see full CPU usage on the machine, and connections from my app time out. Why does this happen? I would have thought the many cores and harddisks would somehow (together with cleverly written DB server) be able to keep at least some of the resources free for other apps? I'm pretty sure he doesn't use multiple connections for his query. What can I do to prevent this?

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >