Search Results

Search found 3887 results on 156 pages for 'processor simulator'.

Page 13/156 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • 912 stream processor available in OpenCL

    - by tugrul büyükisik
    I am thinking of assembling this system: AMD CPU (A8-3870 APU which has Radeon HD 6550D inside: 400 stream processors:xxx GFLOPS) nearly 110$ AMD Graphics card: HD 7750 (512 stream processors:819 GFLOPS peak performance) nearly 170$ Appropriate ram (1600MHz bus) Mainboard What GFLOPS level can I reach as a stable mode with using OpenCL and similar programs? Can I use all 912 stream processors at the same time? I am not trying to do a VS question. I need to know what could be better for scientific computing (%75 of the time) and gaming (%25 of the time) because I have a low budget. With "scientific calculations" I mean fluid dynamics/solid state physics simulating; with games I mean those that need openCL and PhysX.

    Read the article

  • Is this processor burned?

    - by Jhonnytunes
    I've recently exchange the processors in two PCChips boards. Both boards are LGA775. Board A is P17G(Pentium 4 HyperThreading 3GHz) and board B is P49G(Pentium Dual Core 3GHz). I use board A to watch videos, and some of them are 3GB size and this is why I exchanged the CPU. I installed Dual Core in board A and it worked out of the box, now 3GB videos use 5% of CPU instead of 50%. When I installed the pentium in the board B, I forgot to connect the 4pin power and, when i powered on the PC, the CPU fan stay off. Then, I connected all right this time, and now the board doesnt show video. I think the CPU is not working but im not sure about that. The PC turns on and the HD spins, the CPU fan spins, network socket blinking, but not video and case power led is neither blinking. I tried with other PSU and everything was the same. I figure out that CPU have that paste above. IDK really what's happening, I hope I dont have to buy another CPU. Is it Burned?

    Read the article

  • Windows Server 2008 R2 Maximum Processor Limit Confusion

    - by Stevoni
    As I was looking through the Windows Server 2008 R2 specifications, I saw that the maximum supported processors is 64 sockets for Datacenter addition. This puts the maximum number or cores at 256 (if all sockets are quad cores), which I think it's just silly, but whatever. And now the questions: How does one set something like that up? (Obviously not for me, but humor me) Are there multiple dual socket motherboards running in a giant case with a ton of memory? How does the OS see all of the CPUs if they're on different boards? What would be a real world example of a need to have 64 sockets attached to one operating system vs 32 2 socket servers?

    Read the article

  • Limit a process's relative (not absolute) processor consumption in Linux

    - by BobBanana
    What is the standard way in Linux to enforce a system policy to limit the relative CPU use of a single process? That is, on a quad-core machine, I never want a process to use more than 2 CPUs at once, even if the process creates more threads. I do not want an absolute time limit, just a relative limit so that one task cannot dominate the machine. This is also different than renice, which allows a process to use all the resources but just politely step aside if others need them too. ulimit is the usual resource limiting tool, but it does not allow such CPU restrictions.. it can limit the number of processes per user, or absolute CPU time, not restrict the maximum number of active threads of a single process. I've found a couple of user-level tools, like CPUlimit, but not a system level tool or setting. Does such a standard resource controller exist in Linux (Red Hat Enterprise, if it matters.) If there is such a limit imposed, how would a user identify it?

    Read the article

  • how Infiniband speed is related to processor speed

    - by user223231
    I have two exactly the same servers and very curious how to make Infiniband interconnection between them? Both servers' basic specs are: CPU: 32GHz = 2x Intel Xeon X5650, 6 core, 2.66GHz and RAM: 24GB per server (edited) How determine what speed of Infiniband will be enough for perfect interconnection? SDR, DDR, QDR or FDR? My logic is 32Ghz = 32Gb/s and 40Gb one is enough, am I right or it is not that simple?

    Read the article

  • whats the name of this pattern?

    - by Wes
    I see this a lot in frameworks. You have a master class which other classes register with. The master class then decides which of the registered classes to delegate the request to. An example based passed in class may be something this. public interface Processor { public boolean canHandle(Object objectToHandle); public void handle(Object objectToHandle); } public class EvenNumberProcessor extends Processor { public boolean canHandle(Object objectToHandle) { if (!isNumeric(objectToHandle)){ return false } return isEven(objectToHandle); } public void handle(objectToHandle) { //Optionally call canHandleAgain to ensure the calling class is fufilling its contract doSomething(); } } public class OddNumberProcessor extends Processor { public boolean canHandle(Object objectToHandle) { if (!isNumeric(objectToHandle)){ return false } return isOdd(objectToHandle); } public void handle(objectToHandle) { //Optionally call canHandleAgain to ensure the calling class is fufilling its contract doSomething(); } } //Can optionally implement processor interface public class processorDelegator { private List processors; public void addProcessor(Processor processor) { processors.add(processor); } public void process(Object objectToProcess) { //Lookup relevant processor either by keeping a list of what they can process //Or query each one to see if it can process the object. chosenProcessor=chooseProcessor(objectToProcess); chosenProcessor.handle(objectToProcess); } } Note there are a few variations I see on this. In one variation the sub classes provide a list of things they can process which the ProcessorDelegator understands. The other variation which is listed above in fake code is where each is queried in turn. This is similar to chain of command but I don't think its the same as chain of command means that the processor needs to pass to other processors. The other variation is where the ProcessorDelegator itself implements the interface which means you can get trees of ProcessorDelegators which specialise further. In the above example you could have a numeric processor delegator which delegates to an even/odd processor and a string processordelegator which delegates to different strings. My question is does this pattern have a name.

    Read the article

  • iPad application crash in Apple review - cannot replicate in simulator, have crash log

    - by Mike
    I am clearly missing something obvious here and would really appreciate some input. I have tried repeatedly to submit an application to Apple (iPad in this case) that is crashing on their end when testing but I cannot replicated the situation on my end (obviously I only have the damned simulator to work with at this point). The crash log is as follows: Date/Time: 2010-04-01 05:39:47.226 -0700 OS Version: iPhone OS 3.2 (7B367) Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x00000000, 0x00000000 Crashed Thread: 0 Thread 0 Crashed: 0 libSystem.B.dylib 0x000790a0 __kill + 8 1 libSystem.B.dylib 0x00079090 kill + 4 2 libSystem.B.dylib 0x00079082 raise + 10 3 libSystem.B.dylib 0x0008d20a abort + 50 4 libstdc++.6.dylib 0x00044a1c __gnu_cxx::__verbose_terminate_handler() + 376 5 libobjc.A.dylib 0x000057c4 _objc_terminate + 104 6 libstdc++.6.dylib 0x00042dee __cxxabiv1::__terminate(void (*)()) + 46 7 libstdc++.6.dylib 0x00042e42 std::terminate() + 10 8 libstdc++.6.dylib 0x00042f12 __cxa_throw + 78 9 libobjc.A.dylib 0x000046a4 objc_exception_throw + 64 10 CoreFoundation 0x00090c6e +[NSException raise:format:arguments:] + 74 11 CoreFoundation 0x00090d38 +[NSException raise:format:] + 28 12 Foundation 0x00002600 -[NSCFDictionary setObject:forKey:] + 184 13 iPadMosaic 0x00003282 -[iPadMosaicViewController getAlbumThumbs] (iPadMosaicViewController.m:468) 14 Foundation 0x000728fe __NSFireDelayedPerform + 314 15 CoreFoundation 0x00022d1c CFRunLoopRunSpecific + 2092 16 CoreFoundation 0x000224da CFRunLoopRunInMode + 42 17 GraphicsServices 0x000030d4 GSEventRunModal + 108 18 GraphicsServices 0x00003180 GSEventRun + 56 19 UIKit 0x000034c2 -[UIApplication _run] + 374 20 UIKit 0x000019ec UIApplicationMain + 636 21 iPadMosaic 0x00002234 main (main.m:14) 22 iPadMosaic 0x00002204 start + 32 My understanding here is that I am botching the Dictionary add somehow. The relevant lines of code are: for (NSDictionary *album in self.albumList) { // Get image for each album cover UIImage *albumCover; // Loop through photos to get URL of cover based on photo ID match NSString *coverURL = @""; for (NSDictionary *photo in self.photoList) { if ([[photo objectForKey:@"pid"] isEqualToString:[album objectForKey:@"cover_pid"]]) { coverURL = [photo objectForKey:@"src"]; } } NSURL *albumCoverURL = [NSURL URLWithString:coverURL]; NSData *albumCoverData = [NSData dataWithContentsOfURL:albumCoverURL]; albumCover = [UIImage imageWithData:albumCoverData]; if (albumCover == nil || albumCover == NULL) { //NSLog(@"No album cover for some reason"); albumCover = [UIImage imageNamed:@"noImage.png"]; } [[self.albumList objectAtIndex:albumCurrent] setObject:albumCover forKey:@"coverThumb"]; } This is part of a loop that runs over the existing dictionaries stored in an array. If retrieving the album cover fails for some reason the object is filled with a default image and then added. The last line of the code is what's showing up in the crash log. It runs fine in the simulator but crashes 100% in testing on device apparently. Can anyone tell me what I am missing here?

    Read the article

  • Interface builder UIButton custom background image not working on simulator/device

    - by xenonii
    I'm trying to do something really simple. I have an image for a button and I'm trying to set it on a custom button in interface builder. I set the background image accordingly (no case sensitivity problem here). In interface builder it shows up, but in the simulator or on the device it doesn't appear at all. Just the button's text will appear. Do I need to turn on some flag or something of the sort?

    Read the article

  • Modifying PROCTHROTTLEMAX with powercfg has no effect in 2008 R2

    - by AlexC
    I am trying to make the CPU transition to a lower P-state. I used pwrtest to determine the tests, and now I want to set the processor frequency to 50%. I executed the following command: powercfg -setacvalue SCHEME_BALANCED SUB_PROCESSOR PROCTHROTTLEMAX 50 When i query the scheme, the value is set to the desired value. However, the processor frequency is not modified (I am using CPU-Z to check the frequency). My system is running Windows 2008 R2. Any ideas? Thanks!

    Read the article

  • NSUserDefaults not present on first run on simulator

    - by Ben Collins
    I've got some settings saved in my Settings.bundle, and I try to use them in application:didFinishLaunchingWithOptions, but on a first run on the simulator accessing objects by key always returns nil (or 0 in the case of ints). Once I go to the settings screen and then exit, they work fine for every run thereafter. What's going on? Isn't the point of using default values in the Settings.bundle to be able to use them without requiring the user to enter them first?

    Read the article

  • UITextField keyboard difference between simulator and real device

    - by Raphael Pinto
    Hi, I am trying get float value with UITextField. On the simulator V3.1.3 in English, I have to use '.' to separate my float values. But on my Iphone 3GS V3.1.3 in French, I have to use ',' to separate them. If I use '.' on my Iphone 3GS, my float is troncated : 3.22222 = 3.0000000 Is there a solution to detect the language version and use the good separator automaticaly?

    Read the article

  • What is the iPhone simulator IP address?

    - by Chris G
    Hi I have been looking for the answer to this question for some time. I am doing network programming for the iPhone and it is necessary for me to use the IP address of the device. This isn't a problem on the physical device as it has its own IP address on the network. However I was wondering what was the case with it on the simulator. Does it get assigned an IP address to be used? Thanks in advanced for any help, CG

    Read the article

  • iphone audio plays on simulator but not on device

    - by amarsh-anand
    The following code plays well on the simulator but the audio doesnt play on the actual device. I have tries aif, wav and mp3 ... all three with the same behaviour. Please sugest what could be wrong. SystemSoundID aSound; AudioServicesCreateSystemSoundID(CFBundleCopyResourceURL(CFBundleGetMainBundle(),CFSTR("drop"), CFSTR("wav"), NULL), &aSound); AudioServicesPlaySystemSound(aSound);

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >