Search Results

Search found 3887 results on 156 pages for 'processor simulator'.

Page 30/156 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • Computer makes hissing noise, turns off after few seconds

    - by Kaustubh P
    I have a problem similar to the questions posted here and here. This is my config: Asus M3N78-EM, with AMD Phenom X3 720 2800 Black Edition, 4GB Transcend DDR2 RAM, Nvidia 9400GT. HD is a 160 GB IDE, and a LG IDE DVD-ROM. The power button is a bit off, I have removed the cover of the switch, and the only way it turns on is just giving the "stick" under the cover a gentle press. It turns on sometimes, and at other times, I have to cut-off the power from the PSU, and try again. I will describe my problem in as detail as possible, please bear with me: The problem has started in the last week, a few months after I changed the to the powerswitch arrangement as described above. The PC makes a hissing noise, and I wasn't able to pin-point the noise source, because of the various other fans. At first, removing the HD, rebooting w/o the HD, turning it off, reconnecting and booting made the problem go away. But of late, it doesn't happen. As suggested in the other questions, I tried reducing the load by disconnecting both the IDE drives, and the problem (noise + turn-off) still occurs. I also connected another 80G IDE HD,today morning, adn it still made that noise, and turned off. I also opened up the PSU, but I couldn't see any fault in that, I tried rotating the fan by blowing into the blades, and with my fingers, but the hissing noise didn't come from there. Or maybe the speed wasn't enough to evoke that noise. A few weeks ago: I had cleaned the Cabinet and had repasted the processor and its fan using some thermal paste. Could that be at fault? I also used a vacuum to blow the dust out of the PSU, could the power have been too much, to maybe offset the fan or something? A label on the PSU says it uses a ball-bearing fan. That only leaves me with the Processor fan and the processor itself. I didn't try removing the processor fan and processor from the motherboard, and then turning the PC on, fearing damage. Will doing so cause any damage? What can I do to localize and pin-point the problem? Also, after a few tries, the Computer starts up. Sometimes it turns of within 2 seconds, sometimes after the POST. Once it turned off at the grub. Another time it booted completely and then turned off. The only way to ensure that the PC wont turn off, is if the hissing noise stops. EDIT: I suspect it to be the Processor/Processor fan, owing to the source of noise. All the config, except for the Cabinet, is just over a year old. EDIT2: I also just remembered, that I had set the "On-power resume" to turn on, i.e. If I supply he PC with power, it will turn itself on, w/o me needing to press the switch. I had done that to workaround the faulty power-switch, as noted above. EDIT3: I calculated the power my system needs, from the antec site, and I just arrived at 292W

    Read the article

  • Performance monitoring on Linux/Unix

    - by ervingsb
    I run a few Windows servers and (Debian and Ubuntu) Linux and AIX servers. I would like to continously monitor performance on these systems in order to easily identify bottlenecks as well as to have an overview of the general activity on the servers. On Windows, I use Windows Performance Monitor (perfmon) for this. I set up these counters: For bottlenecks: Processor utilization : System\Processor Queue Length Memory utilization : Memory\Pages Input/Sec Disk Utilization : PhysicalDisk\Current Disk Queue Length\driveletter Network problems: Network Interface\Output Queue Length\nic name For general activity: Processor utilization : Processor\% Processor Time_Total Memory utilization : Process\Working Set_Total (or per specific process) Memory utilization : Memory\Available MBytes Disk Utilization : PhysicalDisk\Bytes/sec_Total (or per process) Network Utilization : Network Interface\Bytes Total/Sec\nic name (More information on the choice of these counters on: http://itcookbook.net/blog/windows-perfmon-top-ten-counters ) This works really well. It allows me to look in one place and identify most common bottlenecks. So my question is, how can I do something equivalent (or just very similar) on Linux servers? I have looked a bit on nmon (http://www.ibm.com/developerworks/aix/library/au-analyze_aix/) which is a free performance monitoring tool developed for AIX but also availble for Linux. However, I am not sure if nmon allows me to set up the above counters. Maybe it is because Linux and AIX does not allow monitoring these exact same measures. Is so, which ones should I choose and why? If nmon is not the tool to use for this, then what do you recommend?

    Read the article

  • List of all IBM motherboard models with a certain socket?

    - by Ricket
    I just got a really good deal on two Intel Quad Core Xeon L5420 processors and I have access to other deals on bare IBM servers (case+motherboard, no processor/ram/hdd). How can I easily find out what server or motherboard models will be compatible with this processor? I am ideally looking for a dual-processor motherboard and I see that it is socket LGA771. So I guess the underlying question is, how can I find what IBM motherboards (and servers) have dual socket LGA 771?

    Read the article

  • What is the difference between these Pentium Extreme Edition CPUs?

    - by Giffyguy
    The CPU in question is the Pentium Extreme Edition 955. Intel's website shows four "versions", but for the most part they all look identical. They even share the same set of ordering codes. But one of them has a substantially lower TDP, which is seemingly unexplainable - since everything else is the same. Two of them say "LGA775, Tray" and I have no idea what "Tray" means either. Also, two of them have a different SPEC code. What I need to know is: What does "LGA775, Tray" mean? Why does the one CPU have a lower TDP? And what does that mean for me? Does that mean lower maximum power consumption? Does it mean the CPU may be more stable/endurant, because of a lower heat output? Why do two of them have a different SPEC code, and what does this mean? Finally, what does PLGA775 (as opposed to LGA775) mean, and do I need to be worried about that? Information from Intel's wbsite: Intel® Pentium® Processor Extreme Edition 955 (4M Cache, 3.46 GHz, 1066 MHz FSB) with SPEC Code 1 Boxed Intel® Pentium® Processor Extreme Edition 955 4M Cache, 3.46 GHz, 1066 MHz FSB LGA775 PLGA775 B1 95 Watts BX80553955 SL94N 2 Intel® Pentium® Processor Extreme Edition 955 4M Cache, 3.46 GHz, 1066 MHz FSB LGA775, Tray PLGA775 B1 130 Watts HH80553PH0994M SL94N 3 Boxed Intel® Pentium® Processor Extreme Edition 955 4M Cache, 3.46 GHz, 1066 MHz FSB LGA775 PLGA775 B1 130 Watts BX80553955 SL8WM 4 Intel® Pentium® Processor Extreme Edition 955 4M Cache, 3.46 GHz, 1066 MHz FSB LGA775, Tray PLGA775 B1 130 Watts HH80553PH0994M SL8WM

    Read the article

  • What are these CPU cache settings? Snoop Filter, ACL prefetch, HW prefetch

    - by eater
    I was in my BIOS setup turning on VT-x support today and saw these other settings. A little googling indicates that they each seem to turn on some sort of optimization to do with the CPU's L2 cache. They were all turned off by default. The processor in question is an Intel Xeon quad-core 3.4GHz (X5492). My OS is Linux 2.6.35.10-74.fc14.x86_64 #1 SMP Thu Dec 23 16:04:50 UTC 2010 x86_64 x86_64 x86_64 GNU/Linux. I have 4GB of RAM if that matters. Here's what the BIOS manufacturer has to say: Snoop Filter Enabling the snoop filter typically improves performance by reducing snoop traffic on the frontside bus in dual processor configurations. Well I like the sound of improved performance. Why would the BIOS have this off by default? Or by dual processor do they not mean multi-core? Regardless, is there a downside if this is on? ACL Prefetch When enabled, the Adjacent Cache Line Prefetcher fetches both cache lines that comprise a cache line pair when it determines required data is not currently in its cache. When disabled, the processor will only fetch the cache line required by the processor. HW Prefetch Fetches an extra line of data into L2 from external memory. Both of these sound like optimizations that have some drawbacks. What are the reasons to turn them on? What are the reasons to leave them off. Why is the default off?

    Read the article

  • List of all IBM server models with a certain socket?

    - by Ricket
    I just got a really good deal on two Intel Quad Core Xeon L5420 processors and I have access to other deals on bare IBM servers (case+motherboard, no processor/ram/hdd). How can I easily find out what server or motherboard models will be compatible with this processor? I am ideally looking for a dual-processor motherboard and I see that it is socket LGA771. So I guess the underlying question is, how can I find what IBM motherboards (and servers) have dual socket LGA 771?

    Read the article

  • Duplication of Architecture State means physically extra?

    - by Doopy Doo
    Hyper-Threading Technology makes a single physical processor appear as two logical processors; the physical execution resources are shared and the architecture state is duplicated for the two logical processors. So, this means that there are two sets of basic registers such as Next Instruction Pointer, processor registers like AX, BX, CX etc physically embedded in the micro-processor chip, OR they(arch. state) are made to look two sets by some low level duplication by software/OS.

    Read the article

  • What is the official Microsoft name for Windows 8 versions: Intel compatible vs. ARM?

    - by Clay Nichols
    Windows 8 will, AFAIK, be available in two very different flavors: One that supports old Windows programs (intel processor, I think) and the other will be an ARM processor which does NOT support x86 programs. I need to know how to refer to these to let customers clearly know which version of Windows we (currently) support. It looks like the terminology is: Windows 8 : This will be backward compatible with Win 32 apps. Windows RT: Runs on ARM-based processor devices (probably mainly tablets) and does not support

    Read the article

  • Node.js Lockstep Multiplayer Architecture

    - by Wakaka
    Background I'm using the lockstep model for a multiplayer Node.js/Socket.IO game in a client-server architecture. User input (mouse or keypress) is parsed into commands like 'attack' and 'move' on the client, which are sent to the server and scheduled to be executed on a certain tick. This is in contrast to sending state data to clients, which I don't wish to use due to bandwidth issues. Each tick, the server will send the list of commands on that tick (possibly empty) to each client. The server and all clients will then process the commands and simulate that tick in exactly the same way. With Node.js this is actually quite simple due to possibility of code sharing between server and client. I'll just put the deterministic simulator in the /shared folder which can be run by both server and client. The server simulation is required so that there is an authoritative version of the simulation which clients cannot alter. Problem Now, the game has many entity classes, like Unit, Item, Tree etc. Entities are created in the simulator. However, for each class, it has some methods that are shared and some that are client-specific. For instance, the Unit class has addHp method which is shared. It also has methods like getSprite (gets the image of the entity), isVisible (checks if unit can be seen by the client), onDeathInClient (does a bunch of stuff when it dies only on the client like adding announcements) and isMyUnit (quick function to check if the client owns the unit). Up till now, I have been piling all the client functions into the shared Unit class, and adding a this.game.isServer() check when necessary. For instance, when the unit dies, it will call if (!this.game.isServer()) { this.onDeathInClient(); }. This approach has worked pretty fine so far, in terms of functionality. But as the codebase grew bigger, this style of coding seems a little strange. Firstly, the client code is clearly not shared, and yet is placed under the /shared folder. Secondly, client-specific variables for each entity are also instantiated on the server entity (like unit.sprite) and can run into problems when the server cannot instantiate the variable (it doesn't have Image class like on browsers). So my question is, is there a better way to organize the client code, or is this a common way of doing things for lockstep multiplayer games? I can think of a possible workaround, but it does have its own problems. Possible workaround (with problems) I could use Javascript mixins that are only added when in a browser. Thus, in the /shared/unit.js file in the /shared folder, I would have this code at the end: if (typeof exports !== 'undefined') module.exports = Unit; else mixin(Unit, LocalUnit); Then I would have /client/localunit.js store an object LocalUnit of client-side methods for Unit. Now, I already have a publish-subscribe system in place for events in the simulator. To remove the this.game.isServer() checks, I could publish entity-specific events whenever I want the client to do something. For instance, I would do this.publish('Death') in /shared/unit.js and do this.subscribe('Death', this.onDeathInClient) in /client/localunit.js. But this would make the simulator's event listeners list on the server and the client different. Now if I want to clear all subscribed events only from the shared simulator, I can't. Of course, it is possible to create two event subscription systems - one client-specific and one shared - but now the publish() method would have to do if (!this.game.isServer()) { this.publishOnClient(event); }. All in all, the workaround off the top of my head seems pretty complicated for something as simple as separating the client and shared code. Thus, I wonder if there is an established and simpler method for better code organization, hopefully specific to Node.js games.

    Read the article

  • Subscription based eCommerce Site selling Physical Goods

    - by Kash
    I want to implement a system where customers can buy physical products as subscription with recurring billing. Customers will come to my site, visit product page and buy stuff. They need to register an account before purchasing. After registration they will be forwarded to payment processor for payment and after successful payment they will be returned back to my site. Requirement: I dont want to use Paypal as a payment processor. It will be recurring payment on monthly, quarterly, bi-annual or annual basis. Customers can login to my site in order to view their Order History, Profile Management and can edit their Billing and/or Shipping details. Customers can cancel their Subscription any time after login to their account on my site. Plan: Currently I am using wordpress with WooCommerce plugin for eCommerce functionality. I am planning to use Amazon as a payment processor. I have searched WooCommerce Extension Drectory and found WC Subscription extension that can be used for my site. But I just confirmed from the Support WC Subscription only support recurring payment with Paypal and Stripe, no other payment processor is supported at the moment. Questions: According to my need, what will be the best solution? I am open to switch the platform from Wordpress to something else that can fulfill my requirement.

    Read the article

  • Questioning the motivation for dependency injection: Why is creating an object graph hard?

    - by oberlies
    Dependency injection frameworks like Google Guice give the following motivation for their usage (source): To construct an object, you first build its dependencies. But to build each dependency, you need its dependencies, and so on. So when you build an object, you really need to build an object graph. Building object graphs by hand is labour intensive (...) and makes testing difficult. But I don't buy this argument: Even without dependency injection, I can write classes which are both easy to instantiate and convenient to test. E.g. the example from the Guice motivation page could be rewritten in the following way: class BillingService { private final CreditCardProcessor processor; private final TransactionLog transactionLog; // constructor for tests, taking all collaborators as parameters BillingService(CreditCardProcessor processor, TransactionLog transactionLog) { this.processor = processor; this.transactionLog = transactionLog; } // constructor for production, calling the (productive) constructors of the collaborators public BillingService() { this(new PaypalCreditCardProcessor(), new DatabaseTransactionLog()); } public Receipt chargeOrder(PizzaOrder order, CreditCard creditCard) { ... } } So dependency injection may really be an advantage in advanced use cases, but I don't need it for easy construction and testability, do I?

    Read the article

  • Cannot get temperatures in Dell Studio 1558

    - by Athul Iddya
    I could never get proper temperatures on my Dell Studio 1558. lm-sensors and acpi give wrong readings. The output of sensors is, $ sensors acpitz-virtual-0 Adapter: Virtual device temp1: +26.8°C (crit = +100.0°C) temp2: +0.0°C (crit = +100.0°C) acpi -V gives me, $ acpi -V Battery 0: Full, 100% Battery 0: design capacity 414 mAh, last full capacity 369 mAh = 89% Adapter 0: on-line Thermal 0: ok, 0.0 degrees C Thermal 0: trip point 0 switches to mode critical at temperature 100.0 degrees C Thermal 0: trip point 1 switches to mode passive at temperature 95.0 degrees C Thermal 0: trip point 2 switches to mode active at temperature 71.0 degrees C Thermal 0: trip point 3 switches to mode active at temperature 55.0 degrees C Thermal 1: ok, 26.8 degrees C Thermal 1: trip point 0 switches to mode critical at temperature 100.0 degrees C Thermal 1: trip point 1 switches to mode active at temperature 71.0 degrees C Thermal 1: trip point 2 switches to mode active at temperature 55.0 degrees C Cooling 0: LCD 0 of 15 Cooling 1: Processor 0 of 10 Cooling 2: Processor 0 of 10 Cooling 3: Processor 0 of 10 Cooling 4: Processor 0 of 10 Cooling 5: Fan 0 of 1 Cooling 6: Fan 0 of 1 I suspect even hddtemp gives bogus readings as its always at 46 $ sudo hddtemp /dev/sda /dev/sda: ST9500420AS: 46°C I have gone through some bug reports and some used to have the same problem after resuming from suspend. But I always have this problem. I had updated to the latest BIOS from Windows a couple of weeks ago, will updating from Ubuntu change anything? CORRECTION: hddtemp's readings do change. Its now at 45.

    Read the article

  • Questioning one of the arguments for dependency injection: Why is creating an object graph hard?

    - by oberlies
    Dependency injection frameworks like Google Guice give the following motivation for their usage (source): To construct an object, you first build its dependencies. But to build each dependency, you need its dependencies, and so on. So when you build an object, you really need to build an object graph. Building object graphs by hand is labour intensive (...) and makes testing difficult. But I don't buy this argument: Even without dependency injection, I can write classes which are both easy to instantiate and convenient to test. E.g. the example from the Guice motivation page could be rewritten in the following way: class BillingService { private final CreditCardProcessor processor; private final TransactionLog transactionLog; // constructor for tests, taking all collaborators as parameters BillingService(CreditCardProcessor processor, TransactionLog transactionLog) { this.processor = processor; this.transactionLog = transactionLog; } // constructor for production, calling the (productive) constructors of the collaborators public BillingService() { this(new PaypalCreditCardProcessor(), new DatabaseTransactionLog()); } public Receipt chargeOrder(PizzaOrder order, CreditCard creditCard) { ... } } So there may be other arguments for dependency injection (which are out of scope for this question!), but easy creation of testable object graphs is not one of them, is it?

    Read the article

  • UITableView backgroundColor always gray on iPad

    - by rjobidon
    Hi, When I set the backgroundColor for my UITableView it works fine on iPhone (device and simulator) but NOT on the iPad simulator. Instead I get a light gray background for any color I set including groupTableViewBackgroundColor. Steps to reproduce: Create a new navigation-based project. Open RootViewController.xib and set the table view style to "Grouped". Add this responder to the RootViewController:- (void)viewDidLoad { [super viewDidLoad]; self.view.backgroundColor = [UIColor blackColor]; } Select Simulator SDK 3.2, build and run. You will get a black background (device and simulator). Select your target in the project tree. Click on Project : Upgrade Current Target for iPad. Build and run. You will get a light gray background. Revert the table view style to Plain and you will get a black background. Thanks for your help!

    Read the article

  • Unable to change the value of the variable

    - by Legend
    I'm using a discrete event simulator called ns-2 that was built using Tcl and C++. I was trying to write some code in TCL: set ns [new Simulator] set state 0 $ns at 0.0 "puts \"At 0.0 value of state is: $state\"" $ns at 1.0 "changeVal" $ns at 2.0 "puts \"At 2.0 values of state is: $state\"" proc changeVal {} { global state global ns $ns at-now "set state [expr $state+1]" puts "Changed value of state to $state" } $ns run Here's the output: At 0.0 value of state is: 0 Changed value of state to 0 At 2.0 values of state is: 0 The value of state does not seem to change. I am not sure if I am doing something wrong in using TCL. Anyone has an idea as to what might be going wrong here? EDIT: Thanks for the help. Actually, ns-2 is something over which I do not have much control (unless I recompile the simulator itself). I tried out the suggestions and here's the output: for the code: set ns [new Simulator] set state 0 $ns at 0.0 "puts \"At 0.0 value of state is: $state\"" $ns at 1.0 "changeVal" $ns at 9.0 "puts \"At 2.0 values of state is: $state\"" proc changeVal {} { global ns set ::state [expr {$::state+1}] $ns at-now "puts \"At [$ns now] changed value of state to $::state\"" } $ns run the output is: At 0.0 value of state is: 0 At 1 changed value of state to 1 At 2.0 values of state is: 0 And for the code: set ns [new Simulator] set state 0 $ns at 0.0 "puts \"At 0.0 value of state is: $state\"" $ns at 1.0 "changeVal" $ns at 9.0 "puts \"At 2.0 values of state is: $state\"" proc changeVal {} { global ns set ::state [expr {$::state+1}] $ns at 1.0 {puts "At 1.0 values of state is: $::state"} } $ns run the output is: At 0.0 value of state is: 0 At 1.0 values of state is: 1 At 2.0 values of state is: 0 Doesn't seem to work... Not sure if its a problem with ns2 or my code...

    Read the article

  • MIPS assembly: big and little endian confusion

    - by Barney
    I've run the following code snippet on the MIPS MARS simulator. That simulator is little endian. So the results are as follows: lui $t0,0x1DE # $t0 = 0x01DE0000 ori $t0,$t0,0xCADE # $t0 = 0x01DECADE lui $t1,0x1001 # $t1 = 0x10010000 sw $t0,200($t1) # $t1 + 200 bytes = 0x01DECADE lw $t2,200($t1) # $t2 = 0x01DECADE So on a little endian MIPS simulator, the value of $t2 at the end of the program is 0x01DECADE. If this simulator was big endian, what would the value be? Would it be 0xDECADE01 or would it still be 0x01DECADE?

    Read the article

  • UITableView backgroundColor always grey on iPad

    - by rjobidon
    Hi, When I set the backgroundColor for my UITableView it works fine on iPhone (device and simulator) but NOT on the iPad simulator. Instead I get a light grey background for any color I set including groupTableViewBackgroundColor. Steps to reproduce: Create a new navigation-based project. Open RootViewController.xib and set the table view style to "Grouped". Add this responder to the RootViewController:- (void)viewDidLoad { [super viewDidLoad]; self.view.backgroundColor = [UIColor blackColor]; } Select Simulator SDK 3.2, build and run. You will get a black background (device and simulator). Select your target in the project tree. Click on Project : Upgrade Current Target for iPad. Build and run. You will get a light grey background. Revert the table view style to Plain and you will get a black background. Thanks for your help!

    Read the article

  • Interface Builder "Simulate Interface" not working

    - by bpapa
    I am using Interface Builder to play around with some ideas. I never noticed that there is a "Simulate Interface" feature which apparently will render the nib in the iPhone simulator. So, I created a view, put one component in there (a Segmented Control), saved it, selected "Simulate Interface", the simulator launched but... nothing rendered in the simulator. Just a black screen. I thought maybe my nib wasn't complete enough, so I've tried it with all of my old nibs and I'm having the same problem with all of them. None of them render in the simulator at all. Is there some trick that I'm missing?

    Read the article

  • ipad tabbar rotation

    - by MaKo
    hi, please help with this noob questions but really making me go crazy if I create a project from scratch (using windows based app) for the ipad, and add a tabbar , with firstviewController, and secondviewController, it works fine, starts in landscape mode, but in info.plist I set it to Landscape(left home button), but really in simulator starts with the button on the right side! in the FirstViewController.m (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation { if (interfaceOrientation == UIInterfaceOrientationLandscapeLeft || interfaceOrientation == UIInterfaceOrientationLandscapeRight) return YES; else { return NO; }} so it starts in landscape, and rotates as the simulator rotates, but if I create a template app for iphone tabbar, set the info.plist Initial interface orientation Landscape (left home button) and add the code above, IT DOESNT WORK!!! simulator starts with button at left but tab bar on the side, same problem that I had with an app that Im porting from iphone to ipad, (landscape intended) I get to the landscape start mode, but the tab bar remains on the side! also the only way to make the old ported app to show the simulator on the side was with UIInterfaceOrientation UIIntefaceOrientationLandscapeLeft (didnt work with Initial interface orientation), does not let me choose the value for the key, but it shows the simulator on landscape,, so,, what can I do please to show the tab bar on landscape mode??? the tabbar from scratch was made to see if the code will work , but it didnt?? why does it work in the tab bar made from windows app and not tab bar app? I just want the tab bar to show in landscape ahhh, thanks

    Read the article

  • Static Libraries on iPhone device

    - by Akusete
    I have two projects, a Cocoa iPhone application and a static library which it uses. I've tested it successfully on the iPhone simulator, but when I try to deploy it to my iPhone device I get (symbol not found) link errors. If I remove the dependancy of the library the project builds/runs fine. I have made sure all the build settings are set to iPhoneOS not the simulator. Im sure its something simple, but has anyone run into similar problems moving from iPhone simulator to device? --EDIT: I have managed to create new projects (one for the application and one for the static library), and successfully get them to run on the iPhone or simulator. But I have a very strange problem... for each specific project I cannot get it working for BOTH the device and the simulator... I have double checked the build settings, made sure the libraries that are being references are for the matching build settings (I believe) but I cannot resolve these linking errors. I think I must be doing something very wrong... all the apple documentation says 'its super simple - one click' but this is giving me a lot of problems. This is probably something to do with xCode build settings, but I cannot seem to understand why selecting the different build platforms and rebuilding the libraries does not work.

    Read the article

  • transfering a container of data to different classes

    - by user340699
    I am passing a vector of bids from Trader class to Simulator class.which class then passes it on to the auctioneer class.something seems messed up, can anyone spot it please. Below is part of the code: Error: 199 expected primary-expression before '&' token //Class of Origin of the vector. class Trader { private: int nextBidId; public: Trader(); ~Trader(){}; Bid getNextBid(); Bid getNextBid(int trdId, int qty, int price, char type); void loadRange( vector <Bid> & bids ) {} ; void loadRange(BidList &, int trdId, int qty, int price, char type, int size); }; //To be received by the Simulator class Simulator { vector <Bid> list; Trader trader; Auctioneer auctioneer; public: void run(); }; // Passing the vector into a function in simulator Simulator::accept_bids(bid_vector::const_iterator begin, bid_vector::const_iterator end){ vector<Bid>::iterator itr; } //Its journey should end with the Auctioneer. who displays the data class Auctioneer { public: vector <Bid>v2;// created a new vector to hold the objects void accept_bids(vector<Bid> & bids); void displayBids(){return bids} };

    Read the article

  • segemented control in iphone not appearing as in interface builder

    - by zecougar
    here is what i see in IB and here is what appears in the simulator i've used a segmented control with style = "Bezelled". When i change the style to "Bar", IB and simulator are consistent in the display. the style is set in interface builder and not in code, if that matters Also - the edges look rather ugly in the simulator. not what i expected even when it rendered incorrectly. Thanks in advance

    Read the article

  • how to pass vector objects from one class to another

    - by gnwillix
    I am trying to make a program that generates bids automatically, then pack them into a vector and send them to the auctioneer.The simulator class, the simulator can then pass the bids to the auctioneer.How do I implement this communication.I thinking of sending the all vector conatiner of bids to the simulator.Can anyone demostrate a better way of sening vector objects from one class to another.

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >