Search Results

Search found 25519 results on 1021 pages for 'virtual machine'.

Page 460/1021 | < Previous Page | 456 457 458 459 460 461 462 463 464 465 466 467  | Next Page >

  • How to infer the type of a derived class in base class?

    - by enzi
    I want to create a method that allows me to change arbitrary properties of classes that derive from my base class, the result should look like this: SetPropertyValue("size.height", 50); – where size is a property of my derived class and height is a property of size. I'm almost done with my implementation but there's one final obstacle that I want to solve before moving on, to describe this I will first have to explain my implementation a bit: Properties that can be modified are decorated with an attribute There's a method in my base class that searches for all derived classes and their decorated properties For each property I generate a "property modifier", a class that contains 2 delegates: one to set and one to get the value of the property. Property Modifiers are stored in a dictionary, with the name of the property as key In my base class, there is another dictionary that contains all property-modifier-dictionaries, with the Type of the respective class as key. What the SetPropertyValue method does is this: Get the correct property-modifier-dictionary, using the concrete type of the derived class (<- yet to solve) Get the property modifier of the property to change (e.g. of the property size) Use the get or set delegate to modify the property's value Some example code to clarify further: private static Dictionary<RuntimeTypeHandle, object> EditableTypes; //property-modifier-dictionary protected void SetPropertyValue<T>(EditablePropertyMap<T> map, string property, object value) { var property = map[property]; // get the property modifier property.Set((T)this, value); // use the set delegate (encapsulated in a method) } In the above code, T is the Type of the actual (derived) class. I need this type for the get/set delegates. The problem is how to get the EditablePropertyMap<T> when I don't know what T is. My current (ugly) solution is to pass the map in an overriden virtual method in the derived class: public override void SetPropertyValue(string property, object value) { base.SetPropertyValue((EditablePropertyMap<ExampleType>)EditableTypes[typeof(ExampleType)], property, value); } What this does is: get the correct dictionary containing the property modifiers of this class using the class's type, cast it to the appropiate type and pass it to the SetPropertyValue method. I want to get rid of the SetPropertyValue method in my derived class (since there are a lot of derived classes), but don't know yet how to accomplish that. I cannot just make a virtual GetEditablePropertyMap<T> method because I cannot infer a concrete type for T then. I also cannot acces my dictionary directly with a type and retrieve an EditablePropertyMap<T> from it because I cannot cast to it from object in the base class, since again I do not know T. I found some neat tricks to infere types (e.g. by adding a dummy T parameter), but cannot apply them to my specific problem. I'd highly appreciate any suggestions you may have for me.

    Read the article

  • C++ class member functions instantiated by traits

    - by Jive Dadson
    I am reluctant to say I can't figure this out, but I can't figure this out. I've googled and searched Stack Overflow, and come up empty. The abstract, and possibly overly vague form of the question is, how can I use the traits-pattern to instantiate non-virtual member functions? The question came up while modernizing a set of multivariate function optimizers that I wrote more than 10 years ago. The optimizers all operate by selecting a straight-line path through the parameter space away from the current best point (the "update"), then finding a better point on that line (the "line search"), then testing for the "done" condition, and if not done, iterating. There are different methods for doing the update, the line-search, and conceivably for the done test, and other things. Mix and match. Different update formulae require different state-variable data. For example, the LMQN update requires a vector, and the BFGS update requires a matrix. If evaluating gradients is cheap, the line-search should do so. If not, it should use function evaluations only. Some methods require more accurate line-searches than others. Those are just some examples. The original version instantiates several of the combinations by means of virtual functions. Some traits are selected by setting mode bits that are tested at runtime. Yuck. It would be trivial to define the traits with #define's and the member functions with #ifdef's and macros. But that's so twenty years ago. It bugs me that I cannot figure out a whiz-bang modern way. If there were only one trait that varied, I could use the curiously recurring template pattern. But I see no way to extend that to arbitrary combinations of traits. I tried doing it using boost::enable_if, etc.. The specialized state information was easy. I managed to get the functions done, but only by resorting to non-friend external functions that have the this-pointer as a parameter. I never even figured out how to make the functions friends, much less member functions. The compiler (VC++ 2008) always complained that things didn't match. I would yell, "SFINAE, you moron!" but the moron is probably me. Perhaps tag-dispatch is the key. I haven't gotten very deeply into that. Surely it's possible, right? If so, what is best practice?

    Read the article

  • What causes POCO proxy entities to only sometimes be created in Entity Framework 4.

    - by Kohan
    I have set up my POCOs and I have marked their public properties as virtual and I am successfully getting Proxies most of the time (95%) but randomly I am getting EF return some proxies and some non-proxies. Recycling the app pool when this happens will then fix this instance of the error and it will go away for an amount of time. Then it will re-occur in some other random (it seems) place. What can cause this sort of behaviour? Thanks, Kohan

    Read the article

  • How-to diagnose and fix such "on-site" crash of dotnet application?

    - by Dmitriy Matveev
    Hello! I'm working on some application which has auto-update function. The implemented idea is simple as following: - There are some "starter" application which is installed to "Program Files/whatever/...". It's the application which is intended to be started by user. - Each time the "starter" application is executed it checks server for updates and downloads it to "%APPDATA%/some/...". And then it starts some application from that folder. Above approach is working on my development machine (running Vista) and on some other machines under XP, but under some different machine (running Windows 7) it isn't working. When "starter" executes the real application it crashes with some unknown problem (Signature = System.UnauthorizedAccess). When real application is executed manually from %APPDATA%/some/ folder then everything is working fine. I've tried to set same working directory in ProcessStartInfo, so "starter" will also execute real application in that folder, but this isn't helped me. How can I diagnose and/or fix that issue?

    Read the article

  • Periodic GPU performance problem

    - by Peter Lillevold
    Hi folks! I have a WinForms application that uses XNA to animate 3D models in a control. The app have been doing just fine for months but recently I've started to experience periodic pauses in the animation. Setting out to investigate what is going on I have established these facts: It (currently) happens on my machine only Removing everything from my render loop does not improve the problem In 2. I didn't actually remove everything, I limited my loop to set the viewport on my GraphicsDevice and then do a GraphicsDevice.Present. Trying to dig further I fired up PIX to capture some statistics. Screenshots of two PIX runs can be viewed here (Run6) and here (Run14). Run6 is using my original render loop and Run14 is using the bare-bones Present loop. PIX tells me that the GPU is periodically doing something, and I assume this is causing the pauses. What could be the cause of this? Or how do I go about finding out what the GPU is actually doing? Note: I'm using XNA 3.1 on a Windows 7 x64 dual-core machine with 8GB RAM. Note2: also posted this question on the XNA Creators forums here.

    Read the article

  • some problem with iOS 4.2.1

    - by bicbac
    Hi, I've been developing an app with MacBook Pro (MBP) so far. Last week one of my friends gave me new macbook air 11"(MBA). so Now I can test my code with more than one machine with the same version of developing tools - Both machine has Xcode (3.2.5) and iOS SDK 4.2.1). After some point my app starts get terminated suddenly(iPhone sumulator), and I was using MBP. I got no error message whatsoever. it just stops. I reckon the crash comes from dealing with memory, like 'release'/ 'double-release'. (I'm not 100% sure though). Anyway I thought there must be some mistake within my code for sure. -Confusion starts from this part.- With my MBA, on the other hand, I don'y see any crash. It just works fine. There is nothing different between MBA & MBP except the h/w specifications. Same code, same versions of XCode and iOS SDK. Is the fact that no crash at MBA suggesting that I have to look somewhere else than the code itself? I red some article and Q&As on iOS4.2.1 and XCode 3.2.5 that the most recent version of XCode doesn't recognize the iOS 4.2.1 since the 4.2.1 came out later the 3.2.5. Is it the reason? I have no idea at this moment what should be the next move. thanks

    Read the article

  • debug=true in web.config = BAD thing?

    - by MateloT
    We're seeing lots of virtual memory fragmentation and out of memory errors and then it hits the 3GB limit. The compilation debug is set to true in the web.config but I get different answers from everyone i ask, does debug set to true cause each aspx to compile into random areas of ram thus fragmenting that ram and eventually causing out of memory problems?

    Read the article

  • Linux configurations that would affect Java memory usage?

    - by wmacura
    Hi, Background: I have a set of java background workers I start as part of my webapp. I develop locally on Ubuntu 10.10 and deploy to an Ubuntu 10.04LTS server (a media temple (ve) instance). They're both running the same JVM: Sun JVM 1.6.0_22-b04. As part of the initialization script each worker is started with explicit Xmx, Xms, and XX:MaxPermGen settings. Yet somehow locally all 10 workers use 250MB, while on the server they use more than 2.7GB. I don't know how to begin to track this down. I thought the Ubuntu (and thus, kernel) version might make a difference, but I tried an old 10.04 VM and it behaves as expected. I've noticed that the machine does not seem to ever use memory for buffer or cache (according to htop), which seems a bit strange, but perhaps normal for a server? (edited) Some info: (server) root@devel:/app/axir/target# uname -a Linux devel 2.6.18-028stab069.5 #1 SMP Tue May 18 17:26:16 MSD 2010 x86_64 GNU/Linux (local) wiktor@beastie:~$ uname -a Linux beastie 2.6.35-25-generic #44-Ubuntu SMP Fri Jan 21 17:40:44 UTC 2011 x86_64 GNU/Linux (edited) Comparing PS output: (ps -eo "ppid,pid,cmd,rss,sz,vsz") PPID PID CMD RSS SZ VSZ (local) 1588 1615 java -cp axir-distribution. 25484 234382 937528 1615 1631 java -cp /home/wiktor/Code/ 83472 163059 652236 1615 1657 java -cp /home/wiktor/Code/ 70624 89135 356540 1615 1658 java -cp /home/wiktor/Code/ 37652 77625 310500 1615 1669 java -cp /home/wiktor/Code/ 38096 77733 310932 1615 1675 java -cp /home/wiktor/Code/ 37420 61395 245580 1615 1684 java -cp /home/wiktor/Code/ 38000 77736 310944 1615 1703 java -cp /home/wiktor/Code/ 39180 78060 312240 1615 1712 java -cp /home/wiktor/Code/ 38488 93882 375528 1615 1719 java -cp /home/wiktor/Code/ 38312 77874 311496 1615 1726 java -cp /home/wiktor/Code/ 38656 77958 311832 1615 1727 java -cp /home/wiktor/Code/ 78016 89429 357716 (server) 22522 23560 java -cp axir-distribution. 24860 285196 1140784 23560 23585 java -cp /app/axir/target/a 100764 161629 646516 23560 23667 java -cp /app/axir/target/a 72408 92682 370728 23560 23670 java -cp /app/axir/target/a 39948 97671 390684 23560 23674 java -cp /app/axir/target/a 40140 81586 326344 23560 23739 java -cp /app/axir/target/a 39688 81542 326168 They look very similar. In fact, the question now is why, if I add up the virtual memory usage on the server (3.2GB) does it more closely reflect 2.4GB of memory used (according to free), yet locally the virtual memory used adds up to a much more substantial 4.7GB but only actually uses ~250MB. It seems that perhaps memory isn't being shared as aggressively. (if that's even possible) Thank you for your help, Wiktor

    Read the article

  • SQL programming interface to external storage application

    - by Gopala
    My application is a non-relational database application with a tcl interface to retrieve data. I would like to add SQL programming interface to my application. Is there any library that converts SQL/PLSQL statements to API calls? It should also support stored procedures. SQLite(Embedded) has 'virtual table' mechanism that suits my requirement but it lacks stored procedure feature. -Gopala

    Read the article

  • Converting to MVC3 - some views still want 'System.Web.Mvc, Version=1.0.0.0,

    - by justSteve
    I've used the directions from the release notes and have been able to navigate most pages - my unit tests are not comprehensive but most all pass. However...when I attempt to edit an existing or create a new user I'm getting the error pasted below - notice that it's references version=1... - this project started life as a v1 and was converted to mvc2 at the RTM. I'm still working with V2 projects but no longer any v1. Am i due for a GAC cleansing? Server Error in '/' Application. Could not load file or assembly 'System.Web.Mvc, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified. === Pre-bind state information === LOG: User = STUDIO11\mUser LOG: DisplayName = System.Web.Mvc, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35 (Fully-specified) LOG: Appbase = file:///C:/Users/C:\Users\[path to project]/ LOG: Initial PrivatePath = C:\Users\[path to project]\bin Calling assembly : App_Web_qcjylaoc, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null. === LOG: This bind starts in default load context. LOG: Using application configuration file: C:\Users\[path to project]\web.config LOG: Using host configuration file: LOG: Using machine configuration file from C:\Windows\Microsoft.NET\Framework\v4.0.30319\config\machine.config. LOG: Post-policy reference: System.Web.Mvc, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35 LOG: The same bind was seen before, and was failed with hr = 0x80070002.

    Read the article

  • Migrating MachineKey from iis6 on old server to iis7 on new server

    - by MaseBase
    I am migrating our hosting environment to a totally new data center with new boxes and hardware and software... the whole deal. Our website cookies are encrypted using the machineKey, so when I make a request to my domain and point it to the new web server (by overriding the local hosts file), I get an error because the cookie cannot be decrypted, since the Machine Key is different. I'd like to avoid any problems a frequent user might have when they arrive at the new server for the first time. To the best of my knowledge, at this point I think I need to set the same MachineKey from our current servers on our new servers. This way when past visitors with a cookie arrive at our website served by the new server, the cookie will be decrypted properly with the MachineKey it was encrypted with and then log them in properly. My question is where do I find my MachineKey value (in IIS 6 win2k3 server) so I can use that value to set it statically on my new servers? I've pulled up my machine.config file, but it doesn't specify the key, it only specifies a configSection where the key can be defined. It's not in my web.config for the app or elsewhere. I did find this great article on some MachineKey and Web Garden woes (which could explain some other bugs I've been experiencing with regard to the machineKey). Update I am back to this issue and am still faced with a similar problem. I have the MachineKey auto-generated on the IIS6 server but I need to get that exact key so I can set it explicitly and not have it auto-generated anymore. Any help is appreciated...

    Read the article

  • javac will not compile enum, ( Windows Sun 1.6 --> OpenJDK 1.6)

    - by avgvstvs
    package com.scheduler.process; public class Process { public enum state { NOT_SUBMITTED, SUBMITTED, BLOCKED, READY, RUNNING, COMPLETED } private state currentState; public state getCurrentState() { return currentState; } public void setCurrentState(state currentState) { this.currentState = currentState; } } package com.scheduler.machine; import com.scheduler.process.Process; import com.scheduler.process.Process.state; public class Machine { com.scheduler.process.Process p = new com.scheduler.process.Process(); state s = state.READY; //fails if I don't also explicitly import Process.state p.setCurrentState(s); //says I need a declarator id after 's'... this is wrong. p.setCurrentState(state.READY); } Modified the example to try and direct to the issue. I cannot change the state on this code. Eclipse suggests importing Process.state like I had on my previous example, but this doesn't work either. This allows state s = state.READY but the call to p.setCurrentState(s); fails as does p.setCurrentState(state.READY);

    Read the article

  • Boost Include Files in VC++

    - by Dr. K
    For the last few years, I have been exclusively a C# developer. Previously, I developed in C++ and have a C++ application that I built about 3 years ago using VS2005. It made extensive use of the Boost libraries. I recently decided to brush off the old app and rebuild it in VS2008 with the latest version of Boost (the latest version with the "easy" installation program from BoostPro Computing), 1.39. Previously when I had the program running I was at 1.33. Also, the last time the program was running was at least 2 OS installations ago. The Boost installation is located on my machine at: "C:\Program Files\boost\boost_1_39". Anyway, I have done the following: Set the project's "Additional Include Directories" directory to "C:\Program Files\boost\boost_1_39" Added "C:\Program Files\boost\boost_1_39" to VS2008's Tools - Options - Projects and Solutions - VC++ Directories - Include Files I have a number of Boost includes in my stdafx.h file. The compiler fails upon attempting to open the first one - #include <boost/algorithm/string/string.hpp> I have confirmed that the above file is indeed located at "C:\Program Files\boost\boost_1_39\boost\algorithm\string\string.hpp" I continue to get: fatal error C1083: Cannot open include file: 'boost/algorithm/string/string.hpp': No such file or directory Any tips on what else to check would be greatly appreciated. Again, this is an application that compiled fine a few years ago, but the source has now been moved to a new machine/compiler.

    Read the article

  • TCP/IP RST being sent differently in different browsers.

    - by Brian
    On Mac OS X (10.6), if I start a YouTube video download and pull the Ethernet cable for 5 or so seconds, then plug it back in, I get varying results depending on the browser. With Opera and Chrome, after I plug the cable back in the video continues to load. But with Safari and Firefox, it never does. Using Wireshark to look at the traffic, I found that Opera and Chrome simply ACK the first packet from YouTube after the cable has been plugged back in, but Safari and Firefox set the RST flag (0x4) in the TCP header and no more traffic follows. I can put a HUB in between the machine and the internet connection, the problem goes away and all four browsers continue loading the video when the cable is plugged back into the HUB. Again, looking at the Wireshark logs, it's evident that the machine doesn't see the Mulitcast connection close and there is simply a delay in the packets flowing through. So it seems that if Safari and Firefox sees a Multicast connection close, and then later see data on that same connection, they will send a RST. My question is why? What is the correct course of action, and why are 2/4 browsers doing it one way, while the other 2/4 are doing it another way? Is there somewhere in the code that I can see where this is happening in Firefox, for instance? Thank you very much.

    Read the article

  • Design for fastest page download

    - by mexxican
    I have a file with millions of URLs/IPs and have to write a program to download the pages really fast. The connection rate should be at least 6000/s and file download speed at least 2000 with avg. 15kb file size. The network bandwidth is 1 Gbps. My approach so far has been: Creating 600 socket threads with each having 60 sockets and using WSAEventSelect to wait for data to read. As soon as a file download is complete, add that memory address(of the downloaded file) to a pipeline( a simple vector ) and fire another request. When the total download is more than 50Mb among all socket threads, write all the files downloaded to the disk and free the memory. So far, this approach has been not very successful with the rate at which I could hit not shooting beyond 2900 connections/s and downloaded data rate even less. Can somebody suggest an alternative approach which could give me better stats. Also I am working windows server 2008 machine with 8 Gig of memory. Also, do we need to hack the kernel so as we could use more threads and memory. Currently I can create a max. of 1500 threads and memory usage not going beyond 2 gigs [ which technically should be much more as this is a 64-bit machine ]. And IOCP is out of question as I have no experience in that so far and have to fix this application today. Thanks Guys!

    Read the article

  • Try method in powershell

    - by Willy
    So I want to build a try method into my powershell script below. If I am denied access to a server, I want it to skip that server. Please help.. [code]$Computers = "server1", "server2" Get-WmiObject Win32_LogicalMemoryConfiguration -Computer $Computers | Select-Object ` @{n='Server';e={ $_.__SERVER }}, ` @{n='Physical Memory';e={ "$('{0:N2}' -f ($_.TotalPhysicalMemory / 1024))mb" }}, ` @{n='Virtual Memory';e={ "$('{0:N2}' -f ($_.TotalPageFileSpace / 1024))mb" }} | ` Export-CSV "output.csv"[/code]

    Read the article

  • ActiveSync File Explorer alternative

    - by Andy White
    Is there an alternative to the ActiveSync "Explore" for looking at the file system of a Windows Mobile device? (This same thing can also be accessed from My Computer - Mobile Device). It would be nice if you could just navigate into the device and view or edit files without having to copy them back and forth from the PC. Why does it work this way in the first place? Is the Explore view sort of a "virtual" version of your phone's file system that cannot be edited directly?

    Read the article

  • JsonIgnore attributes not working in ASP.NET?

    - by Geoffrey
    I've got an object in my project with circular references. I've put [JsonIgnore] above the field like so: [JsonIgnore] public virtual Foobar ChildObject { get; set; } I'm still getting circular reference errors when I serialize the object. The only fields that do not have JsonIgnore are string fields and should not cause this. Is there something else I need to do to get JsonIgnore to work? Thanks!

    Read the article

  • Cannot .Count() on IQueryable (NHibernate)

    - by Bruno Reis
    Hello, I'm with an irritating problem. It might be something stupid, but I couldn't find out. I'm using Linq to NHibernate, and I would like to count how many items are there in a repository. Here is a very simplified definition of my repository, with the code that matters: public class Repository { private ISession session; /* ... */ public virtual IQueryable<Product> GetAll() { return session.Linq<Product>(); } } All the relevant code in the end of the question. Then, to count the items on my repository, I do something like: var total = productRepository.GetAll().Count(); The problem is that total is 0. Always. However there are items in the repository. Furthermore, I can .Get(id) any of them. My NHibernate log shows that the following query was executed: SELECT count(*) as y0_ FROM [Product] this_ WHERE not (1=1) That must be that "WHERE not (1=1)" clause the cause of this problem. What can I do to be able .Count() the items in my repository? Thanks! EDIT: Actually the repository.GetAll() code is a little bit different... and that might change something! It is actually a generic repository for Entities. Some of the entities implement also the ILogicalDeletable interface (it contains a single bool property "IsDeleted"). Just before the "return" inside the GetAll() method I check if if the Entity I'm querying implements ILogicalDeletable. public interface IRepository<TEntity, TId> where TEntity : Entity<TEntity, TId> { IQueryable<TEntity> GetAll(); ... } public abstract class Repository<TEntity, TId> : IRepository<TEntity, TId> where TEntity : Entity<TEntity, TId> { public virtual IQueryable<TEntity> GetAll() { if (typeof (ILogicalDeletable).IsAssignableFrom(typeof (TEntity))) { return session.Linq<TEntity>() .Where(x => (x as ILogicalDeletable).IsDeleted == false); } else { return session.Linq<TEntity>(); } } } public interface ILogicalDeletable { bool IsDeleted {get; set;} } public Product : Entity<Product, int>, ILogicalDeletable { ... } public IProductRepository : IRepository<Product, int> {} public ProductRepository : Repository<Product, int>, IProductRepository {} Edit 2: actually the .GetAll() is always returning an empty result-set for entities that implement the ILogicalDeletable interface (ie, it ALWAYS add a WHERE NOT (1=1) clause. I think Linq to NHibernate does not like the typecast.

    Read the article

  • Accessing smart card with Java

    - by Tom Brito
    I'm trying to learn about how does Java access smart cards, due to a project analysis. I wonder if there is any kind of virtual smart card which I could use to make some tests with Java? By the way, I've read about Java Card, and looks like it is used to run Java in cards, not to smart card data access, right?

    Read the article

< Previous Page | 456 457 458 459 460 461 462 463 464 465 466 467  | Next Page >