Search Results

Search found 752 results on 31 pages for 'leaks'.

Page 17/31 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Freeing ImageData when deleting a Canvas

    - by user578770
    I'm writing a XUL application using HTML Canvas to display Bitmap images. I'm generating ImageDatas and imporingt them in a canvas using the putImageData function : for(var pageIndex=0;pageIndex<100;pageIndex++){ this.img = imageDatas[pageIndex]; /* Create the Canvas element */ var imgCanvasTmp = document.createElementNS("http://www.w3.org/1999/xhtml",'html:canvas'); imgCanvasTmp.setAttribute('width', this.img.width); imgCanvasTmp.setAttribute('height', this.img.height); /* Import the image into the Canvas */ imgCanvasTmp.getContext('2d').putImageData(this.img, 0, 0); /* Use the Canvas into another part of the program (Commented out for testing) */ // this.displayCanvas(imgCanvasTmp,pageIndex); } The images are well imported but there seems to be a memory leak due to the putImageData function. When exiting the "for" loop, I would expect the memory allocated for the Canvas to be freed but, by executing the code without executing putImageData, I noticed that my program at the end use 100Mb less (my images are quite big). I came to the conclusion that the putImageData function prevent the garbage collector to free the allocated memory. Do you have any idea how I could force the garbage collector to free the memory? Is there any way to empty the Canvas? I already tried to delete the canvas using the delete operator or to use the clearRect function but it did nothing. I also tried to reuse the same canvas to display the image at each iteration but the amount of memory used did not changed, as if the image where imported without deleting the existing ones...

    Read the article

  • C#/XNA Giant Memory Leak

    - by user1440926
    this is my first post here. I'm making a game in Visual Studio 2010 using XNA, and i've hit a giant memory leak. My game starts out using 17k ram and then after ten minutes it's upto 65k. I ran some memory profilers, and they all say that new instances of the String object are being created, but they aren't live. The amount of live instances of String hasn't changed at all. It's also creating instances of Char[] (which i'd expect from it), Object[], and StringBuilder. My game is pretty new but there's too much code to post here. I have no idea how to get rid of instances that aren't live, please help!

    Read the article

  • Potential leak of an object allocated

    - by idober
    Using the build and analyze of XCode I saw i have a memory leak in my code: - (NSString *) doIt { NSString *var = [[NSString alloc] init]; return var; } This is of course a simplified snippet of my problem where do i release the object?

    Read the article

  • Release resources in .Net C#

    - by zaidwaqi
    Hi, I'm new to C# and .NET, ,and have been reading around about it. I need to know why and when do I need to release resources? Doesn't the garbage collector take care of everything? When do I need to implement IDisposable, and how is it different from destructor in C++? Also, if my program is rather small i.e. a screensaver, do I need to care about releasing resources? Thanks.

    Read the article

  • Visual C++ function suddenly 170x slower

    - by Mikael
    For the past few months I've been working on a Visual C++ project to take images from cameras and process them. Up until today this has taken about 65 ms to update the data but now it has suddenly increased significantly. What happens is: I launch my program and for the first 30 or so iterations it performs as expected, then suddenly the loop time increases from 65 ms to 250 ms. The odd thing is, after timing each function I found out that the part of the code which is causing the slowdown is fairly basic and has not been modified in over a month. The data which goes into it is unchanged and identical every iteration but the execution time which is initially less than 1 ms suddenly increases to 170 ms while the rest of the code is still performing as expected (time-wise). Basically, I am calling the same function over and over, for the first 30 calls it performs as it should, after that it slows down for no apparent reason. It might also be worth noting that it is a sudden change in execution time, not a gradual increase. What could be causing this? The code is leaking some memory (~50 kb/s) but not nearly enough to warrant a sudden 4x slowdown. If anyone has any ideas I'd love to hear them!

    Read the article

  • Will this be garbage collected in JVM?

    - by stjowa
    I am running the following code every two minutes via a Timer: object = new Object(this); Potentially, this is a lot of objects being created and a lot of objects being overwritten. Do the overwritten objects get garbage collected, even with a reference to itself being used in the newly created object? I am using JDK 1.6.0_13. Thanks for the help.

    Read the article

  • freeing a memory twice

    - by benjamin button
    Hi, AFAIAK, freeing a NULL will result in nothing.i mean nothing is being done by the compiler/no functionality is performed. Still i do see some statements where people say that one of the scenario,where a memory corruption can occur is "freeing a memory twice"? Is this still true?

    Read the article

  • How can I tell what is using the memory when there is a heap overflow in Java?

    - by Grae
    Hi all, I know a little about profiling, but what I am particularlly insterest in, is what has all the memory when I get these heap over flow exceptions. I will start getting them after about a hour of debugging. I am hoping there is some sort of dump or something, that I can use to get a list of what instances are around at the time the program starts. By the way, sorry if this is a lazy question. I really shoud put sometime in learning about profiling. Grae

    Read the article

  • Is this a php memory leak?

    - by mseifert
    I have memory_get_usage() in the footer of my page and with each refresh of the page, I watch it increase by about 100k each time. My page load creates many objects and destroys them when done . My parent objects each have __destruct() which uses unset() with all child objects. Child objects with a reference back to the parent, have __destruct() to unset() these references. Inserting memory_get_usage() before and after processing different parts of my page only tells me how much of the total usage was added due to that part of the script. How do I go about determining what memory is lost and not recycled for garbage collection after the page finishes loading? I have one global $_SESSION var containing objects storing user info, but have verified using strlen(serialize($object)) that this object is not growing in size. I presume that what I am seeing is a memory leak and that php garbage collection should be in effect after the script ends. Any ideas how to debug this?

    Read the article

  • DOMDocument::load in PHP 5

    - by Abs
    Hello all, I open a 10MB+ XML file several times in my script in different functions: $dom = DOMDocument::load( $file ) or die('couldnt open'); 1) Is the above the old style of loading a document? I am using PHP 5. Oppening it statically? 2) Do I need to close the loading of the XML file, if possible? I suspect its causing memory problems because I loop through the nodes of the XML file several thousand times and sometimes my script just ends abruptly. Thanks all for any help

    Read the article

  • What's the best way to fix memory leaks on the iPhone?

    - by Anthony Glyadchenko
    I'm trying to use XCode's Leaks utility to fix some memory leaks in my code. Is there a better and more understandable way to check for leaks with explanations that pinpoint the exact line and/or offer suggestions? Another question, I'm using AVAudioRecorder in my code in one of my view controllers. Should I load the recorder in viewDidLoad or in viewWillAppear?

    Read the article

  • Best practice for handling memory leaks in large Java projects?

    - by knorv
    In almost all larger Java projects I've been involved with I've noticed that the quality of service of the application degrades with the uptime of the container. This is most probably due to memory leaks in the code. The correct way to solve this problem is obviously to trace back to the root cause of the problem and fix the leaks in the code. The quick and dirty way of solving the problem is simply restarting Tomcat (or whichever servlet container you're using). These are my three questions: Assume that you choose to solve the problem by tracing the root cause of the problem (the memory leaks), how would you collect data to zoom in on the problem? Assume that you choose the quick and dirty way of speeding things up by simply restarting the container, how would you collect data to choose the optimal restart cycle? Have you been able to deploy and run projects over an extended period of time without ever restarting the servlet container to regain snappiness? Or is an occasional servlet restart something that one has to simply accept?

    Read the article

  • Leaks on Wikis: "Corporations...You're Next!" Oracle Desktop Virtualization Can Help.

    - by adam.hawley
    Between all the press coverage on the unauthorized release of 251,287 diplomatic documents and on previous extensive releases of classified documents on the events in Iraq and Afghanistan, one could be forgiven for thinking massive leaks are really an issue for governments, but it is not: It is an issue for corporations as well. In fact, corporations are apparently set to be the next big target for things like Wikileaks. Just the threat of such a release against one corporation recently caused the price of their stock to drop 3% after the leak organization claimed to have 5GB of information from inside the company, with the implication that it might be damaging or embarrassing information. At the moment of this blog anyway, we don't know yet if that is true or how they got the information but how did the diplomatic cable leak happen? For the diplomatic cables, according to press reports, a private in the military, with some appropriate level of security clearance (that is, he apparently had the correct level of security clearance to be accessing the information...he reportedly didn't "hack" his way through anything to get to the documents which might have raised some red flags...), is accused of accessing the material and copying it onto a writeable CD labeled "Lady Gaga" and walking out the door with it. Upload and... Done. In the same article, the accused is quoted as saying "Information should be free. It belongs in the public domain." Now think about all the confidential information in your company or non-profit... from credit card information, to phone records, to customer or donor lists, to corporate strategy documents, product cost information, etc, etc.... And then think about that last quote above from what was a very junior level person in the organization...still feeling comfortable with your ability to control all your information? So what can you do to guard against these types of breaches where there is no outsider (or even insider) intrusion to detect per se, but rather someone with malicious intent is physically walking out the door with data that they are otherwise allowed to access in their daily work? A major first step it to make it physically, logistically much harder to walk away with the information. If the user with malicious intent has no way to copy to removable or moble media (USB sticks, thumb drives, CDs, DVDs, memory cards, or even laptop disk drives) then, as a practical matter it is much more difficult to physically move the information outside the firewall. But how can you control access tightly and reliably and still keep your hundreds or even thousands of users productive in their daily job? Oracle Desktop Virtualization products can help.Oracle's comprehensive suite of desktop virtualization and access products allow your applications and, most importantly, the related data, to stay in the (highly secured) data center while still allowing secure access from just about anywhere your users need to be to be productive.  Users can securely access all the data they need to do their job, whether from work, from home, or on the road and in the field, but fully configurable policies set up centrally by privileged administrators allow you to control whether, for instance, they are allowed to print documents or use USB devices or other removable media.  Centrally set policies can also control not only whether they can download to removable devices, but also whether they can upload information (see StuxNet for why that is important...)In fact, by using Sun Ray Client desktop hardware, which does not contain any disk drives, or removable media drives, even theft of the desktop device itself would not make you vulnerable to data loss, unlike a laptop that can be stolen with hundreds of gigabytes of information on its disk drive.  And for extreme security situations, Sun Ray Clients even come standard with the ability to use fibre optic ethernet networking to each client to prevent the possibility of unauthorized monitoring of network traffic.But even without Sun Ray Client hardware, users can leverage Oracle's Secure Global Desktop software or the Oracle Virtual Desktop Client to securely access server-resident applications, desktop sessions, or full desktop virtual machines without persisting any application data on the desktop or laptop being used to access the information.  And, again, even in this context, the Oracle products allow you to control what gets uploaded, downloaded, or printed for example.Another benefit of Oracle's Desktop Virtualization and access products is the ability to rapidly and easily shut off user access centrally through administrative polices if, for example, an employee changes roles or leaves the company and should no longer have access to the information.Oracle's Desktop Virtualization suite of products can help reduce operating expense and increase user productivity, and those are good reasons alone to consider their use.  But the dynamics of today's world dictate that security is one of the top reasons for implementing a virtual desktop architecture in enterprises.For more information on these products, view the webpages on www.oracle.com and the Oracle Technology Network website.

    Read the article

  • Website with large number of users keeps going down due to memory leaks:Tomcat6 and Java 6

    - by user1766478
    We host many websites on one of our two virtual servers. We use tomcat 6 and Java 6. It is an MVC model with a hibernate like layer. The problem is, one of our biggest clients with the most number of members, keeps crashing the server every 6-8 hours(precisely in the mornings when most members login) We have been having this issue for 4 days now. Trying to figure out the problem but we suspect memory leaks. Any suggestions?

    Read the article

  • How Can I Prevent Memory Leaks in IE Mobile?

    - by Jake Howlett
    Hi All, I've written an application for use offline (with Google Gears) on devices using IE Mobile. The devices are experiencing memory leaks at such a rate that the device becomes unusable over time. The problem page fetches entries from the local Gears database and renders a table of each entry with a link in the last column of each row to open the entry ( the link is just onclick="open('myID')" ). When they've done with the entry they return to the table, which is RE-rendered. It's the repeated building of this table that appears to be the problem. Mainly the onclick events. The table is generated in essence like this: var tmp=""; for (var i=0; i<100; i++){ tmp+="<tr><td>row "+i+"</td><td><a href=\"#\" id=\"LINK-"+i+"\""+ " onclick=\"afunction();return false;\">link</a></td></tr>"; } document.getElementById('view').innerHTML = "<table>"+tmp+"</table>"; I've read up on common causes of memory leaks and tried setting the onclick event for each link to "null" before re-rendering the table but it still seems to leak. Anybody got any ideas? In case it matters, the function being called from each link looks like this: function afunction(){ document.getElementById('view').style.display="none"; } Would that constitute a circular reference in any way? Jake

    Read the article

  • NSDecimalNumber leaks memory if not used with AutoRelease pool?

    - by bioffe
    NSString* str = [[NSString alloc] initWithString:@"0.05"]; NSDecimalNumber* num = [[NSDecimalNumber alloc] initWithString:str]; NSLog(@" %@", num); [str release]; [num release]; leaks memory *** __NSAutoreleaseNoPool(): Object 0x707990 of class NSCFString autoreleased with no pool in place - just leaking Can someone suggest a workaround ?

    Read the article

  • Does lamda in List.ForEach leads to memory leaks and performance problems ?

    - by Monomachus
    I have a problem which I could solve using something like this sortedElements.ForEach((XElement el) => PrintXElementName(el,i++)); And this means that I have in ForEach a lambda which permits using parameters like int i. I like that way of doing it, but i read somewhere that anonymous methods and delegates with lambda leads to a lot of memory leaks because each time when lambda is executed something is instantiated but is not released. Something like that. Could you please tell me if this is true in this situation and if it is why?

    Read the article

  • 47 memory leaks. STL pointers.

    - by icelated
    I have a major amount of memory leaks. I know that the Sets have pointers and i cannot change that! I cannot change anything, but clean up the mess i have... I am creating memory with new in just about every function to add information to the sets. I have a Cd/ DVD/book: super classes of ITEM class and a library class.. In the library class i have 2 functions for cleaning up the sets.. Also, the CD, DVD, book destructors are not being called.. here is my potential leaks.. library.h #pragma once #include <ostream> #include <map> #include <set> #include <string> #include "Item.h" using namespace std; typedef set<Item*> ItemSet; typedef map<string,Item*> ItemMap; typedef map<string,ItemSet*> ItemSetMap; class Library { public: // general functions void addKeywordForItem(const Item* const item, const string& keyword); const ItemSet* itemsForKeyword(const string& keyword) const; void printItem(ostream& out, const Item* const item) const; // book-related functions const Item* addBook(const string& title, const string& author, int const nPages); const ItemSet* booksByAuthor(const string& author) const; const ItemSet* books() const; // music-related functions const Item* addMusicCD(const string& title, const string& band, const int nSongs); void addBandMember(const Item* const musicCD, const string& member); const ItemSet* musicByBand(const string& band) const; const ItemSet* musicByMusician(const string& musician) const; const ItemSet* musicCDs() const; // movie-related functions const Item* addMovieDVD(const string& title, const string& director, const int nScenes); void addCastMember(const Item* const movie, const string& member); const ItemSet* moviesByDirector(const string& director) const; const ItemSet* moviesByActor(const string& actor) const; const ItemSet* movies() const; ~Library(); void Purge(ItemSet &set); void Purge(ItemSetMap &map); }; here is some functions for adding info using new in library. Keep in mind i am cutting out alot of code to keep this post short. library.cpp #include "Library.h" #include "book.h" #include "cd.h" #include "dvd.h" #include <iostream> // general functions ItemSet allBooks; ItemSet allCDS; ItemSet allDVDs; ItemSetMap allBooksByAuthor; ItemSetMap allmoviesByDirector; ItemSetMap allmoviesByActor; ItemSetMap allMusicByBand; ItemSetMap allMusicByMusician; const ItemSet* Library::itemsForKeyword(const string& keyword) const { const StringSet* kw; ItemSet* obj = new ItemSet(); return obj; const Item* Library::addBook(const string& title, const string& author, const int nPages) { ItemSet* obj = new ItemSet(); Book* item = new Book(title,author,nPages); allBooks.insert(item); // add to set of all books obj->insert(item); return item; const Item* Library::addMusicCD(const string& title, const string& band, const int nSongs) { ItemSet* obj = new ItemSet(); CD* item = new CD(title,band,nSongs); return item; void Library::addBandMember(const Item* musicCD, const string& member) { ItemSet* obj = new ItemSet(); (((CD*) musicCD)->addBandMember(member)); obj->insert((CD*) musicCD); here is the library destructor..... Library::~Library() { Purge(allBooks); Purge(allCDS); Purge(allDVDs); Purge(allBooksByAuthor); Purge(allmoviesByDirector); Purge(allmoviesByActor); Purge(allMusicByBand); Purge(allMusicByMusician); } void Library::Purge(ItemSet &set) { for (ItemSet::iterator it = set.begin(); it != set.end(); ++it) delete *it; set.clear(); } void Library::Purge(ItemSetMap &map) { for (ItemSetMap::iterator it = map.begin(); it != map.end(); ++it) delete it->second; map.clear(); } so, basically item, cd, dvd class all have a set like this: typedef set<string> StringSet; class CD : public Item StringSet* music; and i am deleting it like: but those superclasses are not being called.. Item destructor is. CD::~CD() { delete music; } Do, i need a copy constructor? and how do i delete those objects i am creating in the library class? and how can i get the cd,dvd, destructor called? would the addbandmember function located in the library.cpp cause me to have a copy constructor? Any real help you can provide me to help me clean up this mess instead of telling me not to use pointers in my sets i would really appreciate. How can i delete the memory i am creating in those functions? I cannot delete them in the function!!

    Read the article

  • Can I fix memory leaks in browsers? (Win7 64bit IE9 Chrome)

    - by Randy
    I have a habit of opening websites in their own window, then getting interested in something else in another window while leaving the first window open with the intention of returning to finishing reading. Sometimes I'll leave browser windows open for quite a while. When I do, the browser locks-up gigs of memory until it uses so much, IE will no longer respond. I've seen one browser window use up over 2 gigs just sitting there doing nothing. Chrome appears to do it too, but it doesn't use nearly as much as IE. Is there a way to stop this? Any ideas what may be causing it? Is it a problem with the OS or the browsers? Any insight is appreciated.

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >